5In terms of neural network theory, resting-state activity is enabled by the brain having intrinsic dynamics based on recurrent connections. Recurrent connections mean that the neurons are not arranged in successive layers where the signal just goes in one direction: Instead, the outputs of some neurons are fed back to other neurons that actually provided input to those neurons in question. The output can also be fed back to the outputting neuron itself. With such feedback, neurons can learn to sustain each other’s activity: Neuron A activates neuron B, which by recurrency again activates neuron A, and so on. Even a single neuron can sustain its activity by sending feedback activation to itself (Hopfield, 1982; Hochreiter and Schmidhuber, 1997). Such recurrent connections are extremely common in the brain, while the most commonly used neural network models have no recurrent connections; this is an important discrepancy.