Neural Mass Dynamics
The Wilson–Cowan model describes the dynamics of interactions between populations of very simple excitatory and inhibitory model neurons. In the Wilson-Cowan model, the activity of a neural population can be understood as its instantaneous firing rate, i.e. the number of neurons firing in a given moment.
In its most simple conception, the dynamics of a neural population can then be modeled using an equation like this.
Here, r represents the population firing rate, x is an external input applied to the neural mass and τ is a constant that describes the timescale of changes in activity. In the following section, we will learn in more detail how the dynamics of such a system, but, for now, we will focus on F(x).
F(x) can then be called Activation Function, since it explains how the neural mass ‘activates’ in response to an external perturbation.
We will start our tutorial by defining F(x).
“the Wilson–Cowan model describes the dynamics of interactions between populations of very simple excitatory and inhibitory model neurons”
en.wikipedia.org ›

Activation Function
There are several types of activation functions. Here is a list of the most common ones used in machine learning:
- Rectified Linear Unit (ReLU)
- Leaky ReLU
- Hyperbolic tangent
- Sigmoid
- (…)
We will not go into detail here, but this article discusses the most common functions and their drawbacks and benefits in detail.

towardsdatascience.com ›

Exercise 1
Now, we will implement F(x) in Python and explore different parameter combinations to find out what each parameter does.
Please complete Exercise 1 in the Colab Notebook that you have opened in your drive and try to answer the following questions:
- What characteristics of the activation function can you vary by changing each parameter?
- How do you think each parameter impacts population dynamics?