Tutorials
CompNeuro 101
Building spiking neural network models in GeNN
Neurons
Create a model consisting of a population of Izhikevich neurons with heterogeneous parameters, driven by a stimulus current. Simulate and record state variables.
Synapses
Create a simple balanced random network with two, sparsely connected populations of leaky integrate-and-fire neurons. Simulate and record spikes.
MNIST inference
Perform MNIST inference by converting a pre-trained ANN to an SNN
Presenting a single image
Create a simple three layer network of integrate-and-fire neurons, densely connected with pre-trained weights. Present a single MNIST image and visualise spiking activity.
Classifying entire test set
Present entire MNIST test set to previous model and calculate accuracy.
Improve classification performance
Use parallel batching and custom updates to improve inference performance by over 30x compared to previous tutorial.
Insect-inspired MNIST classification
Train a model of the insect mushroom body using an STDP learning rule to classify MNIST.
Projection Neurons
Create the first layer of Projection Neurons which convert input images into a sparse temporal code.
Kenyon Cells
Add a second, randomly-connected layer of Kenyon Cells to the model.
Kenyon Cell gain control
Add recurrent inhibition circuit, inspired by <i>Giant GABAergic Neuron</i> in locusts, to improve sparse coding of the Kenyon Cells.
Mushroom Body Output Neurons
Add Mushroom Body Output Neurons with STDP learning and train model on MNIST training set.
Testing
Create a simplified copy of the model without learning, load in the trained weights and calculate inference accuracy on MNIST test set.