GeNN  4.9.0
GPU enhanced Neuronal Networks (GeNN)
Custom update models

The neuron groups, synapse groups and current sources described in previous sections are all updated automatically every timestep. However, in many types of model, there are also processes that would benefit from GPU acceleration but only need to be triggered occasionally. For example, such updates could be used in a classifier to to reset the state of neurons after a stimuli has been presented or in a model which uses gradient-based learning to optimize network weights based on gradients accumulated over several timesteps.

Custom updates allows such updates to be described as models, similar to the neuron and synapse models described in the preceding sections. The custom update system also provides functionality for efficiently calculating the tranpose of variables associated with synapse groups with SynapseMatrixType::DENSE_INDIVIDUALG connectivity. The predefined CustomUpdateModels::Transpose model can be used to do just this for a synapse group with a single variable.

Defining your own custom update model

In order to define a new custom update model for use in a GeNN application, it is necessary to define a new class derived from CustomUpdateModels::Base. For convenience the methods this class should implement can be implemented using :

  • perform the same roles as they do in the neuron models discussed in Defining your own neuron type.
  • .
  • defines the names, type strings (e.g. "float", "double", etc) and (optionally) access mode of the variable references. The variables defined here as NAME can then be used in the syntax $(NAME) in the update code string. Variable reference types must match those of the underlying variables. supported access modes are , , , \ and add_cpp_python_text{VarAccessMode::REDUCE_NEURON_MAX, pygenn.genn_wrapper.Models.VarAccessMode_REDUCE_NEURON_MAX}.
  • defines the names and type strings (e.g. "float", "double", etc) of the extra global parameter references. The variables defined here as NAME can then be used in the syntax $(NAME) in the update code string. Extra global parameter reference types must match those of the underlying extra global parameters and only pointer type extra global parameters are supported.
  • : where UPDATE_CODE contains the code for to perform the custom update.

For example, using these , we can define a custom update which will set a referenced variable to the value of a custom update model state variable:

When used in a model with batch size > 1, whether custom updates of this sort are batched or not depends on the variables their references point to. If any referenced variables have or access modes, then the update will be batched and any variables associated with the custom update which also have or access modes will be duplicated across the batches.

Batch reduction

As well as the standard variable access modes described in Defining neuron populations, custom updates support variables with several 'batch reduction' access modes:

These access modes allow values read from variables duplicated across batches to be reduced into variables that are shared across batches. For example, in a gradient-based learning scenario, a model like this could be used to sum gradients from across all batches so they can be used as the input to a learning rule operating on shared synaptic weights:

Custom updates can also perform the same sort of reduction operation into variable references with the equivalent 'reduction' access modes:

Neuron reductions

Similarly to the batch reduction modes discussed previously, custom updates also support variables with several 'neuron reduction' access modes:

These access modes allow values read from per-neuron variables to be reduced into variables that are shared across neurons. For example, a model like this could be used to calculate the maximum value of a state variable in a population of neurons:

Again, like batch reductions, neuron reductions can also be performed into variable references with the or access modes.

Note
Reading from variables with a reduction access mode is undefined behaviour.

Previous | Top | Next