Putting the Neural back into Networks


Part 3: I got 99 problems but a spike ain't one

13th January, 2020

Ocean scene

Ocean scene

Photo by @tgerz on Unsplash

In the last post, we saw that spike generation can cause major issues with gradient descent, by completely zeroing the gradients. We also learned how to get around this issue by faking the neuron output by making a spiking surrogate.

Machine Learning with SNNs for low-power inference


Presentation at UWA

3rd May, 2024

LIF neuron model

LIF neuron model

An LIF neuron as a small recurrent unit

AI is extremely power-hungry. But brain-inspired computing units can perform machine learning inference at sub-milliWatt power. Learn about low-power computing architectures from SynSense, which use quantised spiking neurons for ML inference. I presented this slide deck at the UWA Computer Science seminar in Perth.

Hands-on with Rockpool and Xylo


OpenNeuromorphic 26th April 2023

26th April, 2023

Learn how to build and deploy a spiking neural network with Rockpool and Xylo.