Putting the Neural back into Networks


Part 1: Why spikes?

11th January, 2020

A rock pool

A rock pool

Photo by @silasbaisch on Unsplash

Not long ago, one of the gods of modern machine learning made a slightly controversial statement. In the final slide of his ISSCC 2019 keynote [1], Yann LeCun [2, 3, 4] (that’s “Mr CNN” to you) said he was skeptical about the usefulness of spiking neural networks, as almost a throwaway remark.

Machine Learning with SNNs for low-power inference


Presentation at UWA

3rd May, 2024

LIF neuron model

LIF neuron model

An LIF neuron as a small recurrent unit

AI is extremely power-hungry. But brain-inspired computing units can perform machine learning inference at sub-milliWatt power. Learn about low-power computing architectures from SynSense, which use quantised spiking neurons for ML inference. I presented this slide deck at the UWA Computer Science seminar in Perth.

Hands-on with Rockpool and Xylo


OpenNeuromorphic 26th April 2023

26th April, 2023

Learn how to build and deploy a spiking neural network with Rockpool and Xylo.