Norse

Exploits bio-inspired neural components, sparse and event-driven, expands PyTorch with primitives for bio-inspired neural components.

Norse

Overview

Norse is a deep learning Python library used for simulating spiking neural networks (SNN)s that leverages PyTorch with bio-inspired neural networks. Norse is maintained and developed by Christian Pehle and Jens Egholm Pedersen, with funding from the EC Horizon 2020 Framework Programme and the DFG, German Research Foundation. Additionally, Norse is a community-driven project, encouraging community contributions and development.

Norse is accompanied by extensive documentation, including tutorials on running classification tasks on datasets like MNIST, CIFAR, and cartpole balancing with policy gradients, showcasing Norse’s compatibility with PyTorch Lightning. While utilizing the PyTorch library for CPU/GPU acceleration, Norse expands it by adding their own spiking neuron models. This approach leverages the sparse and event-driven nature of biological neural networks to create energy efficient computational models. The framework provides a variety of different neuron models and is designed to be adaptable, allowing for custom neuron model creation and integration with existing deep learning models.

Norse aims to be a foundational tool for understanding the transition from standard deep learning models to spiking models. It enables the creation of new neural network models and the adaptation of existing models with spiking capabilities. Norse also acknowledges the resource-intensive nature of spiking neural networks and provides guidance on hardware acceleration to optimize simulation performance.

Can you contribute tutorial guides or case studies?

Get Involved with ONM

NorthPole, IBM's latest Neuromorphic AI Hardware

NorthPole, IBM's latest Neuromorphic AI Hardware

  • Fabrizio Ottati

Translating the NorthPole paper from IBM to human language.

Spiking Neural Network (SNN) Library Benchmarks

Spiking Neural Network (SNN) Library Benchmarks

  • Gregor Lenz, Kade Heckel, Sumit Bam Shrestha, Cameron Barker, Jens Egholm Pedersen

Discover the fastest Spiking Neural Network (SNN) frameworks for deep learning-based optimization. Performance, flexibility, and more analyzed in-depth

Spiking Neurons: A Digital Hardware Implementation

Spiking Neurons: A Digital Hardware Implementation

  • Fabrizio Ottati

Learn how to model Leaky Integrate and Fire (LIF) neurons in digital hardware. Understand spike communication, synapse integration, and more for hardware implementation.