Overview
By defining a common set of computational primitives (like Leaky-Integrate-and-Fire neurons and convolutions), NIR allows researchers and developers to define a model once and then translate it to run on different backends without having to rewrite the model from scratch for each platform. This decouples the model definition from the hardware or software-specific implementation details.
NIR is designed to be extensible and currently supports a range of popular SNN frameworks and hardware, including:
- Simulators: Lava-DL, Nengo, Norse, Rockpool, Sinabs, snnTorch, Spyx
- Hardware: Intel Loihi 2, SpiNNaker 2, SynSense Speck, SynSense Xylo.
More information can be found in the NIR documentation and the Nature Communications paper.
It is actively being developed on GitHub with additional tools for PyTorch integrations.