Spyx Hackathon: Speeding Up Neuromorphic Computing

Explore the power of Spyx in a hands-on hackathon session and dive into the world of neuromorphic frameworks with Kade Heckel.

Join us on December 13th for an exciting Spyx hackathon and ONM talk! Learn how to use and contribute to Spyx, a high-performance spiking neural network library, and gain insights into the latest developments in neuromorphic frameworks. The session will cover Spyx’s utilization of memory and GPU to maximize training throughput, along with discussions on the evolving landscape of neuromorphic computing.

Don’t miss this opportunity to engage with experts, collaborate on cutting-edge projects, and explore the potential of Spyx in shaping the future of neuromorphic computing. Whether you’re a seasoned developer or just curious about the field, this event promises valuable insights and hands-on experience.

Agenda:

  • 18:00 - 19:00: Spyx Introduction
    • Dive into Spyx, its features, and how to contribute
    • Hands-on session: Explore Spyx functionalities and tackle real-world challenges
    • Q&A and collaborative discussions
  • 19:00 - 20:00: Hackathon
    • Collaborate on cutting-edge projects and explore the potential of Spyx
    • Q&A and collaborative discussions

Speakers:

  • Kade Heckel

Note: The event will be hosted virtually. Stay tuned for the video link and further updates. Let’s come together to push the boundaries of neuromorphic computing!

Social share preview for Spyx Hackathon: Speeding up Neuromorphic Computing

Upcoming Workshops

No workshops are currently scheduled. Check back soon for new events!

Are you an expert in a neuromorphic topic? We invite you to share your knowledge with our community. Hosting a workshop is a great way to engage with peers and share your work.

About the Speaker

Kade studied Computer Science and Computer Engineering at the U.S. Naval Academy. Studying in the UK as a Marshall Scholar, Kade completed an MSc in A.I. and Adaptive Systems with distinction from the University of Sussex and is currently pursuing an MPhil in Machine Learning and Machine Intelligence at the University of Cambridge. His dissertation at Sussex focused on comparing surrogate gradient and large scale neuroevolutionary algorithms for optimizing spiking neural networks.

Inspired? Share your work.

Share your expertise with the community by speaking at a workshop, student talk, or hacking hour. It’s a great way to get feedback and help others learn.

Related Workshops

PEPITA - A Forward-Forward Alternative to Backpropagation

PEPITA - A Forward-Forward Alternative to Backpropagation

Explore PEPITA, a forward-forward approach as an alternative to backpropagation, presented by Giorgia Dellaferrera. Learn about its advantages and implementation with PyTorch.

Towards Training Robust Computer Vision Models for Neuromorphic Hardware

Towards Training Robust Computer Vision Models for Neuromorphic Hardware

Join Gregor Lenz as he delves into the world of event cameras and spiking neural networks, exploring their potential for low-power applications on SynSense's Speck chip. Discover the challenges in data, training, and deployment stages. Don't miss this talk on training robust computer vision models for neuromorphic hardware.

The ELM Neuron: An Efficient and Expressive Cortical Neuron Model Can Solve Long-Horizon Tasks

The ELM Neuron: An Efficient and Expressive Cortical Neuron Model Can Solve Long-Horizon Tasks

Aaron tells us about the Expressive Leaky Memory (ELM) neuron model, a biologically inspired phenomenological model of a cortical neuron.