Spyx Hackathon: Speeding Up Neuromorphic Computing

Explore the power of Spyx in a hands-on hackathon session and dive into the world of neuromorphic frameworks with Kade Heckel.

Join us on December 13th for an exciting Spyx hackathon and ONM talk! Learn how to use and contribute to Spyx, a high-performance spiking neural network library, and gain insights into the latest developments in neuromorphic frameworks. The session will cover Spyx’s utilization of memory and GPU to maximize training throughput, along with discussions on the evolving landscape of neuromorphic computing.

Don’t miss this opportunity to engage with experts, collaborate on cutting-edge projects, and explore the potential of Spyx in shaping the future of neuromorphic computing. Whether you’re a seasoned developer or just curious about the field, this event promises valuable insights and hands-on experience.

Agenda:

  • 18:00 - 19:00: Spyx Introduction
    • Dive into Spyx, its features, and how to contribute
    • Hands-on session: Explore Spyx functionalities and tackle real-world challenges
    • Q&A and collaborative discussions
  • 19:00 - 20:00: Hackathon
    • Collaborate on cutting-edge projects and explore the potential of Spyx
    • Q&A and collaborative discussions

Speakers:

  • Kade Heckel

Note: The event will be hosted virtually. Stay tuned for the video link and further updates. Let’s come together to push the boundaries of neuromorphic computing!

Social share preview for Spyx Hackathon: Speeding up Neuromorphic Computing

Upcoming Workshops

No workshops are currently scheduled. Check back soon for new events!

Are you an expert in a neuromorphic topic? We invite you to share your knowledge with our community. Hosting a workshop is a great way to engage with peers and share your work.

About the Speaker

Kade studied Computer Science and Computer Engineering at the U.S. Naval Academy. Studying in the UK as a Marshall Scholar, Kade completed an MSc in A.I. and Adaptive Systems with distinction from the University of Sussex and is currently pursuing an MPhil in Machine Learning and Machine Intelligence at the University of Cambridge. His dissertation at Sussex focused on comparing surrogate gradient and large scale neuroevolutionary algorithms for optimizing spiking neural networks.

Inspired? Share your work.

Share your expertise with the community by speaking at a workshop, student talk, or hacking hour. It’s a great way to get feedback and help others learn.

Related Workshops

Accelerating Inference and Training at the Edge

Accelerating Inference and Training at the Edge

Join us for a talk by Maxence Ernoult, Research Scientist at Rain, on accelerating inference and training at the edge.

Lava: An Open-Source Software Framework for Developing Neuro-Inspired Applications

Lava: An Open-Source Software Framework for Developing Neuro-Inspired Applications

Discover Lava, an open-source software framework for neuro-inspired applications, presented by Andreas Wild and Mathis Richter. Dive into the future of neuromorphic computing.

Evolutionary Optimization for Neuromorphic Systems

Evolutionary Optimization for Neuromorphic Systems

Dive into evolutionary optimization techniques for neuromorphic systems with Catherine Schuman, an expert in the field. Watch the recorded workshop for valuable insights.