PEPITA - A Forward-Forward Alternative to Backpropagation

Explore PEPITA, a forward-forward approach as an alternative to backpropagation, presented by Giorgia Dellaferrera. Learn about its advantages and implementation with PyTorch.

Edit Page
Social share preview for PEPITA - A Forward-Forward Alternative to Backpropagation

Upcoming Workshops

No workshops are currently scheduled. Check back soon for new events!

Are you an expert in a neuromorphic topic? We invite you to share your knowledge with our community. Hosting a workshop is a great way to engage with peers and share your work.

PEPITA - A Forward-Forward Alternative to Backpropagation

About the Speaker

Giorgia Dellaferrera has completed her PhD in computational neuroscience at the Institute of Neuroinformatics (ETH Zurich and the University of Zurich) and IBM Research Zurich with Prof. Indiveri, Prof. Eleftheriou and Dr. Pantazi. Her doctoral thesis focused on the interplay between neuroscience and artificial intelligence, with an emphasis on learning mechanisms in brains and machines. During her PhD, she visited the lab of Prof. Kreiman at the Harvard Medical School (US), where she developed a biologically inspired training strategy for artificial neural networks. Before her PhD, Giorgia obtained a master in Applied Physics at the Swiss Federal Institute of Technology Lausanne (EPFL) and worked as an intern at the Okinawa Institute of Science and Technology, Logitech, Imperial College London, and EPFL.

Inspired? Share your work.

Share your expertise with the community by speaking at a workshop, student talk, or hacking hour. It’s a great way to get feedback and help others learn.

Learn How to Present

Related Workshops

Towards Training Robust Computer Vision Models for Neuromorphic Hardware

Towards Training Robust Computer Vision Models for Neuromorphic Hardware

Join Gregor Lenz as he delves into the world of event cameras and spiking neural networks, exploring their potential for low-power applications on SynSense's Speck chip. Discover the challenges in data, training, and deployment stages. Don't miss this talk on training robust computer vision models for neuromorphic hardware.

Programming Scalable Neuromorphic Algorithms with Fugu

Programming Scalable Neuromorphic Algorithms with Fugu

Explore neural-inspired computing with Brad Aimone, a leading neuroscientist at Sandia Labs. Join us for insights into next-gen technology and neuroscience.

C-DNN and C-Transformer: mixing ANNs and SNNs for the best of both worlds

C-DNN and C-Transformer: mixing ANNs and SNNs for the best of both worlds

Join us for a talk by Sangyeob Kim, Postdoctoral researcher at KAIST, on designing efficient accelerators that mix SNNs and ANNs.