PEPITA - A Forward-Forward Alternative to Backpropagation

Explore PEPITA, a forward-forward approach as an alternative to backpropagation, presented by Giorgia Dellaferrera. Learn about its advantages and implementation with PyTorch.

Social share preview for PEPITA - A Forward-Forward Alternative to Backpropagation

Upcoming Workshops

Open-Source Neuromorphic Research Infrastructure: A Community Panel
Jens E. Pedersen, Hananel Hazan, James Knight, Alexandre Marcireau, Gregor Lenz, Dylan Muir, Christian Pehle, Terry Stewart, Marcel Stimberg
July 30, 2025
17:00 - 18:30 CEST

About the Speaker

Giorgia Dellaferrera has completed her PhD in computational neuroscience at the Institute of Neuroinformatics (ETH Zurich and the University of Zurich) and IBM Research Zurich with Prof. Indiveri, Prof. Eleftheriou and Dr. Pantazi. Her doctoral thesis focused on the interplay between neuroscience and artificial intelligence, with an emphasis on learning mechanisms in brains and machines. During her PhD, she visited the lab of Prof. Kreiman at the Harvard Medical School (US), where she developed a biologically inspired training strategy for artificial neural networks. Before her PhD, Giorgia obtained a master in Applied Physics at the Swiss Federal Institute of Technology Lausanne (EPFL) and worked as an intern at the Okinawa Institute of Science and Technology, Logitech, Imperial College London, and EPFL.

Inspired? Share your work.

Share your expertise with the community by speaking at a workshop, student talk, or hacking hour. It’s a great way to get feedback and help others learn.

Related Workshops

Spyx Hackathon: Speeding up Neuromorphic Computing

Spyx Hackathon: Speeding up Neuromorphic Computing

Explore the power of Spyx in a hands-on hackathon session and dive into the world of neuromorphic frameworks with Kade Heckel.

Programming Scalable Neuromorphic Algorithms with Fugu

Programming Scalable Neuromorphic Algorithms with Fugu

Explore neural-inspired computing with Brad Aimone, a leading neuroscientist at Sandia Labs. Join us for insights into next-gen technology and neuroscience.

C-DNN and C-Transformer: mixing ANNs and SNNs for the best of both worlds

C-DNN and C-Transformer: mixing ANNs and SNNs for the best of both worlds

Join us for a talk by Sangyeob Kim, Postdoctoral researcher at KAIST, on designing efficient accelerators that mix SNNs and ANNs.