C-DNN and C-Transformer: Mixing ANNs and SNNs for the Best of Both Worlds

Join us for a talk by Sangyeob Kim, Postdoctoral researcher at KAIST, on designing efficient accelerators that mix SNNs and ANNs.

C-DNN and C-Transformer: mixing ANNs and SNNs for the best of both worlds
  • Sangyeob Kim
  • May 4, 2024 11:00 - 12:15 CEST
  • Event

Sangyeob and his team have developed a C-DNN processor that effectively processes object recognition workloads, achieving 51.3% higher energy efficiency compared to the previous state-of-the-art processor. Subsequently, they have applied C-DNN not only to image classification but also to other applications, and have developed the C-Transformer, which applies this technique to a Large Language Model (LLM). As a result, they demonstrate that the energy consumed in LLM can be reduced by 30% to 72% using the C-DNN technique, compared to the previous state-of-the-art processor. In this talk, we will introduce the processor developed for C-DNN and C-Transformer, and discuss how neuromorphic computing can be used in actual applications in the future.

~ Share this Site ~

Upcoming Workshops

Advances in Neuromorphic Visual Place Recognition
Tobias Fischer
2024, Apr 18    23:00 - 00:30 CET
C-DNN and C-Transformer: mixing ANNs and SNNs for the best of both worlds

About the Speaker

Sangyeob Kim (Student Member, IEEE) received the B.S., M.S. and Ph.D. degrees from the School of Electrical Engineering, Korea Advanced Institute of Science and Technology (KAIST), Daejeon, South Korea, in 2018, 2020 and 2023, respectively. He is currently a Post-Doctoral Associate with the KAIST. His current research interests include energy-efficient system-on-chip design, especially focused on deep neural network accelerators, neuromorphic hardware, and computing-in-memory accelerators.

Hands-On with snnTorch

Hands-On with snnTorch

  • Jason Eshraghian
  • 2023, March 2

Join Jason Eshraghian for an engaging hands-on session featuring snnTorch. Explore the world of neuromorphic engineering.

Accelerating Inference and Training at the Edge

Accelerating Inference and Training at the Edge

  • Maxence Ernoult
  • 2024, March 5

Join us for a talk by Maxence Ernoult, Research Scientist at Rain, on accelerating inference and training at the edge.

Programming Scalable Neuromorphic Algorithms with Fugu

Programming Scalable Neuromorphic Algorithms with Fugu

  • Brad Aimone
  • 2023, December 19

Explore neural-inspired computing with Brad Aimone, a leading neuroscientist at Sandia Labs. Join us for insights into next-gen technology and neuroscience.