BEGIN:VCALENDAR
VERSION:2.0
PRODID:IEEE vTools.Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
DTSTART:20250309T030000
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
RRULE:FREQ=YEARLY;BYDAY=2SU;BYMONTH=3
TZNAME:EDT
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:20251102T010000
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=11
TZNAME:EST
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20250508T224440Z
UID:6AF7C6F8-5B84-4768-888F-3953E63021EC
DTSTART;TZID=America/New_York:20250506T190000
DTEND;TZID=America/New_York:20250506T200000
DESCRIPTION:This talk unveils the transformative potential of achieving sub
 -10-watt language models (LMs) by drawing inspiration from the brain’s e
 nergy efficiency. We introduce a groundbreaking approach to language model
  design\, featuring a matrix-multiplication-free architecture that scales 
 to billions of parameters. To validate this paradigm\, we developed custom
  hardware solutions (FPGA) as well as leveraged pre-existing neuromorphic 
 hardware (Intel Loihi 2)\, optimized for lightweight operations that outpe
 rform traditional GPU capabilities. Our system achieves human-surpassing t
 hroughput on billion-parameter models at just 13 watts\, setting a new ben
 chmark for energy-efficient AI. This work not only redefines what&#39;s possib
 le for low-power LLMs but also highlights the critical operations future a
 ccelerators must prioritize to enable the next wave of sustainable AI inno
 vation.\n\nSpeaker(s): Dr. Eshraghian\n\nVirtual: https://events.vtools.ie
 ee.org/m/479501
LOCATION:Virtual: https://events.vtools.ieee.org/m/479501
ORGANIZER:joseph.palko@ieee.org
SEQUENCE:66
SUMMARY:BRAIN-INSPIRED LOW-POWER LANGUAGE MODEL
URL;VALUE=URI:https://events.vtools.ieee.org/m/479501
X-ALT-DESC:Description: &lt;br /&gt;&lt;p&gt;&lt;span style=&quot;font-size: 11.0pt\; font-fami
 ly: &#39;Times New Roman&#39;\,serif\; mso-fareast-font-family: &#39;Malgun Gothic&#39;\; 
 mso-fareast-theme-font: minor-fareast\; mso-ansi-language: EN-US\; mso-far
 east-language: EN-US\; mso-bidi-language: AR-SA\;&quot;&gt;This talk unveils the t
 ransformative potential of achieving sub-10-watt language models (LMs) by 
 drawing inspiration from the brain&amp;rsquo\;s energy efficiency. We introduc
 e a groundbreaking approach to language model design\, featuring a matrix-
 multiplication-free architecture that scales to billions of parameters. To
  validate this paradigm\, we developed custom hardware solutions (FPGA) as
  well as leveraged pre-existing neuromorphic hardware (Intel Loihi 2)\, op
 timized for lightweight operations that outperform traditional GPU capabil
 ities. Our system achieves human-surpassing throughput on billion-paramete
 r models at just 13 watts\, setting a new benchmark for energy-efficient A
 I. This work not only redefines what&#39;s possible for low-power LLMs but als
 o highlights the critical operations future accelerators must prioritize t
 o enable the next wave of sustainable AI innovation.&lt;/span&gt;&lt;/p&gt;
END:VEVENT
END:VCALENDAR

