BEGIN:VCALENDAR
VERSION:2.0
PRODID:IEEE vTools.Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:Europe/Zurich
BEGIN:DAYLIGHT
DTSTART:20260329T030000
TZOFFSETFROM:+0100
TZOFFSETTO:+0200
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=3
TZNAME:CEST
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:20251026T020000
TZOFFSETFROM:+0200
TZOFFSETTO:+0100
RRULE:FREQ=YEARLY;BYDAY=-1SU;BYMONTH=10
TZNAME:CET
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20260319T082905Z
UID:F4FC1410-1E41-4665-B95F-FE5CC957D37A
DTSTART;TZID=Europe/Zurich:20260310T110000
DTEND;TZID=Europe/Zurich:20260310T130000
DESCRIPTION:Distinguished Lecture by Jason K. Eshraghian\n\nNeuromorphic La
 rge Language Models (LLMs)\n\nAbstract: The brain is the perfect place to 
 look for inspiration to develop more efficient neural networks. Inspired b
 y the recurrent dynamics of biological neurons\, this talk will present se
 veral frontier reasoning LLMs developed in my lab\, from software to devic
 e deployments. Trained end-to-end in an academic lab on a full production 
 pipeline (data curation\, pre-training\, to post-training and alignment) t
 hese models surpass all leading LLMs from Meta\, Google and every other ov
 er-resourced company in the ~10-billion parameter regime\, despite being ~
 5x smaller. We have deployed several of our models on neuromorphic hardwar
 e at 2-watts\, bringing SoTA-level reasoning from the datacenter to the ed
 ge.\n\nRegistration is free but mandatory - limited number of places\, ple
 ase contact the chair by email.\n\nSpeaker(s): \, Jason K. Eshraghian \n\n
 EPFL\, Lausanne\, Switzerland\, Switzerland
LOCATION:EPFL\, Lausanne\, Switzerland\, Switzerland
ORGANIZER:lbegon@ethz.ch
SEQUENCE:79
SUMMARY:CAS Distinguished Lecture at EPFL - J. K. Eshraghian UC Santa Cruz 
 - Neuromorphic LLMs
URL;VALUE=URI:https://events.vtools.ieee.org/m/539827
X-ALT-DESC:Description: &lt;br /&gt;&lt;p style=&quot;text-align: center\;&quot;&gt;&lt;span class=&quot;
 mark0p63h6u9h&quot; data-markjs=&quot;true&quot; data-ogac=&quot;&quot; data-ogab=&quot;&quot; data-ogsc=&quot;&quot; d
 ata-ogsb=&quot;&quot; data-olk-copy-source=&quot;MessageBody&quot;&gt;Distinguished Lecture by Ja
 son K. Eshraghian&lt;/span&gt;&lt;/p&gt;\n&lt;p style=&quot;text-align: center\;&quot;&gt;&lt;strong&gt;&lt;spa
 n class=&quot;mark0p63h6u9h&quot; data-markjs=&quot;true&quot; data-ogac=&quot;&quot; data-ogab=&quot;&quot; data-
 ogsc=&quot;&quot; data-ogsb=&quot;&quot; data-olk-copy-source=&quot;MessageBody&quot;&gt;&lt;span data-olk-cop
 y-source=&quot;MessageBody&quot;&gt;Neuromorphic Large Language Models (LLMs)&lt;/span&gt;&lt;/s
 pan&gt;&lt;/strong&gt;&lt;/p&gt;\n&lt;p style=&quot;text-align: center\;&quot;&gt;&amp;nbsp\;&lt;/p&gt;\n&lt;p&gt;&lt;strong
 &gt;&lt;span class=&quot;mark0p63h6u9h&quot; data-markjs=&quot;true&quot; data-ogac=&quot;&quot; data-ogab=&quot;&quot; 
 data-ogsc=&quot;&quot; data-ogsb=&quot;&quot; data-olk-copy-source=&quot;MessageBody&quot;&gt;Abstract&lt;/spa
 n&gt;: &lt;/strong&gt;The brain is the perfect place to look for inspiration to dev
 elop more efficient neural networks. Inspired by the recurrent dynamics of
  biological neurons\, this talk will present several frontier reasoning LL
 Ms developed in my lab\, from software to device deployments. Trained end-
 to-end in an academic lab on a full production pipeline (data curation\, p
 re-training\, to post-training and alignment) these models surpass all lea
 ding LLMs from Meta\, Google and every other over-resourced company in the
  ~10-billion parameter regime\, despite being ~5x smaller. We have deploye
 d several of our models on neuromorphic hardware at 2-watts\, bringing SoT
 A-level reasoning from the datacenter to the edge.&amp;nbsp\;&lt;/p&gt;\n&lt;p&gt;&amp;nbsp\;&lt;
 /p&gt;\n&lt;p&gt;&lt;em&gt;Registration is free but mandatory - limited number of places\
 , please contact the chair by email.&lt;/em&gt;&lt;/p&gt;\n&lt;p&gt;&amp;nbsp\;&lt;/p&gt;
END:VEVENT
END:VCALENDAR

