BEGIN:VCALENDAR
VERSION:2.0
PRODID:IEEE vTools.Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:Australia/Melbourne
BEGIN:DAYLIGHT
DTSTART:20261004T030000
TZOFFSETFROM:+1000
TZOFFSETTO:+1100
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=10
TZNAME:AEDT
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:20260405T020000
TZOFFSETFROM:+1100
TZOFFSETTO:+1000
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=4
TZNAME:AEST
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20260429T051629Z
UID:0F6D0732-AA1A-441D-BBF2-85D598FB8117
DTSTART;TZID=Australia/Melbourne:20260429T140000
DTEND;TZID=Australia/Melbourne:20260429T150000
DESCRIPTION:Abstract:\n\nBackpropagation has powered the modern era of comp
 utational intelligence\, enabling breakthroughs in perception\, language\,
  control\, and autonomous systems. Yet as intelligent systems move into dy
 namic\, real-world environments\, new demands emerge: continual adaptation
 \, robustness under uncertainty\, energy efficiency\, and scalable autonom
 y. These challenges invite a deeper question — are our learning algorith
 ms fundamentally aligned with how intelligence itself operates?\n\nThis le
 cture explores predictive coding as a compelling\, brain-inspired alternat
 ive for credit assignment in deep systems. Rather than relying on staged f
 orward and backward passes with global error transport\, predictive coding
  formulates learning as the continuous minimization of hierarchical predic
 tion errors through local\, parallel\, and bidirectional interactions. Rec
 ent theoretical advances demonstrate that such dynamics can approximate gr
 adient-based optimization\, offering a principled bridge between neuroscie
 nce and modern machine learning.\n\nThis perspective reframes learning as 
 an energy-minimizing dynamical process\, opening new directions in distrib
 uted credit assignment\, continual learning\, robust inference\, and neuro
 morphic implementation. By revisiting the principles of biological intelli
 gence\, this lecture argues that the next generation of computational inte
 lligence systems may emerge not from scaling existing algorithms\, but fro
 m rethinking the foundations of learning itself.\n\nCo-sponsored by: IEEE 
 VIC CIS Chapter\; IEEE VIC Section\n\nSpeaker(s): Narayan Srinivasa\, \n\n
 Virtual: https://events.vtools.ieee.org/m/554620
LOCATION:Virtual: https://events.vtools.ieee.org/m/554620
ORGANIZER:malka_nisha@ieee.org
SEQUENCE:46
SUMMARY:Rethinking Learning: Beyond Backpropagation Toward Brain-Inspired C
 omputational Intelligence
URL;VALUE=URI:https://events.vtools.ieee.org/m/554620
X-ALT-DESC:Description: &lt;br /&gt;&lt;p dir=&quot;ltr&quot;&gt;&amp;nbsp\;&lt;/p&gt;\n&lt;p dir=&quot;ltr&quot;&gt;&lt;img s
 rc=&quot;https://events.vtools.ieee.org/vtools_ui/media/display/011a8392-8438-4
 59f-a515-684eec256d9e&quot;&gt;&lt;/p&gt;\n&lt;p dir=&quot;ltr&quot;&gt;&amp;nbsp\;&lt;/p&gt;\n&lt;p dir=&quot;ltr&quot;&gt;&lt;stron
 g&gt;Abstract:&amp;nbsp\;&lt;/strong&gt;&lt;/p&gt;\n&lt;p dir=&quot;ltr&quot;&gt;Backpropagation has powered 
 the modern era of computational intelligence\, enabling breakthroughs in p
 erception\, language\, control\, and autonomous systems. Yet as intelligen
 t systems move into dynamic\, real-world environments\, new demands emerge
 : continual adaptation\, robustness under uncertainty\, energy efficiency\
 , and scalable autonomy. These challenges invite a deeper question &amp;mdash\
 ; are our learning algorithms fundamentally aligned with how intelligence 
 itself operates?&lt;/p&gt;\n&lt;p dir=&quot;ltr&quot;&gt;This lecture explores predictive coding
  as a compelling\, brain-inspired alternative for credit assignment in dee
 p systems. Rather than relying on staged forward and backward passes with 
 global error transport\, predictive coding formulates learning as the cont
 inuous minimization of hierarchical prediction errors through local\, para
 llel\, and bidirectional interactions. Recent theoretical advances demonst
 rate that such dynamics can approximate gradient-based optimization\, offe
 ring a principled bridge between neuroscience and modern machine learning.
 &lt;/p&gt;\n&lt;p dir=&quot;ltr&quot;&gt;This perspective reframes learning as an energy-minimiz
 ing dynamical process\, opening new directions in distributed credit assig
 nment\, continual learning\, robust inference\, and neuromorphic implement
 ation. By revisiting the principles of biological intelligence\, this lect
 ure argues that the next generation of computational intelligence systems 
 may emerge not from scaling existing algorithms\, but from rethinking the 
 foundations of learning itself.&lt;/p&gt;\n&lt;p&gt;&amp;nbsp\;&lt;/p&gt;
END:VEVENT
END:VCALENDAR

