BEGIN:VCALENDAR
VERSION:2.0
PRODID:IEEE vTools.Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:Asia/Shanghai
BEGIN:STANDARD
DTSTART:19910915T010000
TZOFFSETFROM:+0900
TZOFFSETTO:+0800
TZNAME:CST
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20251118T031808Z
UID:E2AA9D21-17C7-4983-B7F6-C6C49EFE2A90
DTSTART;TZID=Asia/Shanghai:20251120T160000
DTEND;TZID=Asia/Shanghai:20251120T170000
DESCRIPTION:Continual learning is a vital area of deep learning that enable
 s trained models to acquire new tasks incrementally\, without retraining f
 rom scratch for each task. This talk introduces Analytic Continual Learnin
 g (ACL)\, a novel branch of continual learning that leverages classical re
 cursive least squares methods. By employing simple linear algebra\, ACL br
 ings closed-form recursive algorithms into deep continual learning\, achie
 ving an equivalence between continual learning and joint training. As a re
 sult\, deep models can be trained in a single epoch while maintaining high
  training speed and near-zero forgetting. This approach has been successfu
 lly applied to conventional continual learning settings as well as highly 
 challenging scenarios involving large models.\n\nBinary Coffee\,  Qingshui
 he Campus\, University of Electronic Science and Technology of China\, Che
 ngdu\, Sichuan\, China
LOCATION:Binary Coffee\,  Qingshuihe Campus\, University of Electronic Scie
 nce and Technology of China\, Chengdu\, Sichuan\, China
ORGANIZER:windof47@gmail.com
SEQUENCE:7
SUMMARY:Analytic Continual Learning with a Fast and Non-forgetting Closed-f
 orm Solution by Prof. Zhiping Lin
URL;VALUE=URI:https://events.vtools.ieee.org/m/514731
X-ALT-DESC:Description: &lt;br /&gt;&lt;p&gt;&lt;span lang=&quot;EN-US&quot; style=&quot;font-size: 12.0p
 t\; mso-bidi-font-size: 11.0pt\; font-family: &#39;Times New Roman&#39;\,serif\; m
 so-fareast-font-family: 等线\; mso-fareast-theme-font: minor-fareast\; m
 so-ansi-language: EN-US\; mso-fareast-language: EN-US\; mso-bidi-language:
  AR-SA\;&quot;&gt;Continual learning is a vital area of deep learning that enables
  trained models to acquire new tasks incrementally\, without retraining fr
 om scratch for each task. This talk introduces Analytic Continual Learning
  (ACL)\, a novel branch of continual learning that leverages classical rec
 ursive least squares methods. By employing simple linear algebra\, ACL bri
 ngs closed-form recursive algorithms into deep continual learning\, achiev
 ing an equivalence between continual learning and joint training. As a res
 ult\, deep models can be trained in a single epoch while maintaining high 
 training speed and near-zero forgetting. This approach has been successful
 ly applied to conventional continual learning settings as well as highly c
 hallenging scenarios involving large models.&lt;/span&gt;&lt;/p&gt;
END:VEVENT
END:VCALENDAR

