BEGIN:VCALENDAR
VERSION:2.0
PRODID:IEEE vTools.Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:America/Los_Angeles
BEGIN:DAYLIGHT
DTSTART:20240310T030000
TZOFFSETFROM:-0800
TZOFFSETTO:-0700
RRULE:FREQ=YEARLY;BYDAY=2SU;BYMONTH=3
TZNAME:PDT
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:20231105T010000
TZOFFSETFROM:-0700
TZOFFSETTO:-0800
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=11
TZNAME:PST
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20240304T201135Z
UID:BE46018F-2C62-4E2F-8313-2264AEA432F3
DTSTART;TZID=America/Los_Angeles:20240222T110000
DTEND;TZID=America/Los_Angeles:20240222T120000
DESCRIPTION:In the past decade\, Deep Learning (DL) has achieved\nunprecede
 nted improvement in the inference accuracy for several\nhard-to-tackle app
 lications such as natural language processing\, image\nclassification\, ob
 ject detection and identification\, etc. The\nstate-of-the-art DL models t
 hat achieve close to 100% inference\naccuracy are large requiring gigabyte
 s of memory to load them. On the\nother end of the spectrum\, the tinyML c
 ommunity is pushing the limits\nof compressing DL models to embed them on 
 memory-limited IoT devices.\nPerforming local inference for data samples o
 n the end devices reduces\ndelay\, saves network bandwidth\, and improves 
 the energy efficiency of\nthe system\, but it suffers in terms of low QoE 
 as the small-size DL\nmodels have low inference accuracy. To reap the bene
 fits of doing\nlocal inference while not compromising on the inference acc
 uracy\, we\nexplore the idea of Hierarchical Inference (HI)\, wherein the 
 local\ninference is accepted only when it is correct\, otherwise\, the dat
 a\nsample is offloaded. However\, it is generally impossible to know if\nt
 he local inference is correct or not a priori. In this talk\, for the\npro
 totypical image classification application\, I will present the HI\nonline
  learning framework for identifying incorrect local inferences.\nThe resul
 ting problem turns out to be a novel partitioning experts\nproblem with co
 ntinuous action space. I will present algorithms with\nsub-linear regret a
 nalysis for both adversarial and stochastic\narrivals of experts and use s
 imulation to demonstrate the efficacy of\nHI on ImageNet and CIFAR-10 data
 sets.\n\nRoom: 3038\, Bldg: MacLeod Building\, 2356 Main Mall\, Vancouver\
 , British Columbia\, Canada
LOCATION:Room: 3038\, Bldg: MacLeod Building\, 2356 Main Mall\, Vancouver\,
  British Columbia\, Canada
ORGANIZER:lelewang@ece.ubc.ca
SEQUENCE:17
SUMMARY: Getting the Best of Both Worlds (End Devices and Edge/Cloud) Using
  Hierarchical Inference - IEEE DLT
URL;VALUE=URI:https://events.vtools.ieee.org/m/408207
X-ALT-DESC:Description: &lt;br /&gt;&lt;p&gt;In the past decade\, Deep Learning (DL) ha
 s achieved&lt;br /&gt;unprecedented improvement in the inference accuracy for se
 veral&lt;br /&gt;hard-to-tackle applications such as natural language processing
 \, image&lt;br /&gt;classification\, object detection and identification\, etc. 
 The&lt;br /&gt;state-of-the-art DL models that achieve close to 100% inference&lt;b
 r /&gt;accuracy are large requiring gigabytes of memory to load them. On the&lt;
 br /&gt;other end of the spectrum\, the tinyML community is pushing the limit
 s&lt;br /&gt;of compressing DL models to embed them on memory-limited IoT device
 s.&lt;br /&gt;Performing local inference for data samples on the end devices red
 uces&lt;br /&gt;delay\, saves network bandwidth\, and improves the energy effici
 ency of&lt;br /&gt;the system\, but it suffers in terms of low QoE as the small-
 size DL&lt;br /&gt;models have low inference accuracy. To reap the benefits of d
 oing&lt;br /&gt;local inference while not compromising on the inference accuracy
 \, we&lt;br /&gt;explore the idea of Hierarchical Inference (HI)\, wherein the l
 ocal&lt;br /&gt;inference is accepted only when it is correct\, otherwise\, the 
 data&lt;br /&gt;sample is offloaded. However\, it is generally impossible to kno
 w if&lt;br /&gt;the local inference is correct or not a priori. In this talk\, f
 or the&lt;br /&gt;prototypical image classification application\, I will present
  the HI&lt;br /&gt;online learning framework for identifying incorrect local inf
 erences.&lt;br /&gt;The resulting problem turns out to be a novel partitioning e
 xperts&lt;br /&gt;problem with continuous action space. I will present algorithm
 s with&lt;br /&gt;sub-linear regret analysis for both adversarial and stochastic
 &lt;br /&gt;arrivals of experts and use simulation to demonstrate the efficacy o
 f&lt;br /&gt;HI on ImageNet and CIFAR-10 datasets.&lt;/p&gt;
END:VEVENT
END:VCALENDAR

