BEGIN:VCALENDAR
VERSION:2.0
PRODID:IEEE vTools.Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
DTSTART:20250309T030000
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
RRULE:FREQ=YEARLY;BYDAY=2SU;BYMONTH=3
TZNAME:EDT
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:20251102T010000
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=11
TZNAME:EST
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20250905T152241Z
UID:3FCC47B2-AEA9-4105-9D85-7397BB724B6A
DTSTART;TZID=America/New_York:20250904T181500
DTEND;TZID=America/New_York:20250904T200000
DESCRIPTION:Recent years have witnessed a huge proliferation of low-cost de
 vices connected on the Internet\nof Things. Given the large amounts of dat
 a generated by these devices near the edge of the\nnetwork\, there is an i
 ncreasing need to process them near the network edge to meet the strict\nl
 atency requirements for the applications. For instance\, Connected Autonom
 ous Vehicle\napplications such as collision warning\, autonomous driving a
 nd traffic efficiency applications\nhave a strict latency requirement rang
 ing from 10 to 100 millisecond to produce timely actions.\nEdge computing 
 improves the quality of service for such applications by filling the laten
 cy gaps\nbetween the devices and the typical cloud infrastructures. While 
 Micro Data Centers provide\ncomputing resources that are geographically di
 stributed\, careful management of these resources\nnear the edge of the ne
 twork is vital for ensuring efficient\, cost-effective and resilient opera
 tion\nof the system while providing low-latency access for applications ex
 ecuting near the network\nedge. This talk will first introduce the notion 
 of Micro Data Centers and the edge computing\narchitecture. We will then d
 iscuss the algorithms\, techniques and design methodologies focusing\non e
 fficient and resilient resource allocation for latency-sensitive stream da
 ta processing in edge\ncomputing. Finally\, we will discuss some open rese
 arch problems in this area and discuss\npotential directions for future wo
 rk.\n\nSpeaker(s): Dr Balaji Palanisamy\, \n\nRoom: Township Meeting Room 
 \, 101 Crawfords Corner Road \, Holmdel Library\, Holmdel\, New Jersey\, U
 nited States\, 07733\, Virtual: https://events.vtools.ieee.org/m/495663
LOCATION:Room: Township Meeting Room \, 101 Crawfords Corner Road \, Holmde
 l Library\, Holmdel\, New Jersey\, United States\, 07733\, Virtual: https:
 //events.vtools.ieee.org/m/495663
ORGANIZER:Sharmaneerja100@gmail.com
SEQUENCE:51
SUMMARY:Latency-aware and Resilient Stream Data Processing in Edge Computin
 g
URL;VALUE=URI:https://events.vtools.ieee.org/m/495663
X-ALT-DESC:Description: &lt;br /&gt;&lt;p&gt;Recent years have witnessed a huge prolife
 ration of low-cost devices connected on the Internet&lt;br&gt;of Things. Given t
 he large amounts of data generated by these devices near the edge of the&lt;b
 r&gt;network\, there is an increasing need to process them near the network e
 dge to meet the strict&lt;br&gt;latency requirements for the applications. For i
 nstance\, Connected Autonomous Vehicle&lt;br&gt;applications such as collision w
 arning\, autonomous driving and traffic efficiency applications&lt;br&gt;have a 
 strict latency requirement ranging from 10 to 100 millisecond to produce t
 imely actions.&lt;br&gt;Edge computing improves the quality of service for such 
 applications by filling the latency gaps&lt;br&gt;between the devices and the ty
 pical cloud infrastructures. While Micro Data Centers provide&lt;br&gt;computing
  resources that are geographically distributed\, careful management of the
 se resources&lt;br&gt;near the edge of the network is vital for ensuring efficie
 nt\, cost-effective and resilient operation&lt;br&gt;of the system while providi
 ng low-latency access for applications executing near the network&lt;br&gt;edge.
  This talk will first introduce the notion of Micro Data Centers and the e
 dge computing&lt;br&gt;architecture. We will then discuss the algorithms\, techn
 iques and design methodologies focusing&lt;br&gt;on efficient and resilient reso
 urce allocation for latency-sensitive stream data processing in edge&lt;br&gt;co
 mputing. Finally\, we will discuss some open research problems in this are
 a and discuss&lt;br&gt;potential directions for future work.&lt;/p&gt;
END:VEVENT
END:VCALENDAR

