Codes for Speed: Large-Scale Computation, Signal Recovery and Learning

#North #Jersey #Section #IT
Share

Prof. Kannan Ramchandran (UC Berkeley) will be giving a lecture on large scale computation, signal recovery and learning (Jack Keil Wolf Lecture Series). This lecture will be held in Room LC-400, 5 MetroTech Center (4th floor), Brooklyn, NY, on Thursday, Oct. 5, 2017, from 10 am until 11:30 am.

The Jack Keil Wolf Lecture Series is being organized by the Center for Advanced Technology in Telecommunications (CATT) and is co-sponsored by the IEEE New York/North Jersey Information Theory Society Chapter.

Everyone is welcome to attend this meeting.

Please register in advance for this meeting using the registration link below to provide the meeting organizers an accurate head count. You can cancel the registration using the same link if your plans change.

For more information, please contact Prof. Shivendra Panwar (http://catt.poly.edu/~panwar), Prof. Elza Erkip (eeweb.poly.edu/~elza) and/or Dr. Adriaan van Wijngaarden (avw@ieee.org).



  Date and Time

  Location

  Hosts

  Registration



  • Date: 05 Oct 2017
  • Time: 10:00 AM to 11:30 AM
  • All times are (GMT-05:00) US/Eastern
  • Add_To_Calendar_icon Add Event to Calendar
  • 5 MetroTech Center
  • Room LC 400 (4th floor)
  • Brooklyn, New York
  • United States 11201
  • Click here for Map

  • Contact Event Host
  • Prof. Elza Erkip (eeweb.poly.edu/~elza; elza@nyu.edu), and Adriaan J. van Wijngaarden, IEEE New York/North Jersey Information Theory Society Chapter Chair, E-mail: avw@ieee.org

  • Co-sponsored by CATT, IEEE IT NY/NJ Chapter
  • Starts 15 September 2017 12:00 PM
  • Ends 05 October 2017 11:00 AM
  • All times are (GMT-05:00) US/Eastern
  • No Admission Charge


  Speakers

Kannan Ramchandran Kannan Ramchandran of UC Berkeley

Topic:

Codes for Speed: Large-Scale Computation, Signal Recovery and Learning

Abstract - Seven decades after Claude Shannon's groundbreaking work, codes are now an indispensable part of modern communications and storage systems.   But do they have a role in today's information age that is witness to exponential data deluge?  Can codes help address the challenge of scale in computation, inference, and learning by exploiting underlying structure such as sparsity?  In this talk, we will explore how coding theory can go well beyond traditional communications applications, and can indeed offer an unconventional and valuable playground for some of these problems, with an emphasis on speed. Specifically, we will view a diverse class of problems through the lens of sparse-graph codes.  They form the core of a unified architecture featuring a divide-and-conquer strategy built on simple guess-and-check primitives, and fast peeling-based decoding.  This allows for real-time sparse-structure recovery involving large datasets, in contrast to popular convex relaxation based optimization methods that can be computationally difficult to scale.  We will illustrate our approach to computational tasks such as massive-scale sparse Fourier and Walsh transforms, sparse polynomial learning, support recovery in compressed sensing, phase-retrieval, and group testing, while unveiling insightful connections between sampling theory and coding theory.  The application space is broad, encompassing MRI and optical imaging, hyper-graph sketching, fast neighbor discovery in IoT, spectrum sensing for cognitive radio, and learning mixtures of sparse linear regressions.  Time-permitting, we will also highlight how codes can speed up machine learning in today's distributed cloud computing systems by rendering them robust to system noise in the form of  `straggling' compute nodes.


 

 

Biography:

Kannan Ramchandran has been a Professor of Electrical Engineering and Computer Science at UC Berkeley since 1999. He was on the faculty at the University of Illinois from 1993 to 1999. Prof. Ramchandran is a recipient of the 2017 IEEE Kobayashi Computers and Communications award for his contributions to the theory and practice of distributed storage coding and distributed compression. He is a Fellow of the IEEE, has published extensively in his field, and holds over a dozen patents. He has received several awards for his research and teaching including an IEEE Information Theory Society and Communication Society Joint Best Paper award for 2012, an IEEE Communication Society Data Storage Best Paper award in 2010, two Best Paper awards from the IEEE Signal Processing Society in 1993 and 1999, an Okawa Foundation Prize for outstanding research at Berkeley in 2001, and an Outstanding Teaching Award at Berkeley in 2009. His research interests lie at the broad intersection of signal processing, machine learning, coding and information theory, and peer-to-peer networking.

 

Address:Berkeley, California, United States