BEGIN:VCALENDAR
VERSION:2.0
PRODID:IEEE vTools.Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:US/Eastern
BEGIN:DAYLIGHT
DTSTART:20200308T030000
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
RRULE:FREQ=YEARLY;BYDAY=2SU;BYMONTH=3
TZNAME:EDT
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:20201101T010000
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=11
TZNAME:EST
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20210121T012844Z
UID:12A49F40-26BF-4D25-90E7-C8CF4F0D1F58
DTSTART;TZID=US/Eastern:20201028T181500
DTEND;TZID=US/Eastern:20201028T201500
DESCRIPTION:Guessing Random Additive Noise Decoding (GRAND)\n\nWe introduce
  a new algorithm for a noise-centric\, rather than code-centric\, ML decod
 ing. The algorithm is based on the principle that the receiver rank orders
  noise sequences from most likely to least likely\, and guesses noises acc
 ordingly.\n\nAbstract: Claude Shannon&#39;s 1948 &quot;A Mathematical Theory of Com
 munication&quot; provided the basis for the digital communication revolution. A
 s part of that ground-breaking work\, he identified the greatest rate (cap
 acity) at which data can be communicated over a noisy channel. He also pro
 vided an algorithm for achieving it\, based on random codes and a code-cen
 tric Maximum Likelihood (ML) decoding\, where channel outputs are compared
  to all possible codewords to select the most likely candidate based on th
 e observed output. Despite its mathematical elegance\, his algorithm is im
 practical from a complexity perspective and much work in the intervening 7
 0 years has focused on co-designing codes and decoders that enable reliabl
 e communication at high rates.\n\nWe introduce a new algorithm for a noise
 -centric\, rather than code-centric\, ML decoding. The algorithm is based 
 on the principle that the receiver rank orders noise sequences from most l
 ikely to least likely and guesses noises accordingly. Subtracting noise fr
 om the received signal in that order\, the first instance that results in 
 an element of the code-book is the ML decoding. For common additive noise 
 channels\, we establish that the algorithm is capacity-achieving for unifo
 rmly selected code-books\, providing an intuitive alternate approach to th
 e channel coding theorem. We illustrate the practical usefulness of our ap
 proach and the fact that it renders the decoding of random codes feasible.
  The complexity of the decoding is\, for the sorts of channels generally u
 sed in commercial applications\, quite low\, unlike code-centric ML\n\nCo-
 sponsored by: Chamara Johnson\, Chair\, VTS chapter NY section\n\nSpeaker(
 s): Dr. Muriel Medard\, \n\nAgenda: \n6.15pm to 6.20pm: Login and checks\n
 \n6.20pm: Introduce speaker\; 6.30pm - 8.00pm: Talk &quot;Guessing Random Addit
 ive Noise Decoding (GRAND)&quot;\n\n8.00pm - 8.15pm Q &amp; A session 8.00pm Closin
 g remarks\n\nHolmdel\, New Jersey\, United States\, 07733\, Virtual: https
 ://events.vtools.ieee.org/m/241603
LOCATION:Holmdel\, New Jersey\, United States\, 07733\, Virtual: https://ev
 ents.vtools.ieee.org/m/241603
ORGANIZER:Raghunandan@ieee.org
SEQUENCE:8
SUMMARY:CAPACITY OF WIRELESS NETWORKS - A NEW APPROACH
URL;VALUE=URI:https://events.vtools.ieee.org/m/241603
X-ALT-DESC:Description: &lt;br /&gt;&lt;p&gt;&lt;strong&gt;Guessing Random Additive Noise Dec
 oding (GRAND)&lt;/strong&gt;&lt;/p&gt;\n&lt;p&gt;We introduce a new algorithm for a noise-ce
 ntric\, rather than code-centric\, ML decoding. The algorithm is based on 
 the principle that the receiver rank orders noise sequences from most like
 ly to least likely\, and guesses noises accordingly.&lt;/p&gt;\n&lt;p&gt;&lt;strong&gt;&lt;u&gt;Ab
 stract:&lt;/u&gt;&lt;/strong&gt; Claude Shannon&#39;s 1948 &quot;A Mathematical Theory of Commu
 nication&quot; provided the basis for the digital communication revolution. As 
 part of that ground-breaking work\, he identified the greatest rate (capac
 ity) at which data can be communicated over a noisy channel. He also provi
 ded an algorithm for achieving it\, based on random codes and a code-centr
 ic Maximum Likelihood (ML) decoding\, where channel outputs are compared t
 o all possible codewords to select the most likely candidate based on the 
 observed output. Despite its mathematical elegance\, his algorithm is impr
 actical from a complexity perspective and much work in the intervening 70 
 years has focused on co-designing codes and decoders that enable reliable 
 communication at high rates.&lt;/p&gt;\n&lt;p&gt;&amp;nbsp\;&lt;/p&gt;\n&lt;p&gt;We introduce a new al
 gorithm for a noise-centric\, rather than code-centric\, ML decoding. The 
 algorithm is based on the principle that the receiver rank orders noise se
 quences from most likely to least likely and guesses noises accordingly. S
 ubtracting noise from the received signal in that order\, the first instan
 ce that results in an element of the code-book is the ML decoding. For com
 mon additive noise channels\, we establish that the algorithm is capacity-
 achieving for uniformly selected code-books\, providing an intuitive alter
 nate approach to the channel coding theorem. &amp;nbsp\;We illustrate the prac
 tical usefulness of our approach and the fact that it renders the decoding
  of random codes feasible. The complexity of the decoding is\, for the sor
 ts of channels generally used in commercial applications\, quite low\, unl
 ike code-centric ML&lt;/p&gt;&lt;br /&gt;&lt;br /&gt;Agenda: &lt;br /&gt;&lt;p&gt;6.15pm to 6.20pm: Logi
 n and checks&lt;/p&gt;\n&lt;p&gt;6.20pm: Introduce speaker\; 6.30pm - 8.00pm:&amp;nbsp\;Ta
 lk &quot;Guessing Random Additive Noise Decoding (GRAND)&quot;&lt;/p&gt;\n&lt;p&gt;8.00pm - 8.15
 pm&amp;nbsp\; Q &amp;amp\; A session 8.00pm Closing remarks&lt;/p&gt;
END:VEVENT
END:VCALENDAR

