BEGIN:VCALENDAR
VERSION:2.0
PRODID:IEEE vTools.Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:US/Eastern
BEGIN:DAYLIGHT
DTSTART:20210314T030000
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
RRULE:FREQ=YEARLY;BYDAY=2SU;BYMONTH=3
TZNAME:EDT
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:20211107T010000
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=11
TZNAME:EST
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20210524T142811Z
UID:BBDC3C0D-8170-4FCE-B998-BBA70EB5AE54
DTSTART;TZID=US/Eastern:20210320T090000
DTEND;TZID=US/Eastern:20210320T121500
DESCRIPTION:This course is confirmed to run on Saturday\, March 20th!\n\nCo
 urse Background and Content:\n\nThis is a live instructor-led introductory
  course on Neural Networks and Deep Learning. It is planned to be a two-pa
 rt series of courses. The first course is complete by itself. It will be a
  pre-requisite for the planned second course. The class material is mostly
  from the highly-regarded and free online book “Neural Networks and Deep
  Learning” by Michael Nielsen\, plus additional material such as some pr
 oofs of fundamental equations not provided in the book\, and (in planned P
 art 2) touching on more recent neural network types such as ResNet.\n\n3 h
 ours of instruction!\n\nFrom the book introduction: “Neural networks and
  deep learning currently provides the best solutions to many problems in i
 mage recognition\, speech recognition\, and natural language processing.
 ”\n\nThis Part 1 and the planned Part 2 (late spring/early summer 2021\,
  to be confirmed) series of courses will teach many of the core concepts b
 ehind neural networks and deep learning.\n\nReference book:\n\n“Neural N
 etworks and Deep Learning” by Michael Nielsen\, [http://neuralnetworksan
 ddeeplearning.com](http://neuralnetworksanddeeplearning.com/)\nMore from t
 he book introduction: “We’ll learn the core principles behind neural n
 etworks and deep learning by attacking a concrete problem: the problem of 
 teaching a computer to recognize handwritten digits. ...it can be solved p
 retty well using a simple neural network\, with just a few tens of lines o
 f code\, and no special libraries.” “But you don’t need to be a prof
 essional programmer.”\n\nThe code provided is in Python\, which even if 
 you don’t program in Python\, should be easy to understand with just a l
 ittle effort.\n\nBenefits of attending the series:\n\n* Learn the core pri
 nciples behind neural networks and deep learning.\n* See a simple python p
 rogram that solves a concrete problem: teaching a computer to recognize a 
 handwritten digit.\n* Improve the result through incorporating more and mo
 re of core ideas about neural networks and deep learning.\n* Principle-ori
 ented\, with worked-out proofs of fundamental equations of backpropagation
  for those interested.\n* Yet hands-on practical\, with simple code exampl
 es.\n\nSpeaker(s): CL Kim \, \n\nAgenda: \nAgenda:\n\nIntroduction to Prac
 tical Neural Networks and Deep Learning (Part 1)\nFeedforward Neural Netwo
 rks.\n* Simple (Python) Network to classify a handwritten digit\n* Learnin
 g with Gradient Descent\n* How the backpropagation algorithm works\n* Impr
 oving the way neural networks learn:\n** Cross-entropy cost function\n** S
 oftmax activation function and log-likelihood cost function\n** Rectified 
 Linear Unit\n** Overfitting and Regularization:\n*** L2 regularization\n**
 * Dropout\n*** Artificially expanding data set\n*** Hyper-parameters\n\nPr
 e-requisites:\nThere is some heavier mathematics in proving the four funda
 mental equations behind backprogation\, so a basic familiarity with multiv
 ariable calculus and linear algebra is expected\, but nothing advanced is 
 required. (The backpropagation equations can be also just accepted without
  bothering with the proofs since the provided python code for the simple n
 etwork just makes use of the equations.)\n\nA live\, interactive webinar\,
  Boston\, Massachusetts\, United States\, Virtual: https://events.vtools.i
 eee.org/m/253675
LOCATION:A live\, interactive webinar\, Boston\, Massachusetts\, United Sta
 tes\, Virtual: https://events.vtools.ieee.org/m/253675
ORGANIZER:ieeebostonsection@gmail.com
SEQUENCE:13
SUMMARY:Introduction to Practical Neural Networks and Deep Learning (Part 1
 ) 
URL;VALUE=URI:https://events.vtools.ieee.org/m/253675
X-ALT-DESC:Description: &lt;br /&gt;&lt;p&gt;This course is confirmed to run on Saturda
 y\, March 20th!&amp;nbsp\;&lt;/p&gt;\n&lt;p&gt;Course Background and Content:&lt;/p&gt;\n&lt;p&gt;&lt;br 
 /&gt;This is a live instructor-led introductory course on Neural Networks and
  Deep Learning. It is planned to be a two-part series of courses. The firs
 t course is complete by itself. It will be a pre-requisite for the planned
  second course. The class material is mostly from the highly-regarded and 
 free online book &amp;ldquo\;Neural Networks and Deep Learning&amp;rdquo\; by Mich
 ael Nielsen\, plus additional material such as some proofs of fundamental 
 equations not provided in the book\, and (in planned Part 2) touching on m
 ore recent neural network types such as ResNet.&lt;/p&gt;\n&lt;p&gt;3 hours of instruc
 tion!&lt;/p&gt;\n&lt;p&gt;From the book introduction: &amp;ldquo\;Neural networks and deep
  learning currently provides the best solutions to many problems in image 
 recognition\, speech recognition\, and natural language processing.&amp;rdquo\
 ;&lt;br /&gt;&lt;br /&gt;This Part 1 and the planned Part 2 (late spring/early summer 
 2021\, to be confirmed) series of courses will teach many of the core conc
 epts behind neural networks and deep learning.&lt;br /&gt;&lt;br /&gt;Reference book:&lt;
 /p&gt;\n&lt;p&gt;&lt;br /&gt;&amp;ldquo\;Neural Networks and Deep Learning&amp;rdquo\; by Michael
  Nielsen\,&amp;nbsp\;&lt;a href=&quot;http://neuralnetworksanddeeplearning.com/&quot; targe
 t=&quot;_blank&quot; rel=&quot;noopener&quot; data-saferedirecturl=&quot;https://www.google.com/url
 ?q=http://neuralnetworksanddeeplearning.com&amp;amp\;source=gmail&amp;amp\;ust=160
 9337576863000&amp;amp\;usg=AFQjCNGgbg2vqyLTqa1OzIHqE7V851xkng&quot;&gt;http://&lt;wbr /&gt;n
 euralnetworksanddeeplearning.&lt;wbr /&gt;com&lt;/a&gt;&lt;br /&gt;More from the book introd
 uction: &amp;ldquo\;We&amp;rsquo\;ll learn the core principles behind neural netwo
 rks and deep learning by attacking a concrete problem: the problem of teac
 hing a computer to recognize handwritten digits. ...it can be solved prett
 y well using a simple neural network\, with just a few tens of lines of co
 de\, and no special libraries.&amp;rdquo\; &amp;ldquo\;But you don&amp;rsquo\;t need t
 o be a professional programmer.&amp;rdquo\;&lt;/p&gt;\n&lt;p&gt;&lt;br /&gt;The code provided is
  in Python\, which even if you don&amp;rsquo\;t program in Python\, should be 
 easy to understand with just a little effort.&lt;br /&gt;&lt;br /&gt;Benefits of atten
 ding the series:&lt;/p&gt;\n&lt;p&gt;&lt;br /&gt;* Learn the core principles behind neural n
 etworks and deep learning.&lt;br /&gt;* See a simple python program that solves 
 a concrete problem: teaching a computer to recognize a handwritten digit.&lt;
 br /&gt;* Improve the result through incorporating more and more of core idea
 s about neural networks and deep learning.&lt;br /&gt;* Principle-oriented\, wit
 h worked-out proofs of fundamental equations of backpropagation for those 
 interested.&lt;br /&gt;* Yet hands-on practical\, with simple code examples.&lt;/p&gt;
 &lt;br /&gt;&lt;br /&gt;Agenda: &lt;br /&gt;&lt;p&gt;Agenda:&lt;/p&gt;\n&lt;p&gt;&lt;br /&gt;Introduction to Practic
 al Neural Networks and Deep Learning (Part 1)&lt;br /&gt;Feedforward Neural Netw
 orks.&lt;br /&gt;* Simple (Python) Network to classify a handwritten digit&lt;br /&gt;
 * Learning with Gradient Descent&lt;br /&gt;* How the backpropagation algorithm 
 works&lt;br /&gt;* Improving the way neural networks learn:&lt;br /&gt;** Cross-entrop
 y cost function&lt;br /&gt;** Softmax activation function and log-likelihood cos
 t function&lt;br /&gt;** Rectified Linear Unit&lt;br /&gt;** Overfitting and Regulariz
 ation:&lt;br /&gt;*** L2 regularization&lt;br /&gt;*** Dropout&lt;br /&gt;*** Artificially e
 xpanding data set&lt;br /&gt;*** Hyper-parameters&lt;/p&gt;\n&lt;p&gt;Pre-requisites:&lt;br /&gt;T
 here is some heavier mathematics in proving the four fundamental equations
  behind backprogation\, so a basic familiarity with multivariable calculus
  and linear algebra is expected\, but nothing advanced is required. (The b
 ackpropagation equations can be also just accepted without bothering with 
 the proofs since the provided python code for the simple network just make
 s use of the equations.)&lt;/p&gt;
END:VEVENT
END:VCALENDAR

