BEGIN:VCALENDAR
VERSION:2.0
PRODID:IEEE vTools.Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:US/Eastern
BEGIN:DAYLIGHT
DTSTART:20240310T030000
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
RRULE:FREQ=YEARLY;BYDAY=2SU;BYMONTH=3
TZNAME:EDT
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:20241103T010000
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=11
TZNAME:EST
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20240319T134823Z
UID:1B8C60A5-B6A4-4D30-8877-CBD361F84942
DTSTART;TZID=US/Eastern:20240316T090000
DTEND;TZID=US/Eastern:20240316T123000
DESCRIPTION:Course Format: Live Webinar\, 3.5 hours of instruction! Series 
 Overview: From the book introduction: “Neural networks and deep learning
  currently provides the best solutions to many problems in image recogniti
 on\, speech recognition\, and natural language processing.”\n\nThis Part
  1 and the planned Part 2\, (to be confirmed) series of courses will teach
  many of the core concepts behind neural networks and deep learning.\n\nTh
 is is a live instructor-led introductory course on Neural Networks and Dee
 p Learning. It is planned to be a two-part series of courses. The first co
 urse is complete by itself and covers a feedforward neural network (but no
 t convolutional neural network in Part 1). It will be a pre-requisite for 
 the planned Part 2 second course. The class material is mostly from the hi
 ghly-regarded and free online book “Neural Networks and Deep Learning”
  by Michael Nielsen\, plus additional material such as some proofs of fund
 amental equations not provided in the book.\n\nMore from the book introduc
 tion: Reference book: “Neural Networks and Deep Learning” by Michael N
 ielsen\, http://neuralnetworksanddeeplearning.com/ “We’ll learn the co
 re principles behind neural networks and deep learning by attacking a conc
 rete problem: the problem of teaching a computer to recognize handwritten 
 digits. …it can be solved pretty well using a simple neural network\, wi
 th just a few tens of lines of code\, and no special libraries.”\n\n“B
 ut you don’t need to be a professional programmer.”\n\nThe code provid
 ed is in Python\, which even if you don’t program in Python\, should be 
 easy to understand with just a little effort.\n\nBenefits of attending the
  series:\n\n* Learn the core principles behind neural networks and deep le
 arning.\n* See a simple Python program that solves a concrete problem: tea
 ching a computer to recognize a handwritten digit.\n* Improve the result t
 hrough incorporating more and more core ideas about neural networks and de
 ep learning.\n* Understand the theory\, with worked-out proofs of fundamen
 tal\n\nThe demo Python program (updated from version provided in the book)
  can be downloaded from the speaker’s GitHub account. The demo program i
 s run in a Docker container that runs on your Mac\, Windows\, or Linux per
 sonal computer\; we plan to provide instructions on doing that in advance 
 of the class.\n\n(That would be one good reason to register early if you p
 lan to attend\, in order that you can receive the straightforward instruct
 ions and leave yourself with plenty of time to prepare the Git and Docker 
 software that are widely used among software professionals.)\n\nCourse Bac
 kground and Content: This is a live instructor-led introductory course on 
 Neural Networks and Deep Learning. It is planned to be a two-part series o
 f courses. The first course is complete by itself and covers a feedforward
  neural network (but not convolutional neural network in Part 1). It will 
 be a pre-requisite for the planned Part 2 second course. The class materia
 l is mostly from the highly-regarded and free online book “Neural Networ
 ks and Deep Learning” by Michael Nielsen\, plus additional material such
  as some proofs of fundamental equations not provided in the book.\n\nOutl
 ine:\n\n- Feedforward Neural Networks\n- Simple (Python) Network to classi
 fy a handwritten digit\n- Learning with Stochastic Gradient Descent\n- How
  the backpropagation algorithm work\n- Improving the way neural networks l
 earn:\n-\n- Cross-entropy cost function\n- SoftMax activation function and
  log-likelihood cost function\n- Rectified Linear Unit\n\n- Overfitting an
 d Regularization:\n-\n- L2 regularization\n- Dropout\n- Artificially expan
 ding data set\n\nPre-requisites: There is some heavier mathematics in lear
 ning the four fundamental equations behind backpropagation\, so a basic fa
 miliarity with multivariable calculus and matrix algebra is expected\, but
  nothing advanced is required. (The backpropagation equations can be also 
 just accepted without bothering with the proofs since the provided Python 
 code for the simple network just make use of the equations.) Basic familia
 rity with Python or similar computer language.\n\nSpeaker(s): CL Kim\, \n\
 nBoston\, Massachusetts\, United States\, Virtual: https://events.vtools.i
 eee.org/m/401136
LOCATION:Boston\, Massachusetts\, United States\, Virtual: https://events.v
 tools.ieee.org/m/401136
ORGANIZER:k.safina@ieee.org
SEQUENCE:21
SUMMARY:Introduction to Neural Networks and Deep Learning (Part I)
URL;VALUE=URI:https://events.vtools.ieee.org/m/401136
X-ALT-DESC:Description: &lt;br /&gt;&lt;p&gt;Course Format:&amp;nbsp\; &amp;nbsp\;Live Webinar\
 , 3.5 hours of instruction! Series Overview: &amp;nbsp\; From the book introdu
 ction: &amp;ldquo\;Neural networks and deep learning currently provides the be
 st solutions to many problems in image recognition\, speech recognition\, 
 and natural language processing.&amp;rdquo\;&lt;br&gt;&lt;br&gt;This Part 1 and the planne
 d Part 2\, (to be confirmed) series of courses will teach many of the core
  concepts behind neural networks and deep learning.&lt;/p&gt;\n&lt;p&gt;This is a live
  instructor-led introductory course on Neural Networks and Deep Learning. 
 It is planned to be a two-part series of courses. The first course is comp
 lete by itself and covers a feedforward neural network (but not convolutio
 nal neural network in Part 1). It will be a pre-requisite for the planned 
 Part 2 second course. The class material is mostly from the highly-regarde
 d and free online book &amp;ldquo\;Neural Networks and Deep Learning&amp;rdquo\; b
 y Michael Nielsen\, plus additional material such as some proofs of fundam
 ental equations not provided in the book.&lt;br&gt;&lt;br&gt;More from the book introd
 uction: &amp;nbsp\;Reference book: &amp;ldquo\;Neural Networks and Deep Learning&amp;r
 dquo\; by Michael Nielsen\, &lt;a href=&quot;http://neuralnetworksanddeeplearning.
 com/&quot; target=&quot;_blank&quot; rel=&quot;noopener&quot; data-saferedirecturl=&quot;https://www.goo
 gle.com/url?q=http://neuralnetworksanddeeplearning.com/&amp;amp\;source=gmail&amp;
 amp\;ust=1668716871780000&amp;amp\;usg=AOvVaw2YFKDZOE9rV3h7YP9bcQcS&quot;&gt;http://&lt;w
 br&gt;neuralnetworksanddeeplearning.&lt;wbr&gt;com/&lt;/a&gt; &amp;ldquo\;We&amp;rsquo\;ll learn 
 the core principles behind neural networks and deep learning by attacking 
 a concrete problem: the problem of teaching a computer to recognize handwr
 itten digits. &amp;hellip\;it can be solved pretty well using a simple neural 
 network\, with just a few tens of lines of code\, and no special libraries
 .&amp;rdquo\;&lt;br&gt;&lt;br&gt;&amp;ldquo\;But you don&amp;rsquo\;t need to be a professional pr
 ogrammer.&amp;rdquo\;&lt;br&gt;&lt;br&gt;The code provided is in Python\, which even if yo
 u don&amp;rsquo\;t program in Python\, should be easy to understand with just 
 a little effort.&lt;/p&gt;\n&lt;p&gt;Benefits of attending the series:&lt;/p&gt;\n&lt;p&gt;* Learn
  the core principles behind neural networks and deep learning.&lt;br&gt;* See a 
 simple Python program that solves a concrete problem: teaching a computer 
 to recognize a handwritten digit.&lt;br&gt;* Improve the result through incorpor
 ating more and more core ideas about neural networks and deep learning.&lt;br
 &gt;* Understand the theory\, with worked-out proofs of fundamental&amp;nbsp\;&lt;/p
 &gt;\n&lt;p&gt;The demo Python program (updated from version provided in the book) 
 can be downloaded from the speaker&amp;rsquo\;s GitHub account. The demo progr
 am is run in a Docker container that runs on your Mac\, Windows\, or Linux
  personal computer\; we plan to provide instructions on doing that in adva
 nce of the class.&lt;/p&gt;\n&lt;p&gt;(That would be one good reason to register early
  if you plan to attend\, in order that you can receive the straightforward
  instructions and leave yourself with plenty of time to prepare the Git an
 d Docker software that are widely used among software professionals.)&lt;/p&gt;\
 n&lt;p&gt;Course Background and Content: &amp;nbsp\; This is a live instructor-led i
 ntroductory course on Neural Networks and Deep Learning. It is planned to 
 be a two-part series of courses. The first course is complete by itself an
 d covers a feedforward neural network (but not convolutional neural networ
 k in Part 1). It will be a pre-requisite for the planned Part 2 second cou
 rse. The class material is mostly from the highly-regarded and free online
  book &amp;ldquo\;Neural Networks and Deep Learning&amp;rdquo\; by Michael Nielsen
 \, plus additional material such as some proofs of fundamental equations n
 ot provided in the book.&lt;/p&gt;\n&lt;p&gt;Outline:&lt;/p&gt;\n&lt;ul&gt;\n&lt;li&gt;Feedforward Neura
 l Networks&lt;/li&gt;\n&lt;li&gt;Simple (Python) Network to classify a handwritten dig
 it&lt;/li&gt;\n&lt;li&gt;Learning with Stochastic Gradient Descent&lt;/li&gt;\n&lt;li&gt;How the b
 ackpropagation algorithm work&lt;/li&gt;\n&lt;li&gt;Improving the way neural networks 
 learn:&lt;/li&gt;\n&lt;li&gt;\n&lt;ul&gt;\n&lt;li&gt;Cross-entropy cost function&lt;/li&gt;\n&lt;li&gt;SoftMax
  activation function and log-likelihood cost function&lt;/li&gt;\n&lt;li&gt;Rectified 
 Linear Unit&lt;/li&gt;\n&lt;/ul&gt;\n&lt;/li&gt;\n&lt;li&gt;Overfitting and Regularization:&lt;/li&gt;\n
 &lt;li&gt;\n&lt;ul&gt;\n&lt;li&gt;L2 regularization&lt;/li&gt;\n&lt;li&gt;Dropout&lt;/li&gt;\n&lt;li&gt;Artificially
  expanding data set&lt;/li&gt;\n&lt;/ul&gt;\n&lt;/li&gt;\n&lt;/ul&gt;\n&lt;p&gt;Pre-requisites: There is
  some heavier mathematics in learning the four fundamental equations behin
 d backpropagation\, so a basic familiarity with multivariable calculus and
  matrix algebra is expected\, but nothing advanced is required. (The backp
 ropagation equations can be also just accepted without bothering with the 
 proofs since the provided Python code for the simple network just make use
  of the equations.) Basic familiarity with Python or similar computer lang
 uage.&lt;/p&gt;
END:VEVENT
END:VCALENDAR

