Rochester SPS Neural Networks From Scratch Lecture 1: Regression and Gradient Descent

#neural-networks,machine-learning,artificial-intelligence,tutorial
Share

Neural networks are the dominant class of machine learning algorithms for computer vision and natural language processing today, and their rise in the past 10 years feels like a revolution. But the history of neural networks is not one of revolution, but evolution. Modern neural networks represent decades of fine-tuning of old ideas. In this first lecture, we discuss linear regression and logistic regression. These linear models, in addition to being useful in their own right, are examples that we can use to explain concepts critical to neural networks such as loss functions, gradient descent, and activation functions.



  Date and Time

  Location

  Hosts

  Registration



  • Date: 30 Jul 2021
  • Time: 12:00 PM to 01:00 PM
  • All times are (GMT-05:00) US/Eastern
  • Add_To_Calendar_icon Add Event to Calendar
If you are not a robot, please complete the ReCAPTCHA to display virtual attendance info.
  • Contact Event Host
  • Starts 24 June 2021 09:07 PM
  • Ends 30 July 2021 11:59 AM
  • All times are (GMT-05:00) US/Eastern
  • No Admission Charge


  Speakers

Dr. Miguel Dominguez

Biography:

Miguel Dominguez is a machine learning engineer at VisualDx where he works on automatically diagnosing skin diseases with machine learning. He received his PhD in Engineering at Rochester Institute of Technology this year. He also received his MS in Electrical Engineering at RIT in 2016 as well as a BS in Computer Science and Engineering at University of Toledo in 2012. His research interests include deep learning, point cloud analysis, graph theory, speech processing, and biomedical imaging.