BEGIN:VCALENDAR
VERSION:2.0
PRODID:IEEE vTools.Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:US/Eastern
BEGIN:DAYLIGHT
DTSTART:20220313T030000
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
RRULE:FREQ=YEARLY;BYDAY=2SU;BYMONTH=3
TZNAME:EDT
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:20221106T010000
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=11
TZNAME:EST
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20220912T200938Z
UID:6BA1187B-0F06-4CD6-85D5-F7198A3EDE08
DTSTART;TZID=US/Eastern:20220907T170000
DTEND;TZID=US/Eastern:20220907T200000
DESCRIPTION:Links for the movie and discussion can be found in the locaiton
  area. See below for dates and time details. If you are registering\, or d
 id not receive an email\, please contact Anne (anne.costolanski@gmail) or 
 Chauncey (chaunceywie@gmail.com) for the links. Thanks!\n\nCoded BIAS Movi
 e Event 1: August 31st 2022 12:00pm-1:30pm EST\n\nCoded BIAS Movie Event 2
 : September 7th 2022 5:00pm-6:30pm EST\n\nCoded Bias Discussion: September
  7th 2022 7:00pm-8:00pm EST\n\nThe movie &quot;Coded Bias&quot;\, explores the fallo
 ut of MIT Media Lab researcher Joy Buolamwini’s discovery that facial re
 cognition does not see dark-skinned faces accurately\, and her journey to 
 push for the first-ever legislation in the U.S. to govern against bias in 
 the algorithms that impact us all .\n\nModern society sits at the intersec
 tion of two crucial questions: What does it mean when artificial intellige
 nce increasingly governs our liberties? And what are the consequences for 
 the people AI is biased against? When MIT Media Lab researcher Joy Buolamw
 ini discovers that many facial recognition technologies do not accurately 
 detect darker-skinned faces or classify the faces of women\, she delves in
 to an investigation of widespread bias in algorithms. As it turns out\, ar
 tificial intelligence is not neutral\, and women are leading the charge to
  ensure our civil rights are protected.\n\nJoin us for a discussion of the
  movie\, its topics\, and our own experiences on Wednesday\, Sept 7\, at 7
 pm. A link to the movie will be provided in advance so attendees can watch
  the movie\, on their own time\, prior to the discussion. Even if you can&#39;
 t make the discussion\, sign up to watch the movie.\n\nWatch the trailer h
 ere --&gt; [About — CODED BIAS](https://www.codedbias.com/about)\n\nCo-spon
 sored by: WE30387 - Piedmont WIE\n\nVirtual: https://events.vtools.ieee.or
 g/m/322405
LOCATION:Virtual: https://events.vtools.ieee.org/m/322405
ORGANIZER:chaunceywie@gmail.com
SEQUENCE:7
SUMMARY:&quot;Coded Bias&quot; movie viewing and discussion
URL;VALUE=URI:https://events.vtools.ieee.org/m/322405
X-ALT-DESC:Description: &lt;br /&gt;&lt;p&gt;Links for the movie and discussion can be 
 found in the locaiton area. See below for dates and time details. &amp;nbsp\;I
 f you are registering\, or did not receive an email\, please contact Anne 
 (anne.costolanski@gmail) or Chauncey (chaunceywie@gmail.com) for the links
 . &amp;nbsp\;Thanks!&lt;/p&gt;\n&lt;p&gt;Coded BIAS Movie Event 1: &lt;strong&gt;August 31st 202
 2 &lt;/strong&gt;12:00pm-1:30pm EST&lt;/p&gt;\n&lt;p&gt;Coded BIAS Movie Event 2:&amp;nbsp\; &lt;st
 rong&gt;September 7th 2022 &lt;/strong&gt;5:00pm-6:30pm EST&amp;nbsp\;&lt;/p&gt;\n&lt;p&gt;Coded Bi
 as Discussion: &lt;strong&gt;September 7th 2022 &lt;/strong&gt;7:00pm-8:00pm EST&lt;/p&gt;\n
 &lt;p&gt;The movie &quot;Coded Bias&quot;\, explores the fallout of MIT Media Lab research
 er Joy Buolamwini&amp;rsquo\;s discovery that facial recognition does not see 
 dark-skinned faces accurately\, and her journey to push for the first-ever
  legislation in the U.S. to govern against bias in the algorithms that imp
 act us all .&lt;/p&gt;\n&lt;p&gt;Modern society sits at the intersection of two crucia
 l questions: What does it mean when artificial intelligence increasingly g
 overns our liberties? And what are the consequences for the people AI is b
 iased against? When MIT Media Lab researcher Joy Buolamwini discovers that
  many facial recognition technologies do not accurately detect darker-skin
 ned faces or classify the faces of women\, she delves into an investigatio
 n of widespread bias in algorithms. As it turns out\, artificial intellige
 nce is not neutral\, and women are leading the charge to ensure our civil 
 rights are protected.&lt;/p&gt;\n&lt;p&gt;Join us for a discussion of the movie\, its 
 topics\, and our own experiences on Wednesday\, Sept 7\, at 7pm. &amp;nbsp\;A 
 link to the movie will be provided in advance so attendees can watch the m
 ovie\, on their own time\, prior to the discussion. Even if you can&#39;t make
  the discussion\, sign up to watch the movie. &amp;nbsp\;&lt;/p&gt;\n&lt;p&gt;Watch the tr
 ailer here --&amp;gt\; &lt;a href=&quot;https://www.codedbias.com/about&quot;&gt;About &amp;mdash\
 ; CODED BIAS&lt;/a&gt;&lt;/p&gt;
END:VEVENT
END:VCALENDAR

