BEGIN:VCALENDAR
VERSION:2.0
PRODID:IEEE vTools.Events//EN
CALSCALE:GREGORIAN
BEGIN:VTIMEZONE
TZID:America/New_York
BEGIN:DAYLIGHT
DTSTART:20220313T030000
TZOFFSETFROM:-0500
TZOFFSETTO:-0400
RRULE:FREQ=YEARLY;BYDAY=2SU;BYMONTH=3
TZNAME:EDT
END:DAYLIGHT
BEGIN:STANDARD
DTSTART:20221106T010000
TZOFFSETFROM:-0400
TZOFFSETTO:-0500
RRULE:FREQ=YEARLY;BYDAY=1SU;BYMONTH=11
TZNAME:EST
END:STANDARD
END:VTIMEZONE
BEGIN:VEVENT
DTSTAMP:20221023T131814Z
UID:AB220996-76DF-4BDE-ACC2-3EFD9567E980
DTSTART;TZID=America/New_York:20221019T170000
DTEND;TZID=America/New_York:20221019T190000
DESCRIPTION:Abstract: As reported in the Forbes article “[The Role Of Bia
 s In Artificial Intelligence](https://urldefense.com/v3/__https://www.forb
 es.com/sites/forbestechcouncil/2021/02/04/the-role-of-bias-in-artificial-i
 ntelligence/?sh=421300f7579d__\;!!BeImMA!7hm8HTPEsan1ZioFQH-8t-2ISV-i4QFIu
 8-2xuuJI0MzL262DbM3rq58yiS6HJ1U3PUh39B0X39nbCrEs58$)”\, facial recogniti
 on systems are under scrutiny. Class imbalance is a leading issue in facia
 l recognition software. A dataset called &quot;Faces in the Wild\,&quot; considered 
 the benchmark for testing facial recognition software\, had data that was 
 70% male and 80% white. Although it might be good enough to be used on low
 er-quality pictures\, &quot;in the wild&quot; is a highly debatable topic.\n\nApart 
 from algorithms and data\, researchers and engineers developing any system
  are also responsible for bias. According to VentureBeat\, a Columbia Univ
 ersity study found that &quot;the more homogenous the [engineering] team is\, t
 he more likely it is that a given prediction error will appear.&quot; This can 
 create a lack of empathy for the people who face problems of discriminatio
 n\, leading to an unconscious introduction of bias in these algorithmic-sa
 vvy systems.\n\nSo\, how can we eliminate the negative impact of bias in t
 he use or development of our technology? Come to this session to gain insi
 ghts into this persnickety challenge.\n\nRoom: 128\, Bldg: SETA IDE\, 2500
  North River Rd.\, Manchester\, New Hampshire\, United States\, 03106
LOCATION:Room: 128\, Bldg: SETA IDE\, 2500 North River Rd.\, Manchester\, N
 ew Hampshire\, United States\, 03106
ORGANIZER:bbancroft@ieee.org
SEQUENCE:2
SUMMARY:Eliminating Bias from Technology for Developers
URL;VALUE=URI:https://events.vtools.ieee.org/m/327273
X-ALT-DESC:Description: &lt;br /&gt;&lt;p&gt;Abstract:&amp;nbsp\;&amp;nbsp\;As reported in the 
 Forbes article &amp;ldquo\;&lt;a href=&quot;https://urldefense.com/v3/__https://www.fo
 rbes.com/sites/forbestechcouncil/2021/02/04/the-role-of-bias-in-artificial
 -intelligence/?sh=421300f7579d__\;!!BeImMA!7hm8HTPEsan1ZioFQH-8t-2ISV-i4QF
 Iu8-2xuuJI0MzL262DbM3rq58yiS6HJ1U3PUh39B0X39nbCrEs58$&quot; target=&quot;_blank&quot; rel
 =&quot;noopener noreferrer&quot; data-auth=&quot;NotApplicable&quot; data-safelink=&quot;true&quot; data
 -linkindex=&quot;0&quot;&gt;The Role Of Bias In Artificial Intelligence&lt;/a&gt;&amp;rdquo\;\, f
 acial recognition systems are under scrutiny. Class imbalance is a leading
  issue in facial recognition software. A dataset called &quot;Faces in the Wild
 \,&quot; considered the benchmark for testing facial recognition software\, had
  data that was 70% male and 80% white. Although it might be good enough to
  be used on lower-quality pictures\, &quot;in the wild&quot; is a highly debatable t
 opic.&lt;/p&gt;\n&lt;p&gt;Apart from algorithms and data\, researchers and engineers d
 eveloping any system are also responsible for bias. According to VentureBe
 at\, a Columbia University study found that &quot;the more homogenous the [engi
 neering] team is\, the more likely it is that a given prediction error wil
 l appear.&quot; This can create a lack of empathy for the people who face probl
 ems of discrimination\, leading to an unconscious introduction of bias in 
 these algorithmic-savvy systems.&lt;/p&gt;\n&lt;p&gt;So\, how can we eliminate the neg
 ative impact of bias in the use or development of our technology? Come to 
 this session to gain insights into this persnickety&amp;nbsp\;challenge.&lt;/p&gt;
END:VEVENT
END:VCALENDAR

