Jump to content

Joy Buolamwini: Difference between revisions

From Wikipedia, the free encyclopedia
Content deleted Content added
Citation bot (talk | contribs)
Added magazine. Removed parameters. Some additions/deletions were parameter name changes. | Use this bot. Report bugs. | Suggested by Whoop whoop pull up | Category:MIT Media Lab people | #UCB_Category 48/83
 
(10 intermediate revisions by 7 users not shown)
Line 36: Line 36:


== Early life and education ==
== Early life and education ==
Buolamwini was born in [[Edmonton]], Alberta, grew up in [[Mississippi]], and attended [[Cordova High School (Tennessee)|Cordova High School]] in Cordova, Tennessee.<ref name=":0">{{Cite news|url=https://amysmartgirls.com/the-future-of-computer-science-and-tech-12-young-women-to-watch-part-2-334c2282025d|title=The Future of Computer Science and Tech: 12 Young Women to Watch — Part 2|date=2018-02-19|website=amysmartgirls.com|publisher=Amy Poehler’s Smart Girls|access-date=2018-03-24}}</ref> At age nine, she was inspired by [[Kismet (robot)|Kismet]], the [[Massachusetts Institute of Technology|MIT]] robot, and taught herself [[XHTML]], [[JavaScript]] and [[PHP]].<ref name=":1">{{Cite web|url=http://www.trycomputing.org/career-profiles/joy-buolamwini|title=Joy Buolamwini {{!}} TryComputing.org|website=trycomputing.org|language=en|access-date=2018-03-24|archive-date=March 25, 2018|archive-url=https://web.archive.org/web/20180325045604/http://www.trycomputing.org/career-profiles/joy-buolamwini|url-status=dead}}</ref><ref>{{Cite news|url=https://www.bloomberg.com/news/articles/2017-06-26/the-digital-activist-taking-human-prejudice-out-of-our-machines|title=Meet The Digital Activist That's Taking Human Prejudice Out of Our Machines|date=2017-06-26|work=Bloomberg.com|access-date=2018-03-24|language=en}}</ref> She was a competitive [[pole vault]]er <ref>{{Cite web|url=http://vault.awardspace.com/JoyBuolamwini.html|title=CHS Pole Vaulting - Joy Buolamwini|website=vault.awardspace.com|access-date=2018-03-24|archive-url=https://web.archive.org/web/20180325050435/http://vault.awardspace.com/JoyBuolamwini.html|archive-date=March 25, 2018|url-status=dead}}</ref> and was also involved in track and field as well as basketball. In a podcast episode she recorded on Brené Brown's show "Dare to Lead", she recalls completing her AP Physics homework between basketball break times.<ref name="open.spotify.com">{{cite web | url=https://open.spotify.com/episode/3eQoX2LMQkyASGHokzRMo9?si=9469d723072e4cee | title=Spotify | website=[[Spotify]] }}</ref>
Buolamwini was born in [[Edmonton]], Alberta, grew up in [[Mississippi]], and attended [[Cordova High School (Tennessee)|Cordova High School]] in Cordova, Tennessee.<ref name=":0">{{Cite news|url=https://amysmartgirls.com/the-future-of-computer-science-and-tech-12-young-women-to-watch-part-2-334c2282025d|title=The Future of Computer Science and Tech: 12 Young Women to Watch — Part 2|date=2018-02-19|website=amysmartgirls.com|publisher=Amy Poehler’s Smart Girls|access-date=2018-03-24}}</ref> At age nine, she was inspired by [[Kismet (robot)|Kismet]], the [[Massachusetts Institute of Technology|MIT]] robot, and taught herself [[XHTML]], [[JavaScript]] and [[PHP]].<ref name=":1">{{Cite web|url=http://www.trycomputing.org/career-profiles/joy-buolamwini|title=Joy Buolamwini {{!}} TryComputing.org|website=trycomputing.org|language=en|access-date=2018-03-24|archive-date=March 25, 2018|archive-url=https://web.archive.org/web/20180325045604/http://www.trycomputing.org/career-profiles/joy-buolamwini|url-status=dead}}</ref><ref>{{Cite news|url=https://www.bloomberg.com/news/articles/2017-06-26/the-digital-activist-taking-human-prejudice-out-of-our-machines|title=Meet The Digital Activist That's Taking Human Prejudice Out of Our Machines|date=2017-06-26|work=Bloomberg.com|access-date=2018-03-24|language=en}}</ref> As a student-athlete, she was a competitive [[pole vault]]er<ref>{{Cite web|url=http://vault.awardspace.com/JoyBuolamwini.html|title=CHS Pole Vaulting - Joy Buolamwini|website=vault.awardspace.com|access-date=2018-03-24|archive-url=https://web.archive.org/web/20180325050435/http://vault.awardspace.com/JoyBuolamwini.html|archive-date=March 25, 2018|url-status=dead}}</ref> and played basketball. In a podcast episode she recorded on Brené Brown's show "Dare to Lead", she recalls completing her AP Physics homework between basketball break times.<ref name="open.spotify.com">{{cite web | url=https://open.spotify.com/episode/3eQoX2LMQkyASGHokzRMo9?si=9469d723072e4cee | title=Spotify | website=[[Spotify]] }}</ref>


As an undergraduate, Buolamwini studied computer science at the [[Georgia Institute of Technology]], where she researched [[health informatics]].<ref name=":3">{{Cite news|url=http://www.blackenterprise.com/tech-startup-of-the-week-techturized/|title=Tech Startup of The Week: Techturized Wins With Hair Care Company|date=2013-03-15|work=Black Enterprise|access-date=2018-03-24|language=en-US}}</ref> Buolamwini graduated as a Stamps President's Scholar<ref>{{Cite web|url=https://stampsps.gatech.edu/|title=Stamps President's Scholars Program|website=stampsps.gatech.edu}}</ref> from Georgia Tech in 2012,<ref name=":4">{{Cite news|url=https://news.mit.edu/2017/joy-buolamwini-wins-hidden-figures-contest-for-fighting-machine-learning-bias-0117|title=Joy Buolamwini wins national contest for her work fighting bias in machine learning|work=MIT News|access-date=2018-03-24}}</ref> and was the youngest finalist of the Georgia Tech InVenture Prize in 2009.<ref name=":2">{{Cite web|title=Admissions Conquered {{!}} InVenture Prize|url=https://inventureprize.gatech.edu/team/admissions-conquered|access-date=2021-09-25|website=inventureprize.gatech.edu}}</ref>
As an undergraduate, Buolamwini studied computer science at the [[Georgia Institute of Technology]], where she researched [[health informatics]].<ref name=":3">{{Cite news|url=http://www.blackenterprise.com/tech-startup-of-the-week-techturized/|title=Tech Startup of The Week: Techturized Wins With Hair Care Company|date=2013-03-15|work=Black Enterprise|access-date=2018-03-24|language=en-US}}</ref> Buolamwini graduated as a Stamps President's Scholar<ref>{{Cite web|url=https://stampsps.gatech.edu/|title=Stamps President's Scholars Program|website=stampsps.gatech.edu}}</ref> from Georgia Tech in 2012,<ref name=":4">{{Cite news|url=https://news.mit.edu/2017/joy-buolamwini-wins-hidden-figures-contest-for-fighting-machine-learning-bias-0117|title=Joy Buolamwini wins national contest for her work fighting bias in machine learning|work=MIT News|access-date=2018-03-24}}</ref> and was the youngest finalist of the Georgia Tech InVenture Prize in 2009.<ref name=":2">{{Cite web|title=Admissions Conquered {{!}} InVenture Prize|url=https://inventureprize.gatech.edu/team/admissions-conquered|access-date=2021-09-25|website=inventureprize.gatech.edu}}</ref>


Buolamwini is a [[Rhodes Scholarship|Rhodes Scholar]], a [[Fulbright fellow]], a [[Stamps Scholarship|Stamps Scholar]], an Astronaut Scholar, and an [[Anita Borg Institute]] scholar.<ref>{{Cite web|title=Scholar Spotlight: Joy Buolamwini {{!}} Astronaut Scholarship Foundation|url=https://www.astronautscholarship.org/scholar-spotlight-joy-buolamwini/|access-date=2021-09-25|language=en-US}}</ref> As a [[Rhodes Scholarship|Rhodes Scholar]], she studied learning and technology at the [[University of Oxford]], where she was a student based at [[Jesus College, Oxford]].<ref name=msc>{{cite thesis|degree=MSc|year=2014|title=Increasing participation in graduate level computer science education : a case study of the Georgia Institute of Technology's master of computer science|publisher=University of Oxford|oclc=908967245|url=https://solo.bodleian.ox.ac.uk/permalink/f/89vilt/oxfaleph020304866|website=ox.ac.uk|first=Joy Adowaa|last=Buolamwini}}</ref><ref name=":5">{{Cite web|url=http://rhodesproject.com/joy-buolamwini-profile/|title=Joy Buolamwini Profile|website=rhodesproject.com|publisher=The Rhodes Project|language=en-US|access-date=2018-03-24}}</ref> During her scholarship she took part in the first formal Service Year, working on community focused projects.<ref name=":5" /><ref>{{Cite web|url=https://www.eship.ox.ac.uk/oxford-launchpad-confessions-entrepreneur-joy-buolamwini|title=Oxford Launchpad: Confessions of an Entrepreneur: Joy Buolamwini {{!}} Enterprising Oxford|website=eship.ox.ac.uk|language=en|access-date=2018-03-24|archive-date=March 25, 2018|archive-url=https://web.archive.org/web/20180325051223/https://www.eship.ox.ac.uk/oxford-launchpad-confessions-entrepreneur-joy-buolamwini|url-status=dead}}</ref> She was awarded a Master's Degree in Media Arts & Sciences from MIT in 2017 for research supervised by [[Ethan Zuckerman]].<ref name=phd>{{cite thesis|degree=MS|year=2017|publisher=MIT|title=Gender shades : intersectional phenotypic and demographic evaluation of face datasets and gender classifiers|first=Joy Adowaa|last=Buolamwini|hdl=1721.1/114068|oclc=1026503582}} {{free access}}</ref> She was awarded a PhD degree in Media Arts & Sciences from the MIT Media Lab in 2022 with a thesis on ''Facing the Coded Gaze with Evocative Audits and Algorithmic Audits''.<ref>{{cite thesis|degree=PhD|year=2022|title=Facing the Coded Gaze with Evocative Audits and Algorithmic Audits|first=Joy|last=Buolamwini|url=https://hdl.handle.net/1721.1/143396|hdl=1721.1/143396|website=mit.edu}}</ref>
Buolamwini is a [[Rhodes Scholarship|Rhodes Scholar]], a [[Fulbright fellow]], a [[Stamps Scholarship|Stamps Scholar]], an Astronaut Scholar, and an [[Anita Borg Institute]] scholar.<ref>{{Cite web|title=Scholar Spotlight: Joy Buolamwini {{!}} Astronaut Scholarship Foundation|url=https://www.astronautscholarship.org/scholar-spotlight-joy-buolamwini/|access-date=2021-09-25|language=en-US}}</ref> As a [[Rhodes Scholarship|Rhodes Scholar]], she studied learning and technology at the [[University of Oxford]], where she was a student based at [[Jesus College, Oxford]].<ref name=msc>{{cite thesis|degree=MSc|year=2014|title=Increasing participation in graduate level computer science education : a case study of the Georgia Institute of Technology's master of computer science|publisher=University of Oxford|oclc=908967245|url=https://solo.bodleian.ox.ac.uk/permalink/f/89vilt/oxfaleph020304866|website=ox.ac.uk|first=Joy Adowaa|last=Buolamwini}}{{Dead link|date=December 2024 |bot=InternetArchiveBot |fix-attempted=yes }}</ref><ref name=":5">{{Cite web|url=http://rhodesproject.com/joy-buolamwini-profile/|title=Joy Buolamwini Profile|website=rhodesproject.com|publisher=The Rhodes Project|language=en-US|access-date=2018-03-24}}</ref> During her scholarship she took part in the first formal Service Year, working on community focused projects.<ref name=":5" /><ref>{{Cite web|url=https://www.eship.ox.ac.uk/oxford-launchpad-confessions-entrepreneur-joy-buolamwini|title=Oxford Launchpad: Confessions of an Entrepreneur: Joy Buolamwini {{!}} Enterprising Oxford|website=eship.ox.ac.uk|language=en|access-date=2018-03-24|archive-date=March 25, 2018|archive-url=https://web.archive.org/web/20180325051223/https://www.eship.ox.ac.uk/oxford-launchpad-confessions-entrepreneur-joy-buolamwini|url-status=dead}}</ref> She was awarded a Master's Degree in Media Arts & Sciences from MIT in 2017 for research supervised by [[Ethan Zuckerman]].<ref name=phd>{{cite thesis|degree=MS|year=2017|publisher=MIT|title=Gender shades : intersectional phenotypic and demographic evaluation of face datasets and gender classifiers|first=Joy Adowaa|last=Buolamwini|hdl=1721.1/114068|oclc=1026503582}} {{free access}}</ref> She was awarded a PhD degree in Media Arts & Sciences from the MIT Media Lab in 2022 with a thesis on ''Facing the Coded Gaze with Evocative Audits and Algorithmic Audits''.<ref>{{cite thesis|degree=PhD|year=2022|title=Facing the Coded Gaze with Evocative Audits and Algorithmic Audits|first=Joy|last=Buolamwini|url=https://hdl.handle.net/1721.1/143396|hdl=1721.1/143396|website=mit.edu}}</ref>


== Career and research==
== Career and research==
In 2011, Buolamwini worked with the [[trachoma]] program at the [[Carter Center]] to develop an [[Android (operating system)|Android]]-based assessment system for use in Ethiopia.<ref>{{Cite web|url=https://astronautscholarship.org/scholar-spotlight-joy-buolamwini/|title=Scholar Spotlight: Joy Buolamwini {{!}} Astronaut Scholarship Foundation|website=astronautscholarship.org|language=en-US|access-date=2018-03-24}}</ref><ref name=":1" /> [[File:Joy Buolamwini - Wikimania 2018 02.jpg|thumb|right|Joy Buolamwini at Wikimania 2018 in Cape Town]] As a Fulbright fellow, in 2013 she worked with local computer scientists in Zambia to help Zambian youth become technology creators.<ref>{{Citation|last=ZamrizeMedia|title=Joy Buolamwini {{!}} Fulbright Fellow 2013 {{!}} Zambia|date=2013-04-28|url=https://www.youtube.com/watch?v=usWTUBFEJkk|accessdate=2018-03-24}}</ref> On September 14, 2016, Buolamwini appeared at the [[White House]] summit on Computer Science for All.{{citation needed|date=March 2021}}
In 2011, Buolamwini worked with the [[trachoma]] program at the [[Carter Center]] to develop an [[Android (operating system)|Android]]-based assessment system for use in Ethiopia.<ref>{{Cite web|url=https://astronautscholarship.org/scholar-spotlight-joy-buolamwini/|title=Scholar Spotlight: Joy Buolamwini {{!}} Astronaut Scholarship Foundation|website=astronautscholarship.org|language=en-US|access-date=2018-03-24}}</ref><ref name=":1" />
[[File:The Dangers of Supremely White Data and The Coded Gaze - YouTube.webm|thumb|Interface from Joy Buolamwini’s Gender Shades project evaluating biases in facial recognition systems]]
[[File:Joy Buolamwini - Wikimania 2018 02.jpg|thumb|right|Joy Buolamwini at Wikimania 2018 in Cape Town]] As a Fulbright fellow, in 2013 she worked with local computer scientists in Zambia to help Zambian youth become technology creators.<ref>{{Citation|last=ZamrizeMedia|title=Joy Buolamwini {{!}} Fulbright Fellow 2013 {{!}} Zambia|date=2013-04-28|url=https://www.youtube.com/watch?v=usWTUBFEJkk|accessdate=2018-03-24}}</ref> On September 14, 2016, Buolamwini appeared at the [[White House]] summit on Computer Science for All.<ref>{{Cite web |last=Buolamwini |first=Joy |date=2017-03-27 |title=#CSForAll Tribute to Seymour Papert |url=https://medium.com/mit-media-lab/csforall-tribute-to-seymour-papert-f30b2b43ea16 |access-date=2024-12-09 |website=MIT MEDIA LAB |language=en}}</ref>


Buolamwini was a researcher at the [[MIT Media Lab]], where she worked to identify bias in algorithms and to develop practices for accountability during their design;<ref>{{Cite web|url=https://www.media.mit.edu/projects/algorithmic-justice-league/overview/|title=Project Overview ‹ Algorithmic Justice League – MIT Media Lab|website=MIT Media Lab|access-date=2018-03-24}}</ref> at the lab, Buolamwini was a member of [[Ethan Zuckerman]]'s [[Center for Civic Media]] group.<ref>{{Cite web|url=http://mitadmissions.org/blogs/entry/interview-joy-buolamwini|title=interview: joy buolamwini {{!}} MIT Admissions|website=mitadmissions.org|date=March 5, 2017 |language=en|access-date=2018-03-24}}</ref><ref>{{Cite web|url=https://www.media.mit.edu/groups/civic-media/people/|title=Group People ‹ Civic Media – MIT Media Lab|website=MIT Media Lab|access-date=2018-03-24}}</ref> During her research, Buolamwini showed 1,000 faces to [[facial recognition system]]s and asked the systems to identify whether faces were female or male, and found that the software found it hard to identify dark-skinned women.<ref>{{Cite magazine|url=https://www.wired.com/story/photo-algorithms-id-white-men-fineblack-women-not-so-much/|title=Photo Algorithms ID White Men Fine—Black Women, Not So Much|website=wired.com|publisher=Wired magazine|access-date=2018-03-24|language=en-US}}</ref> Her project, ''Gender Shades'', became part of her MIT thesis.<ref name=phd/><ref>{{Cite news|url=https://www.bbc.co.uk/news/technology-39533308|title=Is artificial intelligence racist?|last=Kleinman|first=Zoe|date=2017-04-14|website=bbc.co.uk|publisher=[[BBC News]]|access-date=2018-03-24|language=en-GB}}</ref> Her 2018 paper ''Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification''<ref>{{Cite journal|last=Buolamwini|first=Joy|date=2018|title=Gender shades: Intersectional accuracy disparities in commercial gender classification|url=http://proceedings.mlr.press/v81/buolamwini18a.html?mod=article_inline|journal=Conference on Fairness, Accountability and Transparency|volume=81|pages=77–91|via=mir.press}}</ref> prompted responses from [[IBM]] and [[Microsoft]], which swiftly improved their software.<ref>{{Cite web|url=https://www.ibm.com/blogs/research/2018/02/mitigating-bias-ai-models/|title=Mitigating Bias in Artificial Intelligence (AI) Models -- IBM Research|date=2016-05-16|website=ibm.com|language=en-US|access-date=2018-03-24}}</ref><ref>{{Cite web|url=http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf|title=Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification|last=|first=|date=2018|website=Proceedings of Machine Learning Research|access-date=2018-03-24}}</ref> She also created the Aspire Mirror, a device that lets users see a reflection of themselves based on what inspires them.<ref>{{Cite web|url=http://www.aspiremirror.com/#idea|title=Aspire Mirror|website=Aspire Mirror|language=en|access-date=2018-03-24}}</ref> Her program, Algorithmic Justice League, aims to highlight the bias in code that can lead to discrimination against underrepresented groups.<ref>{{Cite web|url=https://www.huffingtonpost.com/entry/a-search-for-hidden-figures-finds-joy_us_58b5f466e4b0e5fdf61977ef|title=A Search For 'Hidden Figures' Finds Joy|last=International|first=Youth Radio-- Youth Media|date=2017-02-28|website=huffingtonpost.com|publisher=[[Huffington Post]]|language=en-US|access-date=2018-03-24}}</ref> She has created two films, ''Code4Rights'' and ''Algorithmic Justice League: Unmasking Bias''.<ref>{{Cite web|url=https://filmmakerscollab.org/films/code4rights/|title=Filmmakers Collaborative {{!}} Code4Rights|website=filmmakerscollab.org|access-date=2018-03-24}}</ref><ref>{{Cite web|url=https://filmmakerscollab.org/films/algorithmic-justice-league-unmasking-bias/|title=Filmmakers Collaborative {{!}} Algorithmic Justice League: Unmasking Bias|website=filmmakerscollab.org|access-date=2018-03-24}}</ref> She served as [[Chief Technology Officer]] (CTO) for Techturized Inc., a hair-care technology company.<ref name=":3" />
Buolamwini was a researcher at the [[MIT Media Lab]], where she worked to identify bias in algorithms and to develop practices for accountability during their design;<ref>{{Cite web|url=https://www.media.mit.edu/projects/algorithmic-justice-league/overview/|title=Project Overview ‹ Algorithmic Justice League – MIT Media Lab|website=MIT Media Lab|access-date=2018-03-24}}</ref> at the lab, Buolamwini was a member of [[Ethan Zuckerman]]'s [[Center for Civic Media]] group.<ref>{{Cite web|url=http://mitadmissions.org/blogs/entry/interview-joy-buolamwini|title=interview: joy buolamwini {{!}} MIT Admissions|website=mitadmissions.org|date=March 5, 2017 |language=en|access-date=2018-03-24}}</ref><ref>{{Cite web|url=https://www.media.mit.edu/groups/civic-media/people/|title=Group People ‹ Civic Media – MIT Media Lab|website=MIT Media Lab|access-date=2018-03-24}}</ref> During her research, Buolamwini showed 1,000 faces to [[facial recognition system]]s and asked the systems to identify whether faces were female or male, and found that the software found it hard to identify dark-skinned women.<ref>{{Cite magazine|url=https://www.wired.com/story/photo-algorithms-id-white-men-fineblack-women-not-so-much/|title=Photo Algorithms ID White Men Fine—Black Women, Not So Much|website=wired.com|publisher=Wired magazine|access-date=2018-03-24|language=en-US}}</ref> Her project, ''Gender Shades'', became part of her MIT thesis.<ref name=phd/><ref>{{Cite news|url=https://www.bbc.co.uk/news/technology-39533308|title=Is artificial intelligence racist?|last=Kleinman|first=Zoe|date=2017-04-14|website=bbc.co.uk|publisher=[[BBC News]]|access-date=2018-03-24|language=en-GB}}</ref> Her 2018 paper ''Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification''<ref>{{Cite journal|last=Buolamwini|first=Joy|date=2018|title=Gender shades: Intersectional accuracy disparities in commercial gender classification|url=http://proceedings.mlr.press/v81/buolamwini18a.html?mod=article_inline|journal=Conference on Fairness, Accountability and Transparency|volume=81|pages=77–91|via=mir.press}}</ref> prompted responses from [[IBM]] and [[Microsoft]] to take corrective actions to improve the accuracy of their algorithms, swiftly improved their software demonstrating her influence on the industry.<ref>{{Cite web|url=https://www.ibm.com/blogs/research/2018/02/mitigating-bias-ai-models/|title=Mitigating Bias in Artificial Intelligence (AI) Models -- IBM Research|date=2016-05-16|website=ibm.com|language=en-US|access-date=2018-03-24}}</ref><ref>{{Cite web|url=http://proceedings.mlr.press/v81/buolamwini18a/buolamwini18a.pdf|title=Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification|last=|first=|date=2018|website=Proceedings of Machine Learning Research|access-date=2018-03-24}}</ref> She also created the Aspire Mirror, a device that lets users see a reflection of themselves based on what inspires them.<ref>{{Cite web|url=http://www.aspiremirror.com/#idea|title=Aspire Mirror|website=Aspire Mirror|language=en|access-date=2018-03-24}}</ref> Her program, Algorithmic Justice League, aims to highlight the bias in code that can lead to discrimination against underrepresented groups.<ref>{{Cite web|url=https://www.huffingtonpost.com/entry/a-search-for-hidden-figures-finds-joy_us_58b5f466e4b0e5fdf61977ef|title=A Search For 'Hidden Figures' Finds Joy|last=International|first=Youth Radio-- Youth Media|date=2017-02-28|website=huffingtonpost.com|publisher=[[Huffington Post]]|language=en-US|access-date=2018-03-24}}</ref> She has created two films, ''Code4Rights'' and ''Algorithmic Justice League: Unmasking Bias''.<ref>{{Cite web|url=https://filmmakerscollab.org/films/code4rights/|title=Filmmakers Collaborative {{!}} Code4Rights|website=filmmakerscollab.org|access-date=2018-03-24}}</ref><ref>{{Cite web|url=https://filmmakerscollab.org/films/algorithmic-justice-league-unmasking-bias/|title=Filmmakers Collaborative {{!}} Algorithmic Justice League: Unmasking Bias|website=filmmakerscollab.org|access-date=2018-03-24}}</ref> Still the director, Code4rights is an advocacy organization started in 2012 intended to use technology to spread awareness of human rights.<ref>{{Cite web |title=Joy Buolamwini - TECHHER |url=https://techherng.com/joy-buolamwini/ |access-date=2024-12-09 |language=en-US}}</ref> She served as [[Chief Technology Officer]] (CTO) for Techturized Inc., a hair-care technology company.<ref name=":3" />


Buolamwini's research was cited in 2020 as an influence for [[Google]] and [[Microsoft]] in addressing gender and race bias in their products and processes.<ref>{{Cite web|url=https://www.biometricupdate.com/202003/tech-giants-pressured-to-follow-google-in-removing-gender-labels-from-computer-vision-services|title=Tech giants pressured to follow Google in removing gender labels from computer vision services|last1=Mar 2|last2=Burt|first2=2020 {{!}} Chris|date=2020-03-02|website=biometricupdate.com|publisher=Biometric Update|language=en-US|access-date=2020-03-09}}</ref>
Buolamwini's research was cited in 2020 as an influence for [[Google]] and [[Microsoft]] in addressing gender and race bias in their products and processes.<ref>{{Cite web|url=https://www.biometricupdate.com/202003/tech-giants-pressured-to-follow-google-in-removing-gender-labels-from-computer-vision-services|title=Tech giants pressured to follow Google in removing gender labels from computer vision services|last1=Mar 2|last2=Burt|first2=2020 {{!}} Chris|date=2020-03-02|website=biometricupdate.com|publisher=Biometric Update|language=en-US|access-date=2020-03-09}}</ref>
Line 52: Line 54:


In 2023, she published her first book, ''Unmasking AI: My Mission to Protect What Is Human in a World of Machines,'' which chronicles her research''.''<ref>{{cite web |date=2023-11-03 |title=Column: She set out to build robots. She ended up exposing big tech |url=https://www.latimes.com/entertainment-arts/books/story/2023-11-02/joy-buolamwini-unmasking-ai |access-date=2024-01-22 |website=Los Angeles Times |language=en-US}}</ref>
In 2023, she published her first book, ''Unmasking AI: My Mission to Protect What Is Human in a World of Machines,'' which chronicles her research''.''<ref>{{cite web |date=2023-11-03 |title=Column: She set out to build robots. She ended up exposing big tech |url=https://www.latimes.com/entertainment-arts/books/story/2023-11-02/joy-buolamwini-unmasking-ai |access-date=2024-01-22 |website=Los Angeles Times |language=en-US}}</ref>

== AI Bias and Gender Equity ==
Dr. Joy Buolamwini’s research on AI bias has been pivotal in advancing [[Women in engineering|gender equity]] within engineering and technology. Her research found that AI-powered facial-recognition systems showed higher error rates when identifying darker-skinned women, with rates reaching 34.7%, compared to 0.8% for lighter-skinned men. These disparities indicated potential biases in algorithmic design, where biased training data and incomplete evaluation processes led to unequal technological outcomes based on both gender and skin tone.<ref>{{Cite web |date=2018-02-12 |title=Study finds gender and skin-type bias in commercial artificial-intelligence systems |url=https://news.mit.edu/2018/study-finds-gender-skin-type-bias-artificial-intelligence-systems-0212 |access-date=2024-12-09 |website=MIT News {{!}} Massachusetts Institute of Technology |language=en}}</ref>

Buolamwini’s personal experience with AI performance limitations motivated her research into [[algorithmic bias]]. While working on a facial-recognition-based art project at the MIT Media Lab, she discovered that commercial AI systems could not consistently detect her face due to her darker skin. This frustration inspired her landmark research project ''Gender Shades'', which rigorously evaluated facial analysis systems from IBM, Microsoft, and Face++. Her study revealed that these systems were most accurate for lighter-skinned men, with error rates as low as 1%, while their accuracy plummeted for darker-skinned women, with misclassification rates as high as 47%.<ref name=":6">{{Cite web |title=Joy Buolamwini: examining racial and gender bias in facial analysis software |url=https://artsandculture.google.com/story/joy-buolamwini-examining-racial-and-gender-bias-in-facial-analysis-software-barbican-centre/BQWBaNKAVWQPJg?hl=en |access-date=2024-12-09 |website=Google Arts & Culture |language=en}}</ref>

Realizing that these failures stemmed from data imbalances, Buolamwini introduced the ''Pilot Parliaments Benchmark'', a [[Diversity, equity, and inclusion|diverse dataset]] designed to address the lack of representation in typical AI training sets, which were composed of over 75% male and 80% lighter-skinned faces. This new benchmark set a critical precedent for evaluating and improving AI performance by ensuring more equitable testing standards.<ref name=":6" />

Her findings contributed to significant changes in the tech industry. Following the publication of her research, companies such as IBM and Microsoft took steps to improve their algorithms, reducing bias and enhancing accuracy. However, Buolamwini has noted that improved technical accuracy alone does not eliminate risks of potential misuse in areas such as racial profiling, surveillance, and hiring decisions.<ref name=":6" />

To address these concerns, Buolamwini co-founded the Safe Face Pledge, encouraging tech companies to adopt ethical AI practices. The pledge prohibits weaponizing facial recognition, bans lawless police use, and demands transparency in government surveillance applications. Her advocacy emphasizes that achieving fairness in AI development requires a multi-faceted approach, including regulatory frameworks and collaborative efforts.<ref name=":6" />


=== Activism ===
=== Activism ===
[[File:Algorithmic Justice League logo.png|thumb|Logo of the Algorithmic Justice League]]
Buolamwini founded the [[Algorithmic Justice League]] (AJL) in 2016 to promote equitable and accountable [[artificial intelligence]] (AI).<ref>{{Cite web|title=Mission, Team and Story - The Algorithmic Justice League|url=https://www.ajl.org/about|access-date=2021-05-09|website=ajl.org}}</ref> The AJL organization combines art and research to point out potential societal implications and harms of AI. The company works to raise public awareness of the impacts of AI, and promote further research in the area.
Buolamwini founded the [[Algorithmic Justice League]] (AJL) in 2016 to promote equitable and accountable [[artificial intelligence]] (AI).<ref>{{Cite web|title=Mission, Team and Story - The Algorithmic Justice League|url=https://www.ajl.org/about|access-date=2021-05-09|website=ajl.org}}</ref> The AJL organization integrates art and research to examine societal implications and reduce AI-related harms. The company works to raise public awareness of AI’s impact while advancing research on bias mitigation. It also addresses issues at the intersection of equity and technology, promoting more inclusive and accessible engineering systems. AJL has also encouraged public engagement through interactive campaigns, exhibitions, and educational initiatives, ensuring a broad audience is informed about the impact of biased algorithms on [[Gender equality|gender equity]].


To broaden its outreach, AJL has partnered with organizations such as Black Girls Code to encourage African-American girls to pursue [[Science, technology, engineering, and mathematics|STEM]] careers, thereby fostering more diversity in the tech industry. AJL conducts workshops and provides resources aimed at educating the public and tech community about AI biases, with a focus on empowering underrepresented genders to engage with and challenge these systems.
The AJL website provides information and a live blog.<ref name="Spotlight - Coded Bias Documentary">{{Cite web|title=Spotlight - Coded Bias Documentary|url=https://www.ajl.org/spotlight-documentary-coded-bias|access-date=2021-05-09|website=ajl.org}}</ref> There are several sections on the site where users can share stories, and donate or write to US Congressional representatives. In 2019, Buolamwini testified before the [[United States House Committee on Oversight and Reform]] about the risks of facial recognition technology.<ref>{{Cite web|last=Quach|first=Katyanna|title=We listened to more than 3 hours of US Congress testimony on facial recognition so you didn't have to go through it|year=2019|url=https://www.theregister.com/2019/05/22/congress_facial_recognition/|access-date=2022-01-13|website=theregister.com|language=en}}</ref>

The success of AJL reflects the collective efforts of its team. Some key members of the Algorithmic Justice League include Rachel Fagen, the Chief of Staff, who focuses on organizational development and building connections to promote equitable and accountable AI. Aurum Linh serves as the AI Harms Analyst, dedicated to identifying and mitigating the adverse effects of artificial intelligence. The Algorithm Justice League works with various groups, including CORE funders, advisory committees, and research collaborators, to enhance transparency and accountability in AI systems, ensuring that its advocacy efforts remain impactful and inclusive.<ref>{{Cite web |title=Mission, Team and Story - The Algorithmic Justice League |url=https://www.ajl.org/about |access-date=2024-12-09 |website=www.ajl.org |language=en}}</ref>

The AJL website provides information and a live blog.<ref name="Spotlight - Coded Bias Documentary">{{Cite web|title=Spotlight - Coded Bias Documentary|url=https://www.ajl.org/spotlight-documentary-coded-bias|access-date=2021-05-09|website=ajl.org}}</ref> There are several sections on the site where users can share stories, and donate or write to US Congressional representatives. Buolamwini has influenced policy discussions to address [[gender discrimination]] in AI applications, advocating for regulations that ensure fairness in AI-powered decision-making systems. In 2019, she testified before the [[United States House Committee on Oversight and Reform]] about the risks of facial recognition technology.<ref>{{Cite web|last=Quach|first=Katyanna|title=We listened to more than 3 hours of US Congress testimony on facial recognition so you didn't have to go through it|year=2019|url=https://www.theregister.com/2019/05/22/congress_facial_recognition/|access-date=2022-01-13|website=theregister.com|language=en}}</ref> Her testimony emphasized the need for accountability in the deployment of facial recognition technologies, particularly in areas where these systems could exacerbate gender inequities.

She believed the executive order fell short in terms of redress, or consequences, for AI systems that hurt minority communities.<ref>{{Cite web |last=Feiner |first=Hayden Field,Lauren |date=2023-11-02 |title=Biden's AI order didn't go far enough to address fairness, but it's a good first step, advocates say |url=https://www.cnbc.com/2023/11/02/biden-ai-executive-order-industry-civil-rights-labor-groups-react.html |access-date=2024-12-09 |website=CNBC |language=en}}</ref> Her efforts supported the [[Diversity, equity, and inclusion|inclusion]] of measures to address discrimination in AI applications, particularly in areas like hiring, housing, and criminal justice. Biden’s executive order is a “long and ongoing process” which is happening because the industry is not incentivized to do so she said.<ref>{{Cite web |title=Biden's executive order aims to limit the harms of AI |url=https://www.marketplace.org/shows/marketplace-tech/bidens-executive-order-aims-to-limit-the-harms-of-ai/ |access-date=2024-12-09 |website=Marketplace |language=en-US}}</ref>

Joy Buolamwini, through the Algorithmic Justice League (AJL), has been instrumental in advocating for the inclusion and support of [[Women in engineering in the United States|women]], transgender, and non-binary individuals in the technology sector. Her initiatives focus on exposing and mitigating biases in artificial intelligence (AI) that disproportionately affect these underrepresented groups.

Buolamwini has led campaigns targeting gender equity in AI and technology. In 2021, she collaborated with [[Olay]] on the ''Decode the Bias'' campaign, which examined biases in beauty algorithms affecting women of color. This initiative evaluated Olay's Skin Advisor System to ensure equitable treatment across all skin tones.<ref>{{Citation |title=Algorithmic Justice League |date=2024-10-10 |work=Wikipedia |url=https://en.wikipedia.org/wiki/Algorithmic_Justice_League |access-date=2024-12-09 |language=en}}</ref>

Building on these initiatives, AJL launched the Community Reporting of Algorithmic System Harms (CRASH), which unites key stakeholders to develop tools that enable broader participation in creating accountable and equitable AI systems, directly addressing issues that affect underrepresented genders.<ref>{{Cite web |title=Algorithmic Vulnerability Bounty Project (AVBP) |url=https://www.ajl.org/crash-project |access-date=2024-12-09 |website=www.ajl.org |language=en}}</ref>


=== Voicing Erasure ===
=== Voicing Erasure ===
The ''Voicing Erasure'' section on the AJL website hosts spoken pieces by Buolamwini, [[Allison Koenecke]], [[Safiya Noble]], [[Ruha Benjamin]], [[Kimberlé Crenshaw]], [[Megan Smith]], and [[Sasha Costanza-Chock]] about bias in voice systems.<ref>{{Citation|title=Voicing Erasure - A Spoken Word Piece Exploring Bias in Voice Recognition Technology| date=March 31, 2020 |url=https://www.youtube.com/watch?v=SdCPbyDJtK0|language=en|access-date=2021-05-09}}</ref><ref>{{Cite web|title=Voicing Erasure|url=https://www.ajl.org/voicing-erasure|access-date=2021-05-09|website=ajl.org}}</ref> Buolamwini and Koenecke are the lead researchers on the website working to uncovering biases of voice systems. They've written that [[speech recognition]] systems have the most trouble with [[African-American Vernacular English]] speakers, and that these systems are secretly listening to users' conversations.{{citation needed|date=June 2022}} They have also written about what they regard as harmful [[gender stereotype]]s perpetuated by the voice recognition systems in [[Siri]], [[Amazon Alexa]], and [[Cortana (virtual assistant)|Microsoft Cortana]].{{citation needed|date=June 2022}}
The ''Voicing Erasure'' section on the AJL website hosts spoken pieces by Buolamwini, [[Allison Koenecke]], [[Safiya Noble]], [[Ruha Benjamin]], [[Kimberlé Crenshaw]], [[Megan Smith]], and [[Sasha Costanza-Chock]] about bias in voice systems.<ref>{{Citation|title=Voicing Erasure - A Spoken Word Piece Exploring Bias in Voice Recognition Technology| date=March 31, 2020 |url=https://www.youtube.com/watch?v=SdCPbyDJtK0|language=en|access-date=2021-05-09}}</ref><ref>{{Cite web|title=Voicing Erasure|url=https://www.ajl.org/voicing-erasure|access-date=2021-05-09|website=ajl.org}}</ref> Buolamwini and Koenecke are the lead researchers on the website working to uncovering biases of voice systems. They've written that [[speech recognition]] systems have the most trouble with [[African-American Vernacular English]] speakers, and that these systems are secretly listening to users' conversations.<ref>{{Cite web |title=MIT Center for Civic Media – Creating Technology for Social Change |url=https://civic.mit.edu/index.html?p=2402.html |access-date=2024-12-09 |language=en-US}}</ref> They have also written about what they regard as harmful [[gender stereotype]]s perpetuated by the voice recognition systems in [[Siri]], [[Amazon Alexa]], and [[Cortana (virtual assistant)|Microsoft Cortana]].<ref>{{Cite web |title=How AI bots and voice assistants reinforce gender bias |url=https://www.brookings.edu/articles/how-ai-bots-and-voice-assistants-reinforce-gender-bias/ |access-date=2024-12-09 |website=Brookings |language=en-US}}</ref>

While her [[methodology]] and results have faced criticism from industries like Amazon, she explained in her TED talk how she addressed the 'coded gaze' by highlighting its neglect of the intersection between “social impact, technology, and inclusion.<ref>{{Cite AV media |url=https://www.ted.com/talks/joy_buolamwini_how_i_m_fighting_bias_in_algorithms?subtitle=en |title=How I'm fighting bias in algorithms |date=2017-03-09 |last=Buolamwini |first=Joy |language=en |access-date=2024-12-09 |via=www.ted.com}}</ref>

Her ''Voicing Erasure'' project highlights gender equity by exposing biases in voice recognition systems, particularly those that often fail to accurately process speech from women and [[Non-binary gender|non-binary]] individuals. This project advocates for more inclusive AI development by raising awareness of these limitations.<ref>{{Citation |title=Algorithmic Justice League |date=2024-10-10 |work=Wikipedia |url=https://en.wikipedia.org/wiki/Algorithmic_Justice_League |access-date=2024-12-09 |language=en}}</ref>


=== The Coded Gaze ===
=== The Coded Gaze ===
Line 68: Line 98:
''Coded Bias'' is a [[documentary film]] directed by [[Shalini Kantayya]] that features Buolamwini’s research about AI inaccuracies in facial recognition technology and automated assessment software.<ref>{{Cite web|title=Coded Bias {{!}} Films {{!}} PBS|url=https://www.pbs.org/independentlens/films/coded-bias/|access-date=2021-05-09|website=Independent Lens|language=en-US}}</ref><ref name="Spotlight - Coded Bias Documentary"/> It focuses on what the film's creators regard as a lack of regulation of facial recognition tools sold by [[IBM]], [[Microsoft]], and [[Amazon (company)|Amazon]], and which they say perpetuates racial and gender bias. The film describes a dispute between [[Brooklyn]] tenants and a building management company that tried to use facial recognition to control entry to a building. The film featured ''[[Weapons of Math Destruction]]'' author [[Cathy O'Neil]]l and members of [[Big Brother Watch]] in London, including [[Silkie Carlo]]. On April 5, 2021, the documentary was made available to stream on [[Netflix]].<ref>{{Cite web|url=https://www.cnet.com/culture/entertainment/coded-bias-review-eye-opening-netflix-documentary-faces-up-to-racist-tech/|title=Eye-opening documentary Coded Bias, streaming on Netflix April 5, faces racist technology|first=Richard|last=Trenholm|website=[[CNET]]|date= March 31, 2021}}</ref>
''Coded Bias'' is a [[documentary film]] directed by [[Shalini Kantayya]] that features Buolamwini’s research about AI inaccuracies in facial recognition technology and automated assessment software.<ref>{{Cite web|title=Coded Bias {{!}} Films {{!}} PBS|url=https://www.pbs.org/independentlens/films/coded-bias/|access-date=2021-05-09|website=Independent Lens|language=en-US}}</ref><ref name="Spotlight - Coded Bias Documentary"/> It focuses on what the film's creators regard as a lack of regulation of facial recognition tools sold by [[IBM]], [[Microsoft]], and [[Amazon (company)|Amazon]], and which they say perpetuates racial and gender bias. The film describes a dispute between [[Brooklyn]] tenants and a building management company that tried to use facial recognition to control entry to a building. The film featured ''[[Weapons of Math Destruction]]'' author [[Cathy O'Neil]]l and members of [[Big Brother Watch]] in London, including [[Silkie Carlo]]. On April 5, 2021, the documentary was made available to stream on [[Netflix]].<ref>{{Cite web|url=https://www.cnet.com/culture/entertainment/coded-bias-review-eye-opening-netflix-documentary-faces-up-to-racist-tech/|title=Eye-opening documentary Coded Bias, streaming on Netflix April 5, faces racist technology|first=Richard|last=Trenholm|website=[[CNET]]|date= March 31, 2021}}</ref>


=== Exhibitions ===
== Exhibitions ==
Projects conducted by Algorithmic Justice League have been exhibited at art institutions including the [[Barbican Centre]] in London, UK, and [[Ars Electronica]] in Linz, Austria.<ref>{{Cite web |title=Art and Film - The Algorithmic Justice League |url=https://www.ajl.org/library/art-film |access-date=2022-03-25 |website=ajl.org}}</ref>
Projects conducted by Algorithmic Justice League have been exhibited at art institutions including the [[Barbican Centre]] in London, UK, and [[Ars Electronica]] in Linz, Austria.<ref>{{Cite web |title=Art and Film - The Algorithmic Justice League |url=https://www.ajl.org/library/art-film |access-date=2022-03-25 |website=ajl.org}}</ref>


Line 77: Line 107:
* '' Big Bang Data'' (2018) Exhibition at [[MIT Museum]], Cambridge, MA, US<ref>{{cite web|url=https://mitmuseum.mit.edu/bigbangdata|title=Big Bang Data|website=MIT Museum}}</ref>
* '' Big Bang Data'' (2018) Exhibition at [[MIT Museum]], Cambridge, MA, US<ref>{{cite web|url=https://mitmuseum.mit.edu/bigbangdata|title=Big Bang Data|website=MIT Museum}}</ref>


=== Awards and honors ===
== Awards and honors ==
In 2017, Buolamwini was awarded the grand prize in the professional category in the ''Search for Hidden Figures'' contest, tied to the release of the film ''[[Hidden Figures]]'' in December 2016.<ref>{{Cite web|url=https://youthradio.org/journalism/science/hidden-no-more-stem-spotlight-shines-on-hidden-figures-like-mits-joy-buolamwini/|title=Hidden No More: STEM Spotlight Shines On 'Hidden Figures' Like MIT's Joy Buolamwini|date=2017-02-27|website=youthradio.org|publisher=Youth Radio|access-date=2018-03-24|language=en-US}}</ref> The contest, sponsored by PepsiCo and [[21st Century Fox]], was intended to "help uncover the next generation of female leaders in science, technology, engineering and math,"<ref>{{Cite news|url=https://www.fastcompany.com/3067297/hidden-figures-inspires-a-scholarship-contest-for-minority-stem-aspirants|title="Hidden Figures" Inspires A Scholarship Contest For Minority STEM Aspirants|date=2017-01-19|website=fastcompany.com|publisher=Fast Company|access-date=2018-03-24|language=en-US}}</ref> and attracted 7,300 submissions from young women across the United States.<ref name=":4" />
In 2017, Buolamwini was awarded the grand prize in the professional category in the ''Search for Hidden Figures'' contest, tied to the release of the film ''[[Hidden Figures]]'' in December 2016.<ref>{{Cite web|url=https://youthradio.org/journalism/science/hidden-no-more-stem-spotlight-shines-on-hidden-figures-like-mits-joy-buolamwini/|title=Hidden No More: STEM Spotlight Shines On 'Hidden Figures' Like MIT's Joy Buolamwini|date=2017-02-27|website=youthradio.org|publisher=Youth Radio|access-date=2018-03-24|language=en-US|archive-date=July 3, 2018|archive-url=https://web.archive.org/web/20180703051026/https://youthradio.org/journalism/science/hidden-no-more-stem-spotlight-shines-on-hidden-figures-like-mits-joy-buolamwini/|url-status=dead}}</ref> The contest, sponsored by [[PepsiCo]] and [[21st Century Fox]], was intended to "help uncover the next generation of female leaders in science, technology, engineering and math,"<ref>{{Cite news|url=https://www.fastcompany.com/3067297/hidden-figures-inspires-a-scholarship-contest-for-minority-stem-aspirants|title="Hidden Figures" Inspires A Scholarship Contest For Minority STEM Aspirants|date=2017-01-19|website=fastcompany.com|publisher=Fast Company|access-date=2018-03-24|language=en-US}}</ref> and attracted 7,300 submissions from young women across the United States.<ref name=":4" />


Buolamwini delivered a [[TEDx]] talk at [[Beacon Street]] entitled ''How I'm fighting bias in algorithms''.<ref>{{Cite web|url=https://scholar.harvard.edu/nlanter/events/2017/03/fighting-bias-algorithms|title=Speaker Joy Buolamwini: How I'm Fighting Bias in Algorithms|website=scholar.harvard.edu|language=en|access-date=2018-03-24}}</ref><ref>{{Cite web|url=https://www.media.mit.edu/posts/how-i-m-fighting-bias-in-algorithms/|title=How I'm fighting bias in algorithms – MIT Media Lab|last=Buolamwini|first=Joy|website=MIT Media Lab|access-date=2018-03-24}}</ref><ref>{{Citation|last=TED|title=How I'm fighting bias in algorithms {{!}} Joy Buolamwini|date=2017-03-29|url=https://www.youtube.com/watch?v=UG_X_7g63rY|accessdate=2018-03-24}}</ref> In 2018, she appeared on the [[TED Radio Hour]].<ref>{{Citation|title=Joy Buolamwini: How Does Facial Recognition Software See Skin Color?|url=https://www.wnyc.org/story/joy-buolamwini-how-does-facial-recognition-software-see-skin-color/|language=en|accessdate=2018-03-24}}</ref> She was featured on [[Amy Poehler's Smart Girls]] in 2018.<ref name=":0" /> ''[[Fast Company (magazine)|Fast Company]]'' magazine listed her as one of four "design heroes who are defending democracy online."<ref>{{Citation|first=Katharine|last=Schwab|title=Meet 4 design heroes who are defending democracy online|date=July 3, 2018|access-date=2018-07-21|url=https://www.fastcompany.com/90180572/meet-4-design-heroes-who-are-defending-democracy-online|website=fastcompany.com|publisher=Fast Company Magazine}}</ref> She was listed as one of [[100 Women (BBC)|BBC's 100 Women]] in 2018.<ref>{{Cite news|url=https://www.bbc.com/news/world-46225037|title=BBC 100 Women 2018: Who is on the list?|date=2018-11-19|work=BBC News|access-date=2018-11-21|language=en-GB}}</ref>
Buolamwini delivered a [[TEDx]] talk at [[Beacon Street]] entitled ''How I'm fighting bias in algorithms''.<ref>{{Cite web|url=https://scholar.harvard.edu/nlanter/events/2017/03/fighting-bias-algorithms|title=Speaker Joy Buolamwini: How I'm Fighting Bias in Algorithms|website=scholar.harvard.edu|language=en|access-date=2018-03-24}}</ref><ref>{{Cite web|url=https://www.media.mit.edu/posts/how-i-m-fighting-bias-in-algorithms/|title=How I'm fighting bias in algorithms – MIT Media Lab|last=Buolamwini|first=Joy|website=MIT Media Lab|access-date=2018-03-24}}</ref><ref>{{Citation|last=TED|title=How I'm fighting bias in algorithms {{!}} Joy Buolamwini|date=2017-03-29|url=https://www.youtube.com/watch?v=UG_X_7g63rY|accessdate=2018-03-24}}</ref> In 2018, she appeared on the [[TED Radio Hour]].<ref>{{Citation|title=Joy Buolamwini: How Does Facial Recognition Software See Skin Color?|url=https://www.wnyc.org/story/joy-buolamwini-how-does-facial-recognition-software-see-skin-color/|language=en|accessdate=2018-03-24}}</ref> She was featured on [[Amy Poehler's Smart Girls]] in 2018.<ref name=":0" /> ''[[Fast Company (magazine)|Fast Company]]'' magazine listed her as one of four "design heroes who are defending democracy online."<ref>{{Citation|first=Katharine|last=Schwab|title=Meet 4 design heroes who are defending democracy online|date=July 3, 2018|access-date=2018-07-21|url=https://www.fastcompany.com/90180572/meet-4-design-heroes-who-are-defending-democracy-online|website=fastcompany.com|publisher=Fast Company Magazine}}</ref> She was listed as one of [[100 Women (BBC)|BBC's 100 Women]] in 2018.<ref>{{Cite news|url=https://www.bbc.com/news/world-46225037|title=BBC 100 Women 2018: Who is on the list?|date=2018-11-19|work=BBC News|access-date=2018-11-21|language=en-GB}}</ref>


In 2019, Buolamwini was listed in [[Fortune (magazine)|''Fortune'']] magazine's 2019 list of the "World's 50 Greatest Leaders." The magazine also described her as "the conscience of the A.I. revolution."<ref>{{Cite web|url=https://fortune.com/worlds-greatest-leaders/2019/joy-buolamwini|title=Joy Buolamwini|website=Fortune|language=en|access-date=2019-11-26}}</ref> She also made the inaugural Time 100 Next list in 2019.<ref>{{Cite magazine|url=https://time.com/collection/time-100-next-2019/5718893/joy-buolamwini/|title=TIME 100 Next 2019: Joy Buolamwini|website=time.com|publisher=[[Time magazine]]|language=en-us|access-date=2019-12-16}}</ref> In 2020, Buolamwini featured in a women's empowerment campaign by the clothing company [[Levi's]] for [[International Women's Day]].<ref>{{Cite web|url=https://www.levi.com/US/en_US/blog/article/shes-rewriting-the-code/|title=She's Rewriting the Code|website=Off The Cuff|language=en-US|access-date=2020-03-09}}</ref> She was also featured in the documentary ''Coded Bias''.<ref>{{Cite web|url=https://www.npr.org/sections/codeswitch/2020/02/08/770174171/when-bias-is-coded-into-our-technology|title=New Documentary 'Coded Bias' Explores How Tech Can Be Racist And Sexist : Code Switch|website=npr.org|publisher=[[NPR]]|date=February 8, 2020|language=en-us|access-date=2020-12-12|last1=Lee|first1=Jennifer 8.}}</ref> In 2020, an honoree of the [[Great Immigrants Award]] named by [[Carnegie Corporation of New York]].<ref>{{Cite web |title=Joy Buolamwini |url=https://www.carnegie.org/awards/honoree/joy-buolamwini/ |access-date=June 26, 2024 |website=Carnegie Corporation of New York}}</ref>
In 2019, Buolamwini was listed in [[Fortune (magazine)|''Fortune'']] magazine's 2019 list of the "World's 50 Greatest Leaders," where the magazine described her as "the conscience of the A.I. revolution."<ref>{{Cite web|url=https://fortune.com/worlds-greatest-leaders/2019/joy-buolamwini|title=Joy Buolamwini|website=Fortune|language=en|access-date=2019-11-26}}</ref> She also made the inaugural Time 100 Next list in 2019.<ref>{{Cite magazine|url=https://time.com/collection/time-100-next-2019/5718893/joy-buolamwini/|title=TIME 100 Next 2019: Joy Buolamwini|magazine=[[Time (magazine)|Time]]|publisher=[[Time magazine]]|language=en-us|access-date=2019-12-16}}</ref> In 2020, Buolamwini featured in a women's empowerment campaign by the clothing company [[Levi's]] for [[International Women's Day]].<ref>{{Cite web|url=https://www.levi.com/US/en_US/blog/article/shes-rewriting-the-code/|title=She's Rewriting the Code|website=Off The Cuff|language=en-US|access-date=2020-03-09}}</ref> She was also featured in the documentary ''Coded Bias''.<ref>{{Cite web|url=https://www.npr.org/sections/codeswitch/2020/02/08/770174171/when-bias-is-coded-into-our-technology|title=New Documentary 'Coded Bias' Explores How Tech Can Be Racist And Sexist : Code Switch|website=npr.org|publisher=[[NPR]]|date=February 8, 2020|language=en-us|access-date=2020-12-12|last1=Lee|first1=Jennifer 8.}}</ref> In 2020, an honoree of the [[Great Immigrants Award]] named by [[Carnegie Corporation of New York]].<ref>{{Cite web |title=Joy Buolamwini |url=https://www.carnegie.org/awards/honoree/joy-buolamwini/ |access-date=June 26, 2024 |website=Carnegie Corporation of New York}}</ref>


In 2022, Buolamwini was named the [[American Society for Quality|ASQ]] Hutchens Medalist.<ref>{{Cite web |date=January 4, 2024 |title=Hutchens MEdalists |url=https://asq.org/about-asq/asq-awards/honors/hutchens }}</ref> In 2023, she was listed in the Time 100 AI.<ref>{{Cite magazine |title=The 100 Most Influential People in AI 2023 |url=https://time.com/collection/time100-ai/ |access-date=2024-02-18 |magazine=Time |language=en}}</ref>
In 2022, Buolamwini was named the [[American Society for Quality|ASQ]] Hutchens Medalist.<ref>{{Cite web |date=January 4, 2024 |title=Hutchens MEdalists |url=https://asq.org/about-asq/asq-awards/honors/hutchens }}</ref> In 2023, she was listed in the Time 100 AI.<ref>{{Cite magazine |title=The 100 Most Influential People in AI 2023 |url=https://time.com/collection/time100-ai/ |access-date=2024-02-18 |magazine=Time |language=en}}</ref>


In 2024, Buolamwini was awarded an honorary Doctor of Science degree from [[Dartmouth College]] for her work in exposing biases in AI systems and preventing AI harms. She was also invited as the keynote speaker for Dartmouth's 2024 Social Justice Awards.<ref>{{Cite web |title=Announcing the 2024 Honorary Degree Recipients |url=https://home.dartmouth.edu/news/2024/04/announcing-2024-honorary-degree-recipients |access-date=June 10, 2024 |website=Dartmouth.edu |date=April 11, 2024 }}</ref>
In June 9, 2024, Buolamwini was awarded an honorary Doctor of Science degree from [[Dartmouth College]] for her work in exposing biases in AI systems and preventing AI harms. She was also invited as the keynote speaker for Dartmouth's 2024 Social Justice Awards.<ref>{{Cite web |title=Announcing the 2024 Honorary Degree Recipients |url=https://home.dartmouth.edu/news/2024/04/announcing-2024-honorary-degree-recipients |access-date=June 10, 2024 |website=Dartmouth.edu |date=April 11, 2024 }}</ref>


==Personal life==
==Personal life==
Line 111: Line 141:
[[Category:MIT Media Lab people]]
[[Category:MIT Media Lab people]]
[[Category:Scientists from Edmonton]]
[[Category:Scientists from Edmonton]]
[[Category:Year of birth missing (living people)]]
[[Category:American people of Ghanaian descent]]
[[Category:American people of Ghanaian descent]]
[[Category:American women academics]]
[[Category:American women academics]]
Line 117: Line 146:
[[Category:21st-century African-American scientists]]
[[Category:21st-century African-American scientists]]
[[Category:Data activism]]
[[Category:Data activism]]
[[Category:1990 births]]

Latest revision as of 16:25, 5 January 2025

Joy Buolamwini
Buolamwini at Wikimania 2018
Born
Joy Adowaa Buolamwini

(1990-01-23) 23 January 1990 (age 34)
EducationCordova High School
Alma materGeorgia Institute of Technology (BS)
Jesus College, Oxford (MS)
Massachusetts Institute of Technology (MS, PhD)
Known forAlgorithmic Justice League
Scientific career
FieldsMedia Arts & Sciences
Computer science
Algorithmic bias
InstitutionsMIT Media Lab
Theses
Doctoral advisorEthan Zuckerman[1]
Websitewww.poetofcode.com Edit this at Wikidata

Joy Adowaa Buolamwini is a Canadian-American computer scientist and digital activist formerly based at the MIT Media Lab.[2] She founded the Algorithmic Justice League (AJL), an organization that works to challenge bias in decision-making software, using art, advocacy, and research to highlight the social implications and harms of artificial intelligence (AI).[3]

Early life and education

[edit]

Buolamwini was born in Edmonton, Alberta, grew up in Mississippi, and attended Cordova High School in Cordova, Tennessee.[4] At age nine, she was inspired by Kismet, the MIT robot, and taught herself XHTML, JavaScript and PHP.[5][6] As a student-athlete, she was a competitive pole vaulter[7] and played basketball. In a podcast episode she recorded on Brené Brown's show "Dare to Lead", she recalls completing her AP Physics homework between basketball break times.[8]

As an undergraduate, Buolamwini studied computer science at the Georgia Institute of Technology, where she researched health informatics.[9] Buolamwini graduated as a Stamps President's Scholar[10] from Georgia Tech in 2012,[11] and was the youngest finalist of the Georgia Tech InVenture Prize in 2009.[12]

Buolamwini is a Rhodes Scholar, a Fulbright fellow, a Stamps Scholar, an Astronaut Scholar, and an Anita Borg Institute scholar.[13] As a Rhodes Scholar, she studied learning and technology at the University of Oxford, where she was a student based at Jesus College, Oxford.[14][15] During her scholarship she took part in the first formal Service Year, working on community focused projects.[15][16] She was awarded a Master's Degree in Media Arts & Sciences from MIT in 2017 for research supervised by Ethan Zuckerman.[1] She was awarded a PhD degree in Media Arts & Sciences from the MIT Media Lab in 2022 with a thesis on Facing the Coded Gaze with Evocative Audits and Algorithmic Audits.[17]

Career and research

[edit]

In 2011, Buolamwini worked with the trachoma program at the Carter Center to develop an Android-based assessment system for use in Ethiopia.[18][5]

Interface from Joy Buolamwini’s Gender Shades project evaluating biases in facial recognition systems
Joy Buolamwini at Wikimania 2018 in Cape Town

As a Fulbright fellow, in 2013 she worked with local computer scientists in Zambia to help Zambian youth become technology creators.[19] On September 14, 2016, Buolamwini appeared at the White House summit on Computer Science for All.[20]

Buolamwini was a researcher at the MIT Media Lab, where she worked to identify bias in algorithms and to develop practices for accountability during their design;[21] at the lab, Buolamwini was a member of Ethan Zuckerman's Center for Civic Media group.[22][23] During her research, Buolamwini showed 1,000 faces to facial recognition systems and asked the systems to identify whether faces were female or male, and found that the software found it hard to identify dark-skinned women.[24] Her project, Gender Shades, became part of her MIT thesis.[1][25] Her 2018 paper Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification[26] prompted responses from IBM and Microsoft to take corrective actions to improve the accuracy of their algorithms, swiftly improved their software demonstrating her influence on the industry.[27][28] She also created the Aspire Mirror, a device that lets users see a reflection of themselves based on what inspires them.[29] Her program, Algorithmic Justice League, aims to highlight the bias in code that can lead to discrimination against underrepresented groups.[30] She has created two films, Code4Rights and Algorithmic Justice League: Unmasking Bias.[31][32] Still the director, Code4rights is an advocacy organization started in 2012 intended to use technology to spread awareness of human rights.[33] She served as Chief Technology Officer (CTO) for Techturized Inc., a hair-care technology company.[9]

Buolamwini's research was cited in 2020 as an influence for Google and Microsoft in addressing gender and race bias in their products and processes.[34]

She also served as an advisor to President Biden ahead of his administration's Executive Order 14110, released October 30, 2023. The order is also known as the Executive Order on Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence (sometimes referred to as "Executive Order on Artificial Intelligence").[35][36]

In 2023, she published her first book, Unmasking AI: My Mission to Protect What Is Human in a World of Machines, which chronicles her research.[37]

AI Bias and Gender Equity

[edit]

Dr. Joy Buolamwini’s research on AI bias has been pivotal in advancing gender equity within engineering and technology. Her research found that AI-powered facial-recognition systems showed higher error rates when identifying darker-skinned women, with rates reaching 34.7%, compared to 0.8% for lighter-skinned men. These disparities indicated potential biases in algorithmic design, where biased training data and incomplete evaluation processes led to unequal technological outcomes based on both gender and skin tone.[38]

Buolamwini’s personal experience with AI performance limitations motivated her research into algorithmic bias. While working on a facial-recognition-based art project at the MIT Media Lab, she discovered that commercial AI systems could not consistently detect her face due to her darker skin. This frustration inspired her landmark research project Gender Shades, which rigorously evaluated facial analysis systems from IBM, Microsoft, and Face++. Her study revealed that these systems were most accurate for lighter-skinned men, with error rates as low as 1%, while their accuracy plummeted for darker-skinned women, with misclassification rates as high as 47%.[39]

Realizing that these failures stemmed from data imbalances, Buolamwini introduced the Pilot Parliaments Benchmark, a diverse dataset designed to address the lack of representation in typical AI training sets, which were composed of over 75% male and 80% lighter-skinned faces. This new benchmark set a critical precedent for evaluating and improving AI performance by ensuring more equitable testing standards.[39]

Her findings contributed to significant changes in the tech industry. Following the publication of her research, companies such as IBM and Microsoft took steps to improve their algorithms, reducing bias and enhancing accuracy. However, Buolamwini has noted that improved technical accuracy alone does not eliminate risks of potential misuse in areas such as racial profiling, surveillance, and hiring decisions.[39]

To address these concerns, Buolamwini co-founded the Safe Face Pledge, encouraging tech companies to adopt ethical AI practices. The pledge prohibits weaponizing facial recognition, bans lawless police use, and demands transparency in government surveillance applications. Her advocacy emphasizes that achieving fairness in AI development requires a multi-faceted approach, including regulatory frameworks and collaborative efforts.[39]

Activism

[edit]
Logo of the Algorithmic Justice League

Buolamwini founded the Algorithmic Justice League (AJL) in 2016 to promote equitable and accountable artificial intelligence (AI).[40] The AJL organization integrates art and research to examine societal implications and reduce AI-related harms. The company works to raise public awareness of AI’s impact while advancing research on bias mitigation. It also addresses issues at the intersection of equity and technology, promoting more inclusive and accessible engineering systems. AJL has also encouraged public engagement through interactive campaigns, exhibitions, and educational initiatives, ensuring a broad audience is informed about the impact of biased algorithms on gender equity.

To broaden its outreach, AJL has partnered with organizations such as Black Girls Code to encourage African-American girls to pursue STEM careers, thereby fostering more diversity in the tech industry. AJL conducts workshops and provides resources aimed at educating the public and tech community about AI biases, with a focus on empowering underrepresented genders to engage with and challenge these systems.

The success of AJL reflects the collective efforts of its team. Some key members of the Algorithmic Justice League include Rachel Fagen, the Chief of Staff, who focuses on organizational development and building connections to promote equitable and accountable AI. Aurum Linh serves as the AI Harms Analyst, dedicated to identifying and mitigating the adverse effects of artificial intelligence. The Algorithm Justice League works with various groups, including CORE funders, advisory committees, and research collaborators, to enhance transparency and accountability in AI systems, ensuring that its advocacy efforts remain impactful and inclusive.[41]

The AJL website provides information and a live blog.[42] There are several sections on the site where users can share stories, and donate or write to US Congressional representatives. Buolamwini has influenced policy discussions to address gender discrimination in AI applications, advocating for regulations that ensure fairness in AI-powered decision-making systems. In 2019, she testified before the United States House Committee on Oversight and Reform about the risks of facial recognition technology.[43] Her testimony emphasized the need for accountability in the deployment of facial recognition technologies, particularly in areas where these systems could exacerbate gender inequities.

She believed the executive order fell short in terms of redress, or consequences, for AI systems that hurt minority communities.[44] Her efforts supported the inclusion of measures to address discrimination in AI applications, particularly in areas like hiring, housing, and criminal justice. Biden’s executive order is a “long and ongoing process” which is happening because the industry is not incentivized to do so she said.[45]

Joy Buolamwini, through the Algorithmic Justice League (AJL), has been instrumental in advocating for the inclusion and support of women, transgender, and non-binary individuals in the technology sector. Her initiatives focus on exposing and mitigating biases in artificial intelligence (AI) that disproportionately affect these underrepresented groups.

Buolamwini has led campaigns targeting gender equity in AI and technology. In 2021, she collaborated with Olay on the Decode the Bias campaign, which examined biases in beauty algorithms affecting women of color. This initiative evaluated Olay's Skin Advisor System to ensure equitable treatment across all skin tones.[46]

Building on these initiatives, AJL launched the Community Reporting of Algorithmic System Harms (CRASH), which unites key stakeholders to develop tools that enable broader participation in creating accountable and equitable AI systems, directly addressing issues that affect underrepresented genders.[47]

Voicing Erasure

[edit]

The Voicing Erasure section on the AJL website hosts spoken pieces by Buolamwini, Allison Koenecke, Safiya Noble, Ruha Benjamin, Kimberlé Crenshaw, Megan Smith, and Sasha Costanza-Chock about bias in voice systems.[48][49] Buolamwini and Koenecke are the lead researchers on the website working to uncovering biases of voice systems. They've written that speech recognition systems have the most trouble with African-American Vernacular English speakers, and that these systems are secretly listening to users' conversations.[50] They have also written about what they regard as harmful gender stereotypes perpetuated by the voice recognition systems in Siri, Amazon Alexa, and Microsoft Cortana.[51]

While her methodology and results have faced criticism from industries like Amazon, she explained in her TED talk how she addressed the 'coded gaze' by highlighting its neglect of the intersection between “social impact, technology, and inclusion.[52]

Her Voicing Erasure project highlights gender equity by exposing biases in voice recognition systems, particularly those that often fail to accurately process speech from women and non-binary individuals. This project advocates for more inclusive AI development by raising awareness of these limitations.[53]

The Coded Gaze

[edit]

The Coded Gaze is a mini-documentary that debuted at the Museum of Fine Arts, Boston in 2016, and is currently available via YouTube. Buolamwini uses the mini documentary to talk about the bias that she believes lies in artificial intelligence's function. The inspiration for the mini documentary and her research came when she was at MIT, creating her art "Aspire Mirror," which uses facial recognition to reflect another person who inspires a user, onto that user's face.[54] Buolamwini anticipated having Serena Williams, another dark-skinned woman, reflected onto her face. However, the technology did not recognize her face. Buolamwini's research investigated why this happened, and consequently led Buolamwini to conclude that the exclusion of people who look like her was a result of a practice she called the "Coded Gaze."[55] She further discusses this concept in the mini documentary, "The Coded Gaze." The documentary explores how AI can be subject to racial and gender biases that reflect the views and cultural backgrounds of those who develop it.[56]

Coded Bias

[edit]

Coded Bias is a documentary film directed by Shalini Kantayya that features Buolamwini’s research about AI inaccuracies in facial recognition technology and automated assessment software.[57][42] It focuses on what the film's creators regard as a lack of regulation of facial recognition tools sold by IBM, Microsoft, and Amazon, and which they say perpetuates racial and gender bias. The film describes a dispute between Brooklyn tenants and a building management company that tried to use facial recognition to control entry to a building. The film featured Weapons of Math Destruction author Cathy O'Neill and members of Big Brother Watch in London, including Silkie Carlo. On April 5, 2021, the documentary was made available to stream on Netflix.[58]

Exhibitions

[edit]

Projects conducted by Algorithmic Justice League have been exhibited at art institutions including the Barbican Centre in London, UK, and Ars Electronica in Linz, Austria.[59]

Awards and honors

[edit]

In 2017, Buolamwini was awarded the grand prize in the professional category in the Search for Hidden Figures contest, tied to the release of the film Hidden Figures in December 2016.[65] The contest, sponsored by PepsiCo and 21st Century Fox, was intended to "help uncover the next generation of female leaders in science, technology, engineering and math,"[66] and attracted 7,300 submissions from young women across the United States.[11]

Buolamwini delivered a TEDx talk at Beacon Street entitled How I'm fighting bias in algorithms.[67][68][69] In 2018, she appeared on the TED Radio Hour.[70] She was featured on Amy Poehler's Smart Girls in 2018.[4] Fast Company magazine listed her as one of four "design heroes who are defending democracy online."[71] She was listed as one of BBC's 100 Women in 2018.[72]

In 2019, Buolamwini was listed in Fortune magazine's 2019 list of the "World's 50 Greatest Leaders," where the magazine described her as "the conscience of the A.I. revolution."[73] She also made the inaugural Time 100 Next list in 2019.[74] In 2020, Buolamwini featured in a women's empowerment campaign by the clothing company Levi's for International Women's Day.[75] She was also featured in the documentary Coded Bias.[76] In 2020, an honoree of the Great Immigrants Award named by Carnegie Corporation of New York.[77]

In 2022, Buolamwini was named the ASQ Hutchens Medalist.[78] In 2023, she was listed in the Time 100 AI.[79]

In June 9, 2024, Buolamwini was awarded an honorary Doctor of Science degree from Dartmouth College for her work in exposing biases in AI systems and preventing AI harms. She was also invited as the keynote speaker for Dartmouth's 2024 Social Justice Awards.[80]

Personal life

[edit]

Buolamwini has lived in Ghana; Barcelona, Spain; Oxford, United Kingdom; and, in the U.S., Memphis, Tennessee, and Atlanta, Georgia.[12] She describes herself as a "daughter of the science and of the arts",[8] her father being an Academic and her mother an artist, as well as a Poet of Code.[8]

References

[edit]
  1. ^ a b c Buolamwini, Joy Adowaa (2017). Gender shades : intersectional phenotypic and demographic evaluation of face datasets and gender classifiers (MS thesis). MIT. hdl:1721.1/114068. OCLC 1026503582. Free access icon
  2. ^ "Joy Buolamwini". forbes.com. Retrieved March 19, 2022.
  3. ^ "Algorithmic Justice League - Unmasking AI harms and biases". Algorithmic Justice League - Unmasking AI harms and biases. Retrieved May 15, 2021.
  4. ^ a b "The Future of Computer Science and Tech: 12 Young Women to Watch — Part 2". amysmartgirls.com. Amy Poehler’s Smart Girls. February 19, 2018. Retrieved March 24, 2018.
  5. ^ a b "Joy Buolamwini | TryComputing.org". trycomputing.org. Archived from the original on March 25, 2018. Retrieved March 24, 2018.
  6. ^ "Meet The Digital Activist That's Taking Human Prejudice Out of Our Machines". Bloomberg.com. June 26, 2017. Retrieved March 24, 2018.
  7. ^ "CHS Pole Vaulting - Joy Buolamwini". vault.awardspace.com. Archived from the original on March 25, 2018. Retrieved March 24, 2018.
  8. ^ a b c "Spotify". Spotify.
  9. ^ a b "Tech Startup of The Week: Techturized Wins With Hair Care Company". Black Enterprise. March 15, 2013. Retrieved March 24, 2018.
  10. ^ "Stamps President's Scholars Program". stampsps.gatech.edu.
  11. ^ a b "Joy Buolamwini wins national contest for her work fighting bias in machine learning". MIT News. Retrieved March 24, 2018.
  12. ^ a b "Admissions Conquered | InVenture Prize". inventureprize.gatech.edu. Retrieved September 25, 2021.
  13. ^ "Scholar Spotlight: Joy Buolamwini | Astronaut Scholarship Foundation". Retrieved September 25, 2021.
  14. ^ Buolamwini, Joy Adowaa (2014). Increasing participation in graduate level computer science education : a case study of the Georgia Institute of Technology's master of computer science. ox.ac.uk (MSc thesis). University of Oxford. OCLC 908967245.[permanent dead link]
  15. ^ a b "Joy Buolamwini Profile". rhodesproject.com. The Rhodes Project. Retrieved March 24, 2018.
  16. ^ "Oxford Launchpad: Confessions of an Entrepreneur: Joy Buolamwini | Enterprising Oxford". eship.ox.ac.uk. Archived from the original on March 25, 2018. Retrieved March 24, 2018.
  17. ^ Buolamwini, Joy (2022). Facing the Coded Gaze with Evocative Audits and Algorithmic Audits. mit.edu (PhD thesis). hdl:1721.1/143396.
  18. ^ "Scholar Spotlight: Joy Buolamwini | Astronaut Scholarship Foundation". astronautscholarship.org. Retrieved March 24, 2018.
  19. ^ ZamrizeMedia (April 28, 2013), Joy Buolamwini | Fulbright Fellow 2013 | Zambia, retrieved March 24, 2018
  20. ^ Buolamwini, Joy (March 27, 2017). "#CSForAll Tribute to Seymour Papert". MIT MEDIA LAB. Retrieved December 9, 2024.
  21. ^ "Project Overview ‹ Algorithmic Justice League – MIT Media Lab". MIT Media Lab. Retrieved March 24, 2018.
  22. ^ "interview: joy buolamwini | MIT Admissions". mitadmissions.org. March 5, 2017. Retrieved March 24, 2018.
  23. ^ "Group People ‹ Civic Media – MIT Media Lab". MIT Media Lab. Retrieved March 24, 2018.
  24. ^ "Photo Algorithms ID White Men Fine—Black Women, Not So Much". wired.com. Wired magazine. Retrieved March 24, 2018.
  25. ^ Kleinman, Zoe (April 14, 2017). "Is artificial intelligence racist?". bbc.co.uk. BBC News. Retrieved March 24, 2018.
  26. ^ Buolamwini, Joy (2018). "Gender shades: Intersectional accuracy disparities in commercial gender classification". Conference on Fairness, Accountability and Transparency. 81: 77–91 – via mir.press.
  27. ^ "Mitigating Bias in Artificial Intelligence (AI) Models -- IBM Research". ibm.com. May 16, 2016. Retrieved March 24, 2018.
  28. ^ "Gender Shades: Intersectional Accuracy Disparities in Commercial Gender Classification" (PDF). Proceedings of Machine Learning Research. 2018. Retrieved March 24, 2018.
  29. ^ "Aspire Mirror". Aspire Mirror. Retrieved March 24, 2018.
  30. ^ International, Youth Radio-- Youth Media (February 28, 2017). "A Search For 'Hidden Figures' Finds Joy". huffingtonpost.com. Huffington Post. Retrieved March 24, 2018.
  31. ^ "Filmmakers Collaborative | Code4Rights". filmmakerscollab.org. Retrieved March 24, 2018.
  32. ^ "Filmmakers Collaborative | Algorithmic Justice League: Unmasking Bias". filmmakerscollab.org. Retrieved March 24, 2018.
  33. ^ "Joy Buolamwini - TECHHER". Retrieved December 9, 2024.
  34. ^ Mar 2; Burt, 2020 | Chris (March 2, 2020). "Tech giants pressured to follow Google in removing gender labels from computer vision services". biometricupdate.com. Biometric Update. Retrieved March 9, 2020.{{cite web}}: CS1 maint: numeric names: authors list (link)
  35. ^ Boak, Josh; Press, MATT O’BRIEN Associated (October 30, 2023). "Biden wants to move fast on AI safeguards and signs an executive order to address his concerns". Los Angeles Times. Retrieved January 22, 2024.
  36. ^ "Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence". Federal Register. October 30, 2023.
  37. ^ "Column: She set out to build robots. She ended up exposing big tech". Los Angeles Times. November 3, 2023. Retrieved January 22, 2024.
  38. ^ "Study finds gender and skin-type bias in commercial artificial-intelligence systems". MIT News | Massachusetts Institute of Technology. February 12, 2018. Retrieved December 9, 2024.
  39. ^ a b c d "Joy Buolamwini: examining racial and gender bias in facial analysis software". Google Arts & Culture. Retrieved December 9, 2024.
  40. ^ "Mission, Team and Story - The Algorithmic Justice League". ajl.org. Retrieved May 9, 2021.
  41. ^ "Mission, Team and Story - The Algorithmic Justice League". www.ajl.org. Retrieved December 9, 2024.
  42. ^ a b "Spotlight - Coded Bias Documentary". ajl.org. Retrieved May 9, 2021.
  43. ^ Quach, Katyanna (2019). "We listened to more than 3 hours of US Congress testimony on facial recognition so you didn't have to go through it". theregister.com. Retrieved January 13, 2022.
  44. ^ Feiner, Hayden Field,Lauren (November 2, 2023). "Biden's AI order didn't go far enough to address fairness, but it's a good first step, advocates say". CNBC. Retrieved December 9, 2024.{{cite web}}: CS1 maint: multiple names: authors list (link)
  45. ^ "Biden's executive order aims to limit the harms of AI". Marketplace. Retrieved December 9, 2024.
  46. ^ "Algorithmic Justice League", Wikipedia, October 10, 2024, retrieved December 9, 2024
  47. ^ "Algorithmic Vulnerability Bounty Project (AVBP)". www.ajl.org. Retrieved December 9, 2024.
  48. ^ Voicing Erasure - A Spoken Word Piece Exploring Bias in Voice Recognition Technology, March 31, 2020, retrieved May 9, 2021
  49. ^ "Voicing Erasure". ajl.org. Retrieved May 9, 2021.
  50. ^ "MIT Center for Civic Media – Creating Technology for Social Change". Retrieved December 9, 2024.
  51. ^ "How AI bots and voice assistants reinforce gender bias". Brookings. Retrieved December 9, 2024.
  52. ^ Buolamwini, Joy (March 9, 2017). How I'm fighting bias in algorithms. Retrieved December 9, 2024 – via www.ted.com.
  53. ^ "Algorithmic Justice League", Wikipedia, October 10, 2024, retrieved December 9, 2024
  54. ^ "The Coded Gaze: Unpacking Biases in Algorithms That Perpetuate Inequity". rockefellerfoundation.org. Retrieved May 15, 2021.
  55. ^ "The Coded Gaze: Unpacking Biases in Algorithms That Perpetuate Inequity". The Rockefeller Foundation. Retrieved June 20, 2021.
  56. ^ "Here's AOC calling out the vicious circle of white men building biased face AI". fastcompany.com. May 22, 2019. Retrieved May 15, 2021.
  57. ^ "Coded Bias | Films | PBS". Independent Lens. Retrieved May 9, 2021.
  58. ^ Trenholm, Richard (March 31, 2021). "Eye-opening documentary Coded Bias, streaming on Netflix April 5, faces racist technology". CNET.
  59. ^ "Art and Film - The Algorithmic Justice League". ajl.org. Retrieved March 25, 2022.
  60. ^ "apexart Exhibition: The Criminal Type". apexart.org.
  61. ^ "Understanding AI". Ars Electronica Center.
  62. ^ "AI: More than Human | Barbican". www.barbican.org.uk.
  63. ^ "Nine Moments for Now". coopergallery.fas.harvard.edu.
  64. ^ "Big Bang Data". MIT Museum.
  65. ^ "Hidden No More: STEM Spotlight Shines On 'Hidden Figures' Like MIT's Joy Buolamwini". youthradio.org. Youth Radio. February 27, 2017. Archived from the original on July 3, 2018. Retrieved March 24, 2018.
  66. ^ ""Hidden Figures" Inspires A Scholarship Contest For Minority STEM Aspirants". fastcompany.com. Fast Company. January 19, 2017. Retrieved March 24, 2018.
  67. ^ "Speaker Joy Buolamwini: How I'm Fighting Bias in Algorithms". scholar.harvard.edu. Retrieved March 24, 2018.
  68. ^ Buolamwini, Joy. "How I'm fighting bias in algorithms – MIT Media Lab". MIT Media Lab. Retrieved March 24, 2018.
  69. ^ TED (March 29, 2017), How I'm fighting bias in algorithms | Joy Buolamwini, retrieved March 24, 2018
  70. ^ Joy Buolamwini: How Does Facial Recognition Software See Skin Color?, retrieved March 24, 2018
  71. ^ Schwab, Katharine (July 3, 2018), "Meet 4 design heroes who are defending democracy online", fastcompany.com, Fast Company Magazine, retrieved July 21, 2018
  72. ^ "BBC 100 Women 2018: Who is on the list?". BBC News. November 19, 2018. Retrieved November 21, 2018.
  73. ^ "Joy Buolamwini". Fortune. Retrieved November 26, 2019.
  74. ^ "TIME 100 Next 2019: Joy Buolamwini". Time. Time magazine. Retrieved December 16, 2019.
  75. ^ "She's Rewriting the Code". Off The Cuff. Retrieved March 9, 2020.
  76. ^ Lee, Jennifer 8. (February 8, 2020). "New Documentary 'Coded Bias' Explores How Tech Can Be Racist And Sexist : Code Switch". npr.org. NPR. Retrieved December 12, 2020.{{cite web}}: CS1 maint: numeric names: authors list (link)
  77. ^ "Joy Buolamwini". Carnegie Corporation of New York. Retrieved June 26, 2024.
  78. ^ "Hutchens MEdalists". January 4, 2024.
  79. ^ "The 100 Most Influential People in AI 2023". Time. Retrieved February 18, 2024.
  80. ^ "Announcing the 2024 Honorary Degree Recipients". Dartmouth.edu. April 11, 2024. Retrieved June 10, 2024.