{"id":625828,"date":"2023-04-05T09:49:36","date_gmt":"2023-04-05T14:49:36","guid":{"rendered":"https:\/\/news.sellorbuyhomefast.com\/index.php\/2023\/04\/05\/this-student-is-taking-on-biased-exam-software\/"},"modified":"2023-04-05T09:49:36","modified_gmt":"2023-04-05T14:49:36","slug":"this-student-is-taking-on-biased-exam-software","status":"publish","type":"post","link":"https:\/\/newsycanuse.com\/index.php\/2023\/04\/05\/this-student-is-taking-on-biased-exam-software\/","title":{"rendered":"This Student Is Taking On \u2018Biased\u2019 Exam Software"},"content":{"rendered":"<div data-testid=\"ArticlePageChunks\">\n<div data-journey-hook=\"client-content\" data-testid=\"BodyWrapper\">\n<p><span>Robin Pocornie brought<\/span> a lamp with her to court. It\u2019s nothing special, just a basic Ikea floor lamp. But for the masters student, the lamp was a useful prop to help explain how she believes her university\u2019s exam supervision software discriminated against her based on the color of her skin.<\/p>\n<p>Pocornie and her lamp stood in front of the Netherlands Institute of Human Rights, a court focused on discrimination claims,\u00a0in October 2022. But the first time she encountered remote-monitoring software was two years earlier, during the pandemic, when her course at Dutch university VU Amsterdam was holding mandatory online exams. To prevent students from cheating, the university had bought software from the tech firm Proctorio, which\u00a0<a data-offer-url=\"https:\/\/proctorio.com\/faq\" href=\"https:\/\/proctorio.com\/faq\" rel=\"nofollow noopener\" target=\"_blank\">uses<\/a> face detection to verify the identity of the person taking the exam.\u00a0But when Pocornie, who is Black, tried to scan her face, the software kept saying it couldn\u2019t recognize her: stating \u201cno face found.\u201d\u00a0That\u2019s where the Ikea lamp came in.\u00a0<\/p>\n<p>For that first exam in September 2020, and the nine others that followed, the only way Pocornie could get Proctorio\u2019s software to recognize her was if she shone the lamp uncomfortably close to her face\u2014flooding her features with white light during the middle of the day. She imagined herself as a writer in a cartoon, huddled over her desk in a bright spotlight. Having a harsh light shining in her face as she tried to concentrate was uncomfortable, she says, but she persevered. \u201cI was afraid to be kicked out of the exam if I turned off the lamp.\u201d\u00a0\u00a0<\/p>\n<p>On Google, Pocornie soon found other Black students sharing similar experiences\u2014saying that they had to use lamps or flashlights or sit near windows to get exam software to recognize them. All over the world, students of color were being forced to create complicated light arrangements just to take their exams.\u00a0<\/p>\n<p>Pocornie\u2019s exams are over, but she decided to challenge the university\u2019s use of software that she says is discriminatory by taking her university to the Dutch human rights court. Exam software of this kind has already outlived the pandemic lockdowns that made it mainstream, and Pocornie says that all universities using face detection software need to reckon with the way this technology discriminates against people with darker skin tones.\u00a0<\/p>\n<p>\u201cIf education isn\u2019t working according to everybody\u2019s needs,\u201d she says, \u201cthen it\u2019s just not working.\u201d\u00a0<\/p>\n<p>VU Amsterdam and Proctorio declined to comment.\u00a0\u00a0<\/p>\n<p>There have been legal challenges to the use of anti-cheating software in the US and in the Netherlands, but so far they\u2019ve mostly focused on privacy, not race.\u00a0An August case in Cleveland, Ohio, found that the way Proctorio scans students\u2019 rooms during remote tests is\u00a0<a href=\"https:\/\/www.npr.org\/2022\/08\/25\/1119337956\/test-proctoring-room-scans-unconstitutional-cleveland-state-university\">unconstitutional<\/a>. Another Dutch case in 2020 tried\u00a0<a data-offer-url=\"https:\/\/www.dataguidance.com\/news\/netherlands-court-amsterdam-rules-uva-exam-surveillance\" href=\"https:\/\/www.dataguidance.com\/news\/netherlands-court-amsterdam-rules-uva-exam-surveillance\" rel=\"nofollow noopener\" target=\"_blank\">and failed<\/a> to prove that exam-monitoring software violated the European Union\u2019s privacy rules. Pocornie\u2019s case is different because it focuses on what she describes as bias embedded in the software. \u201cIf [these systems] do not function as well for Black people in comparison to white people, that feels to us discriminatory,\u201d says Naomi Appelman, a cofounder at the volunteer-run Racism and Technology Center who has helped Pocornie with her case.\u00a0<\/p>\n<\/div>\n<div data-journey-hook=\"client-content\" data-testid=\"BodyWrapper\">\n<p>Pocornie&#8217;s legal case is still ongoing. In December, the Dutch Institute of Human Rights\u00a0<a data-offer-url=\"https:\/\/oordelen.mensenrechten.nl\/oordeel\/2022-146\" href=\"https:\/\/oordelen.mensenrechten.nl\/oordeel\/2022-146\" rel=\"nofollow noopener\" target=\"_blank\">issued<\/a> an interim ruling saying it strongly suspected that the software used by VU Amsterdam was discriminatory and giving the university 10 weeks to file its defense. That defense has not yet been made public, but VU Amsterdam has previously argued that Pocornie\u2019s log data\u2014showing how long she took to log into her exam and how many times she had to restart the software\u2014imply her problems were due to an unstable internet connection, as opposed to issues with the face detection technology. A ruling is expected later this year.\u00a0<\/p>\n<p>Producers of anti-cheating software like Proctorio\u2019s were boosted by the pandemic, as exam halls were replaced by students\u2019 own homes. Digital monitoring was meant to help schools and universities maintain business as usual throughout lockdown\u2014without creating an opportunity for unsupervised students to cheat. But the pandemic is over and the software is still being used, even as students around the world return to in-person teaching. \u201cWe don\u2019t believe it is going away,\u201d\u00a0<a data-offer-url=\"https:\/\/www.eff.org\/deeplinks\/2022\/12\/schools-and-edtech-need-study-student-privacy-year-review-2022\" href=\"https:\/\/www.eff.org\/deeplinks\/2022\/12\/schools-and-edtech-need-study-student-privacy-year-review-2022\" rel=\"nofollow noopener\" target=\"_blank\">said<\/a> Jason Kelly, who focuses on student surveillance at the US-based Electronic Frontier Foundation, in a 2022 review of the state of student privacy in December.<\/p>\n<p>In the US, Amaya Ross says her college in Ohio still uses anti-cheating software. But every time she logs in, she feels anxious that her experience during the pandemic will repeat itself. Ross, who is Black, also says she couldn\u2019t access her test when she first encountered the software back in 2021. \u201cIt just kept saying: We can&#8217;t recogize your face,\u201d says Ross, who was 20 at the time. After receiving that message three or four times, she started playing around with nearby lamps and the window blinds. She even tried taking a test standing up, directly underneath her ceiling light.\u00a0<\/p>\n<p>Eventually she discovered that if she balanced an LED flashlight on a shelf near her desk and directed it straight at her face, she was able to take her science test\u2014even though the light was almost blinding. She compares the experience to driving at night with a car approaching from the other direction with its headlights on full-beam. \u201cYou just had to power through until it was done,\u201d she says.\u00a0<\/p>\n<p>Ross declines to name the company that made the software she still uses (Proctorio has\u00a0<a href=\"https:\/\/www.theguardian.com\/us-news\/2022\/aug\/26\/anti-cheating-technology-students-tests-proctorio\">sued<\/a> at least one of its critics). But after her mother, Janice Wyatt-Ross,\u00a0<a data-offer-url=\"https:\/\/twitter.com\/JaniceWyattRoss\/status\/1364032597484056577\" href=\"https:\/\/twitter.com\/JaniceWyattRoss\/status\/1364032597484056577\" rel=\"nofollow noopener\" target=\"_blank\">posted<\/a> about what happened on Twitter, Ross says a representative from the business reached out, advising her to stop taking tests in front of white walls. Now she takes tests with a multi-colored wall-hanging behind her, which so far seems to work. When Ross asked some of her Black or darker-skinned friends about the software, a lot of them had experienced similar problems. \u201cBut then I asked my white friends and they\u2019re like, \u2018I\u2019m taking tests in the dark,\u2019\u201d she says.\u00a0<\/p>\n<p>Typically, face-recognition and detection technology fails to recognize people with darker skin when companies use models that were not trained on diverse data sets, says Deborah Raji, a fellow with the Mozilla Foundation. In 2019, Raji\u00a0<a href=\"https:\/\/www.media.mit.edu\/publications\/actionable-auditing-investigating-the-impact-of-publicly-naming-biased-performance-results-of-commercial-ai-products\/\">copublished<\/a> an audit of commercially deployed face-recognition products, which found that some of them were up to 30 percent worse at recognizing darker-skinned women than they were white men. \u201cA lot of the data sets that were in mainstream use in the facial recognition space before [2019] contained 90-plus percent lighter skin subjects, 70-plus percent male subjects,\u201d she says, adding progress has been made since then, but this is not a problem that has been \u201csolved.\u201d\u00a0<\/p>\n<\/div>\n<div data-journey-hook=\"client-content\" data-testid=\"BodyWrapper\">\n<p>\u201cBefore we can get to an acceptable degree of performance across these different demographics \u2026 it doesn\u2019t make sense for us to deploy that technology in high-stakes scenarios such as proctoring,\u201d Raji says, especially when students are given no way to opt out or appeal.\u00a0<\/p>\n<p>After spending more than a year searching for a university official who would take her concerns about Proctorio seriously, Pocornie hopes this case will force VU Amsterdam to create a process that students can use to complain about discriminatory software.\u00a0<\/p>\n<p>For her, Pocornie says this legal case is about making sure other students don\u2019t suffer the same experience or have to deploy Ikea lamps just to take a test. She doesn\u2019t want to retake her exams, even though she believes the software impacted her grades. Because she was taking two full-time masters courses at the same time, she says she was able to compare the grades she received in both (only one course replaced in-person exams with proctoring software during the pandemic; the other used coursework). The stress of online proctoring, Pocornie says, means the grades she received for her VU Amsterdam course were significantly lower than the grades she received from the University of Maastricht. To protect her privacy, she asked WIRED not to share the exact difference.\u00a0<\/p>\n<p>\u201cSo that\u2019s the quantitative effect,\u201d she says. \u201cBut on top of that, there is obviously also the distress, and I think that should count for something.\u201d\u00a0<\/p>\n<\/div>\n<\/div>\n<p><a href=\"https:\/\/www.wired.com\/story\/student-exam-software-bias-proctorio\/\" class=\"button purchase\" rel=\"nofollow noopener\" target=\"_blank\">Read More<\/a><br \/>\n Morgan Meaker<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Robin Pocornie brought a lamp with her to court. It\u2019s nothing special, just a basic Ikea floor lamp. But for the masters student, the lamp was a useful prop to help explain how she believes her university\u2019s exam supervision software discriminated against her based on the color of her skin.Pocornie and her lamp stood in<\/p>\n","protected":false},"author":1,"featured_media":625829,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1882,1318,46],"tags":[],"class_list":{"0":"post-625828","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-student","8":"category-taking","9":"category-technology"},"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/625828","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/comments?post=625828"}],"version-history":[{"count":0,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/625828\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media\/625829"}],"wp:attachment":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media?parent=625828"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/categories?post=625828"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/tags?post=625828"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}