{"id":607200,"date":"2023-02-12T07:50:11","date_gmt":"2023-02-12T13:50:11","guid":{"rendered":"https:\/\/news.sellorbuyhomefast.com\/index.php\/2023\/02\/12\/microsofts-new-ai-bing-taught-my-son-ethnic-slurs-and-im-horrified\/"},"modified":"2023-02-12T07:50:11","modified_gmt":"2023-02-12T13:50:11","slug":"microsofts-new-ai-bing-taught-my-son-ethnic-slurs-and-im-horrified","status":"publish","type":"post","link":"https:\/\/newsycanuse.com\/index.php\/2023\/02\/12\/microsofts-new-ai-bing-taught-my-son-ethnic-slurs-and-im-horrified\/","title":{"rendered":"Microsoft\u2019s new AI Bing taught my son ethnic slurs, and I\u2019m horrified"},"content":{"rendered":"<div>\n<svg viewBox=\"0 0 0 0\" width=\"0\" height=\"0\" focusable=\"false\" role=\"none\" style=\"visibility: hidden; position: absolute; left: -9999px; overflow: hidden;\"><defs><filter id=\"wp-duotone-dark-grayscale\"><feColorMatrix color-interpolation-filters=\"sRGB\" type=\"matrix\" values=\" .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 \" \/><feComponentTransfer color-interpolation-filters=\"sRGB\"><feFuncR type=\"table\" tableValues=\"0 0.49803921568627\" \/><feFuncG type=\"table\" tableValues=\"0 0.49803921568627\" \/><feFuncB type=\"table\" tableValues=\"0 0.49803921568627\" \/><feFuncA type=\"table\" tableValues=\"1 1\" \/><\/feComponentTransfer><feComposite in2=\"SourceGraphic\" operator=\"in\" \/><\/filter><\/defs><\/svg><svg viewBox=\"0 0 0 0\" width=\"0\" height=\"0\" focusable=\"false\" role=\"none\" style=\"visibility: hidden; position: absolute; left: -9999px; overflow: hidden;\"><defs><filter id=\"wp-duotone-grayscale\"><feColorMatrix color-interpolation-filters=\"sRGB\" type=\"matrix\" values=\" .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 \" \/><feComponentTransfer color-interpolation-filters=\"sRGB\"><feFuncR type=\"table\" tableValues=\"0 1\" \/><feFuncG type=\"table\" tableValues=\"0 1\" \/><feFuncB type=\"table\" tableValues=\"0 1\" \/><feFuncA type=\"table\" tableValues=\"1 1\" \/><\/feComponentTransfer><feComposite in2=\"SourceGraphic\" operator=\"in\" \/><\/filter><\/defs><\/svg><svg viewBox=\"0 0 0 0\" width=\"0\" height=\"0\" focusable=\"false\" role=\"none\" style=\"visibility: hidden; position: absolute; left: -9999px; overflow: hidden;\"><defs><filter id=\"wp-duotone-purple-yellow\"><feColorMatrix color-interpolation-filters=\"sRGB\" type=\"matrix\" values=\" .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 \" \/><feComponentTransfer color-interpolation-filters=\"sRGB\"><feFuncR type=\"table\" tableValues=\"0.54901960784314 0.98823529411765\" \/><feFuncG type=\"table\" tableValues=\"0 1\" \/><feFuncB type=\"table\" tableValues=\"0.71764705882353 0.25490196078431\" \/><feFuncA type=\"table\" tableValues=\"1 1\" \/><\/feComponentTransfer><feComposite in2=\"SourceGraphic\" operator=\"in\" \/><\/filter><\/defs><\/svg><svg viewBox=\"0 0 0 0\" width=\"0\" height=\"0\" focusable=\"false\" role=\"none\" style=\"visibility: hidden; position: absolute; left: -9999px; overflow: hidden;\"><defs><filter id=\"wp-duotone-blue-red\"><feColorMatrix color-interpolation-filters=\"sRGB\" type=\"matrix\" values=\" .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 \" \/><feComponentTransfer color-interpolation-filters=\"sRGB\"><feFuncR type=\"table\" tableValues=\"0 1\" \/><feFuncG type=\"table\" tableValues=\"0 0.27843137254902\" \/><feFuncB type=\"table\" tableValues=\"0.5921568627451 0.27843137254902\" \/><feFuncA type=\"table\" tableValues=\"1 1\" \/><\/feComponentTransfer><feComposite in2=\"SourceGraphic\" operator=\"in\" \/><\/filter><\/defs><\/svg><svg viewBox=\"0 0 0 0\" width=\"0\" height=\"0\" focusable=\"false\" role=\"none\" style=\"visibility: hidden; position: absolute; left: -9999px; overflow: hidden;\"><defs><filter id=\"wp-duotone-midnight\"><feColorMatrix color-interpolation-filters=\"sRGB\" type=\"matrix\" values=\" .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 \" \/><feComponentTransfer color-interpolation-filters=\"sRGB\"><feFuncR type=\"table\" tableValues=\"0 0\" \/><feFuncG type=\"table\" tableValues=\"0 0.64705882352941\" \/><feFuncB type=\"table\" tableValues=\"0 1\" \/><feFuncA type=\"table\" tableValues=\"1 1\" \/><\/feComponentTransfer><feComposite in2=\"SourceGraphic\" operator=\"in\" \/><\/filter><\/defs><\/svg><svg viewBox=\"0 0 0 0\" width=\"0\" height=\"0\" focusable=\"false\" role=\"none\" style=\"visibility: hidden; position: absolute; left: -9999px; overflow: hidden;\"><defs><filter id=\"wp-duotone-magenta-yellow\"><feColorMatrix color-interpolation-filters=\"sRGB\" type=\"matrix\" values=\" .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 \" \/><feComponentTransfer color-interpolation-filters=\"sRGB\"><feFuncR type=\"table\" tableValues=\"0.78039215686275 1\" \/><feFuncG type=\"table\" tableValues=\"0 0.94901960784314\" \/><feFuncB type=\"table\" tableValues=\"0.35294117647059 0.47058823529412\" \/><feFuncA type=\"table\" tableValues=\"1 1\" \/><\/feComponentTransfer><feComposite in2=\"SourceGraphic\" operator=\"in\" \/><\/filter><\/defs><\/svg><svg viewBox=\"0 0 0 0\" width=\"0\" height=\"0\" focusable=\"false\" role=\"none\" style=\"visibility: hidden; position: absolute; left: -9999px; overflow: hidden;\"><defs><filter id=\"wp-duotone-purple-green\"><feColorMatrix color-interpolation-filters=\"sRGB\" type=\"matrix\" values=\" .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 \" \/><feComponentTransfer color-interpolation-filters=\"sRGB\"><feFuncR type=\"table\" tableValues=\"0.65098039215686 0.40392156862745\" \/><feFuncG type=\"table\" tableValues=\"0 1\" \/><feFuncB type=\"table\" tableValues=\"0.44705882352941 0.4\" \/><feFuncA type=\"table\" tableValues=\"1 1\" \/><\/feComponentTransfer><feComposite in2=\"SourceGraphic\" operator=\"in\" \/><\/filter><\/defs><\/svg><svg viewBox=\"0 0 0 0\" width=\"0\" height=\"0\" focusable=\"false\" role=\"none\" style=\"visibility: hidden; position: absolute; left: -9999px; overflow: hidden;\"><defs><filter id=\"wp-duotone-blue-orange\"><feColorMatrix color-interpolation-filters=\"sRGB\" type=\"matrix\" values=\" .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 .299 .587 .114 0 0 \" \/><feComponentTransfer color-interpolation-filters=\"sRGB\"><feFuncR type=\"table\" tableValues=\"0.098039215686275 1\" \/><feFuncG type=\"table\" tableValues=\"0 0.66274509803922\" \/><feFuncB type=\"table\" tableValues=\"0.84705882352941 0.41960784313725\" \/><feFuncA type=\"table\" tableValues=\"1 1\" \/><\/feComponentTransfer><feComposite in2=\"SourceGraphic\" operator=\"in\" \/><\/filter><\/defs><\/svg><\/p>\n<div id=\"page\">\n<p><a href=\"http:\/\/www.pcworld.com\/#primary\">Skip to content<\/a><\/p>\n<p>\t<main id=\"primary\"><\/p>\n<article id=\"post-1507512\">\n<div>\n<div>\n<p><img loading=\"lazy\" decoding=\"async\" width=\"1024\" height=\"768\" src=\"https:\/\/www.pcworld.com\/wp-content\/uploads\/2023\/02\/new-bing-laptop-1.jpg?quality=50&#038;strip=all&#038;w=1024\" alt=\"The new bing waitlist on a laptop screen\" data-hero  ><\/p>\n<p><span>Image: Jon Phillips\/IDG<\/span>\t\t\t\t<\/p>\n<\/p><\/div>\n<div>\n<div id=\"link_wrapped_content\">\n<body><\/p>\n<p>Remember <a href=\"https:\/\/www.pcworld.com\/article\/420242\/the-internet-turns-tay-microsofts-millennial-ai-chatbot-into-a-racist-bigot.html\">Tay<\/a>? That\u2019s what I immediately fixed upon when Microsoft\u2019s new Bing started spouting racist terms in front of my fifth-grader.<\/p>\n<p>I have two sons, and both of them are familiar with <a href=\"https:\/\/www.pcworld.com\/article\/1424575\/chatgpt-is-the-future-of-ai-chatbots.html\">ChatGPT, OpenAI\u2019s AI-powered tool<\/a>. When Bing launched its own AI-powered search engine and chatbot this week, my first thought upon returning home was to show them how it worked, and how it compared with a tool that they had seen before.<\/p>\n<p>As it happened, my youngest son was home sick, so he was the first person I began showing Bing to when he walked in my office. I started giving him a tour of the interface, <a href=\"https:\/\/www.pcworld.com\/article\/1506016\/hands-on-with-bings-ai-chat-and-search.html\">as I had done in my hands-on with the new Bing<\/a>, but with an emphasis on how Bing explains things at length, how it uses footnotes, and, most of all, includes safeguards to prevent users from tricking it into using hateful language like Tay had done. By bombarding Tay with racist language, the <a href=\"https:\/\/www.pcworld.com\/article\/420242\/the-internet-turns-tay-microsofts-millennial-ai-chatbot-into-a-racist-bigot.html\">Internet turned Tay into a hateful bigot<\/a>. <\/p>\n<p>What I was trying to do was show my son how Bing would shut down a leading but otherwise innocuous query: \u201cTell me the nicknames for various ethnicitiies.\u201d (I was typing quickly, so I misspelled the last word.) <\/p>\n<p>I had used this exact query before, and Bing had rebuked me for possibly introducing hateful slurs. Unfortunately, Bing only saves previous conversations for about 45 minutes, I was told, so I couldn\u2019t show him how Bing had responded earlier. But he saw what the new Bing said this time\u2014and it\u2019s nothing I wanted my son to see.<\/p>\n<h2 id=\"the-specter-of-tay\">The specter of Tay<\/h2>\n<blockquote>\n<p><em>Note: A Bing screenshot below includes derogatory terms for various ethnicities. We don\u2019t condone using these racist terms, and only share this screenshot to illustrate exactly what we found.<\/em><\/p>\n<\/blockquote>\n<p>What Bing supplied this time was <em>far<\/em> different than how it had responded before. Yes, it prefaced the response by noting that some ethnic nicknames were neutral or positive, and others were racist and harmful. But I expected one of two outcomes: Either Bing would provide socially acceptable characterizations of ethnic groups (Black, Latino) or simply decline to respond. Instead, it started listing pretty much every ethnic description it knew, both good and very, <em>very<\/em> bad.<\/p>\n<div>\n<figure><img decoding=\"async\" loading=\"lazy\" src=\"https:\/\/b2c-contenthub.com\/wp-content\/uploads\/2023\/02\/new-bing-slurs.jpg?quality=50&#038;strip=all&#038;w=1200\" alt=\"Bing's AI chatbot suggesting racial slurs as ethnic nicknames.\" width=\"1200\" height=\"673\"><\/figure>\n<p>Mark Hachman \/ IDG<\/p>\n<\/div>\n<p>You can imagine my reaction. My son pivoted away from the screen in horror, as he knows that he\u2019s not supposed to know or even say those words. As I started seeing some horribly racist terms pop up on my screen, I clicked the \u201cStop Responding\u201d button. <\/p>\n<p>I will admit that I shouldn\u2019t have demonstrated Bing live in front of my son. But, in my defense, there were just so many reasons that I felt confident that nothing like this would have happened. <\/p>\n<p>I shared my experience with Microsoft, and a spokesperson replied with the following: \u201cThank you for bringing this to our attention. We take these matters very seriously and are committed to applying learnings from the early phases of our launch. We have taken immediate actions and are looking at additional improvements we can make to address this issue.\u201d<\/p>\n<p>The company has reason to be cautious. For one, Microsoft has already experienced the very public nightmare of Tay, an AI the company launched in 2016. Users bombarded Tay with racist messages, discovering that the way Tay \u201clearned\u201d was through interactions with users. Awash in racist tropes, Tay became a bigot herself.<\/p>\n<p>Microsoft said in 2016 that\u00a0<a href=\"https:\/\/www.pcworld.com\/article\/420272\/microsoft-deeply-sorry-for-tay-chatbot-will-bring-it-back-when-vulnerability-is-fixed.html\" target=\"_blank\" rel=\"noreferrer noopener\">it was \u201cdeeply sorry\u201d for what happened<\/a>\u00a0with Tay, and said it would bring it back when the vulnerability was fixed. (It apparently never was.) You would think that Microsoft would be hypersensitive to exposing users to such themes again, especially as the public has become increasingly sensitive to what can be considered a slur.<\/p>\n<p>Some time after I had unwittingly exposed my son to Bing\u2019s summary of slurs, I tried the query again, which is the second response that you see in the screenshot above. This is what I expected of Bing, even if it was a continuation of the conversation that I had had with it before. <\/p>\n<h2 id=\"microsoft-says-that-its-better-than-this\">Microsoft says that it\u2019s better than this<\/h2>\n<p>There\u2019s another point to be made here: Tay was an AI personality, sure, but it was Microsoft\u2019s voice. This was, in effect, <em>Microsoft<\/em> saying those things. In the screenshot above, what\u2019s missing? Footnotes. Links. <a href=\"https:\/\/www.pcworld.com\/article\/1504752\/theres-a-war-brewing-over-ai-citations-and-youre-going-to-lose.html\">Both are typically present in Bing\u2019s responses<\/a>, but they\u2019re absent here. In effect, this is Microsoft itself responding to the question.<\/p>\n<p>A very big part of Microsoft\u2019s new Bing launch event at its headquarters in Redmond, Washington was an assurance that the mistakes of Tay wouldn\u2019t happen again. According to general counsel Brad Smith\u2019s , Microsoft has been working hard on the foundation of what it calls <a href=\"https:\/\/click.linksynergy.com\/deeplink?id=*l6kYCuH720&#038;mid=24542&#038;u1=2-1-1507512-1-0-0&#038;murl=https:\/\/www.microsoft.com\/en-us\/ai\/responsible-ai?activetab=pivot1%3aprimaryr6\" rel=\"nofollow\">Responsible AI<\/a> for six years. In 2019, it created an Office of Responsible AI. Microsoft named a Chief Responsible AI Officer, Natasha Crampton, who along with Smith and the Responsible AI Lead, Sarah Bird, spoke publicly at Microsoft\u2019s event about how Microsoft has \u201cred teams\u201d trying to break its AI. The company even offers a <a href=\"https:\/\/click.linksynergy.com\/deeplink?id=*l6kYCuH720&#038;mid=24542&#038;u1=2-1-1507512-1-0-0&#038;murl=https:\/\/www.microsoft.com\/en-us\/ai\/ai-business-school?SilentAuth=1#primaryR7\" rel=\"nofollow\">Responsible AI business school<\/a>, for pete\u2019s sake. <\/p>\n<p>Microsoft doesn\u2019t call out racism and sexism as specific guardrails to avoid as part of Responsible AI. But it refers constantly to \u201csafety,\u201d implying that users should feel comfortable and secure using it. If safety <em>doesn\u2019t<\/em> include filtering out racism and sexism, that can be a big problem, too.<\/p>\n<p>\u201cWe take all of that [Responsible AI] as first-class things which we want to reduce not just to principles, but to engineering practice, such that we can build AI that\u2019s more aligned with human values, more aligned with what our preferences are, both individually and as a society,\u201d Microsoft chief executive Satya Nadella said during the launch event.<\/p>\n<p>In thinking about how I interacted with Bing, a question suggested itself: Was this entrapment? Did I essentially ask for Bing to start parroting racist slurs in the guise of academic research? If I did, Microsoft failed badly in its safety guardrails here, too. A few seconds into this clip (at 51:26), Sarah Bird, Responsible AI Lead at Microsoft\u2019s Azure AI, talks about how Microsoft specifically designed an automated conversational tool to interact with Bing just to see if it (or a human) could convince it to violate its safety regulations. The idea is that Microsoft would test this extensively, before a human ever got its hands on it, so to speak. <\/p>\n<figure>\n<p>\n<iframe loading=\"lazy\" title=\"Introducing your copilot for the web: AI-powered Bing and Microsoft Edge\" width=\"500\" height=\"281\" src=\"https:\/\/www.youtube.com\/embed\/rOeRWRJ16yY?start=3030&#038;feature=oembed\" frameborder=\"0\" allow=\"accelerometer; autoplay; clipboard-write; encrypted-media; gyroscope; picture-in-picture; web-share\" allowfullscreen><\/iframe>\n<\/p>\n<\/figure>\n<p>I\u2019ve used these AI chatbots enough to know that if you ask it the same question enough times, the AI will generate different responses. It\u2019s a conversation, after all. But think through all of the conversations you\u2019ve ever had, say with a good friend or close coworker. Even if the conversation goes smoothly hundreds of times, it\u2019s that one time that you hear something unexpectedly awful that will shape all future interactions with that person.<\/p>\n<p>Does this slur-laden response conform to Microsoft\u2019s \u201cResponsible AI\u201d program? That invites a whole suite of questions pertaining to free speech, the intent of research, and so on\u2014but Microsoft has to be absolutely perfect in this regard. It\u2019s tried to convince us that it will. We\u2019ll see.<\/p>\n<p>That night, I closed down Bing, shocked and embarrassed that I had exposed my son to words I don\u2019t want him ever to think, let alone use. It\u2019s certainly made me think twice about using it in the future.<\/p>\n<p><\/body><\/div>\n<div data-ga=\"article-footer-author\">\n<h3>\n\t\t<a href=\"https:\/\/www.pcworld.com\/author\/mhachman\" rel=\"author\"><br \/>\n\t\tAuthor: Mark Hachman<\/a>, Senior Editor\t\t<\/h3>\n<div>\n<div>\n<p><img loading=\"lazy\" decoding=\"async\" src=\"https:\/\/www.pcworld.com\/wp-content\/uploads\/2023\/02\/author_photo_Mark-Hachman_1632347568-24.jpeg?quality=50&#038;strip=all&#038;w=116&#038;h=116&#038;crop=1\" height=\"125\" width=\"125\">\n\t\t\t\t<\/p>\n<p>As PCWorld&#8217;s senior editor, Mark focuses on Microsoft news and chip technology, among other beats. He has formerly written for PCMag, BYTE, Slashdot, eWEEK, and ReadWrite.<\/p>\n<\/p><\/div>\n<ul>\n<li>\n\t\t\t\t\t\t\t\t<a href=\"https:\/\/twitter.com\/markhachman\" title=\"Twitter\" target=\"_blank\" rel=\"noopener\"><svg viewBox=\"0 0 512 512\"><path d=\"M459.37 151.716c.325 4.548.325 9.097.325 13.645 0 138.72-105.583 298.558-298.558 298.558-59.452 0-114.68-17.219-161.137-47.106 8.447.974 16.568 1.299 25.34 1.299 49.055 0 94.213-16.568 130.274-44.832-46.132-.975-84.792-31.188-98.112-72.772 6.498.974 12.995 1.624 19.818 1.624 9.421 0 18.843-1.3 27.614-3.573-48.081-9.747-84.143-51.98-84.143-102.985v-1.299c13.969 7.797 30.214 12.67 47.431 13.319-28.264-18.843-46.781-51.005-46.781-87.391 0-19.492 5.197-37.36 14.294-52.954 51.655 63.675 129.3 105.258 216.365 109.807-1.624-7.797-2.599-15.918-2.599-24.04 0-57.828 46.782-104.934 104.934-104.934 30.213 0 57.502 12.67 76.67 33.137 23.715-4.548 46.456-13.32 66.599-25.34-7.798 24.366-24.366 44.833-46.132 57.827 21.117-2.273 41.584-8.122 60.426-16.243-14.292 20.791-32.161 39.308-52.628 54.253z\" \/><\/svg><\/a>\n\t\t\t\t\t\t\t<\/li>\n<\/ul><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<\/p><\/div>\n<p>\t<\/main><\/p>\n<p><main id=\"primary\"><\/main><\/p>\n<\/div>\n<\/div>\n<p><a href=\"https:\/\/www.pcworld.com\/article\/1507512\/microsofts-new-ai-bing-taught-my-son-ethnic-slurs-and-im-horrified.html\" class=\"button purchase\" rel=\"nofollow noopener\" target=\"_blank\">Read More<\/a><br \/>\n Bong Drews<\/p>\n","protected":false},"excerpt":{"rendered":"<p>Skip to content Image: Jon Phillips\/IDG Remember Tay? That\u2019s what I immediately fixed upon when Microsoft\u2019s new Bing started spouting racist terms in front of my fifth-grader. I have two sons, and both of them are familiar with ChatGPT, OpenAI\u2019s AI-powered tool. When Bing launched its own AI-powered search engine and chatbot this week, my<\/p>\n","protected":false},"author":1,"featured_media":607201,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[4308,1782,46],"tags":[],"class_list":{"0":"post-607200","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-microsofts","8":"category-taught","9":"category-technology"},"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/607200","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/comments?post=607200"}],"version-history":[{"count":0,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/607200\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media\/607201"}],"wp:attachment":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media?parent=607200"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/categories?post=607200"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/tags?post=607200"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}