{"id":618815,"date":"2023-03-17T09:49:24","date_gmt":"2023-03-17T14:49:24","guid":{"rendered":"https:\/\/news.sellorbuyhomefast.com\/index.php\/2023\/03\/17\/tech-that-aims-to-read-your-mind-and-probe-your-memories-is-already-here\/"},"modified":"2023-03-17T09:49:24","modified_gmt":"2023-03-17T14:49:24","slug":"tech-that-aims-to-read-your-mind-and-probe-your-memories-is-already-here","status":"publish","type":"post","link":"https:\/\/newsycanuse.com\/index.php\/2023\/03\/17\/tech-that-aims-to-read-your-mind-and-probe-your-memories-is-already-here\/","title":{"rendered":"Tech that aims to read your mind and probe your memories is already here"},"content":{"rendered":"<div class>\n<div>\n<p><em>This article is from The Checkup, MIT Technology Review&#8217;s weekly biotech newsletter. To receive it in your inbox every Thursday, <a href=\"https:\/\/forms.technologyreview.com\/newsletters\/biotech-the-checkup\/?_ga=2.241810882.15113993.1664981064-43237434.1647441349\">sign up here<\/a>.<\/em><\/p>\n<p>Earlier this week, I had a fascinating call with Nita Farahany, a futurist and legal ethicist at Duke University in Durham, North Carolina. Farahany has spent much of her career exploring the impacts of new technologies\u2014in particular, those that attempt to understand or modify our brains.<\/p>\n<\/p><\/div>\n<div>\n<p>In recent years, we\u2019ve seen neurotechnologies move from research labs to real-world use. Schools have used some devices to <a href=\"https:\/\/www.wsj.com\/articles\/chinas-efforts-to-lead-the-way-in-ai-start-in-its-classrooms-11571958181\">monitor the brain activity of children to tell when they are paying attention<\/a>. Police forces are using others <a href=\"https:\/\/www.gulftoday.ae\/news\/2021\/01\/25\/dubai-police-crack-murder-case-using-brain-fingerprint-technology\">to work out whether someone is guilty of a crime<\/a>. And employers use them <a href=\"https:\/\/www.nytimes.com\/2020\/02\/06\/business\/drowsy-driving-truckers.html\">to keep workers awake<\/a> and productive.<\/p>\n<p>These technologies hold the remarkable promise of giving us all-new insight into our own minds. But our brain data is precious, and letting it fall into the wrong hands could be dangerous, Farahany argues in her new book, <em>The Battle for Your Brain<\/em>. I chatted with her about some of her concerns.<\/p>\n<p><em>The following interview has been edited for length and clarity.<\/em><\/p>\n<p><strong>Your book describes how technologies that collect and probe our brain data might be used\u2014for better or for worse. What can you tell from a person\u2019s brain data?<\/strong><\/p>\n<p>When I talk about brain data, I\u2019m referring to the use of EEG, fNIRS [functional near-infrared spectroscopy], fMRI [functional magnetic resonance imaging], EMG and other modalities that collect biological, electrophysiological, and other functions from the human brain. These devices tend to collect data from across the brain, and you can then use software to try to pick out a particular signal.<\/p>\n<\/p><\/div>\n<div>\n<p>Brain data is not thought. But you can use it to make inferences about what\u2019s happening in a person\u2019s mind. There are brain states you can decode: tired, paying attention, mind-wandering, engagement, boredom, interest, happy, sad. You could work out how they are thinking or feeling, whether they are hungry, <a href=\"https:\/\/news.osu.edu\/brain-scans-remarkably-good-at-predicting-political-ideology\/\">whether they are a Democrat or Republican<\/a>.<\/p>\n<p>You can also pick up a person\u2019s reactions, and try to probe the brain for information and figure out what\u2019s in their memory or their thought patterns. You could show them numbers to try to figure out their PIN number, or images of political candidates to find out if they have more positive or negative reactions. You can probe for biases, but also for substantive knowledge that a person holds, such as recognition of a crime scene or a password.<\/p>\n<p><strong>Until now, most people will only have learned about their brain data through medical exams. Our health records are protected. What about brain data collected by consumer products?<\/strong><\/p>\n<\/p><\/div>\n<div>\n<p>I feel like we\u2019re at an inflection point. [A lot of] consumer devices are hitting the market this year, and in the next two years. There have been huge advances in AI that allows us to decode brain activity, and in the miniaturization of electrodes, which [allows manufacturers] to put them into earbuds and headphones. And there has been significant investment from big tech companies. It is, I believe, about to become ubiquitous.<\/p>\n<p>The only person who has access to your brain data right now is you, and it is only analyzed in the internal software of your mind. But once you put a device on your head \u2026 you\u2019re immediately sharing that data with whoever the device manufacturer is, and whoever is offering the platform. It could also be shared with any government or employer that might have given you the device.<\/p>\n<p><strong>Is that always a bad thing?<\/strong><\/p>\n<p>It\u2019s transformational for individuals to have access to their own brain data, in a good way. The brain has always been this untouchable and inaccessible area of our bodies. And suddenly that\u2019s in the hands of individuals. The relationship we\u2019re going to have with ourselves is going to change.<\/p>\n<\/div>\n<div>\n<p>If scientists and researchers have access to that data, it could help them understand brain dysfunction, which could lead to the development of new treatments for neurological disease and mental illness.<\/p>\n<p>The collection or creation of the data isn\u2019t what\u2019s problematic\u2014it\u2019s when the data is used in ways that are harmful to individuals, collectives, or groups. And the problem is that that can happen very quickly.<\/p>\n<p>An authoritarian government having access to it could use it to try to identify people who don\u2019t show political adherence, for example. That\u2019s a pretty quick and serious misuse of the data. Or trying to identify people who are neuroatypical, and discriminate against or segregate them. In a workplace, it could be used for dehumanization of individuals by subjecting them to neurosurveillance. All of that simultaneously becomes possible.<\/p>\n<p><strong>Some consumer products, such as headbands and earbuds that purport to measure your brain activity and induce a sense of calm, for example, have been dismissed as gimmicks by some scientists.<\/strong><\/p>\n<\/p><\/div>\n<div>\n<p>Very much so. The hardcore BCI [brain-computer interface] folks who are working on serious implanted [devices] to revolutionize and improve health will say \u2026 you\u2019re not picking up much real information. The signal is distorted by noise\u2014muscle twitches and hair, for example. But that doesn\u2019t mean that there\u2019s no signal. There are still meaningful things that you can pick up. I think people dismiss it at their peril. They don\u2019t know about what\u2019s happening in the field\u2014the advances and how rapidly they\u2019re coming.<\/p>\n<p><strong>In the book, you give a few examples of how these technologies are already being used by employers. Some devices are used to monitor how awake and alert truck drivers are, for example.<\/strong><\/p>\n<p>That\u2019s not such a terrible use, from my perspective. You can balance the interest of mental privacy of the individual against societal interest, and keeping others on the road safe, and keeping the driver safe.<\/p>\n<p>And giving employees the tools to have real-time neurofeedback [being able to monitor your own brain activity] to understand their own stress or attention levels is also starting to become widespread. If it\u2019s given to individuals to use for themselves as a tool of self-reflection and improvement, I don\u2019t find that to be problematic.<\/p>\n<p>The problem comes if it\u2019s used as a mandatory tool, and employers gather data to make decisions about hiring, firing, and promotions. They turn it into a kind of productivity score. Then I think it becomes really insidious and problematic. It undermines trust \u2026 and can make the workplace dehumanizing.<\/p>\n<p><strong>You also describe how corporations and governments might use our brain data. I was especially intrigued by the idea of targeted dream incubation \u2026<\/strong><\/p>\n<\/div>\n<div>\n<p>This is the stuff of the movie <em>Inception<\/em>! [Brewing company] <a href=\"https:\/\/www.youtube.com\/watch?v=tU_0jU0mMLw\">Coors teamed up with a dream researcher<\/a> to incubate volunteers\u2019 dreams with thoughts of mountains and fresh streams, and ultimately associate those thoughts with Coors beer. To do this, they played soundscapes to the volunteers when they were just waking up or falling asleep\u2014times when our brains are the most suggestible.<\/p>\n<p>It\u2019s icky for so many reasons. It is about literally looking for the moments when you\u2019re least able to protect your own mind, and then attempting to create associations in your mind. It starts to feel a lot like the kind of manipulation that should be off limits.<\/p>\n<\/p><\/div>\n<div>\n<p>They recruited consenting volunteers. But could this be done without people\u2019s consent? Apple has a patent on a sleep mask with EEG sensors embedded in it, and LG has showcased EEG earbuds for sleep, for example. Imagine if any of these sensors could pick up when you\u2019re at your most suggestible, and connect to a nearby cell phone or home device to play a soundscape to manipulate your thinking. Don\u2019t you think it\u2019s creepy?<\/p>\n<p><strong>Yes, I do! How can we prevent this from happening?<\/strong><\/p>\n<p>I\u2019m actively talking to a lot of companies, and telling them they need to have really robust privacy policies. I think people should be able to experiment with devices without worrying about what the implications might be.<\/p>\n<p><strong>Have those companies been receptive to the idea?<\/strong><\/p>\n<p>Most neurotech companies that I\u2019ve talked with recognize the issues, and are trying to come forward with solutions and be responsible. I\u2019ve been very encouraged by their sincerity. But I\u2019ve been less impressed with some of the big tech companies. As we\u2019ve seen with the recent major layoffs, <a href=\"https:\/\/www.theverge.com\/2023\/3\/13\/23638823\/microsoft-ethics-society-team-responsible-ai-layoffs\">the ethics people are some of the first to go<\/a> at those companies.<\/p>\n<p>Given that these smaller neuro companies are getting acquired by the big titans in tech, I\u2019m less confident that brain data collected by these small companies will remain under their privacy policies. The commodification of data is the business model of these big companies. I don\u2019t want to leave it to companies to self-govern.<\/p>\n<p><strong>What else can we do?<\/strong><\/p>\n<p>My hope is that we immediately move toward adopting a right to cognitive liberty\u2014a novel human right that in principle exists within existing human rights law.<\/p>\n<\/div>\n<div>\n<p>I think of cognitive liberty as an umbrella concept made up of three core principles: mental privacy, freedom of thought, and self-determination. That last principle covers the right to access our own brain information, to know our own brains, and to change our own brains.<\/p>\n<p>It\u2019s an update to our general conception of liberty to recognize what liberty needs to look like in the digital age.<\/p>\n<p><strong>How likely is it that we\u2019ll be able to implement something like this?<\/strong><\/p>\n<p>I think it\u2019s actually quite likely. The UN Human Rights Committee can, through a general comment or opinion, recognize the right to cognitive liberty. It doesn\u2019t require a political process at the UN.<\/p>\n<p><strong>But will it be implemented in time?<\/strong><\/p>\n<p>I hope so. That\u2019s why I wrote the book now. We don\u2019t have a lot of time. If we wait for some disaster to occur, it\u2019s going to be too late.<\/p>\n<p>But we can set neurotechnology on a course that can be empowering for humanity.<\/p>\n<p><strong>Farahany\u2019s book, <em>The Battle for Your Brain<\/em>, is out this week. There\u2019s also loads of neurotech content in Tech Review\u2019s archive:<\/strong><\/p>\n<\/p><\/div>\n<div>\n<p><strong>The US military has been working to develop mind-reading devices for years.<\/strong> The aim is to create technologies that allow us to help people with brain or nervous system damage, but also enable soldiers to direct drones and other devices by thought alone, as Paul Tullis <a href=\"https:\/\/www.technologyreview.com\/2019\/10\/16\/132269\/us-military-super-soldiers-control-drones-brain-computer-interfaces\/?utm_source=the_checkup&#038;utm_medium=email&#038;utm_campaign=the_checkup.unpaid.engagement&#038;utm_content=03-16-23\">reported<\/a> in 2019.<\/p>\n<p><strong>Several multi-millionaires who made their fortune in tech have launched projects to link human brains to computers, whether to read our minds, communicate, or supercharge our brainpower.<\/strong> <a href=\"https:\/\/www.technologyreview.com\/2017\/03\/16\/153211\/the-entrepreneur-with-the-100-million-plan-to-link-brains-to-computers\/?utm_source=the_checkup&#038;utm_medium=email&#038;utm_campaign=the_checkup.unpaid.engagement&#038;utm_content=03-16-23\">Antonio Regalado spoke to entrepreneur Bryan Johnson<\/a> in 2017 about his plans to build a neural prosthetic for human intelligence enhancement. (Since then, Johnson has embarked on a quest to keep his body as young as possible.)<\/p>\n<p><strong>We can deliver jolts of electricity to the brain via headbands and caps\u2014devices that are generally considered to be noninvasive.<\/strong> But given that they are probing our minds and potentially changing the way they work, perhaps we need to reconsider how invasive they really are, as <a href=\"https:\/\/www.technologyreview.com\/2022\/12\/23\/1065862\/brain-stimulation-invasive\/?utm_source=the_checkup&#038;utm_medium=email&#038;utm_campaign=the_checkup.unpaid.engagement&#038;utm_content=03-16-23\">I wrote<\/a> in an earlier edition of The Checkup.<\/p>\n<p><strong>Elon Musk\u2019s company Neuralink has stated it has an eventual goal of \u201ccreating a whole-brain interface capable of more closely connecting biological and artificial intelligence.\u201d <\/strong>Antonio described how much progress the company and its competitors have made in <a href=\"https:\/\/www.technologyreview.com\/2021\/10\/27\/1036821\/brain-computer-interface-implant-mouse\/?utm_source=the_checkup&#038;utm_medium=email&#038;utm_campaign=the_checkup.unpaid.engagement&#038;utm_content=03-16-23\">a feature<\/a> that ran in the <a href=\"https:\/\/www.technologyreview.com\/magazines\/the-computing-issue\/?utm_source=the_checkup&#038;utm_medium=email&#038;utm_campaign=the_checkup.unpaid.engagement&#038;utm_content=03-16-23\">Computing issue<\/a> of the magazine.\u00a0<\/p>\n<p><strong>When a person with an electrode implanted in their brain to treat epilepsy was accused of assaulting a police officer, law enforcement officials asked to see the brain data collected by the device. <\/strong>The data was exonerating; it turns out the person was having a seizure at the time. But brain data could just as easily be used to incriminate someone else, as <a href=\"https:\/\/www.technologyreview.com\/2023\/02\/24\/1069116\/how-your-brain-data-could-be-used-against-you\/?utm_source=the_checkup&#038;utm_medium=email&#038;utm_campaign=the_checkup.unpaid.engagement&#038;utm_content=03-16-23\">I wrote<\/a> in a recent edition of The Checkup.<\/p>\n<h3>From around the web<\/h3>\n<p><strong>How would you feel about getting letters from your doctor that had been written by an AI?<\/strong> A pilot study showed that \u201cit is possible to generate clinic letters with a high overall correctness and humanness score with ChatGPT.\u201d (<a href=\"https:\/\/www.thelancet.com\/journals\/landig\/article\/PIIS2589-7500(23)00048-1\/fulltext\">The Lancet Digital Health<\/a>)<\/p>\n<p><strong>When Meredith Broussard found out that her hospital had used AI to help diagnose her breast cancer, she explored how the technology fares against human doctors.<\/strong> Not great, it turned out. (<a href=\"https:\/\/www.wired.com\/story\/artificial-intelligence-cancer-detection\/\">Wired<\/a>)<\/p>\n<p><strong>A federal judge in Texas is being asked in a lawsuit to direct the US Food and Drug Administration to rescind its approval of mifepristone, one of two drugs used in medication abortions.<\/strong> A ruling against the FDA could diminish the authority of the organization and \u201cbe catastrophic for public health.\u201d (<a href=\"https:\/\/www.washingtonpost.com\/health\/2023\/03\/15\/abortion-pill-fda\/\">The Washington Post<\/a>)<\/p>\n<p><strong>The US Environmental Protection Agency has proposed regulation that would limit the levels of six \u201cforever chemicals\u201d in drinking water.<\/strong> Perfluoroalkyl and polyfluoroalkyl substances (PFAS) are synthetic chemicals that have been used to make products since the 1950s. They break down extremely slowly and have been found in the environment, and in the blood of people and animals, around the world. We still don\u2019t know how harmful they are. (<a href=\"https:\/\/www.epa.gov\/sdwa\/and-polyfluoroalkyl-substances-pfas\">EPA<\/a>)<\/p>\n<p><strong>Would you pay thousands of dollars to have your jaw broken and remodeled to resemble that of Batman?<\/strong> The surgery represents yet another disturbing cosmetic trend. (<a href=\"https:\/\/www.gq-magazine.co.uk\/lifestyle\/article\/jaw-surgery-men\">GQ<\/a>)<svg viewBox=\"0 0 1091.84 1091.84\"><polygon fill=\"#6d6e71\" points=\"363.95 0 363.95 1091.84 727.89 1091.84 727.89 363.95 363.95 0\" \/><polygon fill=\"#939598\" points=\"363.95 0 728.24 365.18 1091.84 364.13 1091.84 0 363.95 0\" \/><polygon fill=\"#414042\" points=\"0 0 0 0.03 0 363.95 363.95 363.95 363.95 0 0 0\" \/><\/svg> <\/p>\n<\/div>\n<\/div>\n<p><a href=\"https:\/\/www.technologyreview.com\/2023\/03\/17\/1069897\/tech-read-your-mind-probe-your-memories\/\" class=\"button purchase\" rel=\"nofollow noopener\" target=\"_blank\">Read More<\/a><br \/>\n Jessica Hamzelou<\/p>\n","protected":false},"excerpt":{"rendered":"<p>This article is from The Checkup, MIT Technology Review&#8217;s weekly biotech newsletter. To receive it in your inbox every Thursday, sign up here. Earlier this week, I had a fascinating call with Nita Farahany, a futurist and legal ethicist at Duke University in Durham, North Carolina. Farahany has spent much of her career exploring the<\/p>\n","protected":false},"author":1,"featured_media":618816,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[31845,3151,46],"tags":[],"class_list":{"0":"post-618815","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-memories","8":"category-probe","9":"category-technology"},"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/618815","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/comments?post=618815"}],"version-history":[{"count":0,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/618815\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media\/618816"}],"wp:attachment":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media?parent=618815"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/categories?post=618815"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/tags?post=618815"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}