{"id":620672,"date":"2023-03-22T09:48:57","date_gmt":"2023-03-22T14:48:57","guid":{"rendered":"https:\/\/news.sellorbuyhomefast.com\/index.php\/2023\/03\/22\/can-society-adjust-at-the-speed-of-artificial-intelligence\/"},"modified":"2023-03-22T09:48:57","modified_gmt":"2023-03-22T14:48:57","slug":"can-society-adjust-at-the-speed-of-artificial-intelligence","status":"publish","type":"post","link":"https:\/\/newsycanuse.com\/index.php\/2023\/03\/22\/can-society-adjust-at-the-speed-of-artificial-intelligence\/","title":{"rendered":"Can society adjust at the speed of artificial intelligence?"},"content":{"rendered":"<div>\n<p id=\"mrlB5L\">On Tuesday, OpenAI <a href=\"https:\/\/openai.com\/product\/gpt-4\">announced<\/a> the release of GPT-4, its latest, biggest language model, only a few months after the splashy release of ChatGPT. GPT-4 was already in action \u2014 Microsoft <a href=\"https:\/\/www.theverge.com\/2023\/3\/14\/23639928\/microsoft-bing-chatbot-ai-gpt-4-llm\">has been using<\/a> it to power Bing\u2019s new assistant function. The people behind OpenAI <a href=\"https:\/\/www.vox.com\/future-perfect\/23619354\/openai-chatgpt-sam-altman-artificial-intelligence-regulation-sydney-microsoft-ai-safety\">have written<\/a> that they think the best way to handle powerful AI systems is to develop and release them as quickly as possible, and that\u2019s certainly what they\u2019re doing. <\/p>\n<p id=\"uwUk97\">Also on Tuesday, I sat down with Holden Karnofsky, the co-founder and co-CEO of Open Philanthropy, to talk about AI and where it\u2019s taking us. <\/p>\n<p id=\"INwekw\">Karnofsky, in my view, should get a lot of credit for his prescient views on AI. Since 2008, he\u2019s been <a href=\"https:\/\/www.openphilanthropy.org\/research\/potential-risks-from-advanced-artificial-intelligence\/\">engaging with what was then a small minority of researchers<\/a> who were saying that powerful AI systems were one of the most important social problems of our age \u2014 a view that I think has aged remarkably well. (Karnofsky was a board member of OpenAI but stepped down in 2021 when his wife \u2014 a former employee of OpenAI \u2014 helped launch the AI company Anthropic. Open Philanthropy <a href=\"https:\/\/www.openphilanthropy.org\/grants\/openai-general-support\/\">provided<\/a> a $30 million grant to OpenAI in 2017.)<\/p>\n<p id=\"WBpcNa\">Some of his early published work on the question, from 2011 and 2012, raises questions about what shape those models will take, and how hard it would be to make developing them go well \u2014 all of which will only look more important with a decade of hindsight. <\/p>\n<p id=\"uX2lis\">In the last few years, he\u2019s <a href=\"https:\/\/www.cold-takes.com\/ai-safety-seems-hard-to-measure\/\">started to write<\/a> <a href=\"https:\/\/www.cold-takes.com\/ai-could-defeat-all-of-us-combined\/\">about the case<\/a> that AI may be an unfathomably big deal \u2014 and about what we can and can\u2019t learn from the behavior of today\u2019s models. Over that same time period, Open Philanthropy has been <a href=\"https:\/\/www.openphilanthropy.org\/focus\/potential-risks-advanced-ai\/\">investing more<\/a> in making AI go well. And recently, Karnofsky <a href=\"https:\/\/forum.effectivealtruism.org\/posts\/aJwcgm2nqiZu6zq2S\/taking-a-leave-of-absence-from-open-philanthropy-to-work-on\">announced<\/a> a leave of absence from his work at Open Philanthropy to explore working directly on AI risk reduction.<\/p>\n<p id=\"v2FUKH\">The following interview has been edited for length and clarity.<\/p>\n<h4 id=\"C64WdQ\">Kelsey Piper<\/h4>\n<p id=\"JNyXLY\">You\u2019ve written about how AI could mean that things get really crazy in the near future.<\/p>\n<h4 id=\"yPKJ7r\">Holden Karnofsky<\/h4>\n<p id=\"txxUpM\">The basic idea would be: Imagine what the world would look like in the far future after a lot of scientific and technological development. Generally, I think most people would agree the world could look really, really strange and unfamiliar. There\u2019s a lot of science fiction about this. <\/p>\n<p id=\"Se899a\">What is most high stakes about AI, in my opinion, is the idea that AI could potentially serve as a way of automating all the things that humans do to advance science and technology, and so we could get to that wild future a lot faster than people tend to imagine. <\/p>\n<p id=\"nShYXd\">Today, we have a certain number of human scientists who try to push forward science and technology. The day that we\u2019re able to automate everything they do, that could be a massive increase in the amount of scientific and technological advancement that\u2019s getting done. And furthermore, it can create a kind of feedback loop that we don\u2019t have today where basically as you improve your science and technology that leads to a greater supply of hardware and more efficient software that runs a greater number of AIs.<\/p>\n<p id=\"IUK9lj\">And because AIs are the ones doing the science and technology research and advancement, that could go in a loop. If you get that loop, you get very explosive progress. <\/p>\n<p id=\"zohnm8\">The upshot of all this is that the world most people imagine thousands of years from now in some wild sci-fi future could be more like 10 years out or one year out or months out from the point when AI systems are doing all the things that humans typically do to advance science and technology. <\/p>\n<p id=\"77ufz3\">This all <a href=\"https:\/\/www.cold-takes.com\/the-duplicator\/\">follows straightforwardly<\/a> from standard economic growth models, and there are signs of this kind of feedback loop in parts of economic history.<\/p>\n<h4 id=\"QKgDVM\">Kelsey Piper<\/h4>\n<p id=\"GnU8xY\">That sounds great, right? <em>Star Trek<\/em> future overnight? What\u2019s the catch?<\/p>\n<h4 id=\"Xgi2YO\">Holden Karnofsky<\/h4>\n<p id=\"cwNx93\">I think there are big risks. I mean, it could be great. But as you know, I think that if all we do is we kind of sit back and relax and let scientists move as fast as they can, we\u2019ll get some chance of things going great and some chance of some things going terribly. <\/p>\n<p id=\"be3Foq\">I am most focused on standing up where normal market forces will not and trying to push against the probability of things going terribly. In terms of how things could go terribly, maybe I\u2019ll start with the broad intuition: When we talk about scientific progress and economic growth, we\u2019re talking about the few percent per year range. That\u2019s what we\u2019ve seen in the last couple hundred years. That\u2019s all any of us know. <\/p>\n<p id=\"CeJyoS\">But how you would feel about an economic growth rate of, let\u2019s say, 100 percent per year, 1,000 percent per year. Some of how I feel is that we just are not ready for what\u2019s coming. I think society has not really shown any ability to adapt to a rate of change that fast. The appropriate attitude towards the next sort of Industrial Revolution-sized transition is caution. <\/p>\n<p id=\"PFtcLd\">Another broad intuition is that these AI systems we\u2019re building, they might do all the things humans do to automate scientific and technological advancement, but they\u2019re not humans. If we get there, that would be the first time in all of history that we had anything <em>other than humans<\/em> capable of autonomously developing its own new technologies, autonomously advancing science and technology. No one has any idea what that\u2019s going to look like, and I think we shouldn\u2019t assume that the result is going to be good for humans. I think it really depends on how the AIs are designed. <\/p>\n<p id=\"D4NHip\">If you look at this current state of machine learning, it\u2019s just very clear that we have no idea what we\u2019re building. To a first approximation, the way these systems are designed is that someone takes a relatively simple learning algorithm and they pour in an enormous amount of data. They put in the whole internet and it sort of tries to predict one word at a time from the internet and learn from that. That\u2019s an oversimplification, but it\u2019s like they do that and out of that process pops some kind of thing that can talk to you and make jokes and write poetry, but no one really knows why. <\/p>\n<p id=\"QFC8M2\">You can think of it as analogous to human evolution, where there were lots of organisms and some survived and some didn\u2019t and at some point there were humans who have all kinds of things going on in their brains that we still don\u2019t really understand. Evolution is a simple process that resulted in complex beings that we still don\u2019t understand. <\/p>\n<p id=\"dzo3wB\">When Bing chat came out and it started <a href=\"https:\/\/www.nytimes.com\/2023\/02\/16\/technology\/bing-chatbot-transcript.html\">threatening users<\/a> and, you know, trying to seduce them and god knows what, people asked, why is it doing that? And I would say not only do I not know, but no one knows because the people who designed it don\u2019t know, the people who trained it don\u2019t know. <\/p>\n<h4 id=\"H7LAkp\">Kelsey Piper<\/h4>\n<p id=\"ox2R0k\">Some people <a href=\"https:\/\/openai.com\/blog\/planning-for-agi-and-beyond\">have argued<\/a> that yes, you\u2019re right, AI is going to be a huge deal, dramatically transform our world overnight, and that that\u2019s why we should be racing forwards as much as possible because by releasing technology sooner we\u2019ll give society more time to adjust. <\/p>\n<h4 id=\"iXjBGP\">Holden Karnofsky<\/h4>\n<p id=\"ICkwts\">I think there\u2019s some pace at which that would make sense and I think the pace AI could advance may be too fast for that. I think society just takes a while to adjust to anything. <\/p>\n<p id=\"xX01lg\">Most technologies that come out, it takes a long time for them to be appropriately regulated, for them to be appropriately used in  government. People who are not early adopters or tech lovers learn how to use them, integrate them into their lives, learn how to avoid the pitfalls, learn how to deal with the downsides. <\/p>\n<p id=\"pzbGlp\">So I think that if we may be on the cusp of a radical explosion in growth or in technological progress, I don\u2019t really see how rushing forward is supposed to help here. I don\u2019t see how it\u2019s supposed to get us to a rate of change that is slow enough for society to adapt, if we\u2019re pushing forward as fast as we can.<\/p>\n<p id=\"2C0bG8\">I think the better plan is to actually have a societal conversation about what pace we do want to move at and whether we want to slow things down on purpose and whether we want to move a bit more deliberately and if not, how we can have this go in a way that avoids some of the key risks or that reduces some of the key risks. <\/p>\n<h4 id=\"53mygj\">Kelsey Piper<\/h4>\n<p id=\"eLYYA6\">So, say you\u2019re interested in regulating AI, to make some of these changes go better, to reduce the risk of catastrophe. What should we be doing?<\/p>\n<h4 id=\"r6fqQo\">Holden Karnofsky<\/h4>\n<p id=\"IRVdgT\">I am quite worried about people feeling the need to do something just to do something. I think many plausible regulations have a lot of downsides and may not succeed. And I cannot currently articulate specific regulations that I really think are going to be like, definitely good. I think this needs more work. It\u2019s an unsatisfying answer, but I think it\u2019s urgent for people to start thinking through what a good regulatory regime could look like. That is something I\u2019ve been spending increasingly a large amount of my time just thinking through. <\/p>\n<p id=\"o3XIVH\">Is there a way to articulate how we\u2019ll know when the risk of some of these catastrophes is going up from the systems? Can we set triggers so that when we see the signs, we know that the signs are there, we can pre-commit to take action based on those signs to slow things down based on those signs. If we are going to hit a very risky period, I would be focusing on trying to design something that is going to catch that in time and it\u2019s going to recognize when that\u2019s happening and take appropriate action without doing harm. That\u2019s hard to do. And so the earlier you get started thinking about it, the more reflective you get to be.<\/p>\n<h4 id=\"aAJvzb\">Kelsey Piper<\/h4>\n<p id=\"wwKO9l\">What are the biggest things you see people missing or getting wrong about AI?<\/p>\n<h4 id=\"jhY7bN\">Holden Karnofsky<\/h4>\n<p id=\"kvnbSZ\">One, I think people will often get a little tripped up on questions about whether AI will be conscious and whether AI will have feelings and whether AI will have things that it wants.<\/p>\n<p id=\"HuIG4E\">I think this is basically entirely irrelevant. We could easily design systems that don\u2019t have consciousness and don\u2019t have desires, but do have \u201caims\u201d in the sense that a chess-playing AI aims for checkmate. And the way we design systems today, and especially the way I think that things could progress, is very prone to developing these kinds of systems that can act autonomously toward a goal.  <\/p>\n<p id=\"8xlOCt\">Regardless of whether they\u2019re conscious, they could act as if they\u2019re trying to do things that could be dangerous. They may be able to form relationships with humans, convince humans that they\u2019re friends, convince humans that they\u2019re in love. Whether or not they really are, that\u2019s going to be disruptive. <\/p>\n<p id=\"hnR3Jb\">The other misconception that will trip people up is that they will often make this distinction between wacky long-term risks and tangible near-term risks. And I don\u2019t always buy that distinction. I think in some ways the really wacky stuff that I talk about with automation, science, and technology, it\u2019s not really obvious why that will be upon us later than something like mass unemployment. <\/p>\n<p id=\"ezszZb\">I\u2019ve written <a href=\"https:\/\/www.google.com\/amp\/s\/www.cold-takes.com\/technological-unemployment-ai-vs-most-important-century-ai-how-far-apart\/amp\/\">one post<\/a> arguing that it would be quite hard for an AI system to take all the possible jobs that even a pretty low-skill human could have. It\u2019s one thing for it to cause a temporary transition period where some jobs disappear and others appear, like we\u2019ve had many times in the past. It\u2019s another thing for it to get to where there\u2019s absolutely nothing you can do as well as an AI, and I\u2019m not sure we\u2019re gonna see that before we see AI that can do science and technological advancement. It\u2019s really hard to predict what capabilities we\u2019ll see in what order. If we hit the science and technology one, things will move really fast. <\/p>\n<p id=\"AXUz9i\">So the idea that we should focus on \u201cnear term\u201d stuff that may or may not actually be nearer term and then wait to adapt to the wackier stuff as it happens? I don\u2019t know about that. I don\u2019t know that the wacky stuff is going to come later and I don\u2019t know that it\u2019s going to happen slow enough for us to adapt to it. <\/p>\n<p id=\"7eNXI3\">A third point where I think a lot of people get off the boat with my writing is just thinking this is all so wacky, we\u2019re talking about this giant transition for humanity where things will move really fast. That\u2019s just a crazy claim to make. And why would we think that we happen to be in this especially important time period? But it\u2019s actually \u2014 if you just zoom out and you look at basic <a href=\"https:\/\/www.cold-takes.com\/most-important-century\/#we-live-in-a-wild-time-and-should-be-ready-for-anything\">charts and timelines<\/a> of historical events and technological advancement in the history of humanity, there\u2019s just a lot of reasons to think that we\u2019re already on an accelerating trend and that we already live in a weird time. <\/p>\n<p id=\"MsS2fn\">I think we all need to be very open to the idea that the next big transition \u2014 something as big and accelerating as the Neolithic Revolution or Industrial Revolution or bigger \u2014 could kind of come any time. I don\u2019t think we should be sitting around thinking that we have a super strong default that nothing weird can happen. <\/p>\n<h4 id=\"L4zQcs\">Kelsey Piper<\/h4>\n<p id=\"1YXWCj\">I want to end on something of a hopeful note. What if humanity really gets our act together, if we spend the next decade, like <a href=\"https:\/\/www.cold-takes.com\/jobs-that-can-help-with-the-most-important-century\/\">working really hard on a good approach to this<\/a> and we succeed at some coordination and we succeed somewhat on the technical side? What would that look like? <\/p>\n<h4 id=\"nWyRhe\">Holden Karnofsky<\/h4>\n<p id=\"Jf3qMF\">I think in some ways it\u2019s important to contend with the incredible uncertainty ahead of us. And the fact that even if we do a great job and are very rational and come together as humanity and do all the right things, things might just move too fast and we might just still have a catastrophe. <\/p>\n<p id=\"2H7cW3\">On the flip side \u2014 I\u2019ve used the term \u201c<a href=\"https:\/\/www.lesswrong.com\/posts\/jwhcXmigv2LTrbBiB\/success-without-dignity-a-nearcasting-story-of-avoiding\">success without dignity<\/a>\u201d \u2014 maybe we could do basically nothing right and still be fine. <\/p>\n<p id=\"2wF1sb\">So I think both of those are true and I think all possibilities are open and it\u2019s important to keep that in mind. But if you want me to focus on the optimistic vision, I think there are a number of people today who work on alignment research, which is trying to kind of demystify these AI systems and make it less the case that we have these mysterious minds that we know nothing about and more the case that we understand where they\u2019re coming from. They can help us know what is going on inside them and to be able to design them so that they truly are things that help humans do what humans are trying to do, rather than things that have aims of their own and go off in random directions and steer the world in random ways.<\/p>\n<p id=\"QRsctn\">Then I am hopeful that in the future there will be a regime developed around standards and monitoring of AI. The idea being that there\u2019s a shared sense that systems demonstrating certain properties are dangerous and those systems need to be contained, stopped, not deployed, sometimes not trained in the first place. And that regime is enforced through a combination of maybe self-regulation, but also government regulation, also international action. <\/p>\n<p id=\"u4Z9l0\">If you get those things, then it\u2019s not too hard to imagine a world where AI is first developed by companies that are adhering to the standards, companies that have a good awareness of the risks, and that are being appropriately regulated and monitored and that therefore the first super powerful AIs that might be able to do all the things humans do to advance science and technology are in fact safe and are in fact used with a priority of making the overall situation safer. <\/p>\n<p id=\"pd8Nlo\">For example, they might be used to develop even better alignment methods to make other AI systems easier to make safe, or used to develop better methods of enforcing standards and monitoring. And so you could get a loop where you have early, very powerful systems being used to increase the safety factor of later very powerful systems. And then you end up in a world where we have a lot of powerful systems, but they\u2019re all basically doing what they\u2019re supposed to be doing. They\u2019re all secure, they\u2019re not being stolen by aggressive espionage programs. And that just becomes essentially a force multiplier on human progress as it\u2019s been to date. <\/p>\n<p id=\"FHkf5b\">And so, with a lot of bumps in the road and a lot of uncertainty and a lot of complexity, a world like that might just end us up in the future where health has greatly improved, where we have a huge supply of clean energy, where social science has advanced. I think we could just end up in a world that is a lot better than today in the same sense that I do believe today is a lot better than a couple hundred years ago. <\/p>\n<p id=\"3TxUV1\">So I think there is a potential very happy ending here. If we meet the challenge well, it will increase the odds, but I actually do think we could get catastrophe or a great ending regardless because I think everything is very uncertain.<\/p>\n<p id=\"WFqE23\"><em><strong>Clarification, March 20, 1:30 pm ET: <\/strong><\/em><em>This story has been updated to explain Holden Karnofsky\u2019s former status as a board member of OpenAI and to note Open Philanthropy\u2019s past grant to OpenAI.<\/em><\/p>\n<div data-cid=\"site\/article_footer-1679496854_7157_47807\" data-cdata=\"{\"base_type\":\"Entry\",\"id\":23409054,\"timestamp\":1679144400,\"published_timestamp\":1679144400,\"show_published_and_updated_timestamps\":false,\"title\":\"Can society adjust at the speed of artificial intelligence? \",\"type\":\"Article\",\"url\":\"https:\/\/www.vox.com\/future-perfect\/2023\/3\/18\/23645013\/openai-gpt4-holden-karnofsky-artificial-intelligence-ai-safety-existential-risk\",\"entry_layout\":{\"key\":\"unison_standard\",\"layout\":\"unison_main\",\"template\":\"standard\"},\"additional_byline\":null,\"authors\":[{\"id\":5296687,\"name\":\"Kelsey Piper\",\"url\":\"https:\/\/www.vox.com\/authors\/kelsey-piper\",\"twitter_handle\":\"\",\"profile_image_url\":\"https:\/\/cdn.vox-cdn.com\/thumbor\/LHe6jPR2UsTRjhjaRJg5wRJrEBw=\/512x512\/cdn.vox-cdn.com\/author_profile_images\/191475\/Screen_Shot_2018-09-25_at_11.18.29_AM.0.png\",\"title\":\"\",\"email\":\"\",\"short_author_bio\":\"is a senior writer at Future Perfect, Vox\u2019s effective altruism-inspired section on the world\u2019s biggest challenges. She explores wide-ranging topics like climate change, artificial intelligence, vaccine development, and factory farms, and also writes the Future Perfect newsletter.\"}],\"byline_enabled\":true,\"byline_credit_text\":\"By\",\"byline_serial_comma_enabled\":true,\"comment_count\":0,\"comments_enabled\":false,\"legacy_comments_enabled\":false,\"coral_comments_enabled\":false,\"coral_comment_counts_enabled\":false,\"commerce_disclosure\":null,\"community_name\":\"Vox\",\"community_url\":\"https:\/\/www.vox.com\/\",\"community_logo\":\"rn<svg width=\"386px\" height=\"385px\" viewBox=\"0 0 386 385\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" xmlns:xlink=\"http:\/\/www.w3.org\/1999\/xlink\" >rn    rn    <title>vox-mark<\/title>rn    rn    <defs><\/defs>rn    <g id=\"Page-1\" stroke=\"none\" stroke-width=\"1\" fill=\"none\" fill-rule=\"evenodd\" >rn        <path d=\"M239.811,0 L238.424,6 L259.374,6 C278.011,6 292.908,17.38 292.908,43.002 C292.908,56.967 287.784,75.469 276.598,96.888 L182.689,305.687 L159.283,35.693 C159.283,13.809 168.134,6 191.88,6 L205.854,6 L207.247,0 L1.409,0 L0,6 L13.049,6 C28.88,6 35.863,15.885 37.264,34.514 L73.611,385 L160.221,385 L304.525,79.217 C328.749,31.719 349.237,6 372.525,6 L384.162,6 L385.557,0 L239.811,0\" id=\"vox-mark\" fill=\"#444745\" ><\/path>rn    <\/g>rn<\/svg>&#8220;,&#8221;cross_community&#8221;:false,&#8221;groups&#8221;:[{&#8220;base_type&#8221;:&#8221;EntryGroup&#8221;,&#8221;id&#8221;:76815,&#8221;timestamp&#8221;:1679492267,&#8221;title&#8221;:&#8221;Future Perfect&#8221;,&#8221;type&#8221;:&#8221;SiteGroup&#8221;,&#8221;url&#8221;:&#8221;https:\/\/www.vox.com\/future-perfect&#8221;,&#8221;slug&#8221;:&#8221;future-perfect&#8221;,&#8221;community_logo&#8221;:&#8221;rn<svg width=\"386px\" height=\"385px\" viewBox=\"0 0 386 385\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" xmlns:xlink=\"http:\/\/www.w3.org\/1999\/xlink\" >rn    rn    <title>vox-mark<\/title>rn    rn    <defs><\/defs>rn    <g id=\"Page-1\" stroke=\"none\" stroke-width=\"1\" fill=\"none\" fill-rule=\"evenodd\" >rn        <path d=\"M239.811,0 L238.424,6 L259.374,6 C278.011,6 292.908,17.38 292.908,43.002 C292.908,56.967 287.784,75.469 276.598,96.888 L182.689,305.687 L159.283,35.693 C159.283,13.809 168.134,6 191.88,6 L205.854,6 L207.247,0 L1.409,0 L0,6 L13.049,6 C28.88,6 35.863,15.885 37.264,34.514 L73.611,385 L160.221,385 L304.525,79.217 C328.749,31.719 349.237,6 372.525,6 L384.162,6 L385.557,0 L239.811,0\" id=\"vox-mark\" fill=\"#444745\" ><\/path>rn    <\/g>rn<\/svg>&#8220;,&#8221;community_name&#8221;:&#8221;Vox&#8221;,&#8221;community_url&#8221;:&#8221;https:\/\/www.vox.com\/&#8221;,&#8221;cross_community&#8221;:false,&#8221;entry_count&#8221;:1518,&#8221;always_show&#8221;:false,&#8221;description&#8221;:&#8221;Finding the best ways to do good. &#8220;,&#8221;disclosure&#8221;:&#8221;&#8221;,&#8221;cover_image_url&#8221;:&#8221;&#8221;,&#8221;cover_image&#8221;:null,&#8221;title_image_url&#8221;:&#8221;https:\/\/cdn.vox-cdn.com\/uploads\/chorus_asset\/file\/16290809\/future_perfect_sized.0.jpg&#8221;,&#8221;intro_image&#8221;:null,&#8221;four_up_see_more_text&#8221;:&#8221;View All&#8221;,&#8221;primary&#8221;:true},{&#8220;base_type&#8221;:&#8221;EntryGroup&#8221;,&#8221;id&#8221;:27524,&#8221;timestamp&#8221;:1679485943,&#8221;title&#8221;:&#8221;Technology&#8221;,&#8221;type&#8221;:&#8221;SiteGroup&#8221;,&#8221;url&#8221;:&#8221;https:\/\/www.vox.com\/technology&#8221;,&#8221;slug&#8221;:&#8221;technology&#8221;,&#8221;community_logo&#8221;:&#8221;rn<svg width=\"386px\" height=\"385px\" viewBox=\"0 0 386 385\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" xmlns:xlink=\"http:\/\/www.w3.org\/1999\/xlink\" >rn    rn    <title>vox-mark<\/title>rn    rn    <defs><\/defs>rn    <g id=\"Page-1\" stroke=\"none\" stroke-width=\"1\" fill=\"none\" fill-rule=\"evenodd\" >rn        <path d=\"M239.811,0 L238.424,6 L259.374,6 C278.011,6 292.908,17.38 292.908,43.002 C292.908,56.967 287.784,75.469 276.598,96.888 L182.689,305.687 L159.283,35.693 C159.283,13.809 168.134,6 191.88,6 L205.854,6 L207.247,0 L1.409,0 L0,6 L13.049,6 C28.88,6 35.863,15.885 37.264,34.514 L73.611,385 L160.221,385 L304.525,79.217 C328.749,31.719 349.237,6 372.525,6 L384.162,6 L385.557,0 L239.811,0\" id=\"vox-mark\" fill=\"#444745\" ><\/path>rn    <\/g>rn<\/svg>&#8220;,&#8221;community_name&#8221;:&#8221;Vox&#8221;,&#8221;community_url&#8221;:&#8221;https:\/\/www.vox.com\/&#8221;,&#8221;cross_community&#8221;:false,&#8221;entry_count&#8221;:24324,&#8221;always_show&#8221;:false,&#8221;description&#8221;:&#8221;Uncovering and explaining how our digital world is changing \u2014 and changing us.&#8221;,&#8221;disclosure&#8221;:&#8221;&#8221;,&#8221;cover_image_url&#8221;:&#8221;&#8221;,&#8221;cover_image&#8221;:null,&#8221;title_image_url&#8221;:&#8221;&#8221;,&#8221;intro_image&#8221;:null,&#8221;four_up_see_more_text&#8221;:&#8221;View All&#8221;,&#8221;primary&#8221;:false},{&#8220;base_type&#8221;:&#8221;EntryGroup&#8221;,&#8221;id&#8221;:80311,&#8221;timestamp&#8221;:1679313494,&#8221;title&#8221;:&#8221;Artificial Intelligence&#8221;,&#8221;type&#8221;:&#8221;SiteGroup&#8221;,&#8221;url&#8221;:&#8221;https:\/\/www.vox.com\/artificial-intelligence&#8221;,&#8221;slug&#8221;:&#8221;artificial-intelligence&#8221;,&#8221;community_logo&#8221;:&#8221;rn<svg width=\"386px\" height=\"385px\" viewBox=\"0 0 386 385\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" xmlns:xlink=\"http:\/\/www.w3.org\/1999\/xlink\" >rn    rn    <title>vox-mark<\/title>rn    rn    <defs><\/defs>rn    <g id=\"Page-1\" stroke=\"none\" stroke-width=\"1\" fill=\"none\" fill-rule=\"evenodd\" >rn        <path d=\"M239.811,0 L238.424,6 L259.374,6 C278.011,6 292.908,17.38 292.908,43.002 C292.908,56.967 287.784,75.469 276.598,96.888 L182.689,305.687 L159.283,35.693 C159.283,13.809 168.134,6 191.88,6 L205.854,6 L207.247,0 L1.409,0 L0,6 L13.049,6 C28.88,6 35.863,15.885 37.264,34.514 L73.611,385 L160.221,385 L304.525,79.217 C328.749,31.719 349.237,6 372.525,6 L384.162,6 L385.557,0 L239.811,0\" id=\"vox-mark\" fill=\"#444745\" ><\/path>rn    <\/g>rn<\/svg>&#8220;,&#8221;community_name&#8221;:&#8221;Vox&#8221;,&#8221;community_url&#8221;:&#8221;https:\/\/www.vox.com\/&#8221;,&#8221;cross_community&#8221;:false,&#8221;entry_count&#8221;:338,&#8221;always_show&#8221;:false,&#8221;description&#8221;:&#8221;Vox&#8217;s coverage of artificial intelligence.&#8221;,&#8221;disclosure&#8221;:&#8221;&#8221;,&#8221;cover_image_url&#8221;:&#8221;&#8221;,&#8221;cover_image&#8221;:null,&#8221;title_image_url&#8221;:&#8221;&#8221;,&#8221;intro_image&#8221;:null,&#8221;four_up_see_more_text&#8221;:&#8221;View All&#8221;,&#8221;primary&#8221;:false},{&#8220;base_type&#8221;:&#8221;EntryGroup&#8221;,&#8221;id&#8221;:80357,&#8221;timestamp&#8221;:1679144423,&#8221;title&#8221;:&#8221;Emerging Tech&#8221;,&#8221;type&#8221;:&#8221;SiteGroup&#8221;,&#8221;url&#8221;:&#8221;https:\/\/www.vox.com\/emerging-tech&#8221;,&#8221;slug&#8221;:&#8221;emerging-tech&#8221;,&#8221;community_logo&#8221;:&#8221;rn<svg width=\"386px\" height=\"385px\" viewBox=\"0 0 386 385\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" xmlns:xlink=\"http:\/\/www.w3.org\/1999\/xlink\" >rn    rn    <title>vox-mark<\/title>rn    rn    <defs><\/defs>rn    <g id=\"Page-1\" stroke=\"none\" stroke-width=\"1\" fill=\"none\" fill-rule=\"evenodd\" >rn        <path d=\"M239.811,0 L238.424,6 L259.374,6 C278.011,6 292.908,17.38 292.908,43.002 C292.908,56.967 287.784,75.469 276.598,96.888 L182.689,305.687 L159.283,35.693 C159.283,13.809 168.134,6 191.88,6 L205.854,6 L207.247,0 L1.409,0 L0,6 L13.049,6 C28.88,6 35.863,15.885 37.264,34.514 L73.611,385 L160.221,385 L304.525,79.217 C328.749,31.719 349.237,6 372.525,6 L384.162,6 L385.557,0 L239.811,0\" id=\"vox-mark\" fill=\"#444745\" ><\/path>rn    <\/g>rn<\/svg>&#8220;,&#8221;community_name&#8221;:&#8221;Vox&#8221;,&#8221;community_url&#8221;:&#8221;https:\/\/www.vox.com\/&#8221;,&#8221;cross_community&#8221;:false,&#8221;entry_count&#8221;:154,&#8221;always_show&#8221;:false,&#8221;description&#8221;:&#8221;&#8221;,&#8221;disclosure&#8221;:&#8221;&#8221;,&#8221;cover_image_url&#8221;:&#8221;&#8221;,&#8221;cover_image&#8221;:null,&#8221;title_image_url&#8221;:&#8221;&#8221;,&#8221;intro_image&#8221;:null,&#8221;four_up_see_more_text&#8221;:&#8221;View All&#8221;,&#8221;primary&#8221;:false},{&#8220;base_type&#8221;:&#8221;EntryGroup&#8221;,&#8221;id&#8221;:102794,&#8221;timestamp&#8221;:1679313494,&#8221;title&#8221;:&#8221;Innovation&#8221;,&#8221;type&#8221;:&#8221;SiteGroup&#8221;,&#8221;url&#8221;:&#8221;https:\/\/www.vox.com\/innovation&#8221;,&#8221;slug&#8221;:&#8221;innovation&#8221;,&#8221;community_logo&#8221;:&#8221;rn<svg width=\"386px\" height=\"385px\" viewBox=\"0 0 386 385\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" xmlns:xlink=\"http:\/\/www.w3.org\/1999\/xlink\" >rn    rn    <title>vox-mark<\/title>rn    rn    <defs><\/defs>rn    <g id=\"Page-1\" stroke=\"none\" stroke-width=\"1\" fill=\"none\" fill-rule=\"evenodd\" >rn        <path d=\"M239.811,0 L238.424,6 L259.374,6 C278.011,6 292.908,17.38 292.908,43.002 C292.908,56.967 287.784,75.469 276.598,96.888 L182.689,305.687 L159.283,35.693 C159.283,13.809 168.134,6 191.88,6 L205.854,6 L207.247,0 L1.409,0 L0,6 L13.049,6 C28.88,6 35.863,15.885 37.264,34.514 L73.611,385 L160.221,385 L304.525,79.217 C328.749,31.719 349.237,6 372.525,6 L384.162,6 L385.557,0 L239.811,0\" id=\"vox-mark\" fill=\"#444745\" ><\/path>rn    <\/g>rn<\/svg>&#8220;,&#8221;community_name&#8221;:&#8221;Vox&#8221;,&#8221;community_url&#8221;:&#8221;https:\/\/www.vox.com\/&#8221;,&#8221;cross_community&#8221;:false,&#8221;entry_count&#8221;:135,&#8221;always_show&#8221;:false,&#8221;description&#8221;:&#8221;&#8221;,&#8221;disclosure&#8221;:&#8221;&#8221;,&#8221;cover_image_url&#8221;:&#8221;&#8221;,&#8221;cover_image&#8221;:null,&#8221;title_image_url&#8221;:&#8221;&#8221;,&#8221;intro_image&#8221;:null,&#8221;four_up_see_more_text&#8221;:&#8221;View All&#8221;,&#8221;primary&#8221;:false}],&#8221;internal_groups&#8221;:[{&#8220;base_type&#8221;:&#8221;EntryGroup&#8221;,&#8221;id&#8221;:112403,&#8221;timestamp&#8221;:1679486407,&#8221;title&#8221;:&#8221;Approach \u2014 Dissects something complicated&#8221;,&#8221;type&#8221;:&#8221;SiteGroup&#8221;,&#8221;url&#8221;:&#8221;&#8221;,&#8221;slug&#8221;:&#8221;approach-dissects-something-complicated&#8221;,&#8221;community_logo&#8221;:&#8221;rn<svg width=\"386px\" height=\"385px\" viewBox=\"0 0 386 385\" version=\"1.1\" xmlns=\"http:\/\/www.w3.org\/2000\/svg\" xmlns:xlink=\"http:\/\/www.w3.org\/1999\/xlink\" >rn    rn    <title>vox-mark<\/title>rn    rn    <defs><\/defs>rn    <g id=\"Page-1\" stroke=\"none\" stroke-width=\"1\" fill=\"none\" fill-rule=\"evenodd\" >rn        <path d=\"M239.811,0 L238.424,6 L259.374,6 C278.011,6 292.908,17.38 292.908,43.002 C292.908,56.967 287.784,75.469 276.598,96.888 L182.689,305.687 L159.283,35.693 C159.283,13.809 168.134,6 191.88,6 L205.854,6 L207.247,0 L1.409,0 L0,6 L13.049,6 C28.88,6 35.863,15.885 37.264,34.514 L73.611,385 L160.221,385 L304.525,79.217 C328.749,31.719 349.237,6 372.525,6 L384.162,6 L385.557,0 L239.811,0\" id=\"vox-mark\" fill=\"#444745\" ><\/path>rn    <\/g>rn<\/svg>&#8220;,&#8221;community_name&#8221;:&#8221;Vox&#8221;,&#8221;community_url&#8221;:&#8221;https:\/\/www.vox.com\/&#8221;,&#8221;cross_community&#8221;:false,&#8221;entry_count&#8221;:42,&#8221;always_show&#8221;:false,&#8221;description&#8221;:&#8221;&#8221;,&#8221;disclosure&#8221;:&#8221;&#8221;,&#8221;cover_image_url&#8221;:&#8221;&#8221;,&#8221;cover_image&#8221;:null,&#8221;title_image_url&#8221;:&#8221;&#8221;,&#8221;intro_image&#8221;:null,&#8221;four_up_see_more_text&#8221;:&#8221;View All&#8221;}],&#8221;image&#8221;:{&#8220;ratio&#8221;:&#8221;*&#8221;,&#8221;original_url&#8221;:&#8221;https:\/\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088756\/GettyImages_1248378022.0.jpg&#8221;,&#8221;network&#8221;:&#8221;unison&#8221;,&#8221;bgcolor&#8221;:&#8221;white&#8221;,&#8221;pinterest_enabled&#8221;:false,&#8221;caption&#8221;:null,&#8221;credit&#8221;:&#8221;NurPhoto via Getty Images&#8221;,&#8221;focal_area&#8221;:{&#8220;top_left_x&#8221;:3366,&#8221;top_left_y&#8221;:1742,&#8221;bottom_right_x&#8221;:4646,&#8221;bottom_right_y&#8221;:3022},&#8221;bounds&#8221;:[0,0,8011,4763],&#8221;uploaded_size&#8221;:{&#8220;width&#8221;:8011,&#8221;height&#8221;:4763},&#8221;focal_point&#8221;:null,&#8221;image_id&#8221;:72088756,&#8221;alt_text&#8221;:&#8221;A graphic with horizontal purple and green lines, over which \u201cGPT-4\u201d and a flower shape are imposed.&#8221;},&#8221;hub_image&#8221;:{&#8220;ratio&#8221;:&#8221;*&#8221;,&#8221;original_url&#8221;:&#8221;https:\/\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088756\/GettyImages_1248378022.0.jpg&#8221;,&#8221;network&#8221;:&#8221;unison&#8221;,&#8221;bgcolor&#8221;:&#8221;white&#8221;,&#8221;pinterest_enabled&#8221;:false,&#8221;caption&#8221;:null,&#8221;credit&#8221;:&#8221;NurPhoto via Getty Images&#8221;,&#8221;focal_area&#8221;:{&#8220;top_left_x&#8221;:3366,&#8221;top_left_y&#8221;:1742,&#8221;bottom_right_x&#8221;:4646,&#8221;bottom_right_y&#8221;:3022},&#8221;bounds&#8221;:[0,0,8011,4763],&#8221;uploaded_size&#8221;:{&#8220;width&#8221;:8011,&#8221;height&#8221;:4763},&#8221;focal_point&#8221;:null,&#8221;image_id&#8221;:72088756,&#8221;alt_text&#8221;:&#8221;A graphic with horizontal purple and green lines, over which \u201cGPT-4\u201d and a flower shape are imposed.&#8221;},&#8221;lede_image&#8221;:{&#8220;ratio&#8221;:&#8221;*&#8221;,&#8221;original_url&#8221;:&#8221;https:\/\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088761\/GettyImages_1248378022.0.jpg&#8221;,&#8221;network&#8221;:&#8221;unison&#8221;,&#8221;bgcolor&#8221;:&#8221;white&#8221;,&#8221;pinterest_enabled&#8221;:false,&#8221;caption&#8221;:null,&#8221;credit&#8221;:&#8221;NurPhoto via Getty Images&#8221;,&#8221;focal_area&#8221;:{&#8220;top_left_x&#8221;:3366,&#8221;top_left_y&#8221;:1742,&#8221;bottom_right_x&#8221;:4646,&#8221;bottom_right_y&#8221;:3022},&#8221;bounds&#8221;:[0,0,8011,4763],&#8221;uploaded_size&#8221;:{&#8220;width&#8221;:8011,&#8221;height&#8221;:4763},&#8221;focal_point&#8221;:null,&#8221;image_id&#8221;:72088761,&#8221;alt_text&#8221;:&#8221;A graphic with horizontal purple and green lines, over which \u201cGPT-4\u201d and a flower shape are imposed.&#8221;},&#8221;group_cover_image&#8221;:null,&#8221;picture_standard_lead_image&#8221;:{&#8220;ratio&#8221;:&#8221;*&#8221;,&#8221;original_url&#8221;:&#8221;https:\/\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088761\/GettyImages_1248378022.0.jpg&#8221;,&#8221;network&#8221;:&#8221;unison&#8221;,&#8221;bgcolor&#8221;:&#8221;white&#8221;,&#8221;pinterest_enabled&#8221;:false,&#8221;caption&#8221;:null,&#8221;credit&#8221;:&#8221;NurPhoto via Getty Images&#8221;,&#8221;focal_area&#8221;:{&#8220;top_left_x&#8221;:3366,&#8221;top_left_y&#8221;:1742,&#8221;bottom_right_x&#8221;:4646,&#8221;bottom_right_y&#8221;:3022},&#8221;bounds&#8221;:[0,0,8011,4763],&#8221;uploaded_size&#8221;:{&#8220;width&#8221;:8011,&#8221;height&#8221;:4763},&#8221;focal_point&#8221;:null,&#8221;image_id&#8221;:72088761,&#8221;alt_text&#8221;:&#8221;A graphic with horizontal purple and green lines, over which \u201cGPT-4\u201d and a flower shape are imposed.&#8221;,&#8221;picture_element&#8221;:{&#8220;html&#8221;:{},&#8221;alt&#8221;:&#8221;A graphic with horizontal purple and green lines, over which \u201cGPT-4\u201d and a flower shape are imposed.&#8221;,&#8221;default&#8221;:{&#8220;srcset&#8221;:&#8221;https:\/\/cdn.vox-cdn.com\/thumbor\/1fBNgNqWMwZKlcQ1n-yyBMVsPtQ=\/0x0:8011&#215;4763\/320&#215;240\/filters:focal(3366&#215;1742:4646&#215;3022)\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088761\/GettyImages_1248378022.0.jpg 320w, https:\/\/cdn.vox-cdn.com\/thumbor\/xkuGsN9a9gtm53PyeOUc0uBtEU8=\/0x0:8011&#215;4763\/620&#215;465\/filters:focal(3366&#215;1742:4646&#215;3022)\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088761\/GettyImages_1248378022.0.jpg 620w, https:\/\/cdn.vox-cdn.com\/thumbor\/TdFP-gdes_jNdHH9wnCs-w7Lxg8=\/0x0:8011&#215;4763\/920&#215;690\/filters:focal(3366&#215;1742:4646&#215;3022)\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088761\/GettyImages_1248378022.0.jpg 920w, https:\/\/cdn.vox-cdn.com\/thumbor\/8q8UqvfAVIhfaipQEdbh0OHWVHo=\/0x0:8011&#215;4763\/1220&#215;915\/filters:focal(3366&#215;1742:4646&#215;3022)\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088761\/GettyImages_1248378022.0.jpg 1220w, https:\/\/cdn.vox-cdn.com\/thumbor\/RC2rWC99cmPszxIg1VB1dU7aWUQ=\/0x0:8011&#215;4763\/1520&#215;1140\/filters:focal(3366&#215;1742:4646&#215;3022)\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088761\/GettyImages_1248378022.0.jpg 1520w&#8221;,&#8221;webp_srcset&#8221;:&#8221;https:\/\/cdn.vox-cdn.com\/thumbor\/s_mA_XB_NT0LwJD7e9u6XYCI85U=\/0x0:8011&#215;4763\/320&#215;240\/filters:focal(3366&#215;1742:4646&#215;3022):format(webp)\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088761\/GettyImages_1248378022.0.jpg 320w, https:\/\/cdn.vox-cdn.com\/thumbor\/8hr0cRsBjypxF4yff5I7MR4H05w=\/0x0:8011&#215;4763\/620&#215;465\/filters:focal(3366&#215;1742:4646&#215;3022):format(webp)\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088761\/GettyImages_1248378022.0.jpg 620w, https:\/\/cdn.vox-cdn.com\/thumbor\/005Usrb-3uagYLjGID5-FwkuMcg=\/0x0:8011&#215;4763\/920&#215;690\/filters:focal(3366&#215;1742:4646&#215;3022):format(webp)\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088761\/GettyImages_1248378022.0.jpg 920w, https:\/\/cdn.vox-cdn.com\/thumbor\/vaBayroHYfh0UdSm6DQNumOvjYk=\/0x0:8011&#215;4763\/1220&#215;915\/filters:focal(3366&#215;1742:4646&#215;3022):format(webp)\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088761\/GettyImages_1248378022.0.jpg 1220w, https:\/\/cdn.vox-cdn.com\/thumbor\/u4WS7TBzBLcXgSESe6ukIjOizt0=\/0x0:8011&#215;4763\/1520&#215;1140\/filters:focal(3366&#215;1742:4646&#215;3022):format(webp)\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088761\/GettyImages_1248378022.0.jpg 1520w&#8221;,&#8221;media&#8221;:null,&#8221;sizes&#8221;:&#8221;(min-width: 809px) 485px, (min-width: 600px) 60vw, 100vw&#8221;,&#8221;fallback&#8221;:&#8221;https:\/\/cdn.vox-cdn.com\/thumbor\/z228EQoC1HfHMe64_K216_t46pI=\/0x0:8011&#215;4763\/1200&#215;900\/filters:focal(3366&#215;1742:4646&#215;3022)\/cdn.vox-cdn.com\/uploads\/chorus_image\/image\/72088761\/GettyImages_1248378022.0.jpg&#8221;},&#8221;art_directed&#8221;:[]}},&#8221;image_is_placeholder&#8221;:false,&#8221;image_is_hidden&#8221;:false,&#8221;network&#8221;:&#8221;vox&#8221;,&#8221;omits_labels&#8221;:true,&#8221;optimizable&#8221;:false,&#8221;promo_headline&#8221;:&#8221;Can society adjust at the speed of artificial intelligence? &#8220;,&#8221;recommended_count&#8221;:0,&#8221;recs_enabled&#8221;:false,&#8221;slug&#8221;:&#8221;future-perfect\/2023\/3\/18\/23645013\/openai-gpt4-holden-karnofsky-artificial-intelligence-ai-safety-existential-risk&#8221;,&#8221;dek&#8221;:&#8221;An AI safety expert on why GPT-4 is just the beginning.&#8221;,&#8221;homepage_title&#8221;:&#8221;Can society adjust at the speed of artificial intelligence? &#8220;,&#8221;homepage_description&#8221;:&#8221;An AI safety expert on why GPT-4 is just the beginning.&#8221;,&#8221;show_homepage_description&#8221;:false,&#8221;title_display&#8221;:&#8221;Can society adjust at the speed of artificial intelligence? &#8220;,&#8221;pull_quote&#8221;:null,&#8221;voxcreative&#8221;:false,&#8221;show_entry_time&#8221;:true,&#8221;show_dates&#8221;:true,&#8221;paywalled_content&#8221;:false,&#8221;paywalled_content_box_logo_url&#8221;:&#8221;&#8221;,&#8221;paywalled_content_page_logo_url&#8221;:&#8221;&#8221;,&#8221;paywalled_content_main_url&#8221;:&#8221;&#8221;,&#8221;article_footer_body&#8221;:&#8221;Most news outlets make their money through advertising or subscriptions. But when it comes to what we\u2019re trying to do at Vox, there are a couple of issues with relying on ads and subscriptions to keep the lights on:rn<\/p>\n<p>rn1. Advertising dollars go up and down with the economy. We often only know a few months out what our advertising revenue will be, which makes it hard to plan.rn<\/p>\n<p>rn2. We\u2019re not in the subscriptions business. Vox is here to help everyone understand the complex issues shaping the world \u2014 we believe that\u2019s an important part of building a more equal society. And we can\u2019t do that if we have a paywall. rn<\/p>\n<p>rnIt\u2019s important that we have several ways we make money, and that\u2019s why we ask readers for help keeping Vox free. <b>Our goal today is for 47 Vox readers to pledge an annual financial gift to Vox to help us keep our work free for everyone.<\/b> <a href=\"rnhttp:\/\/vox.com\/pages\/support-now?itm_campaign=47-annual&#038;itm_medium=site&#038;itm_source=article-footer\">Will you become one of them? <\/a>&#8220;,&#8221;article_footer_header&#8221;:&#8221;<a href=\"http:\/\/vox.com\/pages\/support-now?itm_campaign=47-annual&#038;itm_medium=site&#038;itm_source=article-footer\">We have a request<\/a>&#8220;,&#8221;use_article_footer&#8221;:true,&#8221;article_footer_cta_annual_plans&#8221;:&#8221;{rn  &#8220;default_plan&#8221;: 1,rn  &#8220;plans&#8221;: [rn    {rn      &#8220;amount&#8221;: 95,rn      &#8220;plan_id&#8221;: 74295rn    },rn    {rn      &#8220;amount&#8221;: 120,rn      &#8220;plan_id&#8221;: 81108rn    },rn    {rn      &#8220;amount&#8221;: 250,rn      &#8220;plan_id&#8221;: 77096rn    },rn    {rn      &#8220;amount&#8221;: 350,rn      &#8220;plan_id&#8221;: 92038rn    }rn  ]rn}&#8221;,&#8221;article_footer_cta_button_annual_copy&#8221;:&#8221;year&#8221;,&#8221;article_footer_cta_button_copy&#8221;:&#8221;Yes, I&#8217;ll give&#8221;,&#8221;article_footer_cta_button_monthly_copy&#8221;:&#8221;month&#8221;,&#8221;article_footer_cta_default_frequency&#8221;:&#8221;annual&#8221;,&#8221;article_footer_cta_monthly_plans&#8221;:&#8221;{rn  &#8220;default_plan&#8221;: 1,rn  &#8220;plans&#8221;: [rn    {rn      &#8220;amount&#8221;: 9,rn      &#8220;plan_id&#8221;: 77780rn    },rn    {rn      &#8220;amount&#8221;: 20,rn      &#8220;plan_id&#8221;: 69279rn    },rn    {rn      &#8220;amount&#8221;: 50,rn      &#8220;plan_id&#8221;: 46947rn    },rn    {rn      &#8220;amount&#8221;: 100,rn      &#8220;plan_id&#8221;: 46782rn    }rn  ]rn}&#8221;,&#8221;article_footer_cta_once_plans&#8221;:&#8221;{rn  &#8220;default_plan&#8221;: 0,rn  &#8220;plans&#8221;: [rn    {rn      &#8220;amount&#8221;: 20,rn      &#8220;plan_id&#8221;: 69278rn    },rn    {rn      &#8220;amount&#8221;: 50,rn      &#8220;plan_id&#8221;: 48880rn    },rn    {rn      &#8220;amount&#8221;: 100,rn      &#8220;plan_id&#8221;: 46607rn    },rn    {rn      &#8220;amount&#8221;: 250,rn      &#8220;plan_id&#8221;: 46946rn    }rn  ]rn}&#8221;,&#8221;use_article_footer_cta_read_counter&#8221;:true,&#8221;use_article_footer_cta&#8221;:true,&#8221;featured_placeable&#8221;:false,&#8221;video_placeable&#8221;:false,&#8221;disclaimer&#8221;:null,&#8221;volume_placement&#8221;:&#8221;lede&#8221;,&#8221;video_autoplay&#8221;:false,&#8221;youtube_url&#8221;:&#8221;http:\/\/bit.ly\/voxyoutube&#8221;,&#8221;facebook_video_url&#8221;:&#8221;&#8221;,&#8221;play_in_modal&#8221;:true,&#8221;user_preferences_for_privacy_enabled&#8221;:false,&#8221;show_branded_logos&#8221;:true}&#8221;><\/p>\n<div>\n<p><strong><a href=\"http:\/\/vox.com\/pages\/support-now?itm_campaign=47-annual&#038;itm_medium=site&#038;itm_source=article-footer\">We have a request<\/a><\/strong><\/p>\n<div>\n<p>\n      Most news outlets make their money through advertising or subscriptions. But when it comes to what we\u2019re trying to do at Vox, there are a couple of issues with relying on ads and subscriptions to keep the lights on:\n<\/p>\n<p>\n1. Advertising dollars go up and down with the economy. We often only know a few months out what our advertising revenue will be, which makes it hard to plan.\n<\/p>\n<p>\n2. We\u2019re not in the subscriptions business. Vox is here to help everyone understand the complex issues shaping the world \u2014 we believe that\u2019s an important part of building a more equal society. And we can\u2019t do that if we have a paywall.\n<\/p>\n<p>\nIt\u2019s important that we have several ways we make money, and that\u2019s why we ask readers for help keeping Vox free. <b>Our goal today is for 47 Vox readers to pledge an annual financial gift to Vox to help us keep our work free for everyone.<\/b> <a href=\"http:\/\/vox.com\/pages\/support-now?itm_campaign=47-annual&#038;itm_medium=site&#038;itm_source=article-footer\">Will you become one of them? <\/a><\/p>\n<\/div><\/div>\n<div>\n<div>\n<p><label tabindex=\"0\" role=\"radio\" aria-checked=\"true\"><\/p>\n<p>\n                  <span>$95<\/span><span>\/year<\/span>\n                <\/p>\n<p>              <\/label><\/p>\n<p>              <label tabindex=\"0\" role=\"radio\" aria-checked=\"true\"><\/p>\n<p>\n                  <span>$120<\/span><span>\/year<\/span>\n                <\/p>\n<p>              <\/label><\/p>\n<p>              <label tabindex=\"0\" role=\"radio\" aria-checked=\"true\"><\/p>\n<p>\n                  <span>$250<\/span><span>\/year<\/span>\n                <\/p>\n<p>              <\/label><\/p>\n<p>              <label tabindex=\"0\" role=\"radio\" aria-checked=\"true\"><\/p>\n<p>\n                  <span>$350<\/span><span>\/year<\/span>\n                <\/p>\n<p>              <\/label><\/p>\n<p>            <label tabindex=\"0\"><\/p>\n<p>              <span>Other<\/span><br \/>\n            <\/label>\n          <\/p>\n<\/p><\/div>\n<p>        <a href=\"https:\/\/vox.memberful.com\/checkout?plan=\" id=\"contribute--submit\"><\/p>\n<p>\n            Yes, I&#8217;ll give $120<span>\/year<\/span>\n          <\/p>\n<p>        <\/a><\/p>\n<p>\n          Yes, I&#8217;ll give $120<span>\/year<\/span>\n        <\/p>\n<div>\n<p>\n              <span><br \/>\n                We accept credit card, Apple Pay, and<br \/>\n              <\/span><br \/>\n              <span><br \/>\n                Google Pay. You can also contribute via<br \/>\n              <\/span>\n            <\/p>\n<p><a href=\"https:\/\/www.paypal.com\/donate\/?hosted_button_id=VSP4PYJX98SHL\" target=\"_blank\" rel=\"noopener\"><br \/>\n              <img decoding=\"async\" src=\"https:\/\/cdn.vox-cdn.com\/uploads\/chorus_asset\/file\/22734206\/paypal_logo.png\"><br \/>\n            <\/a>\n          <\/p>\n<\/div><\/div>\n<\/p><\/div>\n<\/div>\n<p><a href=\"https:\/\/www.vox.com\/future-perfect\/2023\/3\/18\/23645013\/openai-gpt4-holden-karnofsky-artificial-intelligence-ai-safety-existential-risk\" class=\"button purchase\" rel=\"nofollow noopener\" target=\"_blank\">Read More<\/a><br \/>\n Stephania Block<\/p>\n","protected":false},"excerpt":{"rendered":"<p>On Tuesday, OpenAI announced the release of GPT-4, its latest, biggest language model, only a few months after the splashy release of ChatGPT. GPT-4 was already in action \u2014 Microsoft has been using it to power Bing\u2019s new assistant function. The people behind OpenAI have written that they think the best way to handle powerful<\/p>\n","protected":false},"author":1,"featured_media":620673,"comment_status":"closed","ping_status":"closed","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[38215,28906,46],"tags":[],"class_list":{"0":"post-620672","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-adjust","8":"category-society","9":"category-technology"},"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/620672","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/comments?post=620672"}],"version-history":[{"count":0,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/620672\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media\/620673"}],"wp:attachment":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media?parent=620672"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/categories?post=620672"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/tags?post=620672"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}