{"id":862192,"date":"2025-07-13T07:11:47","date_gmt":"2025-07-13T12:11:47","guid":{"rendered":"https:\/\/newsycanuse.com\/index.php\/2025\/07\/13\/uk-online-safety-regime-ineffective-on-misinformation-mps-say\/"},"modified":"2025-07-13T07:11:47","modified_gmt":"2025-07-13T12:11:47","slug":"uk-online-safety-regime-ineffective-on-misinformation-mps-say","status":"publish","type":"post","link":"https:\/\/newsycanuse.com\/index.php\/2025\/07\/13\/uk-online-safety-regime-ineffective-on-misinformation-mps-say\/","title":{"rendered":"UK online safety regime ineffective on misinformation, MPs say"},"content":{"rendered":"<div id=\"content-header\">\n<h2>A report from the Commons Science, Innovation and Technology Committee outlines how the Online Safety Act fails to deal with the algorithmic amplification of \u2018legal but harmful\u2019 misinformation<\/h2>\n<\/div>\n<div id=\"content-center\">\n<ul>\n<li><i data-icon=\"1\"><\/i><\/li>\n<li><i data-icon=\"2\"><\/i><\/li>\n<\/ul>\n<div id=\"contributors-block\">\n<p><img decoding=\"async\" src=\"https:\/\/cdn.ttgtmedia.com\/rms\/computerweekly\/Sebastian-Klovig-Skelton-CW-contributor.jpg\" alt=\"Sebastian Klovig Skelton\">\n\t\t\t\t\t<\/p>\n<p><span>By<\/span><\/p>\n<ul>\n<li>\n\t\t\t\t\t<a href=\"https:\/\/www.techtarget.com\/contributor\/Sebastian-Klovig-Skelton\">Sebastian Klovig Skelton,<\/a><br \/>\n\t\t\t\t\t\t<span>Data &#038; ethics editor<\/span>\n\t\t\t\t\t\t<\/li>\n<\/ul>\n<p>\n\tPublished: <span>11 Jul 2025 11:45<\/span>\n<\/p>\n<\/div>\n<section id=\"content-body\">\n<p>The UK\u2019s Online Safety Act (OSA) is failing to address \u201calgorithmically accelerated misinformation\u201d on social media platforms, leaving the public vulnerable to a repeat of the 2024 <a href=\"https:\/\/www.computerweekly.com\/news\/366605410\/Campaigners-criticise-Starmer-post-riot-public-surveillance-plans\">Southport riots<\/a>, MPs have warned.<\/p>\n<p>Following an inquiry into online <a href=\"https:\/\/www.computerweekly.com\/news\/366618096\/Davos-2025-Misinformation-and-disinformation-are-most-pressing-risk-says-World-Economic-Forum\">misinformation<\/a>\u00a0and\u00a0<a href=\"https:\/\/www.computerweekly.com\/feature\/Auditing-for-algorithmic-discrimination\">harmful algorithms<\/a>, the Commons Science, Innovation and Technology Committee (SITC) has identified \u201cmajor holes\u201d in the UK\u2019s online safety regime when it comes to dealing with the viral spread of false or harmful content.<\/p>\n<p>Highlighting the July 2024 Southport riots as an example of how \u201conline activity can contribute to real-world violence\u201d, the SITC warned in a <a href=\"https:\/\/publications.parliament.uk\/pa\/cm5901\/cmselect\/cmsctech\/441\/report.html\">report<\/a> published on 11 July 2025 that while many parts of the OSA <a href=\"https:\/\/www.computerweekly.com\/news\/366620919\/Online-Safety-Act-measures-come-into-effect\">were not fully in force at the time of the unrest<\/a>, \u201cwe found little evidence that they would have made a difference if they were\u201d.<\/p>\n<p>It said this was due to a mixture of factors, including <a href=\"https:\/\/www.computerweekly.com\/news\/366623592\/Government-and-Ofcom-disagree-about-scope-of-Online-Safety-Act#:~:text=Committee%20chair%20Chi%20Onwurah%2C%20however%2C%20said%20it%20would%20be%20difficult%20to%20prove%20this%20intent%2C%20and%20highlighted%20that%20there%20are%20no%20duties%20on%20Ofcom%20to%20take%20action%20over%20misinformation%2C%20even%20if%20there%20are%20codes%20about%20misinformation%20risks.\">weak misinformation-related measures<\/a> in the act itself, as well as the business models and opaque recommendation algorithms of social media firms.<\/p>\n<p>\u201cIt\u2019s clear that the Online Safety Act just isn\u2019t up to scratch,\u201d said SITC chair Chi Onwurah. \u201cThe government needs to go further to tackle the pervasive spread of misinformation that causes harm but doesn\u2019t cross the line into illegality. Social media companies are not just neutral platforms but actively curate what you see online, and they must be held accountable. To create a stronger online safety regime, we urge the government to adopt five principles as the foundation of future regulation.\u201d<\/p>\n<p>These principles include public safety, free and safe expression, responsibility (including for both end users and the platforms themselves), control of personal data, and transparency. \u00a0<\/p>\n<p>The SITC also made specific recommendations, such as creating \u201cclear and enforceable standards\u201d for the digital advertising ecosystem that incentivises the amplification of false information, and introducing new duties for platforms to assess and deal with misinformation-related risks. \u201cIn order to tackle amplified disinformation \u2026 the government and Ofcom should collaborate with platforms to identify and track disinformation actors, and the techniques and behaviours they use to spread adversarial and deceptive narratives online,\u201d said MPs.<\/p>\n<section data-menu-title=\"Business models and opaque algorithms\">\n<h2><i data-icon=\"1\"><\/i>Business models and opaque algorithms<\/h2>\n<p>According to the SITC, social media companies have \u201coften enabled or even encouraged\u201d the viral spread of misinformation \u2013 and may have profited from it \u2013 as a result of their advertising and engagement-based business models.<\/p>\n<p>\u201cThe advertisement-based business models of most social media companies mean that they promote engaging content, often regardless of its safety or authenticity,\u201d MPs wrote. \u201cThis spills out across the entire internet, via the opaque, under-regulated digital advertising market, incentivising the creation of content that will perform well on social media.\u201d<\/p>\n<p>They added that while major tech companies told the committee there are no incentives to allow harmful content on their platforms, as it can damage the brand and repel advertisers, \u201cpolicymaking in this space has lacked a full evidence base\u201d because the inner workings of social media recommendation algorithms are not disclosed by the firms.<\/p>\n<p>\u201cWe asked several tech companies to provide high-level representations of their recommendation algorithms to the committee, but they did not,\u201d they said, adding that this \u201cshortfall in transparency\u201d makes it difficult to establish clear causal links between specific recommendations and harms.<\/p>\n<p>\u201cThe technology used by social media companies should be transparent, explainable and accessible to public authorities,\u201d they said.<\/p>\n<p>The SITC added that the government should create measures to compel social media platforms to embed tools in their systems that can identify and algorithmically deprioritise fact-checked misleading content, or content that cites unreliable sources, where it has the potential to cause significant harm.<\/p>\n<p>\u201cIt is vital that these measures do not censor legal free expression, but apply justified and proportionate restrictions to the spread of information to protect national security, public safety or health, or prevent disorder or crime,\u201d said MPs.<\/p>\n<p>On tackling the underlying business models that incentivise misinformation, MPs said there is a regulatory gap around digital advertising, as the focus is currently on harmful advertising content rather than \u201cthe monetisation of harmful content through advertising\u201d.<\/p>\n<p>\u201cThe government should create a new arms-length body \u2013 not funded by industry \u2013 to regulate and scrutinise the process of digital advertising, covering the complex and opaque automated supply chain that allows for the monetisation of harmful and misleading content,\u201d they added. \u201cOr, at the least, the government should extend Ofcom\u2019s powers to explicitly cover this form of harm, and regulate based on the principle of preventing the spread of harmful or misleading content through any digital means, rather than limiting itself to specific technologies or sectors.\u201d<\/p>\n<p>While generative artificial intelligence (GenAI) only played a marginal role in the spread of misinformation before the Southport riots, the SITC expressed concern about the role it could play in a \u201cfuture, similar crisis\u201d.<\/p>\n<p>They said GenAI\u2019s \u201clow cost, wide availability and rapid advances means that large volumes of convincing deceptive content can increasingly be created at scale\u201d.<\/p>\n<p>It said the government should therefore pass legislation that covers GenAI platforms, in line with other online services that pose a high risk of producing or spreading illegal or harmful content.<\/p>\n<p>\u201cThis legislation should require generative AI platforms to: provide risk assessments to Ofcom on the risks associated with different prompts and outputs, including how far they can create or spread illegal, harmful or misleading content; explain to Ofcom how the model curates content, responds to sensitive topics and what guardrails are in place to prevent content that is illegal or harmful to children; implement user safeguards such as feedback, complaints and output flagging; and prevent children from accessing inappropriate or harmful outputs.\u201d<\/p>\n<p>They added that all AI-generated content should be automatically labelled as such \u201cwith metadata and visible watermarks that cannot be removed\u201d.<\/p>\n<\/section>\n<\/section>\n<section id=\"DigDeeperSplash\">\n<h4>\n\t\t\t<i data-icon=\"m\"><\/i>Read more on Technology startups<\/h4>\n<ul>\n<li><a id=\"DigDeeperItem-1\" href=\"https:\/\/www.computerweekly.com\/news\/366623592\/Government-and-Ofcom-disagree-about-scope-of-Online-Safety-Act\"><br \/>\n\t\t\t\t\t<img decoding=\"async\" src=\"https:\/\/www.computerweekly.com\/visuals\/ComputerWeekly\/HeroImages\/fake-fact-misinformation-Lemonsoup14-adobe_searchsitetablet_520X173.jpg\" srcset=\"https:\/\/www.computerweekly.com\/visuals\/ComputerWeekly\/HeroImages\/fake-fact-misinformation-Lemonsoup14-adobe_searchsitetablet_520X173.jpg 960w,https:\/\/www.computerweekly.com\/visuals\/ComputerWeekly\/HeroImages\/fake-fact-misinformation-Lemonsoup14-adobe.jpg 1280w\" alt ><\/p>\n<h5>Government and Ofcom disagree about scope of Online Safety Act<\/h5>\n<div>\n<p><img decoding=\"async\" src=\"https:\/\/www.computerweekly.com\/rms\/computerweekly\/Sebastian-Klovig-Skelton-CW-contributor.jpg\" alt=\"SebastianKlovig Skelton\">\n\t\t\t\t\t\t\t\t\t<\/p>\n<p><span>By: <span>Sebastian\u00a0Klovig Skelton<\/span><\/span>\n\t\t\t\t\t\t\t<\/p>\n<\/div>\n<p>\t\t\t\t<\/a><\/li>\n<li><a id=\"DigDeeperItem-2\" href=\"https:\/\/www.computerweekly.com\/news\/366620919\/Online-Safety-Act-measures-come-into-effect\"><br \/>\n\t\t\t\t\t<img decoding=\"async\" src=\"https:\/\/www.computerweekly.com\/visuals\/ComputerWeekly\/Hero Images\/social-media-apps-icons-adobe_searchsitetablet_520X173.jpg\" srcset=\"https:\/\/www.computerweekly.com\/visuals\/ComputerWeekly\/Hero%20Images\/social-media-apps-icons-adobe_searchsitetablet_520X173.jpg 960w,https:\/\/www.computerweekly.com\/visuals\/ComputerWeekly\/Hero%20Images\/social-media-apps-icons-adobe.jpeg 1280w\" alt ><\/p>\n<h5>Online Safety Act measures come into effect<\/h5>\n<div>\n<p><img decoding=\"async\" src=\"https:\/\/www.computerweekly.com\/rms\/computerweekly\/Sebastian-Klovig-Skelton-CW-contributor.jpg\" alt=\"SebastianKlovig Skelton\">\n\t\t\t\t\t\t\t\t\t<\/p>\n<p><span>By: <span>Sebastian\u00a0Klovig Skelton<\/span><\/span>\n\t\t\t\t\t\t\t<\/p>\n<\/div>\n<p>\t\t\t\t<\/a><\/li>\n<li><a id=\"DigDeeperItem-3\" href=\"https:\/\/www.computerweekly.com\/news\/366619588\/MPs-grill-X-TikTok-and-Meta-about-online-misinformation\"><br \/>\n\t\t\t\t\t<img decoding=\"async\" src=\"https:\/\/www.computerweekly.com\/visuals\/ComputerWeekly\/Hero Images\/Meta-HQ-sign-Valeriya-Zankovych-Editorial-Use-Only-adobe_searchsitetablet_520X173.jpg\" srcset=\"https:\/\/www.computerweekly.com\/visuals\/ComputerWeekly\/Hero%20Images\/Meta-HQ-sign-Valeriya-Zankovych-Editorial-Use-Only-adobe_searchsitetablet_520X173.jpg 960w,https:\/\/www.computerweekly.com\/visuals\/ComputerWeekly\/Hero%20Images\/Meta-HQ-sign-Valeriya-Zankovych-Editorial-Use-Only-adobe.jpg 1280w\" alt ><\/p>\n<h5>MPs grill X, TikTok and Meta about online misinformation<\/h5>\n<div>\n<p><img decoding=\"async\" src=\"https:\/\/www.computerweekly.com\/rms\/computerweekly\/Sebastian-Klovig-Skelton-CW-contributor.jpg\" alt=\"SebastianKlovig Skelton\">\n\t\t\t\t\t\t\t\t\t<\/p>\n<p><span>By: <span>Sebastian\u00a0Klovig Skelton<\/span><\/span>\n\t\t\t\t\t\t\t<\/p>\n<\/div>\n<p>\t\t\t\t<\/a><\/li>\n<li><a id=\"DigDeeperItem-4\" href=\"https:\/\/www.computerweekly.com\/news\/366616030\/Government-issues-strategic-priorities-for-online-safety-regulator-Ofcom\"><br \/>\n\t\t\t\t\t<img decoding=\"async\" src=\"https:\/\/www.computerweekly.com\/visuals\/ComputerWeekly\/Hero Images\/child-computer-internet-fotolia_searchsitetablet_520X173.jpg\" srcset=\"https:\/\/www.computerweekly.com\/visuals\/ComputerWeekly\/Hero%20Images\/child-computer-internet-fotolia_searchsitetablet_520X173.jpg 960w,https:\/\/www.computerweekly.com\/visuals\/ComputerWeekly\/Hero%20Images\/child-computer-internet-fotolia.jpg 1280w\" alt ><\/p>\n<h5>Government issues strategic priorities for online safety regulator Ofcom<\/h5>\n<div>\n<p><img decoding=\"async\" src=\"https:\/\/www.computerweekly.com\/rms\/computerweekly\/Sebastian-Klovig-Skelton-CW-contributor.jpg\" alt=\"SebastianKlovig Skelton\">\n\t\t\t\t\t\t\t\t\t<\/p>\n<p><span>By: <span>Sebastian\u00a0Klovig Skelton<\/span><\/span>\n\t\t\t\t\t\t\t<\/p>\n<\/div>\n<p>\t\t\t\t<\/a><\/li>\n<\/ul>\n<\/section>\n<\/div>\n<p><a href=\"https:\/\/www.computerweekly.com\/news\/366627535\/UK-online-safety-regime-ineffective-on-misinformation-MPs-say\" class=\"button purchase\" rel=\"nofollow noopener\" target=\"_blank\">Read More<\/a><br \/>\n Clora Redner<\/p>\n","protected":false},"excerpt":{"rendered":"<p>A report from the Commons Science, Innovation and Technology Committee outlines how the Online Safety Act fails to deal with the algorithmic amplification of \u2018legal but harmful\u2019 misinformation By Sebastian Klovig Skelton, Data &amp; ethics editor Published: 11 Jul 2025 11:45 The UK\u2019s Online Safety Act (OSA) is failing to address \u201calgorithmically accelerated misinformation\u201d on<\/p>\n","protected":false},"author":1,"featured_media":862193,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"footnotes":""},"categories":[1538,3166,46],"tags":[],"class_list":{"0":"post-862192","1":"post","2":"type-post","3":"status-publish","4":"format-standard","5":"has-post-thumbnail","7":"category-online","8":"category-safety","9":"category-technology"},"aioseo_notices":[],"_links":{"self":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/862192","targetHints":{"allow":["GET"]}}],"collection":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/comments?post=862192"}],"version-history":[{"count":0,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/posts\/862192\/revisions"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media\/862193"}],"wp:attachment":[{"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/media?parent=862192"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/categories?post=862192"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/newsycanuse.com\/index.php\/wp-json\/wp\/v2\/tags?post=862192"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}