top of page

Limitations of the Online Safety Act 2025

Updated: Sep 17

The online Safety Act has limitations including a failure to protect against misinformation and private illegal content, concerns about the potential for invasive age verification methods and threats to free speech, and the Act's reliance on a flawed business model that incentivizes problematic content. The legislation does not mandate the moderation of private messages, leaving a loophole for illegal material to spread, and it also relies on platforms to define and police illegal content, which raises concerns about censorship and potential errors in judgment.


Scope and content limitations


Misinformation


The Act was not designed to tackle the viral spread of misinformation, and further legislation is needed to address this issue effectively, according to some MPs.


Private communications


The Act does not mandate the moderation of content in private communications, allowing illegal content to be shared in private online spaces.


Item-by-item Approach


The legislation takes an item-by-item approach to content, meaning different types of illegal or harmful content might be treated differently, leading to patchy response.


Free Speech and privacy Concerns


Age Verification


The use of digital age checks, while intended to protect children, raises concerns about privacy, security breaches, potential errors, digital exclusion, and invasive data collection.


Threat to freedom of Expression


The Act's requirement for large platforms to remove or restrict access to "harmful" but legal content risks censoring protected speech.


Outsourcing of Decision- making


The Act plates the onus on companies to determine what is illegal, which could lead to over-censorship and potential human rights violations, according to some organisations.


Platform Accountability and loopholes


Business Model Issues


The Act fails to address the underlying business models of tech companies, which can create power imbalances and challenge in protecting human rights,


Lack of proactive Measures


The rules-based approach may allow platforms to comply without actively and effectively addressing the identified harms in their risk assessments.


Private Communications as safe havens


The lack of regulation for private communications creates a safe harbour" for criminals, for sharing child sexual abuse material.




Seeking answers as to how the Online Safety Act impacts on all our users.

1 Comment


Mart Lee
Mart Lee
Sep 16

It shows all very good work shows you how hard we are all working.

Like

Featured Posts

Recent Posts

Archive

Search By Tags

Follow Us

  • Facebook Basic Square
  • Twitter Basic Square
This website is designed, produced and updated by our volunteers 2017
  • Facebook Social Icon
  • Twitter Social Icon
bottom of page