There have been many valid criticisms surrounding the Online Safety Act 2021 (‘The OSA’) since its inception. This is especially the case when referencing the eSafety Commissioner and all of the attention she has garnered over the last couple of years with relation to the attempts to control the regulation of online content.
In 2024, I co-authored critical submissions to the Parliamentary Statutory Review of the OSA, emphasising the need to strengthen its human rights considerations and advocating for the need to implement greater protections with regard to judicial review, with specific reference to the overwhelming powers of the eSafety Commissioner to issue notices and avoid procedural fairness in some instances.
Procedural fairness in Australia is a fundamental right and is found within Section 75 of the Australian Constitution, which allows individuals or companies to seek judicial review of decisions where they believe procedural fairness has not been observed. In Saeed v Minister for Immigration and Citizenship (2010) 241 CLR 252, the High Court of Australia proclaimed that procedural fairness is protected within the Australian legal system. Without the ability to exercise this right, many Australians would be left without the ability to challenge unjust or bias outcomes from decision-makers.
The parliamentary review was submitted during the height of the media attention surrounding the legal action against the social media platform ‘X’, and vicariously, Elon Musk. In this case, the eSafety Commissioner issued blocking notices in response to videos depicting the stabbing of Bishop Mar Mari Emmanuel under Section 99 of the OSA in an attempt to remove access to content that supposedly promoted, incited, instructed, or depicted abhorrent violent content.
In commenting on the issuance of blocking notices in the case of eSafety Commissioner v X Corp [2024] FCA 499, Justice Kennett acknowledged that, ‘…there is widespread alarm at the prospect of a decision by an official of a national government restricting access to controversial material on the internet.’ In X’s appeal in the Federal Court, Justice Kennett contemplated the impacts of the blocking notices and found them to be an invalid use of power, as the notice skipped procedural fairness via the inability for X to seek a merits review of the decision by the Commissioner, and that the removal of the posts on X was not ‘reasonable’. In this instance the court sided with X and Musk, which, in my opinion, highlights that controversial material will inevitably exist and people expect to retain their ability to choose whether or not to engage in this content.
Although there are many shortcomings within the OSA in its current form, it is hard to ignore the positive impact that this legislation can have when applied correctly. This was demonstrated recently within a decision handed down by the Federal Court of Australia on September 26, 2025.
This case was commenced by the eSafety Commissioner under Section 75 of the OSA, which concerns the posting of an intimate image online. Within this section, there are vital safeguards that seek to protect Australian residents from being subject to abhorrent crimes like revenge porn by an ex-partner or the new trend of using artificial intelligence to artificially depict people onto generated bodies engaging in sexual activities.
In this case, the court considered a moving visual image that depicted, without their consent, several people’s genital area and breasts when being engaged in a sexual act ‘of a kind not ordinarily done in public in circumstances in which an ordinary reasonable person would reasonably expect to be afforded privacy’, as well as depicting them getting undressed.
After ignoring several requests by the eSafety Commissioner to remove the content, the court found that the person who posted these images was in contravention of the OSA and ordered them to pay over $300,000 in damages. The damages contemplated the posting of 12 non-consensual deepfake images of six different Australian residents.
In coming to its decision, the court contemplated deepfake sexual content and determined that it satisfied the definition of an ‘intimate image’ under Section 75 of the OSA. An ‘intimate image’ is one that includes a still or moving visual image that depicts, or even appears to depict, a person in a state of undress or engaged in a sexual act of a kind not ordinarily done in public. This also includes an image that depicts a person’s genital area in circumstances where an ordinary reasonable person would reasonably be expected to be afforded privacy.
Although I have been critical of the OSA and the legislative overreach it grants the eSafety Commissioner, this case shines a bright light on the importance of the legislation in protecting Australian residents from despicable acts that can result in significant negative impacts on the life of a person subject to non-consensual sexual content being shared online.
Although the concept of ‘revenge porn’ has been around for many years now, this case discusses the issue of AI-generated sexual content and provides a firm precedent for courts to rely on when dealing with an issue that I believe will only grow as AI technology continues to progress in its rapid development.
Regardless of the opinions you may hold against the OSA and its eSafety Commissioner, it is undeniable that this decision will have an overwhelmingly positive impact on protecting the rights of Australian residents when exposed to the increasing threat of AI-generated sexual content.
It is important that the Australian government takes a proactive approach when developing governance surrounding this issue before Australian courts find themselves overwhelmed with similar cases.


















