The eSafety Commissioner of Australia accuses the social platform of a growing inability to effectively regulate online hate.
Australia has criticized X, previously known as Twitter, for significantly reducing its online trust and safety resources, thereby impeding its capacity to address concerns regarding harmful content on its platform.
The Australian eSafety Commissioner has published a transparency report outlining the reductions made by the microblogging site to its trust and safety teams since its acquisition in October 2022. This marks the first instance of specific figures being disclosed regarding the areas where cuts were implemented, following a legal notice issued to X Corp requesting information on its efforts to adhere to Australia’s online safety regulations.
The transparency report released by eSafety summarizes X’s response, highlighting significant reductions in its safety and public policy personnel, according to the data provided.
On a global scale, X decreased its trust and safety staff by 30%, with a steeper decline of 45% observed in the Asia-Pacific region. Moreover, the number of engineers assigned to trust and safety matters was slashed by 80% worldwide, while content moderators employed by X experienced a 52% reduction, as reported by eSafety.
Globally, public policy personnel witnessed a reduction of 78%, while in the Asia-Pacific region, the decrease was 73%, with the Australian division experiencing a complete loss of its team.
The government agency remarked, “Companies with a diminished number of trust and safety personnel may find themselves lacking the capacity to effectively address online hate and other forms of online harm.” It emphasized that this situation often leads to the burden of safety falling on the users or groups experiencing abuse, rather than the platform taking responsibility for harmful content and behavior on their service.
Additionally, the agency highlighted that the median response time for X to address user reports has slowed by 20% since its acquisition, and the response time for direct messages has slowed even further, by 75%.
“Prompt action on user reports is particularly crucial since X relies exclusively on user reports to identify hateful conduct in direct messages,” emphasized eSafety.
As of May 2023, automated tools designed to detect volumetric attacks that violate X’s targeted harassment policy were not utilized on the platform. Moreover, hyperlinks to websites containing harmful content remained unblocked on X.
During the period between November 2022 and May 2023, the social media platform reinstated 6,103 previously banned accounts, a move believed by eSafety to primarily affect accounts in Australia rather than globally. Media reports estimated that over 62,000 previously suspended accounts were reinstated worldwide during this period.
Among the 6,103 reinstated accounts thought to pertain to Australia, eSafety identified 194 accounts that had previously been suspended for violations related to hateful conduct. The agency noted that X did not subject reinstated accounts to additional scrutiny.
“It’s nearly unavoidable for any social media platform to become more toxic and less secure for users when significant reductions are made to safety and local public policy personnel, combined with the reinstatement of thousands of previously banned accounts,” stated eSafety Commissioner Julie Inman Grant. “It’s really creating a perfect storm.”
Grant also highlighted X’s delayed response to user reports of online hate on its platform, with some users waiting up to 28 hours for a response to their direct messages.
eSafety confirmed that it had issued a notification to X Corp, indicating its failure to comply with the notice under the Online Safety Act.
Furthermore, the government agency recently initiated civil penalty proceedings against X Corp for allegedly failing to adhere to an earlier reporting notice issued in February 2023 regarding its compliance with Australia’s online safety regulations concerning child sexual exploitation and abuse material and activity.
In September 2023, X Corp was issued an infringement notice of AU$610,500 for its failure to comply with the February 2023 notice. However, X Corp has not paid the infringement notice and instead has opted to seek judicial review of eSafety’s reliance on the transparency notice, the government agency revealed. It further stated that it has requested for the judicial review to be heard concurrently with the civil penalty proceedings.
Upon sending inquiries to X, ZDNET received what appeared to be an automated email response: “Busy now, please check back later.”
In February, eSafety issued legal notices to several social media platforms, including Twitter, Google, TikTok, Twitch, and Discord, seeking information on the measures each platform was implementing to address issues such as child sexual exploitation and abuse, sexual extortion, and the promotion of harmful content by algorithms.
Last August, the Australian Broadcasting Corporation (ABC) reduced its presence on X, retaining only four accounts, citing trust concerns and a strategic decision to focus on platforms where its audience is more engaged.