X reinstated 6,103 banned accounts in Australia including 194 barred for hateful conduct | X

Elon Musk’s X social media platform let go 80% of its safety engineers and reinstated thousands of banned accounts in Australia, including nearly 200 previously barred for hateful conduct, in changes the country’s eSafety Commissioner labelled a “perfect storm” for online abuse issues.

The disclosures to the Australian internet safety office are the first specific details the company has shared about its online safety processes since the October 2022 takeover of Twitter by the world’s richest man. X Corp, as the company is now known, told the commissioner it had reduced its global trust and safety staff by a third and did not place previously banned accounts under additional scrutiny once they were reinstated.

Julie Inman Grant, the eSafety commissioner, called the figures “startling”. “You’re creating the perfect storm – reducing your defences, bringing back former users with a history of repeat abuse, and you’re not putting any mitigating factors in to remediate the harm,” she told Guardian Australia.

“That’s obviously of concern to us as an online regulator.”

The disclosures from X come in a report from the eSafety office, after it issued a legal notice in June 2023 under the Online Safety Act, requiring the company to explain how it was addressing online hate on its platform.

The act allows eSafety to compel online platforms to respond to such requests; the regulator deemed X’s initial responses to not have complied with the legal notice and engaged in further regulatory processes to seek further information.

Musk’s purchase of Twitter was soon followed by large layoffs of staff, prompting concerns about content moderation on the platform.

Inman Grant said the eSafety office had seen “an immediate increase in reports of online hate and abuse” shortly after that time.

In responses to eSafety, X disclosed that its number of engineers focused on trust and safety issues had decreased from 279 in October 2022, at the time of Musk’s acquisition of the company, to 55 in May 2023.

The total number of trust and safety staff globally fell from 4,062 to 2,849 over the same period, a drop of 30%. In the Australia Pacific region, trust and safety staff decreased from 111 to 61, a 45% drop.

Global content moderators, a combination of full-time and contractor staff, went from 2,720 in October 2022 (107 full-time and 2,613 contractors) to 2,356 in May 2023 (27 full-time and 2,305 contractors), according to the eSafety data.

Guardian Australia contacted X for comment. An auto-reply from its email contact for press inquiries responded: “Busy now, please check back later.”

X’s data showed median waiting times for the company to respond to user reports of hateful conduct had increased since October 2022.

“eSafety understands from this information that Twitter does not consider that the vast majority of user reports of hateful conduct breach its terms and policies. From X Corp’s response, there was no significant difference in how reports were treated before and after its acquisition,” eSafety said in its report.

skip past newsletter promotion

After Musk’s acquisition of the company, X reinstated large numbers of previously banned accounts. The company said it reinstated 6,103 previously banned accounts, including 194 previously suspended for hateful conduct violations. The eSafety office said X did not specify details about the reinstatements, but eSafety believed the number related specifically to Australian accounts.

“X Corp responded that Twitter did not place reinstated accounts under additional scrutiny,” the report said.

Inman Grant was concerned at the reduction in policy and moderation staff at X, likening such employees to “traffic enforcement and accident response” officers on roads.

She said platforms should be investing more in such staff, not less.

“We should expect more innovation, more investment, more improvement of people, processes and practises. It makes good business sense,” Inman Grant said.

“Public safety is closely tied to brand safety – a lot of these platforms rely on advertisers, we shouldn’t be surprised that not only [do] users walk, but advertisers do too.”

The eSafety Commissioner has been engaged in running disputes with X over its online safety rules, including issuing it with a $610,500 fine under the Online Safety Act after ignoring questions about how it was cracking down on child sexual abuse material on the platform.

In November, X was kicked out of Australia’s voluntary misinformation and disinformation code, after failing to respond to a complaint about shutting down channels for users to report misinformation, during the voice to parliament referendum.

Source link

Denial of responsibility! NewsConcerns is an automatic aggregator of the all world’s media. In each content, the hyperlink to the primary source is specified. All trademarks belong to their rightful owners, all materials to their authors. If you are the owner of the content and do not want us to publish your materials, please contact us by email – [email protected]. The content will be deleted within 24 hours.

Leave a Comment