Elon Musk Fires Twitter Team In Charge Of Handling Child Sexual Abuse

0
336

Elon Musk lies as he breathes, and his claim last week that addressing child exploitation content is Twitter’s “priority #1” was no exception. A new report in Wired this week shows that one team handling child sexual abuse content on the platform for the Asia Pacific region—which has a population of 4.3 billion and includes Japan, the country with the second most Twitter users in the world after the U.S.—is literally down to just one employee. While it’s unclear how many people were on this team before, Wired identified at least four employees who have publicly said they left Twitter this month.

Groups like the UK’s Internet Watch Foundation (IWF) and the U.S.-based National Center for Missing & Exploited Children help tech companies monitor child sexual abuse content. But a platform like Twitter—which allows adult pornography but doesn’t have the technology to distinguish consenting adults from children without human staff immediately—needs in-house help. Organizations like IWF and the National Center for Missing & Exploited Children don’t have access to the internal data, detection code, and other tools that Twitter can use to prevent child sexual abuse from being shared in the first place.

The most recent data from Twitter about the prevalence of child sexual abuse content on the platform, collected between July and December 2021, shows the situation is urgent. Per the report, the company suspended over half a million accounts for disseminating child exploitation content during this period, marking a 31% increase from the previous six months.

Unchecked child sexual abuse content is already tangibly affecting Twitter’s business: Both Dyson and Forbes have suspended advertising on the platform after their ads appeared directly next to child abuse content. Contrary to Musk’s pathetic rants against companies that refuse to advertise on Twitter for supposedly opposing free speech, not wanting to associate your brand with child sexual abuse actually sounds like free speech to me.

Even before Musk took over Twitter and gutted its staff—disproportionately in online safety, security, and content moderation teams—the company conceded back in April that it struggled to rein in child exploitation posts. The platform formerly considered starting an OnlyFans-esque program allowing users to monetize adult content. Still, it nixed this plan because, according to an April 2022 internal report by The Verge, “Twitter cannot accurately detect child sexual exploitation and nonconsensual nudity at scale.”

Without virtually any staff managing this issue at the present, the crisis will inevitably get worse. Last week, Musk tweeted that his followers should “reply in the comments” to flag any child abuse content they witnessed in a performative display of how much he supposedly cares about the issue. “This question should not be a Twitter thread. That’s the very question he should be asking the child safety team he laid off. That’s the contradiction here,” Carolina Christofoletti, a child sex abuse material researcher at the University of São Paulo in Brazil, told Wired about Musk’s approach.

Further, Musk’s callous gutting of child safety professionals aside, essentially tweeting out, “Please share all the child porn here, thank you,” is… a choice. Recall that the noted businessman of the year wants us to pay him $8 per month to perform the labor of content creation and also, apparently, the labor of tracking down child abuse material.

Meanwhile, as some Twitter users say they’re reporting content promoting pedophilia and being told no violations were found, Musk is busy cracking down on 15-year-olds who tweet, “maybe if musk didn’t buy avocado toast, he wouldn’t be asking us for $8.” He’s gone so far as to boost his claims of caring about child safety by lying about being the one who held his baby in his arms during the infant’s dying moments when, actually, it was his ex-wife, whom he called “emotionally manipulative” for openly mourning their child’s death.