- Three members of Twitter’s Trust and Safety Council resigned this week.
- Under Musk’s ownership, Twitter has seen a drastic increase in hate speech, two watchdog groups have found.
- “Contrary to Elon Musk’s claims, the safety and well-being of Twitter users is on the decline,” the three departing members said in a statement.
Twitter’s exodus continues, with three members of its trust and safety board becoming the latest to go.
Twitter’s Trust and Safety Council, formed in 2016, is made up of dozens of individuals and organizations that are independent than Twitter said help “advocate for safety and advise us as we develop our products, programs and policies.” Outgoing members are Anne Collier, Founder and Executive Director of The Net Safety Collaborative; Eirliani Abdul Rahman, co-founder of Youth, Adult Survivors & Kin In Need (YAKIN); and Lesley Podesta, Advisor to the Young and Resilient Research Center at Western Sydney University.
“We are announcing our resignation from the Twitter Trust and Safety Council as research clearly shows that, contrary to Elon Musk’s claims, the safety and wellbeing of Twitter users is in decline,” they wrote in a statement. communicated. Press release shared by Collier on Thursday.
They refer to research by two watchdog organizations, the Center for Countering Digital Hate and the Anti-Defamation League, which recently reported a sharp rise in hate speech — including slurs against black people and gay people, as well as anti-Semitic posts — on Twitter since Musk bought the platform.
“The question was running through our minds: Should Musk be allowed to define digital security because he has freedom of speech? Our answer is a resounding ‘no’,” the statement from the outgoing board members continued. “A Twitter ruled by diktat is no place for us.”
Musk called himself a “absolutist of freedom of expression”, which experts and advocates say could weaken Twitter’s ability to effectively combat hate speech, misinformation and harassment.
Council members have been “mystified by the lack of communication” since Musk took charge, according to Collier. She adds that several changes Musk has made since taking office have raised alarm bells for security assurance on Twitter, including his removal of outsourced content moderator positions from the company. Twitter’s new head of trust and safety, Ella Irwin, told Reuters earlier this week that the company got rid of some manual reviews for content moderation, instead relying heavily on automation.
“You really need human review on a lot of abuse reports because they can be very nuanced and very contextual to offline life, and the platforms don’t really have that context,” Collier said. . “So it’s very difficult for machine learning algorithms to detect it all or make decisions about it all.”
Of course there is also Musk’s new Twitter Blue subscription modelwhich allows people to purchase platform verification for $8 per month. Previously, Twitter verified the identity of users before giving them a blue tick.
“Verification on Twitter was supposed to be about credibility and accountability, and it’s not usually something you can buy. So if you just let people buy verification or credibility, you have no credibility,” Collier said. “So if someone sees a little blue checkmark or really some kind of approval badge, the user doesn’t know what that means and can’t count on it.”
Collier told Insider she would consider returning to the board if a Musk-owned Twitter improves its commitment to security on the platform. But for now, as Abdul Rahman and Podesta echo in the statement, they believe that vow has been broken.
“I followed with, dare I say, trepidation, the negotiations over Elon Musk’s purchase of Twitter,” Abdul Rahman said in the statement. “I had written down some commitments to myself at the time. If Musk crossed those thresholds, I figured I’d quit. Those red lines were crossed.”
In a statement to Insider, Podesta said, “The safety and protection of all users has always been paramount. Having a policy in place has always been key – it meant everyone knew how moderation decisions This careful process appears to have failed. I am deeply saddened to see the rise in racist, violent and hate speech in recent months.