According to two people familiar with the subject, Twitter Inc deleted a feature in the last few days that offered suicide prevention hotlines and other safety options to users seeking specific content, which was requested by new owner Elon Musk.
Twitter’s head of trust and safety Ella Irwin informed Reuters via email that they have been updating and improving their prompts after this story was published. They were merely removed temporarily as they carried out that.
“We expect to have them back up next week,” she said.
There was no prior information about the feature’s removal, known as #ThereIsHelp. It had contacts for support groups for mental health, HIV, immunizations, child sexual exploitation, COVID-19, gender-based violence, natural catastrophes, and freedom of expression at the top of specialized searches in many different countries.
Since its removal, worries about the safety of Twitter’s most vulnerable users have grown. Even though researchers and civil rights organizations have observed an increase in tweets with racial slurs and other hateful content, Musk has claimed that impressions, or views, of harmful content, have decreased since he took over in October and has tweeted graphs demonstrating a downward trend.
Internet firms like Twitter, Google, and Facebook have worked for years to point users to well-known resource providers like government hotlines when they believe someone may be in danger, in part owing to pressure from consumer safety groups.
According to company postings, Twitter debuted certain prompts around five years ago, and some of them were accessible in over 30 countries. Twitter stated that it had a duty to make sure customers could “reach and receive support on our service when they need it most” in one of its blog entries about the feature.
Prompts that had appeared in search results just a few days before, according to Alex Goldenberg, lead intelligence analyst at the nonprofit Network Contagion Research Institute, had disappeared by Thursday.
He and colleagues performed a study in August demonstrating that, compared to a year earlier, the number of monthly mentions on Twitter of specific phrases related to self-harm surged by almost 500%. Younger users were found to be particularly at risk when exposed to such information.
Musk has criticized the previous owner’s handling of the matter and stated that he wants to stop child sexual abuse content on Twitter. However, he has eliminated a sizable chunk of the teams responsible for handling potentially offensive content
What's Your Reaction?
Marian Romaine is a content writer and Ux designer. She is passionate about crafting contents with words and designs. Full name: Marian Romaine