Twitter laid out a detailed plan to curb hate and abuse on its platform Thursday, revealing that it will soon ban groups that promote violence, and bar users from using hate symbols as their profile picture. The company also plans improve the abuse reporting process, and implement steps that could lead to a lot more bans.
“We’re updating our approach to make Twitter a safer place. This won’t be a quick or easy fix, but we’re committed to getting it right,” the company’s safety team wrote in a blog post. “Far too often in the past we’ve said we’d do better and promised transparency but have fallen short in our efforts. Starting today, you can expect regular, real-time updates about our progress.”
As part of these updates, Twitter released a calendar of planned changes Thursday, detailing steps it wants to take over the next three months. Some of this includes suspensions for the sharing of non-consensual nudity, educating users about the existing rules and the ban of abusive user names.
Twitter also plans to update its existing rules in November, at which point it will start to suspend new classes of accounts, including those run by “organizations that use violence to advance their cause.” The changes will also make it a violation of Twitter’s rules to use swastikas and other hate symbols in profile pictures. “Tweets containing this type of content will be placed behind an interstitial,” according to the outline shared Thursday.
Perhaps the biggest change will come in December. At that point, the company plans to not only ban direct threats of violence, but also tweets that condone violence. Many Twitter users have frequently complained that their harassment reports haven’t resulted in action from the company. If the company follows through on these rules, it could potentially crack down on a lot more abusive behavior on its platform.
However, some of the company’s critics have contended that the company didn’t need stricter rules, but more consistent enforcement of these rules. Twitter said Thursday that it wants to address these complaints by being more transparent about the way it enforces its rules, and also implement technology that would help to prioritize abuse reports.