
Twitter has been cracking down on election misinformation.
Angela Lang/CNET
Twitter added labels with additional context to around 300,000 tweets with potentially misleading or unconfirmed content over two weeks spanning the 2020 US presidential election, according to a Thursday report by CNN. Additionally, Twitter reportedly said more than 450 of the labeled tweets were concealed by a warning message and were subject to limited retweeting. The analysis looked at tweets about the election posted between Oct. 27 to Nov. 11.
The social media company, along with other tech giants like Facebook and Google, has been working to battle election misinformation leading up to, during and after the election. This comes as some social media users, including President Donald Trump, have challenged the results of the election after it was called for Democrat Joe Biden, who won the popular vote and has garnered more than 5 million more votes across the US than his rival. Trump has been using social networks to falsely claim, without evidence, that the 2020 election was “stolen” from him. Twitter has added warning labels to several of Trump’s tweets, including one in which he wrongly claimed he’d won the election.
Twitter didn’t immediately respond to a request for comment.
The company also reportedly examined what worked to help curb misinformation and what didn’t. Removing user recommendations on who to follow, for example, apparently didn’t have much of a meaningful impact on election misinformation. Twitter will remove that feature on Thursday, according to CNN. It will also reportedly undo a change it rolled out during the election in which only topics with additional in-line context were shown on the “For You” tab featuring trending topics.