(Reuters) – YouTube said on Wednesday it would start removing content that falsely allege widespread fraud changed the outcome of the U.S. presidential election, in a change to its more hands-off stance on videos making similar claims.
The update comes a day after “safe harbor” day, a deadline set by U.S. law for states to certify the results of the presidential election. YouTube said it would start enforcing the policy in line with its approach towards historical U.S. presidential elections.
Online platforms have been under pressure to police misinformation about the election on their sites.
YouTube, owned by Alphabet Inc’s Google, was widely seen as taking a more hands-off approach than Facebook Inc and Twitter Inc, which started labeling content with election misinformation. YouTube labels all election-related videos.
Last month, a group of Democratic senators asked YouTube to commit to removing content containing false or misleading information about the 2020 election outcome and the upcoming Senate run-off elections in Georgia.
After the November election, Reuters identified several YouTube channels making money from ads and memberships that were amplifying debunked accusations about voting fraud.
YouTube said in a blog post on Wednesday that since September it has removed over 8,000 channels and thousands of misleading election-related videos for violating its existing policies. (https://bit.ly/37PCGy0)
The company said more than 70% of recommendations on election-related topics came from authoritative news sources. YouTube also said that since Election Day, fact-check information panels had triggered over 200,000 times on election-related search results.
(Reporting by Elizabeth Culliford and Ayanti Bera; Editing by Shounak Dasgupta)