Facebook Inc said on Wednesday (May 26) it would take "stronger" action against people who repeatedly share misinformation on the platform. Facebook will reduce the distribution of all posts in its news feed from a user account if it frequently shares content that has been flagged as false by one of the company`s fact-checking partners, the social media giant said in a blog post.


COMMERCIAL BREAK
SCROLL TO CONTINUE READING

It added that it was also launching ways to inform people if they are interacting with content that has been rated by a fact-checker. False claims and conspiracies have proliferated on social media platforms, including Facebook and Twitter, during the COVID-19 pandemic.


"Whether it`s false or misleading content about COVID-19 and vaccines, climate change, elections or other topics, we`re making sure fewer people see misinformation on our apps," the company said in a statement.


Earlier this year, Facebook said it took down 1.3 billion fake accounts between October and December, ahead of an inspection by the U.S. House Committee on Energy and Commerce into how technology platforms are tackling misinformation.


ALSO READ: Concerned about employees and freedom of expression, says Twitter after police raids its India offices