Economy
EU gives ultimatum to Facebook, Twitter, Google on hate speech
The EU executive has presented guidelines and principles for online platforms “to increase the proactive prevention, detection and removal of illegal content inciting hatred, violence and terrorism online."
European regulators have been urging social media companies for years to remove violent, xenophobic and racist posts from their platforms in a timely manner for years, but their patience is running out.
The aim of the new guidelines and principles is to increase the proactive prevention, detection and removal of illegal content inciting hatred, violence and terrorism online.
“The increasing availability and spreading of terrorist material and content that incites violence and hatred online is a serious threat to the security and safety of EU citizens. It also undermines citizens' trust and confidence in the digital environment - a key engine of innovation, growth and jobs," the Commission said in a statement on Thursday.
She noted that the code of conduct she agreed with Facebook, Twitter, Google and Microsoft shows that a self-regulatory approach can serve as a good example and can lead to results. “However, if the tech companies don't deliver, we will do it."
Facebook, Google, Microsoft and Twitter promised in May 2016 to review a majority of hate speech flagged by users within 24 hours and to remove any illegal content.
The new guidance issued today calls on online platforms to further boost their efforts to prevent the spread of illegal content.
“Given their increasingly important role in providing access to information, the Commission expects online platforms to take swift action over the coming months, in particular in the area of terrorism and illegal hate speech - which is already illegal under EU law, both online and offline."
The German government approved a plan in April to start imposing fines of as much as EUR 50 million on social media companies if they fail to remove hate speech and fake news posts within 24 hours after being flagged. Other illegal content needs to be deleted within 7 days of reporting, CNN reminded.
The Commission said it will carefully monitor progress made by the online platforms over the next months and assess whether additional measures are needed in order to ensure the swift and proactive detection and removal of illegal content online,
European regulators have been urging social media companies for years to remove violent, xenophobic and racist posts from their platforms in a timely manner for years, but their patience is running out.
The aim of the new guidelines and principles is to increase the proactive prevention, detection and removal of illegal content inciting hatred, violence and terrorism online.
“The increasing availability and spreading of terrorist material and content that incites violence and hatred online is a serious threat to the security and safety of EU citizens. It also undermines citizens' trust and confidence in the digital environment - a key engine of innovation, growth and jobs," the Commission said in a statement on Thursday.
The Commission has decided to thoroughly tackle the problem of illegal content online. The situation is not sustainable: in more than 28% of cases, it takes more than one week for online platforms to take down illegal content
, commented Mariya Gabriel, Commissioner for the Digital Economy and Society"The rule of law applies online just as much as offline. We cannot accept a digital Wild West, and we must act"
, added Vera Jourová, Commissioner for Justice, Consumers and Gender Equality.She noted that the code of conduct she agreed with Facebook, Twitter, Google and Microsoft shows that a self-regulatory approach can serve as a good example and can lead to results. “However, if the tech companies don't deliver, we will do it."
Facebook, Google, Microsoft and Twitter promised in May 2016 to review a majority of hate speech flagged by users within 24 hours and to remove any illegal content.
The new guidance issued today calls on online platforms to further boost their efforts to prevent the spread of illegal content.
“Given their increasingly important role in providing access to information, the Commission expects online platforms to take swift action over the coming months, in particular in the area of terrorism and illegal hate speech - which is already illegal under EU law, both online and offline."
The German government approved a plan in April to start imposing fines of as much as EUR 50 million on social media companies if they fail to remove hate speech and fake news posts within 24 hours after being flagged. Other illegal content needs to be deleted within 7 days of reporting, CNN reminded.
The Commission said it will carefully monitor progress made by the online platforms over the next months and assess whether additional measures are needed in order to ensure the swift and proactive detection and removal of illegal content online,
including possible legislative measures to complement the existing regulatory framework.
This work will be completed by May 2018.