EC Asks Google, Twitter, Facebook and YouTube to remove Extremist contents
The European Commission’s president on Wednesday has said that Google, Facebook, and Twitter must remove extremist contents from their platforms within an hour or else be ready to pay hefty fines. In his annual State and Union address, Jean Claude Juncker said, an hour was a decisive time window.
According to the report, The European Commission told these companies in March and had given three months of timing to show they are removing extremist contents or face legislation forcing them to do so. The Commission wants extremist contents, advocating extremist offences or showing how to commit such acts to be removed from their respective web within an hour of receiving a corresponding order from national authorities.
In a proposal that will need support from EU countries and the European Parliament, internet platforms will also be required to take proactive measures, such as developing new tools to weed out abuse and human oversight of content.
According to the commission, the service provider will also have to provide an annual transparency report, how they are making an efforts to tackle these problems. If they are failed to remove promoting extremist contents they will be fined up to four per cent of their annual global turnover.
Facebook responded, saying “There is no place for terrorism on Facebook, and we share the goal of the European Commission to fight it, and believe that it is only through a common effort across companies, civil society and institutions that results can be achieved.
“We’ve made significant strides finding and removing terrorist propaganda quickly and at scale, but we know we can do more.” Facebook further said.
A spokesperson for YouTube added that the site “shared the European Commission’s desire to react rapidly to terrorist content and keep violent extremism off our platforms. That’s why we’ve invested heavily in people, technology and collaboration with other tech companies on these efforts.”
The industry has also been working since December 2015 in a voluntary partnership to stop the misuse of the internet by international extremist groups, later creating a “database of hashes” to better detect extremist content.
The Commission will retain a voluntary code of conduct on hate speech with Facebook, Microsoft, Twitter and YouTube in 2016. Other companies have since announced plans to join it.