Following a wave of terrorist attacks in the UK in recent months, Google’s senior vice-president and general counsel, Kent Walker, used one of the leading publications in the country to outline a plan to combat the use by terrorists of Google’s tools.
On Sunday, Walker posted an op-ed in the Financial Times that listed four distinct steps Google is taking to fight extremists who harness the power of tools like YouTube to spread their messages.
“We will now devote more engineering resources to apply our most advanced machine learning research to train new ‘content classifiers’ to help us more quickly identify and remove [extremist and terrorism-related videos],” said Walker.
“We will greatly increase the number of independent experts in YouTubes Trusted Flagger programme,” Walker continued, offering detail on the changes behind the scenes at YouTube. “We will expand this programme by adding 50 expert NGOs to the 63 organisations who are already part of the programme, and we will support them with operational grants.”
These initiatives don’t just cover technical approaches. The company will now take a modified approach to judgment calls on content in general.
“We will be taking a tougher stance on videos that do not clearly violate our policies … videos that contain inflammatory religious or supremacist content.”
“We will be taking a tougher stance on videos that do not clearly violate our policies for example, videos that contain inflammatory religious or supremacist content; in future these will appear behind a warning and will not be monetised, recommended, or eligible for comments or user endorsements,” although this particular measure is no guarantee that such content won’t still make it to users, eliminating the monetization of such content may help. “We think this strikes the right balance between free expression and access to information without promoting extremely offensive viewpoints.”
Lastly, Walker says the company will bolster existing efforts from its Creators for Change and Jigsaw projects to implement the “redirect method,” a program that uses Adwords and YouTube to debunk ISIS recruiting messages online.
Google’s public stance comes just weeks after UK Prime Minister Theresa May gave a speech in which she pushed for more internet regulation as a means to prevent terrorism. In that context, it appears that Google’s op-ed is (aside from its purely positive intentions) an attempt to influence UK away from internet regulation that could take more control out of the hands of internet behemoths like Google.
To that end, Walker also mentioned that Google is working with Facebook, Microsoft, and Twitter to create an international forum devoted to combating terrorist activities online.
What this all means for freedom of speech is fascinating to consider: Do you trust curated censoring of some content to be in the hands of internet companies or the government? Both have shortcomings, but Google’s op-ed is a clear sign that this is no longer something the internet will “just work out” on its own, regulation is coming, from one side or the other.