Google gets tough on extremist YouTube content

Google Inc.is upping its fight against the spread of extremist content on its YouTube service by rolling out new artificial-intelligence services in conjunction with additional human oversight to stop it faster. The internet giant will seek to train new “content classifiers”, using its machine learning research, to help it identify and remove extremist content more quickly.

YouTube, the online video platform owned by Google, has issued a statement pertaining the measures it will be taking to combat extremist content.

Google is already using image-matching technology to block reuploads of known terrorist content, he noted.

To step up its policing efforts, Google will almost double the number of independent experts it uses to flag problematic content and expand its work with counter-extremist groups to help identify content that may be used to radicalize and recruit terrorists.

Like Facebook, Google pledges to increase the use of its technology to help identify terrorism-related content.

In the United Kingdom, a parliamentary committee report published in May accused the industry – including Google – of prioritizing profit over user safety by continuing to host unlawful content.

The first step will entail the use of automated systems for better identification of content related to terror.

It will expand its cooperation with the counter extremist group to identify content used to radicalize and recruit extremists.

It will expand this programme by adding 50 expert NGOs that it will support with operational grants. These initiatives could help Google woo back major advertisers who began pulling back from YouTube earlier this year after learning that their brands sometimes appeared next to unsavory videos. Already most of the accounts they remove for terrorism.

Videos that don’t clearly violate Google’s policies, such as ones “that contain inflammatory religious or supremacist content” will soon be placed behind a full page or interstitial warning. This is to make these videos harder to find while still respecting the right to free expression.

Finally, YouTube will look to build on its Creators for Change programme – which promotes YouTube voices against hate and radicalisation. To describe it simply, adverts that attempt to lure folks to join the ranks of a terrorist organisation will redirect users to content that debunk terrorist recruiting messages.

The company would also reach potential Islamic State recruits through targeted online advertising and thereby redirect them towards anti-terrorist videos, in a bid to change their minds about joining IS.

Leave a Reply

Your email address will not be published. Required fields are marked *