Google doesn’t like to censor its results unless it feels it has to. That’s why the search engine has had so many run-ins with scenarios involving its autocomplete feature and other types of search results associated with individuals or groups of people.
The latest fiasco? Derogatory autocomplete results for cities in the UK when typing a phrase like, “why is [city name] so full of …”
The Daily Mail reported the results “racist,” although it wouldn’t give details about exactly what they said or which groups were being targeted. And, according to reports, Google actually modified the results this time (“Google took immediate action against the Autocomplete suggestions and removed all of them from the site”), which is surprising given the many legal battles it’s had in countries all over the world with regards to modifying its search results so easily.
Maybe Google decided it would pick its battles this time. To recap just some of the high-profile cases where its search results have been under fire include:
Does the latest compliance in the UK mean Google is softening its stance when it comes to censorship? Maybe in some instances.
Mostly, Google likes to let its algorithm do the job to serve up results, but it will, in some cases, manually block offensive material such as child pornography, and upon request, other materials the company has deemed worth hiding.
As reported by the Daily Mail:
The company initially refused to sign up to an alert system that would block people from viewing vile images on the internet, saying it had its own methods of blocking such material. At the time, The Home Affairs Committee, which is chaired by Mr. Vaz, accused ‘complacent’ internet service providers and search engines of being ‘far too laid back about what takes place on their watch’. Speaking in July last year, he said: ‘The Prime Minister was right to highlight the responsibility of the internet service providers, search engines and social media sites.