Google is like those annoying people who finish your sentences. You start to type into the search bar and before you finish the first word or two, it’s suggesting the rest of what it thinks you want to find out.
It’s not mind-reading and it’s not suggestion. It’s called “auto-complete”. An algorithm that predicts and displays possible searches. So you can choose words that are kind of like what you wanted to search. Or change to what everyone else wants to know.
UN Women used the auto-complete function in their new advertising campaign to expose sexism. They Googled phrases like “women should” and showed how they were auto-completed with “stay at home”, “be slaves” and “be in the kitchen.”
Such staggeringly sexist second-guessing is the result of indexing web pages from an estimated 3-billion Google searches per day. It’s what people really think and want to know. Internet searches have apparently been associated with international trends on issues ranging from epidemics to unemployment. So that’s what’s trending in terms of online views of women.
My New Old Self decided to explore online views of old people and Googled the words “old people should”. (Warning to sensitive older readers: this gets ugly.) Google’s first auto-completion was “Die”. Next was “be euthanized”.
Inspired by the anti-sexist posters from UN Women, My New Old Self offers a series of images that use Google’s auto-completion to expose ageism and discrimination.
Auto-completion can be used to detect any kind of prejudice. The UN Human Rights Office’s Free and Equal campaign Googled “gays should” and got “be killed”. The auto-completion of “gays shouldn’t” was “be allowed to marry”.
Now we’ve reached that proverbial slippery slope. You can carry on Googling in the name of exposing intolerance. Type “blacks should” and Google auto-completes “leave America”. Search “whites should” and get “go back to Europe”. (You also get “be washed in” – exposing the apartheid between whites and coloureds on laundry day.)
So racism, sexism, homophobia and ageism are rampant on the net. What is to be done?
Shouldn’t we rather ask why we’re looking to robots for insights into gender, race and mortality? And why we tolerate our searches being tracked and our personal data harvested?
The web is full of complaints about the inane things that people ask Google, from “the meaning of life” to “what is the only”. There’s even a site on a mash-up of auto-completed thoughts purporting to be Google Poetry.
Google’s algorithm offers searches after just a few keystrokes when typing in the search box, in an attempt to predict what the user wants to type. The combination of these suggestions can be funny, absurd, Dadaistic — and sometimes even deeply moving.
– Sampsa Nuotio and Raisa Omaheimo, Google Poetics
Have I been moved by Goo-etry? Sure, after exiting Google I left the room. To get some fresh air and complete my own thoughts.