Computers don’t really make mistakes. To us, it looks like they do, but they are just following the code and logic that has been programmed in by the developers. But still, when they act in a way that is contrary to our commands, we’re inclined to blame it on the computer, as if it was supposed to give us the results we wanted regardless of how it’s programmed to work.
One such “mistake” was recently discovered when Twitter user @__TheProphecy watched a YouTube video in which the person was ranting about the fact that if you search for “white American doctor” on Google’s image search, you’ll see a whole lot of pictures of African American doctors. So, she had to go look for herself.
Turns out, Google’s image search is having trouble giving proper results for “white American doctor”
Image credits: __TheProphecy
So, apparently, if you search for “white American doctor” on Google, you’ll be given a bunch of images of African American doctors. Funnily enough, if you google “black American doctor”, you’ll be given pretty much the same results. So, did some hacktivist truly do some computer wizardry to troll people, or is it by design?
Google’s Public Search Liaison Danny Sullivan gave an explanation on this phenomenon when explaining why Google’s search engine used to give a result for “Boston’s worst neighborhoods” when the search query was “Boston’s black neighborhoods”. The conversation is different, but the explanation is the same.
He said: “As it turns out, when people post images of white couples, they tend to say only ‘couples’ & not provide a race. But when there are mixed couples, then “white” gets mentioned. Our image search heavily depends on words—so when we don’t get the words, this can happen.”
Apparently, googling “white American doctor” resulted in African American doctors, for some reason
Image credits: __TheProphecy
This also worked with “white American nurse”
As soon as word got out, people started experimenting because, c’mon, it’s Google, it’s supposed to know everything perfectly
The Search Engine Journal ventured to elaborate on this by saying that Google is like a mirror in the sense that it reflects how people use search engines and how the web’s content is written: “This is at the heart of the keyword research, understanding what users mean when they type something and how often they type those words […]. The takeaway for these search results: text has a huge influence on image ranking.”
So, that’s why Google gives you results that you don’t expect to get. And, it turns out, this isn’t just for doctors—engineers too! In fact, for anything that Twitter user @plutopaula would google. Another user, @hyacinthgirl_, also pointed out that it’s the same if you search for “portraits of European people” as seen in this tweet.
And it’s not just that phrase, it’s also things like “portraits of European people”
Well, turns out there’s an explanation for this as tweeted by Danny Sullivan, Google’s public search expert
Image credits: dannysullivan
Needless to say, this phenomenon became a hit online with people googling anything they could think of in hopes of understanding just how confused Google’s search algorithms are in our eyes because of this. Why not give it a go yourself and see what you find before they patch it up.
But before you do that, let us know what your take on this is in the comments section below!
Here’s how the internet reacted to this new bit of internet knowledge
Image credits: Mrc0by
Image credits: ArieBomaye
Image credits: thisaintadriana
Image credits: dneely9000
Image credits: cmtrygates
Image credits: dashofcinnam0n
Image credits: HanahakiBlue
Image credits: ShernitaSOfly
Image credits: JimmieBoswell
Image credits: t_real21