Computers don’t really make mistakes. To us, it looks like they do, but they are just following the code and logic that has been programmed in by the developers. But still, when they act in a way that is contrary to our commands, we’re inclined to blame it on the computer, as if it was supposed to give us the results we wanted regardless of how it’s programmed to work.

One such “mistake” was recently discovered when Twitter user @__TheProphecy watched a YouTube video in which the person was ranting about the fact that if you search for “white American doctor” on Google’s image search, you’ll see a whole lot of pictures of African American doctors. So, she had to go look for herself.

Turns out, Google’s image search is having trouble giving proper results for “white American doctor”

Image credits: __TheProphecy

So, apparently, if you search for “white American doctor” on Google, you’ll be given a bunch of images of African American doctors. Funnily enough, if you google “black American doctor”, you’ll be given pretty much the same results. So, did some hacktivist truly do some computer wizardry to troll people, or is it by design?

Google’s Public Search Liaison Danny Sullivan gave an explanation on this phenomenon when explaining why Google’s search engine used to give a result for “Boston’s worst neighborhoods” when the search query was “Boston’s black neighborhoods”. The conversation is different, but the explanation is the same.

He said: “As it turns out, when people post images of white couples, they tend to say only ‘couples’ & not provide a race. But when there are mixed couples, then “white” gets mentioned. Our image search heavily depends on words—so when we don’t get the words, this can happen.”

Apparently, googling “white American doctor” resulted in African American doctors, for some reason

Image credits: __TheProphecy

This also worked with “white American nurse”

As soon as word got out, people started experimenting because, c’mon, it’s Google, it’s supposed to know everything perfectly

The Search Engine Journal ventured to elaborate on this by saying that Google is like a mirror in the sense that it reflects how people use search engines and how the web’s content is written: “This is at the heart of the keyword research, understanding what users mean when they type something and how often they type those words […]. The takeaway for these search results: text has a huge influence on image ranking.”

So, that’s why Google gives you results that you don’t expect to get. And, it turns out, this isn’t just for doctors—engineers too! In fact, for anything that Twitter user @plutopaula would google. Another user, @hyacinthgirl_, also pointed out that it’s the same if you search for “portraits of European people” as seen in this tweet.

And it’s not just that phrase, it’s also things like “portraits of European people”