Robert Epstein, Senior Research Psychologist at the American Institute for Behavioral Research and Technology, is creeped out by Google. He finds the giant's ability to gather information on users and employ that information to deliver tailored online experiences and ads to be disturbing.
Google, he says in an article for tech-focused website Hackernoon, “is actually one of the most rapacious and deceptive companies ever created.”
With that less than objective perspective, Mr. Epstein set out in 2016 to test the search engine's ability to deliver unbiased content surrounding the hottest button topic of the year: The upcoming Presidential elections.
He carried out his test by launching a secretive project to build and employ a tool for watching over users' shoulders as they conducted Internet searches. Using a team of “field agents” who agreed to have this program run on their computers as they conducted searches using election-themed search terms, he tried to determine if Google, Yahoo, and Bing showed any bias toward one candidate or another.
Mr. Epstein claims that he did uncover signs of bias in the search results. What it appears he did instead was make two fundamental errors in his approach.
Mr. Epstein's Findings
Mr. Epstein used covert methods (including obtaining funding through someone he described as a “mysterious man from Central America” and obtaining field agents to run the test through a black hat group specializing in Facebook ads) to develop his project.
The project consisted of creating a computer program that would download and preserve the first page of the search results that appeared on the computers of the projects' field agents. The program would also open and preserve the first 10 search results on these pages. The project ran for the six months leading up to the 2016 election.
Mr. Epstein claims to have found evidence of pro-Clinton bias in these results, enough bias to swing what he estimates to be 2 million votes in her direction. Biased results were defined as higher ranked results that “connected to web pages that made one candidate look better than the other .” Here is a summary of his findings:
- Pro-Clinton search results even when slightly pro-Trump search terms were used.
- Top 10 search results favoring Clinton in the month leading up to the election
- Twice as much pro-Clinton bias on Google as on Yahoo
- Gmail users receiving zero results that were biased
Mr. Epstein's First Mistake.
These results may seem damning at first. However, let's take a closer look at Mr. Epstein's approach. First, Mr. Epstein made the mistake of limiting his field agents. He was extremely concerned about secrecy, so much so that he limited participants to those who were willing to use the Firefox browser and who did not have Gmail accounts (except for a few Gmail users whom he employed as controls).
His concern? That Google would catch wind of the project and change its search results accordingly. So obsessed was he with avoiding Google's attention that he discarded all of his data from the Bing search engine because everyone who used that search engine during the project had a Gmail account.
While this might seem to be a wise move if you are already convinced that your search engine is out to influence your search results, what probably happened is that Mr. Epstein skewed his own results.
It is well-known that search results differ among browsers. Results may differ among browsers in part because different demographics tend to use different browsers. The search engines will return results optimized for individual browsers. The result? Mr. Epstein's field agents received Firefox-specific results. It is possible that many Firefox users are Democratically inclined and the search engines returned Clinton-friendly results in order to cater to its audience's preferences.
In addition, by limiting his participants only to Gmail users, and to those willing to respond to black hat advertisements for a secretive project, Mr. Epstein restricted his ability to get a look at how search engines respond to a wide array of users. In fact, he took Google's lack of “biased” search results toward Gmail users as a sign that Google had snooped in their email, caught wind of the project, and delivered “unbiased” results to muddy the waters.
As a result, instead of counting those results in his analysis, he used his own preconceived ideas about Google as an excuse to throw out data that may have contradicted the conclusion he ultimately came to. In essence, he biased the results of his own project by refusing to account for all the data he received.
Mr. Epstein's Second Mistake
Mr. Epstein's second fundamental error was to assume that the search engines' job is to return unbiased search results, and that those search results should be the same from user to user. Instead, when deciding which search results to show you, Google, and other search engines, take many factors into consideration. These factors include previous sites you visited, past purchases, past reviews, your location, and so forth. Even the device you use (mobile vs. desktop) can influence which results search engines show you.
In fact, the search engines' goal is not to return unbiased or identical results. Their goal is to return the results that are most relevant and useful to each individual user. A user, for example, with a history of visiting gun rights websites or purchasing a gun might receive results that lean toward candidates who support gun rights. A user whose history shows opposite leanings might receive search results leaning toward political candidates who support greater gun control.
In addition, Google bases search results on prevailing trends and the leanings of certain demographics. For example, if most people in an area are searching for and clicking on links related to negative discussions about Trump, the search engine is more likely to return those kinds of results when it sees that the user is in that area.
Mr. Epstein noted demographic differences in his study. In particular, he saw that pro-Clinton search results appeared primarily to “decided voters, males, the young, and voters in Democratic states.” Some of these demographics, particularly the young and those in Democratic states, were naturally going to prefer Clinton-friendly results. The fact that the search engines gave them these results is a sign that the search engines were doing their job, not trying to swing voters one way or another.
Mr Epstein's study is a fascinating look at how technology can be used to monitor search results. What it is not, however, is a look at how search engines are swaying votes. Instead Mr. Epstein biased his own results by throwing out data, limiting project participants, and overlooking some fundamental principles of how search engines work.
Search engines want to deliver customized results to users, and since those results make it easier to find the information we want online, we want them to keep up the good work.
If you want to learn more about how search engines work, and how you can get them to work for you, reach out to Distinct Web Design. We can help you use SEO and digital marketing to advance your own business goals.