TaiShang
ELITE MEMBER
- Joined
- Apr 30, 2014
- Messages
- 27,848
- Reaction score
- 70
- Country
- Location
Research Proves Google Manipulates Millions to Favor Clinton
12.09.2016
In this exclusive report, distinguished research psychologist Robert Epstein explains the new study and reviews evidence that Google's search suggestions are biased in favor of Hillary Clinton. He estimates that biased search suggestions might be able to shift as many as 3 million votes in the upcoming presidential election in the US.Biased search rankings can swing votes and alter opinions, and a new study shows that Google's autocomplete can too.
A scientific study I published last year showed that search rankings favoring one candidate can quickly convince undecided voters to vote for that candidate — as many as 80 percent of voters in some demographic groups. My latest research shows that a search engine could also shift votes and change opinions with another powerful tool: autocomplete. Because of recent claims that Google has been deliberately tinkering with search suggestions to make Hillary Clinton look good, this is probably a good time both to examine those claims and to look at my new research.
As you will see, there is some cause for concern here. In June of this year, Sourcefed released a video claiming that Google's search suggestions — often called "autocomplete" suggestions — were biased in favor of Mrs. Clinton. The video quickly went viral: the full 7-minute version has now been viewed more than a million times on YouTube, and an abridged 3-minute version has been viewed more than 25 million times on Facebook.
The video's narrator, Matt Lieberman, showed screen print after screen print that appeared to demonstrate that searching for just about anything related to Mrs. Clinton generated positive suggestions only. This occurred even though Bing and Yahoo searches produced both positive and negative suggestions and even though Google Trends data showed that searches on Google that characterize Mrs. Clinton negatively are quite common — far more common in some cases than the search terms Google was suggesting. Lieberman also showed that autocomplete did offer negative suggestions for Bernie Sanders and Donald Trump.
"The intention is clear," said Lieberman. "Google is burying potential searches for terms that could have hurt Hillary Clinton in the primary elections over the past several months by manipulating recommendations on their site." Google responded to the Sourcefed video in an email to the Washington Times, denying everything.
According to the company's spokesperson, "Google Autocomplete does not favor any candidate or cause." The company explained away the apparently damning findings by saying that "Our Autocomplete algorithm will not show a predicted query that is offensive or disparaging when displayed in conjunction with a person's name." Since then, my associates and I at the American Institute for Behavioral Research and Technology (AIBRT) — a nonprofit, nonpartisan organization based in the San Diego area — have been systematically investigating Lieberman's claims.
What we have learned has generally supported those claims, but we have also learned something new — something quite disturbing — about the power of Google's search suggestions to alter what people search for. Lieberman insisted that Google's search suggestions were biased, but he never explained why Google would introduce such bias. Our new research suggests why — and also why Google's lists of search suggestions are typically much shorter than the lists Bing and Yahoo show us. Our investigation is ongoing, but here is what we have learned so far:
Please read the detailed (and scientifically proven) report here (with visuals):
https://sputniknews.com/us/20160912/1045214398/google-clinton-manipulation-election.html
12.09.2016
In this exclusive report, distinguished research psychologist Robert Epstein explains the new study and reviews evidence that Google's search suggestions are biased in favor of Hillary Clinton. He estimates that biased search suggestions might be able to shift as many as 3 million votes in the upcoming presidential election in the US.Biased search rankings can swing votes and alter opinions, and a new study shows that Google's autocomplete can too.
A scientific study I published last year showed that search rankings favoring one candidate can quickly convince undecided voters to vote for that candidate — as many as 80 percent of voters in some demographic groups. My latest research shows that a search engine could also shift votes and change opinions with another powerful tool: autocomplete. Because of recent claims that Google has been deliberately tinkering with search suggestions to make Hillary Clinton look good, this is probably a good time both to examine those claims and to look at my new research.
As you will see, there is some cause for concern here. In June of this year, Sourcefed released a video claiming that Google's search suggestions — often called "autocomplete" suggestions — were biased in favor of Mrs. Clinton. The video quickly went viral: the full 7-minute version has now been viewed more than a million times on YouTube, and an abridged 3-minute version has been viewed more than 25 million times on Facebook.
The video's narrator, Matt Lieberman, showed screen print after screen print that appeared to demonstrate that searching for just about anything related to Mrs. Clinton generated positive suggestions only. This occurred even though Bing and Yahoo searches produced both positive and negative suggestions and even though Google Trends data showed that searches on Google that characterize Mrs. Clinton negatively are quite common — far more common in some cases than the search terms Google was suggesting. Lieberman also showed that autocomplete did offer negative suggestions for Bernie Sanders and Donald Trump.
"The intention is clear," said Lieberman. "Google is burying potential searches for terms that could have hurt Hillary Clinton in the primary elections over the past several months by manipulating recommendations on their site." Google responded to the Sourcefed video in an email to the Washington Times, denying everything.
According to the company's spokesperson, "Google Autocomplete does not favor any candidate or cause." The company explained away the apparently damning findings by saying that "Our Autocomplete algorithm will not show a predicted query that is offensive or disparaging when displayed in conjunction with a person's name." Since then, my associates and I at the American Institute for Behavioral Research and Technology (AIBRT) — a nonprofit, nonpartisan organization based in the San Diego area — have been systematically investigating Lieberman's claims.
What we have learned has generally supported those claims, but we have also learned something new — something quite disturbing — about the power of Google's search suggestions to alter what people search for. Lieberman insisted that Google's search suggestions were biased, but he never explained why Google would introduce such bias. Our new research suggests why — and also why Google's lists of search suggestions are typically much shorter than the lists Bing and Yahoo show us. Our investigation is ongoing, but here is what we have learned so far:
Please read the detailed (and scientifically proven) report here (with visuals):
https://sputniknews.com/us/20160912/1045214398/google-clinton-manipulation-election.html