Earlier this week, an investigative video created by news and pop culture website SourceFed went viral, drawing millions of views on Facebook and YouTube. The seven-minute-long video accused the search giant Google of manipulating autocomplete suggestions to be more favorable to Democratic presidential candidate Hillary Clinton, with some examples of how Google differed from its competitors Bing and Yahoo. It claimed, for instance, that typing in "Hillary Clinton cri" would result in terms like "Hillary Clinton crime reform" rather than "crimes," or that "Hillary Clinton ind" would yield "Hillary Clinton Indiana" rather than "indictment." In comparison, the video noted that autocomplete results for "Donald Trump rac" would include the word "racist" and "Bernie Sanders soc" would yield "socialist." "The intention is clear. Google is burying potential searches for terms that could have hurt Hillary Clinton in the primary elections over the past several months by manipulating recommendations on their site," SourceFed's Matt Lieberman said in the video, published Thursday. Presumptive Republican nominee Donald Trump, who has repeatedly accused the media of biased coverage of "Crooked Hillary," even weighed in on the SourceFed video. "If this is true, it is a disgrace that Google would do that," Trump said in a statement sent to Business Insider. "Very, very dishonest." "They should let it float and allow people [to] see how crooked she really is," he added. But Google is denying that its search results are tipping the scales toward Clinton -- or any other political figure, for that matter. In a statement to the Washington Times, a Google spokesperson said "Google Autocomplete does not favor any candidate or cause." And in a Friday blog post, Tamar Yehoshua, Google's vice president of product management in charge of the site's search feature, further explained that the autocomplete feature will not offer up derogatory terms in relation to any person. "The autocomplete algorithm is designed to avoid completing a search for a person's name with terms that are offensive or disparaging," Yehoshua wrote. "We made this change a while ago following feedback that Autocomplete too often predicted offensive, hurtful or inappropriate queries about people. This filter operates according to the same rules no matter who the person is." Yehoshua emphasized that autocomplete terminology "isn't an exact science" and the results of the algorithm "changes frequently." "Predictions are produced based on a number of factors including the popularity and freshness of search terms," he added. "Given that search activity varies, the terms that appears in Autocomplete for you may change over time." Further, the Google executive noted that while autocomplete predictions are a popular search feature, it does not actively limit the search results. |