Google’s Search Algorithm Could Steal the Presidency – wait … what … ?

From: wired.com,  by Adam Rogers,  on Aug 6, 2015,  see the article HERE.  Emphasis is by Garnet92.

google-algorithm

IMAGINE AN ELECTION—A close one. You’re undecided. So you type the name of one of the candidates into your search engine of choice. (Actually, let’s not be coy here. In most of the world, one search engine dominates; in Europe and North America, it’s Google.) And Google coughs up, in fractions of a second, articles and facts about that candidate. Great! Now you are an informed voter, right? But a study published this week says that the order of those results, the ranking of positive or negative stories on the screen, can have an enormous influence on the way you vote. And if the election is close enough, the effect could be profound enough to change the outcome.

In other words: Google’s ranking algorithm for search results could accidentally steal the presidency. “We estimate, based on win margins in national elections around the world,” says Robert Epstein, a psychologist at the American Institute for Behavioral Research and Technology and one of the study’s authors, “that Google could determine the outcome of upwards of 25 percent of all national elections.”

Epstein’s paper combines a few years’ worth of experiments in which Epstein and his colleague Ronald Robertson gave people access to information about the race for prime minister in Australia in 2010, two years prior, and then let the mock-voters learn about the candidates via a simulated search engine that displayed real articles.

One group saw positive articles about one candidate first; the other saw positive articles about the other candidate. (A control group saw a random assortment.) The result: Whichever side people saw the positive results for, they were more likely to vote for—by more than 48 percent. The team calls that number the “vote manipulation power,” or VMP. The effect held—strengthened, even—when the researchers swapped in a single negative story into the number-four and number-three spots. Apparently it made the results seem even more neutral and therefore more trustworthy.

But of course that was all artificial—in the lab. So the researchers packed up and went to India in advance of the 2014 Lok Sabha elections, a national campaign with 800 million eligible voters. (Eventually 430 million people voted over the weeks of the actual election.) “I thought this time we’d be lucky if we got 2 or 3 percent, and my gut said we’re gonna get nothing,” Epstein says, “because this is an intense, intense election environment.” Voters get exposed, heavily, to lots of other information besides a mock search engine result.

The team 2,150 found undecided voters and performed a version of the same experiment. And again, VMP was off the charts. Even taking into account some sloppiness in the data-gathering and a tougher time assessing articles for their positive or negative valence, they got an overall VMP of 24 percent. “In some demographic groups in India we had as high as about 72 percent.”

The fact that media, including whatever search and social deliver, can affect decision-making isn’t exactly news. The “Fox News Effect” says that towns that got the conservative-leaning cable channel tended to become more conservative in their voting in the 2000 election. A well-known effect called recency means that people make decisions based on the last thing they heard. Placement on a list also has a known effect. And all that stuff might be too transient to make it all the way to a voting booth, or get swamped by exposure to other media. So in real life VMP is probably much less pronounced.

But the effect doesn’t have to be enormous to have an enormous effect. The Australian election that Epstein and Robertson used in their experiments came down to a margin of less than 1 percent. Half the presidential elections in US history came down to a margin of less than 8 percent. And presidential elections are really 50 separate state-by-state knife fights, with the focus of campaigns not on poll-tested winners or losers but purple “swing states” with razor-thin margins.

So even at an order of magnitude smaller than the experimental effect, VMP could have serious consequences. “Four to 8 percent would get any campaign manager excited,” says Brian Keegan, a computational social scientist at Harvard Business School. “At the end of the day, the fact is that in a lot of races it only takes a swing of 3 or 4 percent. If the search engine is one or two percent, that’s still really persuasive.”

The Rise of the Machines

It’d be easy to go all 1970s-political-thriller on this research, to assume that presidential campaigns, with their ever-increasing level of technological sophistication, might be able to search-engine-optimize their way to victory. But that’s probably not true. “It would cost a lot of money,” saysDavid Shor, a data scientist at Civis Analytics, a Chicago-based consultancy that grew out of the first Obama campaign’s technology group. “Trying to get the media to present something that is favorable to you is a more favorable strategy.”

That’s called, in the parlance of political hackery, “free media,” and, yes, voters like it. “I think that generally people don’t trust campaigns because they tend to have a low opinion of politicians,” Shor says. “They are more receptive to information from institutions for which they have more respect.” Plus, in the presidential campaign high season, whoever the Republican and Democratic nominees are will already have high page ranks because they’ll have a huge number of inbound links, one of Google’s key metrics.

Search and social media companies can certainly have a new kind of influence, though. During the 2010 US congressional elections, researchers at Facebook exposed 61 million users to a message exhorting them to vote—it didn’t matter for whom—and found they were able to generate 340,000 extra votes across the board.

But what if—as Harvard Law professor Jonathan Zittrainhas proposed—Facebook didn’t push the “vote” message to a random 61 million users? Instead, using the extensive information the social network maintains on all its subscribers, it could hypothetically push specific messaging to supporters or foes of specific legislation or candidates. Facebook could flip an election; Zittrain calls this “digital gerrymandering.” And if you think that companies like the social media giants would never do such a thing, consider the way that Google mobilized its users against the Secure Online Privacy Act and PROTECT IP Act, or “SOPA-PIPA.”

In their paper, Epstein and Robertson equate digital gerrymandering to what a political operative might call GOTV—Get Out the Vote, the mobilization of activated supporters. It’s a standard campaign move when your base agrees with your positions but isn’t highly motivated—because they feel disenfranchised, let’s say, or have problems getting to polling places. What they call the “search engine manipulation effect,” though, works on undecided voters, swing voters. It’s a method of persuasion.

Again, though, it doesn’t require a conspiracy. It’s possible that, as Epstein says, “if executives at Google had decided to study the things we’re studying, they could easily have been flipping elections to their liking with no one having any idea.” But simultaneously more likely and more science-fiction-y is the possibility that this—oh, let’s call it “googlemandering,” why don’t we?—is happening without any human intervention at all. “These numbers are so large that Google executives are irrelevant to the issue,” Epstein says. “If Google’s search algorithm, just through what they call ‘organic processes,’ ends up favoring one candidate over another, that’s enough. In a country like India, that could send millions of votes to one candidate.”

As you’d expect, Google doesn’t think it’s likely their algorithm is stealing elections. “Providing relevant answers has been the cornerstone of Google’s approach to search from the very beginning. It would undermine people’s trust in our results and company if we were to change course,” says a Google spokesperson, who would only comment on condition of anonymity. In short, the algorithms Google uses to rank search results are complicated, ever-changing, and bigger than any one person. A regulatory action that, let’s say, forced Google to change the first search result in a list on a given candidate would break the very thing that makes Google great: giving right answers very quickly all the time. (Plus, it might violate the First Amendment.)

The thing is, though, even though it’s tempting to think of algorithms as the very definition of objective, they’re not. “It’s not really possible to have a completely neutral algorithm,” says Jonathan Bright, a research fellow at the Oxford Internet Institute who studies elections. “I don’t think there’s anyone in Google or Facebook or anywhere else who’s trying to tweak an election. But it’s something these organizations have always struggled with.” Algorithms reflect the values and worldview of the programmers. That’s what an algorithm is, fundamentally. “Do they want to make a good effort to make sure they influence evenly across Democrats and Republicans? Or do they just let the algorithm take its course?” Bright asks.

That course might be scary, if Epstein is right. Add the possibility of search rank influence to the individualization Google can already do based on your gmail, google docs, and every other way you’ve let the company hook into you…combine that with the feedback loop of popular things getting more inbound links and so getting higher search ranking…and the impact stretches way beyond politics. “You can push knowledge, beliefs, attitudes, and behavior among people who are vulnerable any way you want using search rankings,” Epstein says. “Now that we’ve discovered this big effect, how do you kill it?”

~~~~~~~~~~

It’s not like we conservatives weren’t already battling against voter registration fraud, individuals voting multiple times, dead people voting, electronic voting machine manipulations, and more, now we may also be handicapped by search engine results affected by a (left) thumb on the scale. There is only one answer to the tactics employed by the democrats and that is to overcome their dastardly actions by overwhelming their votes – if the outcome ain’t close, they can’t steal it.

Garnet92

 

 

Tagged . Bookmark the permalink.

10 Responses to Google’s Search Algorithm Could Steal the Presidency – wait … what … ?

  1. Bullright says:

    Wait for the ad, Garnet: “this election is being brought to you by Google.” I’m sure its coming. Reminds me how Obama organizations always got top list in searches. Just happenstance.

    • Garnet92 says:

      I spent much of my adult life writing computer programs and I can understand how software algorithms can influence the outcomes of a search, it’s only been the past couple of years that I ever considered that a search engine might intentionally skew the results for a particular purpose. That’s what I meant by “having a thumb on the scale” in my commentary.

      I am not so naive that I believe that Google would NEVER do that, in fact it wouldn’t surprise me. They’ve got the power and have the ability to use it to influence the outcome of search results – in some respects, they already do that with Google Ad Words. Is it completely outrageous to think that they wouldn’t “tweak” the code to assist a candidate who paid them enough or promised them special privileges?

      I think not.

      • vonmesser says:

        They already have done this. Thyere was a stink about a year ago about “googling” something (I think it was something about “rich democrats”) and the top articles that came up had to do with republican fundraising.

  2. Hardnox says:

    This is creepy…

    Frankly, one needs to wonder how qualified a voter is to vote if they haven’t paid enough attention in the first place. The process to lemmings should be simple: don’t vote for anything with a D behind a name and eventually things will turn out OK.

    • Garnet92 says:

      It IS creepy. How can we believe ANYTHING when we accept that Google might lie to us? I’m surprised that no one (as far as I know) has established a Googilism religion yet. There are probably more who have FAITH in Google search results than nave faith in their religion. A sad state.

  3. vonmesser says:

    “……Google’s ranking algorithm for search results could accidentally steal the presidency. …..”
    I suggest that ACCIDENTALLY is not the correct word.

    • Garnet92 says:

      I had exactly the same thought VM. In fact, I even considered deleting it, but I don’t make a practice of changing the text of an article. I did consider putting it in parenthesis or italicizing it, but decided to let it go. But, you’re right, the word “accidentally” has no place in that sentence.

  4. Uriel says:

    Very interesting! Subliminal concept using algorithms.

    • Garnet92 says:

      Well, it’s not exactly subliminal when they “adjust” an outcome to yield a result more desirable for a paying candidate or customer.

  5. Creepy and scary considering we have so many uninformed people in this country. This next election will have the most massive fraud ever.. Now that the commies have half the country, the will do all they can to make sure they hang onto power.