Google Battling Nefarious Actors Trying to Manipulate Search Results

Company official says 2,000 changes to search engine algorithm made annually

Getty Images
May 1, 2018

Google search, the premier internet technology tool used for up to three billion online queries a day, is under constant attack by malicious actors seeking to manipulate its results, a company official says.

Richard Gingras, Google's vice president of news, said in recent remarks the company makes as many as 2,000 changes to its search algorithm annually in a bid to improve capabilities and thwart nefarious actors.

"When we talk about nefarious information and nefarious players, we often tell people we've been dealing with people trying to trick the search engine with spam since the day Google was founded," Gingras told a journalism conference last month.

"We have thousands of people working on these issues, so we have scale abilities to do things that others may not."

Search engine spam is the use of online manipulation for financial gain or political purposes.

Gingras commented when asked if Google is too big and whether those using the internet would be better served by utilizing several different search engines rather than Google's single dominant search tool.

A U.S. intelligence official disclosed to the Washington Free Beacon that Chinese cyber intelligence agencies had developed the technology allowing the Chinese government to restrict the results of internet services conducted on Google search.

The control mechanism was discovered in the spring of 2016, and was assessed as allowing the Chinese government to limit China's 500 million internet users from gaining information from search results opposed by Beijing, such as information about the 1989 Tiananmen massacre of hundreds of unarmed pro-democracy protesters, or dissidents.

Critics of very large tech companies such as Google and Amazon have suggested they be broken up under anti-monopoly laws.

Others have called for applying a fairness doctrine for search engine results.

Gingras said mandating a fairness doctrine for search engines would be a bad idea, and he voiced concerns that forcing people to use multiple search engines could lead to search engines becoming politicized into "silos," such as conservative and liberal search engines.

"I understand the emotional reactions to the technological impact to change in environments," he said. "But I get very, very nervous at those emotions leading to bad public policy."

Google uses a core algorithm called PageRank that match words contained in online documents to search terms. The search engine was developed by Google founders Larry Page and Sergey Brin in 1997.

Google's ad platforms are used by 2 million publishers around the world and more than 70 percent of the $13 billion in ad revenue went to publishers.

Gingras spoke at a conference at the University of California Berkeley April 14. He said Google search engine is a tool designed for finding anything that is findable in the universe of legal expression.

It also can provide a look into the "dark corners of expression and understand what's there," he said.

"We are not an oracle of absolute truth or an oracle to ascertain what is acceptable or unacceptable speech," he said. "I don't think anyone wants us to be that."

Gingras said one error by Google was promoting to the top of its search results a false story misidentifying the shooter behind the Las Vegas mass murder last year. "Our systems aren't prefect and they never will be because the ecosystem is constantly evolving," he said.

But the company is changing how the search engine responds to breaking news by increasing the algorithm's reliance on authoritative news sources during major events while reducing the reliance on relevance.

Gingras was asked why Google search results often produce sharp large swings in traffic for news websites from what appear to be tweaks in the search algorithm.

Myron Levin, head of the online news outlet Fair Warning, said the traffic shifts were a "complete mystery," and called Google's controls over searches a "black box."

As for search engine optimization experts paid to increase online traffic, Levin said "they're witch doctors."

"They're reading chicken bones and surmising what the gods of Google are thinking. But they don't know either," he said.

Gingras said the company is trying to be more transparent within limits.

"The main challenge we deal with is people trying to trick the algorithm," he said. "So complete transparency on the part of the algorithm would not make sense at all."

Instead, Google is trying to increase openness about how the company determines authority in picking search results.

"It'll be an ongoing challenge, and no question, no matter what we do, they'll be algorithmic changes that impact people in different ways. Obviously, we try to do the right thing," Gingras said.

Relevance in searches can appear magical but the process basically involves word matching with online documents.

"People presume the algorithm is more intelligent than it is in terms of the substance, but it's basically word matching," he said.

On the use of PageRank, Gingras said its value has gone down over the years "because people figured it out and now we see robots creating fake links to try to drive their traffic," he said.

To prevent, spoofing, Google adds more signals to the algorithm to limit.

Gingras said Google makes easily "a couple of thousand" algorithm changes a year.

"We're doing experiments all the time with algorithms," he said.

The changes are sent to 10,000 "raters" around the world who analyze the changes and provide feedback on whether the changes are improvements or not.

The company has produced a 160-page document of guidelines for what it calls the Search Quality Rating Program.

Ben Gomes, Google vice president for engineering, said in a blog post last year that people or systems have tried to "game" its system in order to produce search results that appear higher in results. The technique is done using "content farms," hidden text, and other deceptive practices.

"We've tackled these problems, and others over the years, by making regular updates to our algorithms and introducing other features that prevent people from gaming the system," he stated.

Because information and documents are coming online in the tens of thousands every minute of every day, newer methods are being used like fake news, Gomes said.

The problem is "content on the web has contributed to the spread of blatantly misleading, low quality, offensive, or downright false information," he said.

Gomes said good progress is being made in dealing with fake news and that has driven more structural changes in the search engine, such as better search ranking, easier feedback mechanisms and greater transparency on how the engine works.

On countering fake news—misinformation and disinformation—Gingras said the problem is no longer teenage hackers.

"We're seeing misrepresented information coming from all kinds of sources, including media institutions that are quite prominent," he said.

On Google's YouTube platform, Gingras said the algorithm for its video searches is geared to viewers.

"Yes indeed if it detects your interest it tends to feed it," he said.

The problem is that searches for human rights will continue to provide similar results just as searches for extremist subjects will produce continued results on those subjects.

"Now obviously that is something that we're looking to address, but as you can see it's an interesting challenge. It's not like the algorithm is determining to focus on extremists, and obviously in fact we've tried to tighten policies about defining what is extremist content in the first place and also making sure we're not providing our modernization tools, our ad platforms to support those kinds of efforts," he said.

Google spokeswoman Maggie Shiels said Gingras was referring to Project Owl, a program launched by the company a year ago designed to deal with fake news, rumors, conspiracy theories, and myths.

Published under: Cyber Security