Source Code: What matters in tech, in your inbox every morning

×
×

Sign up for Source Code — David Pierce’s daily newsletter on everything that matters in tech.

Not today, thank you!

Will be used in accordance with our Privacy Policy

Google's trying to push more good info to the top of search results — and keep the bad stuff out.

Image: Google

Power

Google has a new plan to keep junk out of search

Unless you go looking for it.

There are less than two months until Election Day. A pandemic is raging. The West Coast is on fire. For tech companies, the value of helping people find good and useful information — and keeping away the problematic and unhelpful — is more obvious than ever.

Social companies like Facebook and Twitter dominate the conversation about information dissemination, but Google Search is every bit as important. And Google's thinking about the issue in a slightly unusual way.

Sign up for Protocol newsletters

There are two sides to Google Search. The first is the search results themselves, which Google says it's perpetually refining but doesn't want to treat with too heavy a hand. If you want to Google your way into some wacky corner of the internet, Google won't stop you. On the other hand, with products like the Answer Box at the top of search results, query autocompletes and even the ads Google shows, Google told reporters Thursday that it's becoming more involved with what people see when they search. With these things, which Google calls "search features," Google is trying to be more proactive about making sure it only shows the right stuff.

One of Google's newest policies is around autocomplete. Going forward, if you start to search for something like "voting by mail illegal," Google won't show any suggestions. "We will remove predictions that could be interpreted or perceived as claims for one candidate or against one candidate" as well, Google's David Graff said.

Graff also said that some perfectly normal, benign queries will get caught in this filter and have autocomplete shut off for no good reason. And Google's fine with that. "There's a tension between precision and recall in algorithms," Google VP of Search Pandu Nayak said. High precision means the algorithm only returns correct results; high recall means it returns more results, including some incorrect ones. Google is intentionally leaning toward recall: It wants to get everything right, even if that means overstepping in some places. "Protecting you from bad autocompletes is more important" than showing them as often as possible, Nayak said.

A lot of Google's new focus is on the immediate, huge stories in the world — particularly COVID and the election — but there's a bigger trend in play here as well. After so many years of leaning on PageRank to tell Google how the internet worked and what mattered, it's realizing that's no longer a sufficient answer. Social media has proved that "people read it and shared it" has no correlation to expertise, relevance or truth.

Google is making a plan for Election Day, too. It's working with the AP and Democracy Works to make sure it's surfacing only true information — and not, Google engineering VP Cathy Edwards said, things like false claims of victory. Graff said Google won't allow "demonstrably false information" in political ads, either.

And while part of the problem can be solved algorithmically, humans need to be part of the equation. Google's already taking a more thorough look at search for what it calls "Your Money or Your Life" queries — related to finance, health, anything that feels existentially important — but Edwards acknowledged that "as the events of this year have shown, there are times when information quality is particularly important." There are more of those all the time, and they require something more than PageRank.

A version of this article will appear in Friday's edition of our daily newsletter, Source Code. Sign up here.
Latest Stories