Google’s Project Owl and What it Means for Your Searches

You may have noticed Google’s autocomplete feature, which does some heavy lifting for users making common queries. If you type “wea”, for instance, you’re like to see a suggestion to search for today’s weather. The system only needs a few letters to suggest a term, thanks to Google’s ability to track the popularity and frequency of a particular search term.

However, autocomplete’s dark side has allowed for fake news to spread. Celebrity hoax deaths, inaccurate reporting and anti-semitism are just a few of the fake news problems autocomplete inadvertently contributes to. Now, Google’s new Project Owl update will give users more control over their suggested searches.

Unintended Results

Quite a few celebrities would return a query for the words “death” or “dead”, leading to the spread of hoaxes and misinformation thanks to the popularity of those queries. These pranks start with convincingly written posts and fake news. Google tracks these results and “verifies” them because it notices users searching this term more frequently. It tries to be helpful by suggesting the term, and ends up as part of the problem.

How Project Owl Helps

Project Owl is a new feedback system that users will utilize to report any suggested queries that may be offensive or inaccurate. Google will try and use this feedback data to “teach” its algorithm how to better server suggested queries not based on fake news.

This system won’t stop people from searching for offensive terms, but it will help users who want to search innocently do so without dealing with offensive terms.


Bio: Fix Bad Reputation was founded by veteran search marketer Pierre Zarokian, and specializes in helping clients remove negative postings online.