Google is announcing changes to its search product today, tied in part to the 20th anniversary of the company. The biggest announcement is that Google is rebranding its news feed — that list of items that appears below the default search bar in the Google mobile app and when you swipe left from the home screen on Android — as “Discover.” It will now be on the Google homepage on all mobile browsers, which is a huge shift for the company as it works to better organize information and help users understand context. The news was unveiled at a press event in San Francisco this morning.
Discover will do more to show relevant content — stuff that might not necessarily be just recent news. It will have topic links to dive deeper into content you’re seeing in the feed, and there will be a little slider on the lower-left corner of each card that will let you increase or decrease the amount of news you’ll see in your feed. (At the moment, most of the feed is personalized, and it will prioritize entertainment and other news items based on your interactions with it.) Discover will also support multilingual items so you can get both English and Spanish items in your feed. More languages are coming soon. Many of these additions feel inspired by the redesigned Google News app, which was first unveiled back at the company’s I/O developer conference.
Google is also using computer vision to add more visuals to search. It has a new format called “Stories” that is based on its AMP standard. Google says that it’s “doubling down” on Stories in search, which presumably means we’ll be seeing more of them “in the coming months.” Stories will also appear inline in search, as well as “featured videos” that will show “salient segments” from videos. It creates an auto-advancing carousel of videos, showing just the relevant section of each video.
Google Images is also getting an update. There’s a new ranking algorithm for image search. Additionally, it will show more “web content” inside the search, including specific page information on the search results page. The new UI will come to desktops this coming Thursday, September 27th. Google Stories is also getting added to image search.
Google Lens is also coming to Google Images. On the mobile web, you can tap on a new Lens button, which will detect what’s in an image and do a search on it. You can also crop a specific part of an image to search on it. Google’s example was a bookshelf that was in the background of a photo, which linked to sites where you could buy it.
Google also talked about search becoming important “in times of crisis,” referring specifically to natural disasters. It has had “SOS Alerts” for a year now, but Google hopes to make those alerts more “fine grained” so they are sent to people who are more likely to be in danger of flooding, for example. It uses AI to more accurately predict where flooding may actually occur. It’s launching in India to start.
Google also talked about its improved job search product, presenting it in a way that suggested that Google could help struggling Americans (complete with a feel-good video montage). The new feature is called “Pathways,” and it’s designed to help people find job training. Google says its “aspiration” is to show training links inside job searches, starting with a pilot program in Virginia. It’s also partnering with Goodwill for this program.
Additionally, Google is introducing new changes to search related to improving researching a topic online over the course of multiple days. Nick Fox, Google’s vice president of product at its search division, detailed how it’s going to work by breaking down the changes into three categories: journeys, collections, and topics.
Very little of that context was on display in Google’s presentation today, which focused on the positive instead. Ahead of the announcement, Google search chief Ben Gomes laid out some of the history of how search works and the principles that guide Google’s product development. Pointing out Google’s advances in word matching, synonyms, and the Knowledge Graph, Gomes emphasized how Google is constantly trying to advance search’s understanding of what words mean.
Gomes also discussed “neural matching,” a technique that uses documents online to help disambiguate imprecise language. For example, on a search for “TV that looks strange,” Google might know that, in that context, it’s referring to motion smoothing. This feature, launched in the last few months, affects over 30 percent of queries.
As for the principles of search, Gomes said that it’s “focused on the user,” strives for “relevant and high-quality information,” follows an “algorithmic approach” to ranking information, and is heavily tested by users. But in a gesture to some of that drama, Gomes added, “Search is not perfect, and we’re under no illusions that it is.”