The future of Google Search is visual

The news: Google offered a peek at what its search product will look like in the years ahead at its Search On livestream event, and it’s apparently more visual and intuitive than before.

The details: The most transformative technology discussed was the Multitask Unified Model (MUM), which the company first publicized in May and which it alleges is 1,000 times more powerful than BERT, a MUM predecessor introduced in 2018.

  • Through MUM, users will be able to search visually and ask Google follow-up questions about on-screen visual content.
  • Google senior vice president Prabhakar Raghavan gave the example of looking at a shirt on Google, then tapping on the Lens icon to find, say, socks or another clothing item with a similar pattern.
  • “By combining images and text into a single query, we’re making it easier to search visually and express your questions in more natural ways,” Raghavan said in a company blog post.
  • This feature will be rolling out in coming weeks, with enhancements to come over the subsequent months.

More on this: Raghavan explained how MUM is becoming fully integrated into Google Search, including through “Things to know,” which follows up broad searches with a list of subtopics and appears to be an enhanced version of the current “People also ask” feature.

  • Raghavan also announced that when users search for a video, MUM will be capable of understanding what that video is about—even when its topic isn’t explicitly stated in the video—and can show related topics as potential follow-up queries.

Even more: While MUM might have been the most interesting feature highlighted, Google shared other updates:

  • They included integrating shopping into search results, the addition of Google Lens on iOS and Chrome desktop browsers, and upgrades in how to evaluate the credibility of news sources.

What this means: These upgrades to an already dominant search engine will make it increasingly difficult for other players to encroach on Google’s search business.

  • “MUM is multimodal, so it understands information across text and images and, in the future, can expand to more modalities like video and audio,” Google wrote in May.
  • Expect Google to make further announcements about how MUM can be used for video searches. When MUM is implemented for audio queries, it has the potential to help Google Home compete against Amazon’s Echo product line.

"Behind the Numbers" Podcast