Google has outlined a range of coming search updates, designed to help streamline product discovery, and utilize multiple inputs in providing better contextual search results.
The main new addition is the implementation of its Multitask Unified Model (MUM) for search, which will enable people to conduct searches using variable inputs, including visuals, as search parameters, facilitating expanded discovery.
As explained by Google:
“In the coming months, we’ll introduce a new way to search visually, with the ability to ask questions about what you see.”
As you can see in this example, Google’s advanced search capacity will soon enable users to use a visual as a reference point – so if you wanted socks with this design pattern, you could use the image as the trigger to search for the same in a different product category.
The same capacity can also be used in situations where you don’t know what something is called, or just want to streamline the process (i.e. doing a search for the right term, reading about bike parts, identifying the element you need, etc.).
It’s a significant advance in search capacity, and it could open up new considerations for discovery, and how people come to your web pages as a result of their behaviors and queries.
Google’s MUM process will also facilitate broader contextual searches, based on advanced machine understanding, with Google also rolling out a new element called ‘Things to know’ to help to guide searchers in the right direction.
“If you search for “acrylic painting,” Google understands how people typically explore this topic, and shows the aspects people are likely to look at first. For example, we can identify more than 350 topics related to acrylic painting, and help you find the right path to take.”
That, again, could become another SEO consideration, with more aspects of projects and related discovery being added into search results. That could make it very valuable to ensure that you’re in touch with the latest trends, and are creating website content based on these elements, in order to maximize discovery.
Google’s also adding a more visual-aligned deep dive search option for selected topics, as well as a new experience which identifies related topics in a video, with links to explore more.
“Using MUM, we can even show related topics that aren’t explicitly mentioned in the video, based on our advanced understanding of information in the video. In this example, while the video doesn’t say the words “macaroni penguin’s life story,” our systems understand that topics contained in the video relate to this topic, like how macaroni penguins find their family members and navigate predators.”
Google says that the first version of this feature will roll out in the coming weeks, with more to come in the coming months.
Visual search is also a key component in Google’s advanced Lens search process, which will facilitate eCommerce discovery by enabling users of the Google app to search based on images, video and text content on a website.
“Starting soon, iOS users will see a new button in the Google app to make all the images on a page searchable through Google Lens. Now, finding this lamp or that shirt (and ones like it) is just a tap away.”
Google’s also expanding its listings of products in the main search feed, based on the now 24 billion products listed in Google Shopping, while it’s also adding a new “in stock” filter to local store listings, so that it only displays the nearby stores that have what you want.
There are various considerations within each of these elements, with the MUM advances set to significantly change how Google displays search results, which will have a big impact on discovery.
How, exactly, that might change your SEO approach is less clear, but as these new processes are rolled out, we’ll get more insight into their impact on SERPs, and subsequently, user behavior, which could prompt a new approach to some aspects.
You can read more about Google’s search updates here.