This tool can be used for complex searches, such as requiring multiple images or textual phrases. Google Multisearch is an excellent tool to learn more about the object in front of you but don't know how to describe it.
Google Multisearch in Lens allows you to ask Google questions about what you see. Google's advancements in AI will help you get the right answer.
How To Use Google Multisearch
Download the most recent update to your Google App, and then follow these steps.
Open the Google App on Android or iOS
Tap on the Lens camera icon
Take a photo of the environment around you or upload a saved image from your gallery
Tap the "+" button on the top to add text.
Google allows you to ask a question about an object before you. You can also use text to refine your search for color, brand, or other visual attributes.
Google gives the following examples of Multisearch's intended use cases:
Take a screenshot of a fashionable orange dress. Add the query "green" to search for it in another color
Take a picture of your dining room and use the query "coffee tables" to search for a matching table
Take a photo of your rosemary plant and add the query "care instructions."
Google's AI advances make it easier for people to see the world naturally and intuitively. Google is exploring how MUM could improve this feature to improve its results.
MUM to Improve Google Multisearch Further
Google said in their blog post, "Using text and images at the same time. With multi-search on Lens, you can go beyond the search box and ask questions about what you see." This new feature is made possible by its latest advancements in artificial intelligence, "making it simpler to thoroughly understand the world around you in more intuitive and natural ways." Google is also exploring ways in which this feature can be possibly enhanced by MUM — Google's latest AI model in Search. This enhancement will offer results for all the questions you could imagine asking. Google is heavily focusing on AI-driven algorithms for advanced search capabilities such as The Pathways Language Model (PaLM).
Though it appears primarily aimed at starting shopping, much more is possible. Google mentions, "You could imagine you have something broken in front of you, don't have the words to explain it, but you want to fix it… you can just type 'how to fix." The company hopes AI models can drive a new era of Search where text is not enough.