Tip Us Hey you, we are hiring! Join us if you are an author, developer or designer!

Google Search redesigned using MUM's multimodal understanding capability

30 September 2021 0

Google is once again leveraging the power of AI for its Search in order to present better results for your complex queries. At its Search On 2021 event, Google explained how it is utilizing the MUM (Multitask Unified Model), a new AI milestone it announced at its I/O event this year to understand the requirements of a searcher.

Before we share with you an early look at what will be possible with MUM, here’s a quick brief on what is MUM. It is a trained model to handle and understand multiple queries together. It develops a more comprehensive understanding of information and world knowledge than previous models. Google says MUM is 1,000 times more powerful than BERT (a deep learning model). MUM is multimodal that understands information across text and images.

Today, at its Search On event, Google explains what it has managed to make possible using the MUM. Google explains that over the period it has managed to unlock several capabilities of the MUM. And, making your search results more appropriate is one of those applications that leverage MUM’s true potential.

In the coming months, Google is introducing a new way that combines the images and text together into a single query and present better results, even those answers that you didn’t ask for but will be helpful.

Google Search On event

Google shares an example so as to be more comprehensive about this new feature. As it is exemplified in the GIF, a user can be seen doing a usual visual search using Google Lens. Google pulls up matching results as always. From here, a user can do two things, either tap on these results to go check and buy a similar product or look for improvisations. Here’s where things start to take a little twist.

What if the user intends to search for socks that have the same floral pattern? This is what is shown in the example. After Google Lens pulls up the matching results, the user can tap on “Add questions” provided on the top and write the query. Here, in this case, the user writes “socks with this pattern,” and Google Lens fetches up the desired results for the user.

Imagine searching for the same with the text alone. This could go quite complex and you may not end up finding the desired results even. So, Google is making the complex query easier for you.

While explaining the MUM back in May, the search giant said that it has discovered that on average it takes eight different searches for a user to get all the needed answers while doing a complex searches. Here’s when MUM shows its prowess and makes the journey of complex searches easier.

The model has been trained to understand the intent better and present the exact matches. The search process has been upskilled using MUM so to handle multiple tasks at one time and work with 75 different languages.

This is an early look at what is possible through MUM. There is, of course, more to come. Google said that the first version of this feature will roll out in the coming weeks and it will add more visual enhancements in the coming months.

MUM is also coming to Google Search to make looking up queries more natural and intuitive. Google is bringing new “Things to know” tab in the search results, again to help you with your complex searches. There are times when you want to know more about the subject. You even want to tap on the unexplored topics that you yourself didn’t think of. Here’s when “Things to know” comes in and suggests you more topic around that particular subject. Take, for instance, you search Google for “acrylic painting,” so, Google will show results you might want to know first.

Google Search O event Things to know

If you search for “acrylic painting,” Google understands how people typically explore this topic and shows the aspects people are likely to look at first. For example, we can identify more than 350 topics related to acrylic painting, and help you find the right path to take.

So, Google will fetch results deeper insights you might have known to search for, like “how to make acrylic paintings with household items.”

In May, Google announced that being multimodal, the MUM currently understands information across text and images, and in the future, it can expand to more modalities like video and audio. Today, at its Search On event, Google is sharing a peek into what it has been able to achieve using the trained model MUM. Google has now upgrading video search using MUM and it looks just the start for now. The AI already identifies some key moments present inside a video. And, now using the MUM, it is taking things further with the launch of a new feature that will identify topics in a video, even if the topic is not explicitly mentioned in the video.

Google Search On video search

It will then provide links that allow users to dig deeper for deeper insights and learn more. These changes will be available in the coming weeks in English in the US.

Google said,

"Across all these MUM experiences, we look forward to helping people discover more web pages, videos, images and ideas that they may not have come across or otherwise searched for."



Google Search redesigned using MUM's multimodal understanding capability
Write a comment...

Samsung unveils Exynos 2200 with AMD RDNA 2-based Xclipse GPU


OnePlus 10 Pro with Snap 8 Gen 1, 120Hz display, improved cameras launches


Asus ZenBook 17 Fold OLED is both a Windows tablet and laptop


Huawei P50 Pocket foldable with flip design, SD888, 120Hz display is here