22 March 2019
Google has announced that it’s changing the way we search for images on its search engine. The integration of the Google Lens is among some important changes Google will bring. Also, the first major change is the creation of AMP stories using AI.
Google calls the AMP stories an open source library that makes it easy or anyone to create a story on the open web. This offers a more visual way to the searchers.
Google has announced that it will begin to show this content in Google Images and Discover as well. These constructed AMP stories will surface in Search. To exemplify, Google put the example of celebrity chef Giada De Laurentiis. These AMP stories have now been rolling out about the notable people, like the celebrities and athletes.
Furthermore, Google is integrating Google Lens right into the mobile image searches. This works on the simple fact, any image you find within the image searches and you’d like to know about it more, then tap a Lens icon that will expand upon your search. The best part of this feature is that the new Lens integration won’t need a Lens-enabled smartphone to use the service, any smartphone will be able to get this to work.
As an example, Google demonstrated – a search for “nursery” you might see a crib and want to buy. Encountering the perfect image is one thing and searching for the same is another. Finding the exact model of the crib with nothing but that image might prove challenging. One won’t be able to put the exact keywords apart from the color and crib.
Also, a new “featured videos” card for automatically played results will also appear for some queries. Google exemplifies,
Imagine you’re planning a hiking trip to Zion National Park, and you want to check out videos of what to expect and ideas for sites to visit. Since you’ve never been there, you might not know which specific landmarks to look for when mapping out your trek.
With featured videos, we take our deep understanding of the topic space.