During Google’s Search On event last night, the company announced new features that it said would “reimagine Search through advancements in artificial intelligence”.
Google has been working to make inputting queries more natural and results more personalised over the years and they are continuing the trend this year thanks to advancements in AI and computer imagery.
Multisearch with images, voice, and text
Searching with images has been made easier in Multisearch by allowing you to take a picture with your smartphone’s camera and add it or an image in your photos, into the search bar, add some text or a voice query for additional context, and call up relevant results.
For example, open the Google app on your smartphone, tap the lens icon and snap a picture of a dress and "+ Add to your search" button to add text like “green” and Search will use both the image and text to find relevant results like the price of the dress in green from your favourite online shops.
AI and Maps
The world is also getting closer thanks to AI improvements to Immersive View in Google Maps. Going somewhere new and want to get a feel for it first? For example, users can visually soar over an area or landmark to see what parking or street conditions are at any given time.
Using predictive modelling, Immersive view automatically learns historical trends for a place to determine what an area will be like tomorrow, next week and even next month.
So, if you’re visiting San Francisco and want to visit Oracle Park, you’ll have all the information you need to plan your day. You’ll be able to see parking lots, entrances, if you need an umbrella or jacket. Need to find a bite to eat after the game? Google can find anything you feel like eating with the ability to glide down to street level, peek inside and know how busy it’ll be.
Immersive view will roll out in the coming months in Los Angeles, London, New York, San Francisco and Tokyo on Android and iOS with some 250 landmarks included for reference.
Google is also expanding Live View with Search with Live View, which helps you find what’s around you more intuitively.
Need an ATM when shopping at an outdoor market? Lift up your smartphone and Search with Live View will show you all ATMs around you. Users can also use it to find different places like coffee shops, grocery stores, and stations.
Search with Live View starts rolling out in London, Los Angeles, New York, San Francisco, Paris and Tokyo in the coming months on Android and iOS.
ALSO READ: Truth Social Android app not approved on Google Play Store
Translation made better
Google Translate is more powerful and can deliver more relevant results. Google is now able to blend translated text into the background image thanks to a machine learning technology called Generative Adversarial Networks (GANs).
Point your camera at a magazine in another language, for example, you’ll now see translated text realistically overlaid onto the pictures underneath. So, you can now get more context and better understanding with the image and text both adding meaning to the translation.
This article was first published in HardwareZone.