In Google I/O 2017, Google made its first step towards the future of AI by releasing the Assistant and showing upcoming new thing like Neural Networks and Google Lens.
WHAT IS GOOGLE LENS
Google Lens is going to be the feature of Google Assistant and Google Photos, and Google lens can identify the things on the screen .With Google Lens, your phone won’t be just able to see things but also able to understand and ready to help you. It’s like AI would solve problems in the real world like you want a translation of stuff on the go just point the camera towards that letter and Google Lens is going to translate it. You want to get information about things around you like opening/closing time of a restaurant than just simply point the camera towards that shop the Google Lens AI will collect all information from the web and location tracking system and show you the results on your phone’s screen.
GOOGLE LENS IN GOOGLE PHOTOS
Google announced that lens would be available in the Google Assistant and Google Photos, in Photos app it will be able to identify images where it’s taken or buildings/location inside the picture, it can also be able to distinguish between genders, animals, objects in your pictures.
HOW TO GET GOOGLE LENS
Google Lens in Google Assistant is not available yet but the latest beta version of Google Photos has Google Lens in its code lines.
To enable Google Lens follow these steps:
- Download Latest Beta version of Google Photos from Play Store.
- Download Google Lens Launcher.
- After installing both apps you are good to go, just share any image and choose Google Lens Launcher.