Google Lens was first shown off with the Pixel 2, whereby the smart feature uses computer vision to recognise items and places the phone's camera is seeing or within an image a person is viewing and serve up information about it. Google during the keynote announced that Google Lens would be coming as a preview to Pixel devices later this year. It has been created to bring up relevant information using visual analysis, and is expected to be relevant to Google CEO Sundar Pichai's "AI first" goal. The process of launching Lens from within Google Assistant is similar - you need to tap the icon in the lower right corner of the screen, which launches the viewfinder and lets you analyze whatever you have in front of the camera. The feature is capable of identifying objects and landmarks in photos as well.
You can also check out info on local business info via Knowledge Graph, pull contact info from business cards, translate text from other languages, pull event information, the applications are basically endless. Google did note Lens will come to its virtual assistant but didn't reveal when. You might be thinking that what's the point of it being available on Google Assistant as it is already available on Google Photos?Читайте также: Meek Mill's attorney says possible bail hearing scheduled
Users are now seeing a new Lens button in Google Assistant which opens up the camera when it's pressed. Google Lens brings all of this together into a simple package that lives inside Google Assistant. So at the time, this feature is a Pixel exclusive one.При любом использовании материалов сайта и дочерних проектов, гиперссылка на обязательна.
«» 2007 - 2018 Copyright.
Автоматизированное извлечение информации сайта запрещено.
Код для вставки в блог