Google Lens new features hands-on

Google Lens new features hands-on


– So we’re here at day one of Google IO checking out new features for Google Lens. It’s an AR and AI
platform for the company, and it’s basically built
into Google Assistant, and now it’s built right into
the smart phone’s camera. So Google first introduced
Google Lens last year, and basically at the
time, it was a way to look through the camera’s viewfinder and identify objects in photos. Now Lens is much more sophisticated. It uses all Google’s understanding of natural language processing, and object recognition, image recognition. It combines it into one big platform. So that the smart phone
can see and understand the world around it and it
can parse human language. Prior to today, Google Lens
was only available within Google Assistant. Now it works right from
the smart phone’s camera and it works in other devices. Right here we have an LGG7 and we have a whole
wall of props behind us that we can use Google Lens to identify and get information from Google Search. There are three ways
to access Google Lens. The first is to just open the camera and click the Google Lens button. From there the phone starts looking and trying to identify
objects it sees through the viewfinder. The second way to access Google Lens is basically just by
touching and holding the home button down here launching
Assistant and just clicking the lens button. And as you can see right now, Lens already sees and identifies objects with these little colored dots, that’s how it knows what it is. Tapping on one of the dots, will pull up Google search results. So you see it understands
that this is an album by Justice Woman and
conveniently Justice happens to be the artist performing
at Google IO tomorrow. And the third way to
access Google Lens will be a double tap on the camera button, but that only works on the RGG7. If you look at some of the clothing here, whoop, doesn’t quite
identify the clothing, but it asks if I like the clothing. I guess it’s trying to build
a preference profile for me. Let’s try this one. Whoop, there it goes, it
pulled up shopping results from Macy’s, from QVC. So it understands what
this item of clothing is and it then prompts you to buy it online. Now as you scan Google
Lens over other objects, it’ll slowly start to
recognize everything else that you pan it over. So we have a piece of art right here, that is not correct, hold on. Looking for results. There we go. So it went from the album, but now it knows this is a
painting by Pablo Picasso. Right here it sees a photo. And it knows that was
a Norwegian Lundehund. I don’t think I pronounced that right, but it is a dog breed
and Google identified it. So Google Lens isn’t just
for photos and objects. You can do a lot with text now, that includes text inside
the book jacket of a book, it includes text on menus at restaurants. You can point the camera at
a whole list of food items and you can pull up images
of those food items. You can pull up YouTube
videos of how to make them. You can even translate
those food items if they’re in another language into
English or into Spanish or into any other language that you want that Google Translate supports. Now if you’re looking at a book, for instance, like the book
Swing Time by Zadie Smith, you can look at huge passages of text, you can even grab that
text using Google Lens and you can pull it out
as if you had just copied and pasted it from a document. From there you can translate
that text into another language you can even then do
Google searches on it. Google Lens essentially takes text from anywhere out in the world, street signs, restaurant menus, even books and it makes that text searchable. Now the underlying technology
behind Google Lens, it isn’t just for basically
looking through a smart phone viewfinder and looking at products or trying to translate text. What powers Google Lens is, a lot of the foundational
AI work that lets Google do AR experiences. So for instance, because Google’s software and the phones that power that software can understand and see the world, you can create whole virtual 3D images. For instance, you can have
paintings come to life right out in front of you
and you can walk around, you can even see the reflections
of objects behind you in those 3D images, if developers design them in the right way and know what environment
you’re standing in. That’s pretty wild. You can also point your
camera lens at a podium and have an entire 3D image
come to life in front of you, grow up into the sky and encompass the entire
vertical area around you. Now these Google Lens features are all coming later this month and as Google said on
stage at the IO keynote, they’re coming to more
than just pixel devices and within the Assistant. You’ll also be able to access them in IOS from within the Assistant itself. But you have to use the
Assistant, you won’t be able to access it from the
Iphone’s camera, of course. For all the news and
announcements from Google IO 2018, check out TheVerge.com and
subscribe to us on YouTube at youtube.com/theverge.

Comments

  1. Post
    Author
  2. Post
    Author
  3. Post
    Author
  4. Post
    Author
  5. Post
    Author
  6. Post
    Author
    Nikka M

    This all looks very cool, but I'm starting to feel that my phone is going to be smarter than me soon.

  7. Post
    Author
  8. Post
    Author
    Mazen Madkour

    Will all of this be featured on Android one even if it's an economic phone like the mi a1 ??

  9. Post
    Author
  10. Post
    Author
    Cyrus James-Khan

    I am really looking forward to advanced on head mounted AR, wile it's cool with our phones it is still just a teaser of what as yet to come 🙂

  11. Post
    Author
    siegfried greding

    this is cool. but to use it you have to give them so much of your info . so far its not worth it.

  12. Post
    Author
  13. Post
    Author
    notnot jake

    is the computation happening on the cloud? if so, is the camera always sending a live feed to Google? cause that would be creepy…

  14. Post
    Author
  15. Post
    Author
    Adrian Dimayacyac

    Google Lens is always work well when you connect to the Wifi, and here in the Philippines we don't have many wifi on public place, and if there's available, you can connect just only 1hr or even worst 30 mins, and the data we have here is very slow like the network Smart and Globe they are slow in connections. So Sad Philippines.

  16. Post
    Author
  17. Post
    Author
    RivenDawn

    this is what we call a video who reads script and knows nothing. Any phone could do this with support of google.

  18. Post
    Author
  19. Post
    Author
  20. Post
    Author
  21. Post
    Author
  22. Post
    Author
    hungrybearcub

    After a long time I am having the urge to move on to Android again from iOS. The stuff looks cool and usable.

  23. Post
    Author
  24. Post
    Author
  25. Post
    Author
  26. Post
    Author
  27. Post
    Author
    v2Drake

    Question.

    What's to stop me from buying a book, using google lens to quickly copy it and then send it to friends for free?

  28. Post
    Author
  29. Post
    Author
    Atticus Pinzon-Rodriguez

    The same dog identified at 4:41 is identified differently in the video from CNET during the same event https://youtu.be/oywp4Uerd-4?t=41s

  30. Post
    Author
  31. Post
    Author
  32. Post
    Author
  33. Post
    Author
  34. Post
    Author
  35. Post
    Author
  36. Post
    Author
  37. Post
    Author
  38. Post
    Author
    Maged Samuel

    Is it going to be available as a separate app on the google playstore? or it's just for pixel phones?

  39. Post
    Author
  40. Post
    Author
  41. Post
    Author
  42. Post
    Author
  43. Post
    Author
  44. Post
    Author
  45. Post
    Author
  46. Post
    Author
    coolyouification

    The text to phone input could be life changing for college students. I hated typing or using the surface pen to write notes because I still prefer my pen and paper.

  47. Post
    Author
    no no

    Where Mr iphone Bohn how come he is not doing the review am sure he upset siri is not capable of what Google now can do, just wait a few years once Apple releases these features on the iPhone XXX and they are are going to said they are are new innovative features from Apple 😂😂😂

  48. Post
    Author
  49. Post
    Author
  50. Post
    Author
  51. Post
    Author
    Anonymous Freak

    It's amazing how in their completely contrived, purposefully set up environment, with specifically-chosen items with good separation between items and perfect lighting, it STILL isn't very reliable.

    Hopefully it gets significant improvement by full release.

  52. Post
    Author
    Hokgiarto Saliem

    I hope Google reboot it camera apps in version 6x. Add all the above feature but it can be use in all android phone starting Android 4.1 and above with 512 mem and any camera hardware 🙂

  53. Post
    Author
  54. Post
    Author
  55. Post
    Author
  56. Post
    Author
  57. Post
    Author
  58. Post
    Author
  59. Post
    Author
  60. Post
    Author
    Chad Porter

    To any Apple manager… if it's not too late. Sell Siri to Google. Maybe for a half decent Starbucks card. Cost of lunches can add up fast.

  61. Post
    Author
  62. Post
    Author
    Florin Bobiș

    Hmm.. what if…but what if..instead of SQL injection, we would have AI based Lens injection? I wonder.. (Lens on Lens) Lensception

  63. Post
    Author
  64. Post
    Author
    Jack Paton

    This is cool but Duplex is absolutely amazing. I think a milestone in AI development, if it works well in the real world

  65. Post
    Author
  66. Post
    Author
  67. Post
    Author
    Ree

    Sound mixing is horrible in this video. Couldn't continue watching it, so much noise AND an added music track too.

  68. Post
    Author
  69. Post
    Author
  70. Post
    Author
    WATER.ORG

    GoogleLens newFeatures Hands-On | PersonalSecurity and #SmoothFlowSystems aLifeStreamApplication iJussSayin @WeCreateProjects..#InTheFLOW

  71. Post
    Author
  72. Post
    Author
  73. Post
    Author
  74. Post
    Author
    idcaf

    This team couldnt even configure some things that would be easily identified… How do they expext this to work irl, come on. Good idea but we're definitely NOT there yet.

  75. Post
    Author
  76. Post
    Author
  77. Post
    Author
    Isabel Song

    I used Google Lens on a (dark, poorly lit) photo my mom sent me of a wrinkled USPS receipt and was able to have it transcribe the entire receipt and then copy the tracking number for the package. That's when I really fell in love with Lens. I hadn't been sure if Lens could even do that but was too lazy to try and copy the number myself, so I was pleasantly surprised. Google Lens is so convenient and amazing. At first, it's like "okay, this is cool and fun to play with, but what's the point?" and then you get used to it and can't imagine life without it.

  78. Post
    Author
  79. Post
    Author
    Colin Harter

    This is pretty awesome, but it is clear that the technology still has a long way to go. In another CNET video when they were testing it on the same dog image, it said that the picture was a corgi, not a Norwegian lundehund. I'm pretty sure Corgi is the correct answer.

  80. Post
    Author
  81. Post
    Author
    bkdnsjck

    I just saw it on the news, and it looks good. I wanted to get it. But I see I cant get it on Apple 😔

  82. Post
    Author
    xgallium

    didn't know that google lens was in my phone till i activated ok google and saw a lens icon at the bottom , cool .

  83. Post
    Author
    ruzzell907

    Apple is just seating there silent. They're doing very well, of course. It's just that has Apple lost its excitement.

  84. Post
    Author
    Wyatt Frank

    Honestly hate Google Lens. The object finder that came built into my phone was a million times better. I hold it over a simple 3 ounce pyramid sinker and it asks if it's a cat. This program is garage

  85. Post
    Author
  86. Post
    Author
  87. Post
    Author
  88. Post
    Author
  89. Post
    Author
  90. Post
    Author
  91. Post
    Author
    ed0985587

    Cool tech, but sigh. More people walking around with their phones in their face, more data harvesting. What can ya do…

  92. Post
    Author
  93. Post
    Author
  94. Post
    Author
  95. Post
    Author
  96. Post
    Author
  97. Post
    Author
    Freedom Cobra

    Doesn’t work very well. For example, taking a picture of a dog sculpture that I would like to buy, instead google lens pulls up dog breeds? I mean, seems kinda poorly done if Google can’t tell the difference between a statute and a real dog

  98. Post
    Author
  99. Post
    Author
  100. Post
    Author

Leave a Reply

Your email address will not be published. Required fields are marked *