newsith.com

Do you know how GOOGLE LENS work? A Product that eases life

Google Lens is a cutting edge product from GOOGLE, which makes the life easy by helping to understand and detail what you scan from your smartphone using this app.

How does it work : Google Lens, a mind blowing GOOGLE product

Google Lens utilizes a combination of image recognitions and machine learning to analyze what you see through your phone’s camera or from an image you provide. Here’s a breakdown of the process:

1. Image Capture or Selection: You can either point your phone’s camera at an object or upload a photo from your gallery or the web.

2. Image Preprocessing: Once you capture or select an image, Google Lens performs initial processing to optimize it for analysis. This may involve adjustments like resizing, color correction, or noise reduction.

3. Feature Extraction: The system extracts key features from the image, such as shapes, edges, colors, and textures. These features are crucial for identifying the object or scene in the image.

4. Machine Learning Model Matching:

5. Contextual Understanding:

6. Result Generation:

7. User Interface:

WATCH : How Google Lens helps you search what you SEE :

Also watch it at : https://youtu.be/BH_6BdgTdiw

Google Lens : 8 ways that Lens can make your life a bit easier.

1. Learn about the things you see as you go about your day

If you see a building or landmark that you don’t recognize, GOOGLE Lens can tell you what you’re looking at and provide links to learn more. Similarly, whether on the road or in your own backyard, it’s not uncommon to discover plants and animals that you can’t quite clock or describe perfectly with words.

2. Search for skin conditions

Describing an odd mole or rash on your skin can be hard to do with words alone. Fortunately, there’s a new way Lens can help, with the ability to search skin conditions that are visually similar to what you see on your skin. Just take a picture or upload a photo through Lens, and you’ll find visual matches to inform your search. This feature also works if you’re not sure how to describe something else on your body, like a bump on your lip, a line on your nails or hair loss on your head. This feature is currently available in selected markets.

This is an image comparison feature that matches to images available publicly on the world wide web. This feature does not constitute a medical analysis of the image. Search results are informational only and not a diagnosis. Consult your medical authority for advice.

3. Translate the street signs, menus and more into over 100 languages

If your summer plans involve travel, Lens can help you bridge the language barrier. Using the Translate filter in Lens, you can upload or take a picture, or even just point your camera at the text you want to translate, like a menu or a street sign. Lens will automatically detect the written language and overlay the translation on top of it, directly on your phone screen.

4. Get step-by-step help with homework issues

If you’re stuck on a homework problem in math, history or science, tap the “homework help” filter, then snap a picture, and Lens will share instructions to help you learn how to solve the problem. The homework help feature also enables you to tackle questions in multiple languages, and you can set your preferred language for search results.

5. Shop for products that catch your eyes

If you’re browsing on your phone and notice a product that you’d love to get your hands on — maybe a snazzy pair of walking shoes or a sleek and functional backpack — you can use Lens to find and buy one of your own. Just take a screenshot and select it in Lens, and you’ll get a list of shoppable matches with links to where you can make a purchase. It works the same way if you see something you want to buy while you’re out and about: Point your camera with Lens, snap a pic and you’ll see options from online merchants.

6. Or find different versions of those eye-catching products

About those snazzy walking shoes — maybe they’d be even better in blue. Multisearch in Lens lets you combine both words and images to find exactly what you’re looking for. In this case, snap a picture of the shoes in Lens and then swipe up to add words to your search (like “blue”). Lens will then show you similar shoes in the color of your choice. This also works with patterns — say you see a fun shirt and would love that pattern for your curtains. Take a pic of the shirt in Lens, swipe up and type “curtains” — and there you have it.

7. Discover delicious food near you

Multi-search also works for finding things nearby, like food from local restaurants. Let’s say you stumbled across an image of a dish you’re dying to try, but you’re not sure what it’s called. Just pull up that image in Lens and add the words NEAR ME to your search; Lens will show you nearby restaurants that serve what you’re looking for.

8. Unleash your creativity with Lens + Bard

As GOOGLE shared at I/O, the power of  joining BARD, an experiment that lets you collaborate with GEN AI. Whether you want to learn more about something you saw, or explore completely new ideas in a more visual way, you can partner with Bard to start that journey. In the coming weeks, you’ll be able to include images in your Bard prompts and Lens will work behind the scenes to help Bard make sense of what’s being shown.

For example, you can show Bard a photo of a new pair of shoes you’ve been eyeing for your vacation, and ask what they’re called. You can even ask Bard for ideas on how to style those gladiator sandals for a complete summer look, and then continue browsing on Search — using the “Google it” button — to explore a wide range of products from retailers across the web.

See related content at:

This is what you need to do, when your iPhone gets wet

Exit mobile version