You’ll Soon Be Able to Send Videos and Screenshots to Google’s Gemini AI

At Mobile World Congress this week in Barcelona, ​​Google hinted that its Gemini AI is finally ready to see the world. Literally.

Originally unveiled last May as Project Astra , Google’s Gemini Live with Video and Gemini Live with Screenshare features have a new trailer showing how the search giant plans to catch up with ChatGPT’s advanced voice mode. Google One AI Premium subscribers will soon be able to share their phone screens and real-time video with Gemini, which the chatbot can then use to answer questions.

ChatGPT’s advanced voice mode gained these features late last year, but it also requires shelling out at least $20 a month for a ChatGPT Plus subscription. While the Google One AI Premium plan costs the same, it also includes 2TB of storage for your Drive and Gmail accounts, so depending on your needs, it may be the best choice for you.

In the trailers, Google shows users accessing Gemini Live through the Gemini phone app to start a real-time conversation with the AI, then tapping the video or screen share button at the bottom of the screen to get started. In the Gemini Live with Video demo, we see someone showing the chatbot freshly fired vases, as well as glaze samples, and asking for advice on which one to choose. In the Screenshare demo, the chatbot instead scans a list of stores for a pair of jeans and gives advice on what clothes to pair them with.

I would love to meet someone who would go to all the trouble of twisting and firing a vase without knowing what color glaze to glaze it with, but you get the point. Gemini will soon be able to use live video and screenshots as input when answering queries.

Unfortunately, Google hasn’t said anything more yet. When Project Astra was originally teased, it boasted such impressive (and creepy) abilities as “the ability to tell where you live just by looking out the window .” These trailers seem significantly scaled down by comparison, but they’re also clearly just short demos. I’m curious to see how this feature will actually work once users get their hands on it, which Google says will be “later this month.”

More…

Leave a Reply