Gemini Nest Camera Reports Aren’t That Useful Yet, but You Can Improve Them.

I may be home all the time, but I love Nest’s Gemini-powered dashboards, even if they’re not always accurate. The new dashboards let me quickly glance at a notification to see if it’s worth waiting for. They’ve become the metronome of my day, as I watch the local feral cat hunt mice in my backyard. And the daily Home Brief perfectly summarizes the relative chaos that’s reigned in my house.

I don’t even use the latest Nest hardware. I have first- and second-generation cameras, including two Nest indoor cameras from 2012, and several 2021 models. It’s been interesting to see how the introduction of Gemini has breathed new life into older hardware, though it’s still far from the cutting-edge smart home technology it promises, especially at this price point. There’s still a long way to go before it becomes the robust contextual assistant Google wants it to be. Here’s what to expect when Gemini becomes an integral part of your Nest camera experience, and what you can do to make it work better.

Where Gemini Outperforms Nest

I’ve been actively testing Gemini in the new Google Home app since its launch last month. It’s now available after a significant, multi-year redesign. However, most of the features discussed in this article are only available on the Advanced tier of the Google Home Premium subscription, which costs $200 per year. This tier unlocks Gemini’s Home Brief, Ask Home video history search, detailed event descriptions and notifications, and 60 days of swipeable event history. You can only opt for the Google Home Premium Standard plan , which includes 30 days of event history and costs $100 per year, but doesn’t include Home Brief or the other AI-powered features mentioned here.

You may also like

Gemini’s smart home features are paid separately, in addition to the regular Gemini chatbot, unless you subscribe to Google’s One AI Pro plan (which includes multiple subscriptions). The summary and search features are time-saving when they’re enabled, but they haven’t yet justified their price. Currently, the price is only justified if you need the rich and detailed AI-generated notifications and 60-day event history.

You can ask Gemini to tell you more about the “Home Review” program and mention specific events. Photo: Florence Ion/Lifehacker

On Android and iOS devices, the app is divided into three main tabs: Home, Activity, and Automation. Gemini dashboards are located in the Activity tab and are updated every morning, reflecting the previous day’s events. You can filter notifications by device if you’re more interested in what your doorbell camera captured than what your backyard camera captured, or you can view all dashboards at once.

The reports tend to be variations of the same message every time. They’re simple and concise, often reading as a status update rather than a notification that the device has detected motion. Reports look something like this: “A person with a child walked past the front door, followed by another person.” Package notifications specify whether the package was delivered by FedEx or UPS, and if the system can’t determine, it simplifies to “courier.” In most cases, if tagging is done correctly, Gemini will identify the camera that captured the action. “Two different people walked past the front door, and the camera at the side door captured a cat.” If the system identifies familiar faces using the “Familiar Faces” feature , Home Briefs can be even more dynamic, bordering on narrative. “Flo was seen interacting with the child, picking him up, and then sitting with him on the couch.”

A Gemini profile can be as simple as it is incredibly colorful, like this one. Photo: Florence Ion/Lifehacker

While rereading the same information over and over can be tedious, this familiarity allows me to skim through and find something special. I usually check the summary before delving into the details. If something stands out—for example, “an unknown person approached the porch, looked into the camera, and shone a flashlight before leaving”—I click the Gemini icon to begin a discussion of the day’s events. The system displays camera footage from the devices selected for review, along with the day’s events, after which I can write or dictate my concerns.

A detailed Gemini notification on the Pixel Watch 4. Photo: Florence Ion/Lifehacker

The current version of the Home Brief feature lacks detailed controls, and you can’t decide whether Gemini will develop the story or simplify it. You can ask Gemini to focus on “certain things” in a general sense, such as ignoring vehicles or animals. More specific settings, such as focusing on cats instead of possums, are not available. Push notifications are managed in the standard way—through the Google Home app and each individual camera.

Home Brief customization options are currently quite limited. Source: Florence Ion/Lifehacker

Geminis have problems with people

While Gemini excels at distinguishing a possum from a cat and a dog from a raccoon, it still stumbles on perhaps one of the most important recognition tasks and a core part of Google Home Premium’s paid features: identifying people who actually live there. Gemini doesn’t always understand who’s at the door or inside the house. Since I started testing, it’s only called my husband and me twice. It’s been pretty good at identifying my child as a child, though it’s identified her as “children” in the plural several times, which startled me the first time I saw the notification. It never calls her by name, even though she’s registered as a Familiar Face. Even when Gemini provides details—”a person leaves, followed shortly by a person and a child”—it doesn’t notice that it’s the same person entering and exiting the house one after the other.

Over the past month, I’ve been reading Reddit threads complaining about similar issues. A quick look at Google’s support pages reveals that the Familiar Faces feature hasn’t always been stable, even before Gemini. We know that AI tends to hallucinate , and there are even cases where the Nest camera feeds in Gemini seem to be making things up . But now that it’s part of the daily feed, it’s become even more obvious. And while I appreciate the AI’s responsible treatment of strangers, usually just “people,” Gemini’s lack of signal that it’s the same face makes the Nest cameras feel more like overactive motion sensors.

Another issue is that the descriptions in Gemini’s dashboards aren’t always accurate. The Google Home activity dashboard showed a person with a flashlight looking at the doorbell camera. Naturally, the description surprised me when I read it. After doing some research on the timeline and talking to my husband, it turned out we had an overnight delivery from Amazon—which is common now that daylight saving time is no longer in effect in my country—and the person who dropped off the package used a flashlight to verify the address matched the house. While I don’t yet expect Gemini to handle that level of detail, the omission of the delivery person only added to the confusion.

Gemini noticed a man approaching the porch but forgot to mention it was a delivery man. And then my heart sank, as if I’d missed something. Photo: Florence Ion/Lifehacker

Perhaps the most egregious of all the incidents was Gemini missing the moment someone stole all our candy on Halloween night. It didn’t even alert me that there were people in the frame. A little less than an hour before the incident, the first-generation Nest doorbell had detected my family and I returning from trick-or-treating, right down to the colors of our costumes. But when two adults and a child approached the edge of the walkway in front of my front door, it didn’t even register the event. The Google Home app simply labeled it as a “sound.”

I reviewed the timeline many times before I saw the moment the candy was stolen. The lack of an incident report linked to the video made it difficult to pinpoint the exact time. Eventually, I recorded a screen capture of several minutes of video feed around the time the “sound” was heard. That’s when I saw one of the adults in the group stand in front of the bowl of candy, presumably to conceal its contents being emptied into the bag. If not for the physical obstruction, the first-generation Nest doorbell camera would have captured the entire incident. However, it didn’t alert me for a long time that anyone was at my door, which is crucial information to gather.

What do you think at the moment?

How to properly set up a Nest camera

While Gemini still requires significant work from Google, you can do several things to optimize Nest cameras so that AI reports aren’t consistently inaccurate. Gemini’s reports are based on the quality of the data collected by the cameras, so improving the Familiar Faces library can be a huge help. This increases Gemini’s “confidence,” so to speak, and the system doesn’t simply default to “human” recognition.

When I started curating “Familiar Faces,” I noticed that Nest was combining my face with our caregiver’s, and my husband’s face with our daughter’s. It’s nice that the AI ​​picks up on subtle similarities, but it doesn’t really help identify who’s at the door. If the camera accidentally captured the wrong photo, or even a blurry one, you can delete it from your “Familiar Faces” library. While you’re at it, check to see if the camera has created multiple profiles for the same person. You can merge them to make the data more complete and reduce the risk of Gemini crashing.

The best way to fix the Gemini mistake is to edit your “Familiar Faces” library. Photo: Florence Ion/Lifehacker

If you have a Google Home full of old Nest devices, keeping the camera lenses clean and properly mounted can do wonders. Google support states that doorbell cameras should be positioned about 1.2 meters off the ground, while regular cameras can be mounted between 1.8 and 2.4 meters. Most people are captured within 3 meters of the camera. Light and shadows are also worth considering. Check out what the camera sees at different times of day. It’s recommended to avoid placing the camera where the sun or bright outside light could hit your face, but unfortunately, my front door faces west, and the sun prefers to be there most of the day.

Activity zones are also important for letting Gemini tell you what’s going on. If the camera is pointed at a wide area with a lot of unimportant movement, like trees swaying in the wind, use activity zones to pinpoint where packages are dropped in the camera preview. The AI ​​will skip the plants and focus more on the highlighted area. If I had enabled this feature to look into the distance, the doorbell camera might have captured a Halloween candy theft.

The twins are still learning

While aging Nest hardware has become somewhat more useful thanks to the Gemini dashboards becoming an integral part of the user experience, it still struggles with context recognition. Nest cameras are good at general motion detection, but Gemini struggles to distinguish between people and determine when to respond to human-related events.

I contacted Google with a very specific example of my Halloween candy saga, to find out what criteria the AI ​​needed to understand me. I received a response with advice on what I could do to improve Gemini’s chances of receiving summaries. Unsurprisingly, I didn’t receive a more specific answer. The new version of Gemini is in the early stages of development, meaning it’s just the beginning. Google wants your feedback so that we can understand what needs to be improved over time to enhance the integrated Gemini experience.

Nest’s Gemini dashboards in the Google Home app are a valuable tool for reducing notification fatigue and quickly checking what’s happening at your door. But until the AI ​​can reliably distinguish between a family member, a well-intentioned delivery person, and an unusual human action from a high-priority event, you’ll have to continue to do your own checking. That’s not to say the foundation is lacking. But Gemini is still learning.

More…

Leave a Reply