Camera E1 “person detected” notifications

Throughout the workday, I keep getting notifications saying “person detected”. But it’s not actually a person - it’s my dog, who stays home while I’m at work.
Animal Detection became a paid feature, so I don’t use it. As for Motion Detection, it mistakes the light turning on and off for movement, making it completely unusable.
If Animal Detection is going to be a paid service, Aqara should improve Person Detection’s accuracy in identifying actual people. Turning Person Detection off means no alerts if someone actually breaks in, while turning it on means getting Person Detection alerts every time the dog moves - - it’s ridiculous. Can’t they just keep it free and add something like “Non-person Detected”??

Is the reason notifications are sent for non-person objects because we’re not using the paid service? Or is determining whether something is a person technically difficult regardless of whether AI is used??

My dog walker comes home at a fixed time, receiving a “Person Detected” notification outside that time is just annoying.

I understand that Aqara wanting to make more money, but Aqara should have known that users want to keep their expenses down, right?
Aqara made virtually every essential daily function a paid feature. But does paying for the service really completely eliminate all false alerts?

How are other users in similar situations handling this?

Hello. I have been using E1 for more than two years. It has never shown me that a dog is a person. Maybe your camera is placed too high, I have one G5 at a height of 8 meters, it does not see animals and sometimes can tell a person is an animal. The recommended height is 2.5-3 meters. Experiment with the sensitivity of the human detection parameter, maybe it will help.

2 Likes

I’m 5’9" (175cm) tall, and the camera is at waist height. When I measured the height, it was 41 inches (104cm). Besides height, what other factors might cause E1 camera to mistake a dog for a person?

This is my first time posting, and while writing this reply, I reread the Community Guidelines. Does my post content fall within the guidelines’ acceptable range? Or was it something like, “We’ll allow it this first time, but subsequent posts won’t be approved until you make the requested changes”? I also chose the category without fully understanding it—did I select the correct one?

Hi, @Aqara_PM_Donie, can you give some advice regarding the false camera trigger on the dog?
Thanks.

I’m sorry for the bad product experience you had, friend. Let me briefly explain why G100 has many paid features:

  1. This product is positioned as the “most cost-effective HomeKit camera” in Aqara cameras. As you can see, it is almost one of the cheapest HomeKit cameras you can buy, so in terms of hardware configuration, it is not a high-performance camera.

  2. Running the HomeKit service consumes a lot of its hardware resources, so locally on the camera, we can only integrate motion detection and human detection features. We want to add more local AI services, but its RAM resources are approaching exhaustion.

  3. In order to fill the gap in AI recognition, we have moved some AI services to cloud servers. Its working logic is that motion detection must first trigger events, and the event video must be uploaded to the cloud service for analysis to determine the triggering reasons in the video clip (such as recognizing faces, vehicles, packages, flames, etc.). Cloud based analysis will consume certain server resources and data traffic, which is why these AI functions need to be paid for. In addition, there is a certain lag in analyzing events on cloud servers. If this is used as an automated triggering condition, it is necessary to understand that it is not suitable for some businesses that require high real-time performance (such as using facial recognition to unlock doorlock)

  4. I think the problem you are facing is the low accuracy of human detection, which is something we need to improve. You can mark the misidentification videos through the feedback channel on the app, or you can share them with me directly here (if you think it is possible). I will arrange for my colleagues to optimize the performance of the AI model. Based on my experience, when the camera’s overhead angle is adjusted too much (close to parallel to the ground), the misidentification of pets as humans will increase because quadrupeds lose some of their external features at this angle, causing the AI model to judge them as bipedal animals (humans). We are working hard to optimize the detection effect at this angle, but we still need some time, here is how to provide feedback on the app



1 Like

or feedback from here


3 Likes

Attached one of the video clip. I also sent feedback.

I’m on the business trip so I don’t read your entire message. I will go through your message later tonight.

got it

Additionally, I would like to add a method to confirm the detected target, as shown in the screenshot below. When you enable this feature, the target will be surrounded by a box, with different colors representing the detection of different objects. For example:
Red - Detected Person
Green - Package detected
Blue (light blue) - vehicle recognized
Blue - Pet detected
Yellow - Recognizing a face can confirm which target triggered the detection

1 Like

My colleague analyzed the video you provided and confirmed that the clothing on the door was mistakenly identified as a human.

3 Likes

Removed the clothing on the door, set up Detection Box. Still, camera thinks my dog is a person. Or camera thinks the blanket on the floor is a person?

Here’s another clip.
Looks like a red detection box is for my dog.

1 Like

@ken_k @Aqara_PM_Donie

I have the same problems, in my case they are the G100 cameras, if there are animals at home these cameras do not discriminate between an animal and a person, I doubt very much that the cameras have a minimum of AI because they would not make those mistakes. I know it’s a cheap camera but it’s sold as detecting people and unfortunately today they detect people and what they are not people do not discriminate. I have the impression that the camera only detects volumes or certain quadrants that move and confuse them with people. I also have videos, I don’t know if it will be possible to fix this camera thing, really today they only serve for camera and turn off everything related to AI and notifications (if you have animals).