Earlier this year, Eufy — a brand owned by Chinese electronics company Anker and known for its popular home security cameras — launched an unusual campaign: they offered users cash in exchange for video footage of package and car thefts. The company said it was collecting these videos to train its AI to recognize real-life theft scenarios better. What made this campaign even more striking was that Eufy actively encouraged users to stage these crimes themselves. According to their website, customers can earn $2 per submitted video, and potentially more if they use multiple cameras or act out different types of theft. For example, faking a car door pull could net someone up to $80.
At first glance, this might seem like a creative way to engage users — turning your own home security footage into a small side hustle. But it also opens up a broader conversation about privacy, data usage, and how far companies should go in collecting information to train AI systems.
Eufy’s goal was to gather 20,000 videos, each of package thefts and car door pulls. Participants were asked to upload videos through a Google Form and provide their PayPal information to receive payment. However, Eufy didn’t disclose how many users participated or how many videos were collected. Even more concerning, the company did not clarify whether the footage would be deleted after training its AI, or what specific safeguards were in place to protect this data.
Since then, Eufy has continued to roll out similar video collection campaigns. In one such initiative, users can earn digital badges or prizes like gift cards and new cameras in exchange for donating footage. The app even features an “Honor Wall” that ranks top contributors — with one user reportedly submitting over 200,000 videos.
While Eufy claims that donated videos are strictly used for AI development and not shared with third parties, their history raises some eyebrows. In 2023, The Verge uncovered that despite advertising end-to-end encryption, Eufy’s web portal allowed access to unencrypted camera feeds. Anker eventually admitted to misleading users and pledged to fix the issue, but the damage to trust was already done.
What this all shows is that companies are increasingly looking to everyday users as a source of valuable data for training AI — even if that means encouraging people to simulate crimes in their own driveways. For some users, it may seem like an easy way to make a few bucks or win prizes. But it also invites questions about data security, transparency, and whether consumers fully understand what they’re giving up in exchange. As AI becomes more embedded in home tech, these are conversations that can no longer be ignored.
