In a recent report by Eileen Guo from MIT Technology Review, it is revealed that a robot vacuum took photos of a woman on the toilet in her home. These private images were leaked and ended up on Facebook, violating iRobot’s strict non-disclosure agreement.

In 2020, iRobot distributed “special development robots with hardware and software modifications that are not and never were present on iRobot consume products for purchase.”

These special devices were reportedly labeled with a green sticker that reminded its users there was “video recording in progress.” It was the responsibility of the participants to “remove anything they deem sensitive from any space the robot operates in, including children.

Save up to 66% on MyPillow products. Use promo code FedUp at checkout and save up to 66%.

According to the company, the Roomba 17 series vacuums were given to “paid collectors and employees” who signed waivers stating they were aware data streams, including video, would be sent back to the company to use as training material.

The images taken by the Roomba were then transferred to Scale AI, a company that labels audio, photo, and video data in order to train artificial intelligence and build smarter robots.

Matt Beane, an assistant professor in the technology management programs at the University of California, Santa Barbara, briefly explained how individuals review robot data for the purpose of data annotation and AI improvements.

“There’s always a group of humans sitting somewhere – usually in a windowless room just doing a bunch of point-and-click: ‘Yes, that is an object or not an object,'” said Beane

Some of the private, sensitive images ended up being posted to closed social media groups. The shared images included ones of a woman sitting on the toilet with her shorts pulled down to her mid-thigh.

After the publication of the photos, iRobot CEO Colin Angle confirmed that “iRobot is terminating its relationship with the service provider who leaked the images, is actively investigating the matter, and is taking measures to help prevent a similar leak by any service provider in the future.”

Justin Brookman, the director of tech policy at Consumer Reports and former policy trade director of the Federal Trade Commission’s Office of Technology Research and Investigation suggested that the participants in iRobot’s data collection initiative may not have known that the recorded footage was going to be reviewed by humans.

Brookman argued, “It’s not expected that human beings are going to be reviewing the raw footage.”

Also weighing in on the matter is Jessica Vitak, an information scientist and professor at the University of Maryland. She pointed out that “we literally treat machines differently than we treat humans.”

“It’s much easier for me to accept a cute little vacuum, you know, moving around my space [than] somebody walking around my house with a camera,” said Vitak, warning of the trust we give to robots that could be recording moments we would not want to be recorded.

Join The Conversation. Leave a Comment.


We have no tolerance for comments containing violence, racism, profanity, vulgarity, doxing, or discourteous behavior. If a comment is spam, instead of replying to it please click the ∨ icon below and to the right of that comment. Thank you for partnering with us to maintain fruitful conversation.