The robot vacuum cleaners of the iRobot company are now very famous, especially those of the Roomba series. Small, cylindrical and compact, these faithful domestic collaborators have in fact been infesting the homes of many of us for a while now. Furthermore, their technology has evolved over the years to offer the best possible service worthy of a human being.
iRobot, the largest trader of robot vacuum cleaners , which was recently bought by Amazon, confirmed the news. Its roomba robot vacuum cleaners have taken private pictures of some users even in intimate circumstances . The MIT Technology Review site managed to get 15 of these photos, which were also posted on closed social media groups. In the following article, we will delve into the story, trying to explain how it happened, clarifying an apparently submerged world.
The "victims" of the Roomba and the response of iRobot
The Roomba robot in question took photos of people from all over the world. In particular, we think that the one of a woman while she is in the bathroom will become sadly famous. Others depict a boy who appears to be up to 8 years old, face in clear, lying on the floor. All screens, however, show houses in general, some occupied by humans, one by a dog. The most characterizing thing is the fact that all objects are labeled by rectangular boxes as a terminator would do. Furniture, decorations and even things hanging on the wall bear the words “tv”, “plant_o_fiore”, “ceiling lamp”, just to name a few.
For its part, iRobot claims to have delivered the offending Roombas only to employees and collectors. They also allegedly signed an agreement acknowledging that they were sending streams of data, including video, to the company. According to the company, it was their job to prevent the Roomba from recording confidential information. Finally, the manufacturer specified that they were robots with hardware and software modifications that are not present on the Roomba on the market . Obviously not even those sold previously.
Who is really behind the diffusion of the images
The iRobot Roomba devices, to date, have powerful sensors and in general a very performing hardware component. Whether they are very advanced video cameras or other tools, everything still contributes to the collection of data for processing by Artificial Intelligence . To make sense of this data, however, human individuals must first see, categorize, label and then give context to every bit of information.
There is always a group of people sitting somewhere doing data categorization work for Artificial Intelligence . It is the so-called gig workers who carry out this process called data annotation . The iRobot company has made it known that it has subcontracted the thing to the Scale AI platform, and beyond. He defends himself though, saying this leaked set of images represents something bigger than any single company's actions.
It is extraordinary to think about how far data goes to "train" artificial intelligence algorithms. In this case, from homes in North America, Europe, Asia, to the servers of iRobot based in Massachusettss. From there to those at Scale AI in San Francisco and finally to the workers we've been discussing around the world. Including, in this case, the Venezuelan gig workers who spread the images on private social media groups .
Project IO related to iRobot's Roomba and how our data ends up around the world
Among all the companies that have appeared in the last decade, Scale AI has become the leader in its sector. Founded in 2016, it has built an entire business of remote workers in less wealthy countries with the Remotasks platform. Right up here in 2020 he assigned a new task: The IO Project. This included images taken at an upward tilt of about 45 degrees, and showed walls, ceilings, floors, houses. Including of course people, whose faces were clearly visible.
The workers then discussed the task on Facebook, Discord and other groups created to exchange advice of all kinds. From managing late payments, assigning best payments, or requesting assistance with tagging. Social media groups have hosted all of this. Ultimately, if iRobot expressed the violation of the agreements by Scale AI, the latter blamed the workers .
The basic problem is that our face is like a password that cannot be changed. Once someone has recorded the 'signature' of our face, they can use it forever to find us in photos or videos. Worse still, such illegal dissemination actions are nearly impossible to regulate on sharing platforms . In short, workers are difficult to control one by one.
So once again, the story of the iRobot Roomba highlights that the technology has an enormous (possibly harmful) potential, it is true. But it is always according to the uses we make of it that it can turn against us, or in this case, take advantage of others.
The article iRobot Roomba violates user privacy, photos on the web was written on: Tech CuE | Close-up Engineering .