The Federal Trade Commission has prohibited Rite Aid from using facial recognition technology, saying the drugstore chain deployed the technology without reasonable safeguards.

“Rite Aid will be prohibited from using facial recognition technology for surveillance purposes for five years to settle Federal Trade Commission charges that the retailer failed to implement reasonable procedures and prevent harm to consumers in its use of facial recognition technology in hundreds of stores,” the FTC wrote Tuesday.

“Rite Aid’s reckless use of facial surveillance systems left its customers facing humiliation and other harms, and its order violations put consumers’ sensitive information at risk,” said Samuel Levine, Director of the FTC’s Bureau of Consumer Protection.

“Today’s groundbreaking order makes clear that the Commission will be vigilant in protecting the public from unfair biometric surveillance and unfair data security practices,” he added.

WATCH:

The FTC writes:

The proposed order will require Rite Aid to implement comprehensive safeguards to prevent these types of harm to consumers when deploying automated systems that use biometric information to track them or flag them as security risks. It also will require Rite Aid to discontinue using any such technology if it cannot control potential risks to consumers. To settle charges it violated a 2010 Commission data security order by failing to adequately oversee its service providers, Rite Aid will also be required to implement a robust information security program, which must be overseen by the company’s top executives.

In a complaint filed in federal court, the FTC says that from 2012 to 2020, Rite Aid deployed artificial intelligence-based facial recognition technology in order to identify customers who may have been engaged in shoplifting or other problematic behavior. The complaint, however, charges that the company failed to take reasonable measures to prevent harm to consumers, who, as a result, were erroneously accused by employees of wrongdoing because facial recognition technology falsely flagged the consumers as matching someone who had previously been identified as a shoplifter or other troublemaker.

Preventing the misuse of biometric information is a high priority for the FTC, which issued a warning earlier this year that the agency would be closely monitoring this sector. Rite Aid’s actions subjected consumers to embarrassment, harassment, and other harm, according to the complaint. The company did not inform consumers that it was using the technology in its stores and employees were discouraged from revealing such information. Employees, acting on false positive alerts, followed consumers around its stores, searched them, ordered them to leave, called the police to confront or remove consumers, and publicly accused them, sometimes in front of friends or family, of shoplifting or other wrongdoing, according to the complaint. In addition, the FTC says Rite Aid’s actions disproportionately impacted people of color.

According to the complaint, Rite Aid contracted with two companies to help create a database of images of individuals—considered to be “persons of interest” because Rite Aid believed they engaged in or attempted to engage in criminal activity at one of its retail locations—along with their names and other information such as any criminal background data. The company collected tens of thousands of images of individuals, many of which were low-quality and came from Rite Aid’s security cameras, employee phone cameras and even news stories, according to the complaint.

“Rite Aid said in a press release that it’s pleased to reach an agreement with the FTC but that it disagrees with the agency’s allegations,” CNBC reports.

“The allegations relate to a facial recognition technology pilot program the Company deployed in a limited number of stores,” the company said, according to the outlet.

According to CNBC, the company said it stopped using the technology before the FTC initiated its investigation.

Per CNBC:

The FTC action comes after a Reuters investigation in 2020 detailed Rite Aid’s use of facial recognition technology in primarily lower-income, non-white neighborhoods. Reuters identified facial recognition software providers DeepCam and FaceFirst as RiteAid’s vendors. FaceFirst’s technology routinely misidentified Black individuals as shoplifters, the Reuters investigation found.

Privacy and civil liberties advocates continue to raise alarms around the use of facial recognition software and the need for further regulation. The technology has led to increased surveillance, and numerous studies have shown the artificial intelligence underpinning the technology is more likely to misidentify people of color, leading to wrongful arrests.

The proposed settlement is subject to approval by a court overseeing Rite Aid’s bankruptcy proceedings. The drugstore chain filed for Chapter 11 bankruptcy protection in October amid slowing sales, rising debt and lawsuits alleging it contributed to the U.S. opioid epidemic.

Join The Conversation. Leave a Comment.


We have no tolerance for comments containing violence, racism, profanity, vulgarity, doxing, or discourteous behavior. If a comment is spam, instead of replying to it please click the ∨ icon below and to the right of that comment. Thank you for partnering with us to maintain fruitful conversation.