Earlier this month, Facebook announced that it was shutting down its Face Recognition system and will delete more than a billion people’s individual facial-recognition templates. Facebook, now rebranded as Meta, continues to be in the news over public safety concerns.
The University of Pennsylvania Carey Law School’s Office of Communications asked Henry R. Silverman Professor of Law and Professor of Philosophy Anita L. Allen, an internationally renowned expert on privacy and data protection law, for her insights on Facebook’s decision and what may have prompted it.
Office of Communications: What were the privacy and/or safety issues concerning Facebook’s facial-recognition system?
Allen: This system was implemented without the informed consent of Facebook users. Users may not have been aware of the extent to which posting photographs entailed a loss of control over the use of the images of family and friends posted on Facebook.
The system carried risks of erroneous identification and mistaken identity for Facebook users and persons appearing in uploaded images.
All around the world, privacy advocates have been raising alarms about biometric systems that facilitate social control through over-surveillance by government and private actors.
Office of Communications: Facebook said it will still work on facial-recognition technologies and sees it as a powerful tool in individual cases, such as for people needing to verify their identity or prevent fraud. It noted that the technology could be valuable when used on one’s own private devices.
Do you think Facebook is going down the right path in terms of stopping its broad use of facial-recognition software and instead planning to use it for narrower purposes?
Allen: Facebook may have responded to the threat of legal action and well as societal concerns. The tagging system was criticized by the Electronic Privacy Information Center (EPIC) and other privacy advocates as violating Facebook’s Privacy Policy, and as constituting an Unfair and Deceptive Trade Practice, under section 5 of the Federal Trade Commission Act.
In 2011, EPIC, whose Board I now chair, filed a complaint with the FTC signed by the Center for Digital Democracy, Consumer Watchdog, and Privacy Rights Clearinghouse. In 2018, EPIC and other advocacy groups again asked the FTC to demand that Facebook stop routinely engaging in nonconsensual biometric face matching scans.
Office of Communications: Is there anything else about this topic readers and Facebook users should be aware of?
Allen: By finally peeling back its contribution to the panoptic society, Facebook demonstrates greater respect for dignity and security.
Read more of our esteemed faculty’s perspectives on today’s pressing legal issues.