Facebook's massive settlement in a class-action case over violating a state law on how it uses facial recognition is being hailed as a watershed moment for "biometric privacy."
The leading social network said Wednesday it agreed to the $550 million payout after failing to win dismissal of the case alleging it illegally collected biometric information for "face tagging" in violation of a 2008 Illinois privacy law.
The settlement could have wide-ranging implications for Facebook and other tech firms using facial recognition technology, and highlights the potential for state laws to force changes in privacy practices.
(Also read: Sundar Pichai supports temporary ban on facial-recognition, Microsoft believes otherwise)
Plaintiff attorney Jay Edelson said the case helps establish the principle of biometric privacy, or the right of users of tech services and products to control access to their data used for facial recognition.
"Biometrics is one of the two primary battlegrounds, along with geolocation, that will define our privacy rights for the next generation," Edelson said in a statement.
"We hope and expect that other companies will follow Facebook's lead and pay significant attention to the importance of our biometric information."
Attorney Nathan Wessler of the American Civil Liberties Union, which backed the plaintiffs' legal arguments, said the settlement could mark a turning point for consumers and biometrics.
"Companies are going to have to take this seriously," Wessler said.
"Hopefully a settlement of this size will be a deterrent."
The deal is one of the largest settlements in a US privacy case, topped only by Facebook's $5 billion deal with the Federal Trade Commission on its data practices. Both are awaiting court approval.
Facial recognition growing
The legal case comes amid an array of deployments of biometric technologies such as facial recognition for law enforcement and border control, but also for "tagging" in social networks and in applications for retail stores or unlocking personal devices and cars.
Several US cities including San Francisco have passed bans on the use of facial recognition technology. There are concerns about creating large databases with the potential for errors in identifying some individuals.
"We have seen growing recognition in the courts and in the public for the last few years on the need for reasonable but strong limits on the collection and use of our most private information," Wessler said.
The Illinois law does not apply to government entities or contractors. At least two other states have similar laws, but Illinois is the only one allowing for private lawsuits for damages when companies collect data without consent.
Alan Butler of the Electronic Privacy Information Center, which also supported the plaintiff arguments, called the case "hugely significant" with a potential impact for all Facebook users.
Butler noted that the courts ruled the case could proceed merely on the basis of showing a violation, without evidence of specific harms.
(Also read: Facebook to face $35 billion class-action lawsuit over misuse of facial recognition data)
Unintended consequences?
But the Illinois law and similar restrictions may have negative consequences as well, according to Daniel Castro of the Information Technology and Innovation Foundation, a think tank often aligned with industry.
The ability to sue without showing damages has unleashed a flood of litigation and some firms "are even blocking their services in Illinois to avoid the risk of penalties. That's not good for consumers," Castro said.
"At the same time, it does not do much to actually address many specific concerns, such as police use of facial recognition to track citizens."
Castro said the "patchwork" of state laws could make it difficult for tech firms to launch new products, leaving them at a disadvantage compared to their Chinese counterparts.
The settlement comes as US lawmakers are debating federal privacy legislation, with some proposals that could pre-empt laws such as those in Illinois.
Wessler argued that some states have been taking the lead in offering strong privacy rules, and that a federal law could weaken overall data protection.
"The worst outcome would be a weak federal law with no private right of action, and which pre-empts state law, even though that is what the industry is seeking," he said.
from Firstpost Tech Latest News https://ift.tt/3b7o1Pu
via IFTTT
Comments
Post a Comment