Thursday, January 3, 2019

No Expectation Of Privacy In Own Facial Biometrics

(via Getty)

I’ve watched enough sci-fi movies to know that facial recognition technology should not be taken lightly. One minute, you have a secure method of accessing all of your data and physical spaces, the next minute you’re having back alley facial reconstruction surgery to try to escape a fascist law enforcement regime.

Our only hope is not to limit the technology, it’s to not elect totalitarian demagogues who will use the technology against us under the guise of “national security.” [Ed. note: Lemme just check Twitter to make sure that hasn’t happened yet… oh goddamn it.] Our only hope is not to limit the technology, it’s to write laws that smartly limit how the technology can be used.

On that scale, Illinois’s Biometric Information Privacy Act is an import piece of legislation. Passed in 2008, the first-of-its-kind state law seeks to regulate the collection of biometric information. It seems like the kind of legislation that is necessary to control the impulses of corporations that are able to collect — and will eventually sell — our natural information.

And Google just blew a gaping hole in the statute. From Courthouse News:

A federal judge ruled Saturday that Google does not violate Illinois privacy laws by automatically creating a face template when Android users upload photos taken on their smartphone to the company’s cloud-based photo service…

U.S. District Judge Edmond Chang on Saturday found that the law cannot be used to protect users from the non-consensual collection of information about what many consider to be private – their face.

We already knew we had no expectation of privacy in our faces, but that makes sense. The lack of facial privacy flows from the concept that people have eyes and you know that before you leave your house.

Facial recognition software, however, picks up patterns that the naked eye can’t necessarily see and, more importantly, can’t necessarily reliably record and disseminate to whoever wants to sell you skin cream. There was some hope that a “Biometric Information Privacy Act” would protect the privacy of the biometric information you wear on your face.

But… no. Judge Chang’s decision seemed to indicate that Google was not invading anyone’s biometric privacy because Google was not selling the information it collected, even though Chang acknowledged that Google might in the future sell such information:

Creating a face template with public information does not qualify as a “highly offensive” intrusion into the plaintiffs’ privacy, the judge concluded, especially when there is no evidence Google used the face templates for commercial purposes.

Chang acknowledged that Google could, in the future, use its facial technology for targeted advertising and filtering content – indeed, plaintiffs’ evidence included a chain email among Google employees discussing similar likely uses.

But this evidence is not enough to show that Google “likely will do so in the future without consent, or that Google used plaintiffs’ data in this way,” Chang said.

It’s like I said to the lobsters I cooked over the holidays: “Don’t worry guys, the water’s not boiling… yet.”

Google Wins Lawsuit Over Face-Scanning Technology [Courthouse News]


No Expectation Of Privacy In Own Facial Biometrics curated from Above the Law

No comments:

Post a Comment