Avatars, Facial Scans & Virtual Basketball: Second Circuit Tosses Biometric Privacy Case
A recent federal appellate ruling delivered a significant blow to invasion of privacy claims based on facial recognition technology used to scan users’ faces that are then put on their personalized players “in-game,” allowing them to play side-by-side with basketball stars in a popular video game.
In Santana v. Take-Two Interactive Software, no. 17-303, 2017 U.S. App. LEXIS 23446 (2d Cir. Nov. 21, 2017), the U.S. Court of Appeals for the Second Circuit rejected privacy claims made under the Illinois Biometric Information Privacy Act (“BIPA”), which governs the collection, storage, and dissemination of an individual’s “biometric identifiers” and “biometric information” by private entities. The statute defines a “biometric” identifier as a “retina or iris scan, fingerprint, voiceprint, or scan of hand or face geometry.”
The video games – NBA 2K15 and NBA2K16 – allow users to create a basketball player that has a 3-D rendition of the gamer’s face, referred to as an “avatar.” To create the avatar, the gamer must allow a camera to photograph their face, “slowly turn[ing] their head 30 degrees to the left and to the right during the scanning process,” which takes about 15 minutes.
The video game manufacturer requires the gamer to first agree to onscreen conditions which state: “Your face scan will be visible to you and others you play with and may be recorded or screen captured during gameplay. By proceeding you agree and consent to such uses ….”
The Second Circuit affirmed dismissal of the complaint for a lack of standing. In so doing, the Court adopted a two-step approach: First, whether in adopting the Illinois law, a “concrete right” was conferred to protect “a plaintiff’s concrete interests’ as to the harm in question.” And second, if so, whether that procedural violation “presents a risk of real harm to that concrete interest.”
While the Court assumed that BIPA’s purposes was to “prevent the unauthorized use, collection, or disclosure of an individual’s biometric data,” the panel went on to conclude that plaintiffs failed to present any “material risk that Take-Two’s procedural violations have resulted in plaintiffs’ biometric data being used or disclosed without their consent” or that the alleged violations “raised a material risk that their biometric data will be improperly accessed by third parties.” Simply put, the plaintiffs failed to show any injury, or even a risk of one.
The Court first held that “no reasonable person” could have believed the scanning feature did anything other than what a gamer was told it would do – perform a face scan for purposes of use with the video game.
A “thornier issue,” however, was the manufacturer’s “alleged violations of BIPA’s data security provisions.” BIPA requires businesses to use a “reasonable standard of care within [their] industry” to protect stored biometric data and to store and transmit such data in “a manner that is the same as or more protective than the manner in which the private entity stores, transmits, and protects other confidential and sensitive information.” The manufacturer allegedly transmitted “unencrypted scans of face geometry via the open, commercial Internet” and stored users’ “face templates in a manner that associates their identity with their biometric data.” Nonetheless, even if true, these alleged procedural violations did not present a risk of real harm and therefore were insufficient to show injury-in-fact.
“We … find unpersuasive plaintiffs’ attempt to manufacture an injury,” wrote the Court. “Plaintiffs’ fear, without more, is insufficient to confer an Article III injury-in-fact… Because plaintiffs have failed to establish that Take-Two’s [alleged] procedural violations have created a material risk that this will occur, they cannot now leapfrog this obligation by imposing an injury upon themselves.”
BIPA is the first privacy law to regulate the private sector’s use of biometric identifiers and the retention, collection, disclosure, and destruction of biometric data. It is still the only legislation creating a private right of action. Washington and Texas have enacted similar legislation but without private right of actions, leaving enforcement to their respective attorneys general.
We will continue to monitor and report on any developments in this area.