The Privacy Problem Presented by Clearview AI

Photo via https://www.flickr.com/photos/182229932@N07/48227343236
Photo via https://www.flickr.com/photos/182229932@N07/48227343236

In early 2020, the New York Times reported that a previously unheard-of company, Clearview AI, was engaging in something notable.[1] Their mission was to create a facial recognition program that used images from public sites like Facebook and Twitter to create a facial recognition database.[2]  Clearview AI accomplishes this through a practice called “scraping”, where users on social media sites post information, photos for example, and companies, such as Clearview AI, will extract that information and use it for their own purposes.[3]

To give some perspective as to the invasiveness of facial recognition technology; Google, a company renowned for privacy violations, has expressed hesitancy in embracing facial recognition technology.[4] This article asks and attempts to answer whether international privacy laws, such as GDPR, are adequate in preventing the invasiveness that comes with scraping.

Under GDPR, consent would be a way of reigning in Clearview AI’s scraping. When defending themselves, Clearview AI might argue that individuals who post information on the internet are implicitly consenting to sharing their information with the public at large. However, under GDPR, consent requires a clear affirmative act.[5] Consent cannot be implied.[6]Furthermore, GDPR considers biometric data to be a special category of data, which requires explicit consent for a specified purpose.[7] In other words, special data cannot be processed outside of the purpose for which it is specified. While Clearview AI might believe that implied consent is enough, it likely won’t hold under the GDPR.

An example of this occurred last year. In May 2021, digital rights organizations submitted complaints before 5 European data privacy regulators against Clearview AI.[8] One of the regulators, the UK’s Information Commissioner’s Office, announced that it had found Clearview AI’s practices to be in breach of the UK’s data privacy laws and announced its “provisional intent to impose a potential fine of just over £17 million”.[9] The UK, who uses GDPR, has found Clearview AI in breach of privacy laws.[10] It seems that this recent action taken by the UK should be viewed as a signal that regulators are beginning to catch onto the invasive practices of companies such as Clearview AI. Therefore, this writer believes that GDPR is adequate in handling Clearview AI, it simply depends on if data privacy regulators can keep up.

Does the US afford adequate protection? Without a federal law that addresses such practices, we look to the states. Many states don’t have privacy laws. However, California’s CCPA and Illinois’s BIPA represent a growing trend of states addressing privacy. We will look at both. Both give state residents the right to receive notice of data collection, similar to the GDPR.[11] Both contain broad definitions of data collection that likely encompasses Clearview AI’s practices.[12]

Analyzing these two laws, the main way they could protect consumers is through the right to receive notice. The CCPA grants those within its jurisdiction the right to receive notice at or before the point of data collection.[13] Recently a CCPA regulation was implemented that exempted some scrapers from the notice requirement, stating “[a] business that does not collect personal information directly from the consumer does not need to provide notice . . . if it does not sell the consumer’s personal information.”[14] While the scope of this regulation is unknown, it is possible that Clearview AI might be able to argue that this exception applies, making the CCPA less effective. The BIPA does not differentiate between direct and indirect collection, meaning that all scrapers must provide notice to consumers before collection.[15]

While the GDPR and BIPA seem adequate in dealing with scraping, the CCPA, could be considered inadequate when dealing with scraping.

[1] Kashmir Hill, The Secretive Company That Might End Privacy as We Know It, N.Y. TIMES (Jan. 31, 2021) https://www.nytimes.com/2020/01/18/technology/clearview-privacy-facial-recognition.html.

[2] Geoffrey Xiao, Bad Bots: Regulating the Scraping of Public Personal Information, 34 Harv. J.L. & Tech. 701, 702 (2021).

[3] Id.

[4] Alphabet CEO Backs Temporary Ban on Facial-Recognition Technology, Aljazeera (Jan. 20, 2020) https://www.aljazeera.com/economy/2020/1/20/alphabet-ceo-backs-temporary-ban-on-facial-recognition-technology#:~:text=The%20chief%20executive%20of%20Google,be%20used%20for%20nefarious%20purposes; Jonathan Stempel, Google Faces $5 Billion Lawsuit in U.S. for Tracking ‘Private’ Internet Use, Reuters (Jun. 2, 2020) https://www.reuters.com/article/us-alphabet-google-privacy-lawsuit/google-faces-5-billion-lawsuit-in-u-s-for-tracking-private-internet-use-idUSKBN23933H.

[5] Regulation 2016/679 of Apr. 27, 2016, on the Protection of Natural Persons with Regard to the Processing of Personal Data and on the Free Movement of Such Data, and Repealing Directive 95/46/EC (General Data Protection Regulation), 2016 O.J. (L 119) 1, Recital 32 (EU) [hereinafter“GDPR].

[6] Id.

[7] GDPR, supra note 6, at Article 9.

[8] The ICO’s Announcement About Clearview AI is a lot more than just a £17 Million Fine, Privacy International (last visited, Feb. 3, 2022), https://privacyinternational.org/news-analysis/4714/icos-announcement-about-clearview-ai-lot-more-just-ps17-million-fine.

[9] Id.

[10] Id.

[11] CAL. CIV. CODE § 1798.100(b) (West 2018); accord 740 ILL. COMP. STAT. 14/15(b)(1)-(2) (2008).

[12] Id.

[13] CAL. CIV. CODE § 1798.100(b) (West 2018); 740 ILL. COMP. STAT. 14/15(b)(1)-(2) (2008).

[14] CAL. CODE REGS. tit. 11, § 999.305(d) (2020).

[15] 740 ILL. COMP. STAT. 14/15(b) (2008) (“No private entity may collect, capture, purchase, receive through trade, or otherwise obtain a person’s or a customer’s biometric identifier or biometric information, unless [notice is given and consent is received].”).