Clearview AI Inc. is facing a class action lawsuit filed by two individuals claiming that the company illegally scraped more than 3 billion images from websites such as Facebook, Twitter, and Google without the consent of users. Plaintiffs say that Clearview AI stores these scraped images of faces on their database. Then the company allegedly utilizes its facial recognition software to create biometric information and sold this information to third-party entities. Ring of Fire’s Farron Cousins discusses this with Scott Hardy, President of Top Class Actions.

Transcript:

*This transcript was generated by a third-party transcription software company, so please excuse any typos.

Farron Cousins:                  By now everybody who is online needs to understand that there are dangers of having your personal information out there. We have talked about it repeatedly on this program and I think everybody’s got a good handle on that. Your name, if you have your address, your phone number, all of that stuff is very vulnerable. But one thing most people don’t seem to understand is that your face is too. All of your biometric data that is online is also at risk and companies are now trying to exploit that without your knowledge or consent and I have Scott Hardy from Top Class Actions with me now to tell us about the latest example of this happening and Scott, this is with a company called Clearview AI. Tell us what the lawsuit says Clearview AI was up to.

Scott Hardy:                          Well this is all about consent to use of your data. So Clearview AI, what they did is they created an app that, and it was a very intelligent way to do it. The founders actually figured out the, he saw a white paper on how to use artificial intelligence to identify facial characteristics on, on individuals and got some developers said, hey, build this and the developers did. And along with the developers who were coding this software to identify individual facial characteristics from the pictures, he had another team of developers that all they did was scrape photos, all the public photos that they could get that were connected to individuals. Facebook, Instagram, LinkedIn, anything that had a public profile that they could connect to a name. This bot that they built went out and scraped all of those pictures and put it into the Clearview AI system.

And so then you have law enforcement because at first they weren’t really sure how to market it. They said, you know, this is an interesting idea. How are we going to make money on it? And then they, after some iterations, they latched onto law enforcement who said, hey, well we’ve got this picture of this crime that happened. We’re going to use this picture, this video, this still shot from this video. Can your Clearview AI system use it to identify the perpetrator? And they did, and so that’s been very successful. Now they’ve got different police, police groups from all over the country and Sheriff’s associations that are signing up for this software platform. But this class action is saying, you know what? While that is great for law enforcement. It’s not great on the privacy perspective and I didn’t give you my consent to use my picture to make money.

And that’s what class action is about, is that you have these users whose faces were used and are used to make money by Clearview. And of course the 3 billion plus photos that were scraped, those of course aren’t getting paid a licensing fee, they’re not getting paid. If I have a picture with some, some guy who robbed a bank on my profile, and you use that picture to catch that crook and you paid Clearview AI $10,000 a year for their system. Well my picture is what got that person caught along with the use of your Clearview AI. I should have to give you my consent to use my picture for you to make money and that’s what this class action is all about. Is, is consent necessary and, you know, what’s the proper use of these public pictures.

Farron Cousins:                  See this is, this is a case that is absolutely going to have widespread ramifications for everyone. You know, depending on which way this case goes and to me this actually seems like one that could eventually make its way all the way up to the top of the system here because it really has a huge impact. If they rule, you know, the judges who whoever this case goes in front of throughout the line here, if they say that yes, consumers have to give consent and you got to pay them if you’re using this, then that’s huge. That, that’s great, wonderful big win for consumers. Plus it means they can’t do it without our consent. Awesome.

If it goes the other way than every piece of our data out there, you know, every picture you upload to Instagram, every, you know, photo you have on Facebook that now is suddenly absolutely subjected to becoming part of a police database and that’s terrifying. I mean, we, we all know, we, we’ve all read 1984 and we know what it’s like to be under constant surveillance because of that book. And so this could be wonderful or this could be horrible, and, and obviously we always hope for the wonderful part, but you know, this scares me because we don’t know which way it could go.

Scott Hardy:                          Well, and Clearview AI’s perspective is they said, hey, yeah, Facebook says it’s technically against rules to scrape all the pictures and use it for my own business. But, Peter Thiel, who has, who is one of the, sits on the Facebook board, put $300,000 into Clearview AI. And now he said, hey, I just helped a scrappy startup and that $300,000 was converted to equity for me. But, you know, I’m not on the board. They’re doing their thing. So Clearview AI’s perspective is that they said, listen, we’re just doing something that someone else is going to do, eventually. All of these pictures are out there. Somebody else is going to do this same type of business model, you know, why can’t it be us? Facebook has said, no, we’re not going to do it. Amazon is doing their own Amazon stuff with the Ring systems. That’s a whole other issue.

But Clearview AI said, listen, you put these pictures out there. You know, you don’t have an expectation of privacy because all of these pictures are public. So why can’t we use them and help law enforcement officer’s solve crimes? I mean, you had some crimes that were completely unsolved and dead cases until they went through the Clearview AI platform and all of a sudden they had leads and they were able to make arrests. So it’s a powerful platform. And then the question is, you said to go to the courts of, if my picture was used to solve a crime, can I get some kind of piece? Can I get part of the reward for that? What, you know, what’s out there? If Clearview AI is making money on this, you know, maybe I should get paid too.

It’ll be very interesting to see that, how this one moves forward. And, you know, from an entrepreneurial perspective, what, what, what Peter did in funding this and from what, what Hoan Ton-That and Richard Schwartz did, who are the founders of Clearview AI to build this business and did what any successful entrepreneur out there did is that they built a product they thought was valuable and then they kept iterating until they found their audience that would pay for it. And now here we are. So we’ll have to see what happens from the consumer’s perspective, you know, going forward.

Farron Cousins:                  Yes, and in the meantime, once again, everybody, please understand whatever you put online, it doesn’t matter if you’ve checked, yes, I want my privacy. Yes, I want this, you know, nobody else can see this unless I approve it. It’s never a hundred percent safe. It’s never a hundred percent private if you put it online. So keep that in mind every time you send out a new TikTok or put a new tweet out there, this can and will be public eventually. So please everyone protect yourself. Be careful. Watch what you say. Watch what you post, because we have no idea what’s out there scraping it right now.

And for more information about this issue, follow the link in the description of this video. Head over to Top Class Actions and if you haven’t already done so, please do subscribe to their weekly newsletter. Scott Hardy with Top Class Actions, thank you very much for talking with us.

Scott Hardy:                          You’re welcome. Thanks your time, Farron.

Farron Cousins is the executive editor of The Trial Lawyer magazine and a contributing writer at DeSmogBlog.com. He is the co-host / guest host for Ring of Fire Radio. His writings have appeared on Alternet, Truthout, and The Huffington Post. Farron received his bachelor's degree in Political Science from the University of West Florida in 2005 and became a member of American MENSA in 2009. Follow him on Twitter @farronbalanced