A TikTok class action filed in Illinois federal court claims that the popular video app collects data from children under the age of 13 without consent. Plaintiffs Sherri LeShore of Illinois and Laura Lopez of California recently filed the TikTok class action on behalf of their children, claiming that the video social media app violates state laws by gathering data from young users. TikTok started in 2014 under the name Musical.ly. Since then, the app has allegedly failed to protect children. Ring of Fire’s Farron Cousins discusses this with Scott Hardy, President of Top Class Actions.

Transcript:

*This transcript was generated by a third-party transcription software company, so please excuse any typos.

Farron Cousins:                  The TikTok app has been around in some form or fashion for at least the last five years, but in recent months has once again surged in popularity. And unfortunately as we’ve seen all too often with these apps, especially those geared towards younger children, it appears that they are taking user’s data and not protecting that data, without consent. Joining me now to explain what’s happening is Scott Hardy with Top Class Actions. And Scott, again, we’ve seen this huge resurgence in the last a month or so of TikTok. People love posting these things on other social media networks and kind of like all the other app stories we’ve had to talk about, they’re not protecting consumer data and they may even be selling the data of children. Lay this one out for us.

Scott Hardy:                          Right. So it’s illegal for children under the age of 13 to give their consent to have their likenesses and their pictures broadcast on the internet. That is a huge privacy no-no. But of course you have kids that love to lip-sync to these videos, use all the filters and post these on TikTok and share them with your friends. But the problem is if the kids under 13 even if they say it’s make it a private account, TikTok was still allowing these things to be searched and viewed. And, you know, I actually had an issue with one of my daughters who unbeknownst to me had a TikTok account and was posting all of these videos and, you know, having these, it was a private account, but she was still getting likes from people that weren’t her friends. And, you know, now all of that has been wiped out. But it really shows a larger issue that TikTok has of underage kids using this app and can get, getting possibly harassed and stalked by people around the world.

Farron Cousins:                  And that’s, you know, it opens the children up, as this lawsuit claims, to predators. You know, if your child has an account here, which, which they can have. But if they’ve said, I want to be private, I only want, you know, my close friends that I accept into my circle here, only they can look at my things. I am doing this for them. But by not protecting this, by failing to live up to their own standards that TikTok had set for this app, they allow these predators to come in and send messages to, to, to underage people. And that just opens up a whole new can of, you know, terrifying worms here in addition to another lawsuit says they’re selling the data.

And it’s bad enough that we as adults, we understand that look, if I’m going to get on my Amazon app and I’m going to buy something, there goes my data, it’s gone. I log into Facebook, my data’s gone. They know where I’m at. They’ve got all my stuff, it sucks and it’s horrible, but we’ve almost become addicted to these apps. These kids on the other hand, they didn’t give consent. They don’t understand what’s happening when they do this. Yet, just like us, their data’s being sold as well.

Scott Hardy:                          It is, I mean, and then we have a second TikTok class action that’s not included in the settlement that for a plaintiff over the age of 18 who, who alleges that TikTok accumulates data and transfers that information to servers in China. And that information can be used to identify and track the location of users within the United States among other things. So there are a lot of privacy concerns that have been brought over the TikTok app and how it’s based in China and what exactly they’re doing with that data.

Farron Cousins:                  We have to be very careful not just as consumers but especially as parents. You know, as you said, you, you discovered that, that one of your children had this app on there and you didn’t even know about it. And while at the same time we’ve got to respect their privacy. We don’t want to be like these corporations and, and swoop in and take their data away from them either. But we’ve also gotta be a little more vigilant about this. And, and unfortunately it means we probably also have to start having talks with them about data privacy and things like that. Things that kids probably can’t even understand. But that’s the, that’s the position that corporations have put us in. Now, we shouldn’t be in this position. Those are things we shouldn’t have to do.

But unfortunately we do almost like when schools have to go through active shooter drills. It’s not something they should have to do. But unfortunately it’s something we have to do because of the way the world is now. And so it puts more on us. That’s unfair. We shouldn’t have to snoop on our kids’ phones. We hadn’t, shouldn’t have to see what kind of apps they’re using. We shouldn’t have to have a talk about data privacy with them. But it kind of looks like that’s, that’s what we have to start doing to protect them from these corporations.

Scott Hardy:                          We are and we had the conversation just last night with our daughter saying, don’t take any pictures on your phone that are at all inappropriate. You know, all of these things, don’t send pictures to your friends. All of these things will live on the internet forever. And, you know, with these lack of privacy and these privacy breaches, it’s just so unsafe for kids right now. It’s really scary and now we have to lock down all of their devices and all their access to these devices to make sure that they are, you know, aren’t doing anything that they’re not even aware could harm them.

Farron Cousins:                  Yeah. We have to basically become the police officers in our own home because of the negligence of these corporations. For more information about this issue, please follow the, in the description of this video, head on over to topclassactions.com and while you’re there, make sure you sign up for their weekly newsletter. Scott Hardy, Top Class Actions, always a pleasure talking to you. Thank you.

Scott Hardy:                          You’re welcome. Thanks for your time, Farron.

Farron Cousins is the executive editor of The Trial Lawyer magazine and a contributing writer at DeSmogBlog.com. He is the co-host / guest host for Ring of Fire Radio. His writings have appeared on Alternet, Truthout, and The Huffington Post. Farron received his bachelor's degree in Political Science from the University of West Florida in 2005 and became a member of American MENSA in 2009. Follow him on Twitter @farronbalanced