Via America’s Lawyer: Legal researcher Sarah Merced and Mike Papantonio walk us through billionaire George Soros’s “Open Society Foundation,” which seeks to legalize prostitution worldwide. Plus, RT correspondent Brigida Santos joins Mike Papantonio to walk us through Amazon’s rollout of its Rekognition facial detection software and Ring home security system.
*This transcript was generated by a third-party transcription software company, so please excuse any typos.
Mike Papantonio: Billionaire George Soros has funded a lot of projects and campaigns for Democrats over the years, but one of his projects has remained largely secret. That project is the Open Society. It’s an organization that Soros founded more than a decade ago and one of the initiatives this group works on is legalizing prostitution around the globe. Joining me to talk about how that is a really bad idea is researcher and analyst Sarah Merced. Sarah, as I looked at this story, here’s the problem. Soros has had this, it’s, it’s kinda like a free ride. He is, he’s out there on progressive issues, he must be right. This story is not right. I mean, if you take a look at what Open Society has done, and it’s not just us talking, there’s been plenty of research done that talks about how dangerous this latest crazy Soros idea is. And he, he, he has a list of crazy ideas and I’ll talk about in a minute, but tell us what Open Society is.
Sarah Merced: Open Society is a foundation that is massive. It has over 1600 employees. It’s chaired by George Soros and it’s main intention is to support civil society groups across the globe.
Mike Papantonio: Okay. How ware they working to, at the end of the day, their goal is to legalize prostitution. To say decriminalize it totally both to the pimp and to the John and to the woman. Nobody goes to jail. Nobody has any type of bad fallout for legalized prostitution. Tell us how are they working to legalize it?
Sarah Merced: They target groups that are headed for that same goal and they fund them and it helps get into communities and up into governments and then changes legislation just like what happened in Ireland.
Mike Papantonio: Okay. The, the thing, this part about the story that bothers me the most is what Soros does is he takes his billions of dollars and he’ll put a thousand people out on the street, it’ll look like a legitimate protest. He’s paying these people to be there. I mean, he’s done in the United States on many times. He’s, he’s done it and he has a long history of doing it in Europe. He’s doing it here. He’s trying to make it sound like there’s this, this loud cry that we want to legalize prostitution and if we do it’s going to be good for women and that’s nonsense. How, how does this hurt woman, let’s go down the list. In your mind, you know, you’re working on one of the biggest cases in the country. The cases had been filed, we filed up in, up in Ohio. Those cases are some of the first cases that are going to be tried in this country. You have a pretty good picture of how, of how it’s hurting women. What’s your take on it?
Sarah Merced: Yeah, I think you’re just getting half the story. I think if you, there are some ideas that are going around about legalizing prostitution and all the help that it could do. But the other thing, and the downside of it is that it doesn’t take into consideration that power and coercion are a big part of sex trafficking and human trafficking. And I think that by legalizing it, you can really target some of the most vulnerable members of our society. You can have people that are still using coercion to now target younger women or potentially trying to tell them that it’s legal and that there’s not anything wrong with it. The dynamics don’t change. And there’ve been studies that show that violence doesn’t change or safe sex doesn’t change.
Mike Papantonio: Well, it increases. I mean, look here, here are the numbers. The, the numbers, you did some research on this in preparation for the lawsuit. I thought it was really good work. There’s been studies that showed that 90 over 90% of the money, for example, ends up not with the woman who is working, but it goes to the pimp or it goes to the trafficker. So that woman doesn’t, that woman doesn’t benefit. If you look at where it goes as far as how it brings in younger people to this, to this process, you look at New Zealand and you look at Australia. What does that tell us? Let’s talk about New Zealand and Australia. Let’s talk about what the history is where they said, well, we’re gonna, we’re going to make an effort to decriminalize. It didn’t do anything but create a bad situation, did it?
Sarah Merced: Yeah. In New Zealand, which was actually the study that I just mentioned a little while ago. They saw no decreases in violence against partners, customers, however you want to now declare that since it has been legalized. There hasn’t been any movement in safer sex. In Australia they actually, they saw an increase in brothels and other illegal activity. So it’s kind of hard to judge. I think there’s just like a lot of missing information.
Mike Papantonio: Well, one thing about it, Sarah, that I think is important, when I looked at those, when I looked at the results that came out of New Zealand that you were talking about, it said, okay, what we’ve done is we’ve created these new parameters. We have created this statement that prostitution is going to be legal here. Now you would think, okay, there certainly would not be any illegal effort to, to, to then take something that’s legal and then add on top of that illegal prostitution. Just the opposite happened. 80% increase, 80% increase in illegal prostitution. That even fell outside the protection of New Zealand. What that, that protection that I’m talking about, here’s what happened. They moved the ball. They said we’re now going to traffic people. They started bringing in people from other countries. They started saying, we’re gonna push the limits on age. We’re going to make it so easy for the pimp and the trafficker to operate that, you know what, it’s still, we still have an illegal process. What is your take on that?
Sarah Merced: Yeah, I think it’s really tricky. I think that one thing that I don’t think anybody’s really speaking about now is that a lot of people, I think the statistics about 70% of women that go into prostitution are under the age of 16. And that’s below the legal age of consent. So it’s kind of hard to monitor if you’re going to legalize everything. What happens to these young people that kind of fall through the cracks? And no one’s getting penalized for that and that I think, opens the door for a potential increase in human trafficking.
Mike Papantonio: Okay. So there’s another part to Open Society that really bothered me. It’s not just George Soros is creating this fraud, the fraud and, and I could go through fraud after fraud with George Soros. You know, he’s this character, first of all that, you know, he starts out as, you know, he’s a money man. He has mezzanine funds, he’s in the money business. He becomes a billionaire and then all over night, he’s like, like he’s bored. Okay, I’m going to now become a political and social intellectual. Here you have this guy that seems like he doesn’t have enough to do. He seems bored. He says, well, my latest thing is we want to have, we want to have open, we want to have Open Society. We want to now legalize prostitution. Look, when the public is presented with the idea of legalizing prostitution, when all of his fraudulent smokescreen disappears, what’s the result? What are the points?
Sarah Merced: I think that what you’re going to be left with is a society that is going to still struggle with what that means. I think that we still have a system where, you know, women have not been given the same work opportunities as their male counterparts. I think that plays into it historically. So you have a lot of women that are going to have a lot of stigma attached to them and no way for a society to kind of embrace them. It’s a little sad because it kind of leaves them with no right answer and no right turn. But at the same time, like decriminalizing does kind of remove some protection for these women that kind of fall through the cracks. And I think there’s an argument to be made that women have autonomy over their own bodies and I’m not trying to dissuade anybody against that, but I think for the most vulnerable members that human trafficking applies to, power and coercion applies to, those people are gonna get really left behind.
Mike Papantonio: All right, Sarah Merced, thank you for joining us. Good luck with the lawsuit. I’m working on that lawsuit up in Ohio. I, I really, but, but good luck on your end because you’re the researcher. You have to bring us the information and I think we’re going to prevail on this case. Thank you for joining me. Okay.
Sarah Merced: Yeah, sure.
Mike Papantonio: Amazon claims it’s facial recognition software can now accurately detect even different emotions including fear. The company says the software may help police with criminal investigations, but critics worry that technology is going to be used by law enforcement to conduct warrantless, warrantless, totally warrantless mass surveillance. RT’s Brigida Santos joins me now from Los Angeles with the story. Brigida, what emotions can this software identify? I find this very, very interesting that they can identify fear and anger and sadness, joy. Tell me about it.
Brigida Santos: Mike, the program known as Rekognition spelled with a K can reportedly detect all seven emotions; happy, sad, angry, surprised, disgusted, calm, confused, and now fear. In a statement, Amazon said, face analysis also generates metadata in the form of gender, age range and attributes. While the company claims identifying emotions like fear could help police with criminal investigations, especially those involving human trafficking or children, civil liberties groups worry that the software could be used by police departments to conduct warrantless mass surveillance or commit other abuses. Especially because these emotions aren’t necessarily shared across all cultures.
Mike Papantonio: Well, so they have to make a giant leap. They see a picture of if somebody looks sad, they look fearful. I mean, you know, what’s the next leap? They look fearful, therefore, let me pull them out of line and do a, you know, a body search. I, I, who knows.
Brigida Santos: Right.
Mike Papantonio: Look, police departments are already using this program. Explain that a little bit. There’s, people are going to be surprised to learn this.
Brigida Santos: Well last year the American Civil Liberties Union published documents revealing for the first time that police departments in Oregon and Orlando were using Rekognition. Now in Oregon, I’ll give you an example of how police were using it. They used it to investigate a $12 theft in one case at a local hardware store. After scanning the store surveillance footage through Amazon’s facial recognition program, they were able to find a possible suspect. So they flagged that suspect’s license plate and they later arrested her. Police departments are also licensing the program to compare surveillance footage to mug shots. Rekognition has also been pitched to immigration and customs enforcement, and a new troubling report reveals that Amazon smart home surveillance company, Ring, has partnered with at least 225 police departments across the United States. While ring doorbell cameras don’t currently have facial recognition capabilities, the company has filed patents to incorporate Amazon’s facial recognition into its products. So this is a slippery slope.
Mike Papantonio: The partnership between Ring and law enforcement agencies, it’s, it’s, it’s a little troubling, isn’t it? Under the deal police are contractually, if I get this right, I think they’re contractually required to market ring products to residents. I mean, what else can you tell me about this peculiar partnership that’s taking place between the police and this company?
Brigida Santos: Police across the country are encouraging residents to adopt Amazon’s doorbell surveillance cameras and to participate in the digital neighborhood watch. In exchange for marketing, participating police departments receive free Ring products and a portal where they can view a map of all active ring cameras in town. Now through the portal, a police can also ask consumers to hand over their personal surveillance footage without a warrant. Now, I do want to stress that consumers must give police permission to obtain their personal footage for use in criminal investigations, but police are using fear-based marketing to get consumers to opt in. Some of those consumers may not know that they also have the option to opt out. The documents also show that for every local resident who downloads the neighbors app, certain police departments get a credit toward more free Ring cameras for residents. The program has nearly zero oversight because there are no federal facial recognition laws and because it shifts government mass surveillance efforts to the private sector. This is a very bizarre loophole. Police already have access to public street cameras to help them track down criminals. This gives them even more coverage while allowing them to skirt legal boundaries.
Mike Papantonio: Here you’ve got Ring saying, hey, put it on, put it on your house. Start a neighborhood, a neighborhood watch club. Put up the big cameras right on your street and we’re all gonna watch everybody. We, you know, it doesn’t take a genius to understand probably where this is going, but so is, it leads me to the next question. How accurate is Amazon’s facial recognition tech? So is it working? I mean, is this recognition, recognition technology really working or there’s, is it fraught with mistakes?
Brigida Santos: There are definitely flaws. Amazon even says that it’s determinations can’t be made with precise accurate, accuracy. And in one test, Rekognition misidentified 28 members of Congress as criminals. Disproportionately they were people of color. This is common. It disproportionately misidentifies women and people of color in other arenas as well. And we know that gender and ethnic bias and machine learning is common. We’ve covered it on this show previously. And when you add that to policing, which also has its own problems, it’s very dangerous. People need to realize that police are selling fear to get consumers to help the government conduct master valence on their friends, families, neighbors, and strangers without a judge, without due process and without a warrant.
Mike Papantonio: Brigida, thank you for joining me. It is sounds like the wild west of technology is upon us. Thank you for covering this story. I’m sure it’s only going to get stranger as time goes on. Thank you.
Brigida Santos: You know it, thanks.