Are the benefits of facial recognition technology worth their societal cost? Your Face Belongs to Us author Kashmir Hill weighs the pros and cons here!
What We Discuss with Kashmir Hill:
- What is facial recognition technology, and how does it work?
- What are the positive use cases for facial recognition technology?
- How accurate is facial recognition technology, and what happens when it confidently misidentifies someone (or is used unethically by those with access to the keys)?
- Who is pushing for the proliferation of facial recognition technology, and what do they stand to gain from it?
- Is there a middle ground between the outright banning of facial recognition technology and completely allowing it to keep tabs on us in a world where privacy is a quaint relic of a bygone era?
- And much more…
Like this show? Please leave us a review here — even one sentence helps! Consider including your Twitter handle so we can thank you personally!
Big Brother is watching you. Big Brother has always been watching you. But now Big Brother has technology on his side to watch you with eyes that see you unfailingly — except for when it does fail and rats you out for being someone you’re not to a government, law enforcement agency, or other entity that holds your life in its hands. What are the consequences of being an innocent doppelganger to someone who has reasons for not wanting to be found?
On this episode, we’re joined by Kashmir Hill, journalist and author of Your Face Belongs to Us: A Secretive Startup’s Quest to End Privacy as We Know It. Here, we discuss how facial recognition technology can be used to make the world better, while sorting through the myriad ways in which it can be abused or misdirected to make the world worse. Listen, learn, and enjoy!
Please Scroll Down for Featured Resources and Transcript!
Please note that some links on this page (books, movies, music, etc.) lead to affiliate programs for which The Jordan Harbinger Show receives compensation. It’s just one of the ways we keep the lights on around here. We appreciate your support!
Sign up for Six-Minute Networking — our free networking and relationship development mini-course — at jordanharbinger.com/course!
Subscribe to our once-a-week Wee Bit Wiser newsletter today and start filling your Wednesdays with wisdom!
This Episode Is Sponsored By:
- Nissan: Find out more at nissanusa.com or your local Nissan dealer
- US Bank: Apply for the US Bank Cash Plus Visa Signature Card at usbank.com/cashpluscard
- Progressive: Get a free online quote at progressive.com
- BetterHelp: Get 10% off your first month at betterhelp.com/jordan
- The Adam and Dr. Drew Show: Listen here or wherever you find fine podcasts!
Miss the conversation we had with Tristan Harris, a former Google design ethicist, the primary subject of the acclaimed Netflix documentary The Social Dilemma, co-founder of The Center for Humane Technology, and co-host of the podcast Your Undivided Attention? Catch up with episode 533: Tristan Harris | Reclaiming Our Future with Humane Technology here!
Thanks, Kashmir Hill!
If you enjoyed this session with Kashmir Hill, let her know by clicking on the link below and sending her a quick shout out at Twitter:
Click here to thank Kashmir Hill at Twitter!
Click here to let Jordan know about your number one takeaway from this episode!
And if you want us to answer your questions on one of our upcoming weekly Feedback Friday episodes, drop us a line at friday@jordanharbinger.com.
Resources from This Episode:
- Your Face Belongs to Us: A Secretive Startup’s Quest to End Privacy as We Know It by Kashmir Hill | Amazon
- Kashmir Hill | Website
- Kashmir Hill | The New York Times
- Kashmir Hill | Threads
- Kashmir Hill | LinkedIn
- Kashmir Hill | Twitter
- What is Facial Recognition and How Does It Work? | Kaspersky
- Facial Recognition Technology: Current and Planned Uses by Federal Agencies | US GAO
- Face Recognition Technology | American Civil Liberties Union
- Facial Recognition | Clearview AI
- Face Recognition Search Engine and Reverse Image Search | PimEyes
- Reverse Image Search Face Recognition Search Engine | FaceCheck
- The Secretive Company That Might End Privacy as We Know It | The New York Times
- What We Learned About Clearview AI’s Hidden ‘Co-Founder’ | The New York Times
- National Institute of Standards and Technology
- Mark Zuckerberg Hides His Kids’ Faces on Social Media. Should You? | Business Insider
- Madison Square Garden Uses Facial Recognition to Ban Its Owner’s Enemies | The New York Times
- Racial Discrimination in Face Recognition Technology | Harvard University
- LOVEINT: How NSA Spies Snooped on Girlfriends, Lovers, and First Dates | Slate
- A Face Search Engine Anyone Can Use Is Alarmingly Accurate | The New York Times
- How a ‘Digital Peeping Tom’ Unmasked Porn Actors | Wired
- Facial Recognition Used to Strip Adult Industry Workers of Anonymity | Sophos News
- She Thought a Dark Moment in Her past Was Forgotten. Then She Scanned Her Face Online | KSBW
- Facial Recognition And Beyond: Venturing Inside China’s ‘Surveillance State’ | NPR
- Russia Illegally Used Facial Recognition to Arrest Protestor, Human Rights Court Rules | Politico
- Why an Illinois Law Is at the Center of Congress’ Debate on New Data Privacy Legislation | The Record
- The Birth of Spy Tech: From the ‘Detectifone’ to a Bugged Martini | Wired
- Documentary Exposes How the FBI Tried to Destroy MLK with Wiretaps, Blackmail | NPR
- Federal Bureau of Investigation (FBI) | The Martin Luther King, Jr. Research and Education Institute
- Do People Caught on Ring Cameras Have Privacy Rights? | Wired
- How Iran Is Pioneering a New Era of Oppression | The Christian Post
- China Drafts Rules for Facial Recognition Tech Amid Privacy Complaints | Al Jazeera
- Paul Mozur | The New York Times
- Laowhy86 | How the Chinese Social Credit Score System Works Part One | Jordan Harbinger
- Laowhy86 | How the Chinese Social Credit Score System Works Part Two | Jordan Harbinger
- Facial Recognition Spreads as Tool to Fight Shoplifting | The New York Times
- Which Stores Are Scanning Your Face? No One Knows. | The New York Times
- Facial Recognition Software Is Everywhere, with Few Legal Limits | Bloomberg Law
- How Facial Recognition Is Being Used in the Ukraine War | The New York Times
- 10 Pieces of Fashion You Can Wear to Confuse Facial Recognition | Gizmodo
- Can You Hide a Child’s Face From AI? | The New York Times
- Nina Schick | Deepfakes and the Coming Infocalypse | Jordan Harbinger
948: Kashmir Hill | Is Privacy Dead in the Age of Facial Recognition?
This transcript is yet untouched by human hands. Please proceed with caution as we sort through what the robots have given us. We appreciate your patience!
[00:00:00] Jordan Harbinger: This episode of the Jordan Harbinger Show is brought to you by Nissan. Nissan SUVs. Have the capabilities to take your adventure to the next level. Learn more@nissanusa.com. Special thanks to US Bank for sponsoring this episode of the Jordan Harbinger Show.
[00:00:13] Coming up next on the Jordan Harbinger Show.
[00:00:16] Kashmir Hill: Ultimately, any technology like this, it's about power. And so what worries me about facial recognition technology is that, you know, there's a certain creepiness that we feel when we're online, right? That kind of feeling of being watched, being tracked. I just think it could all move into the real world with facial recognition technology, that our face would be this way of unlocking. This whole online dossier about us.
[00:00:44] Jordan Harbinger: Welcome to the show, I'm Jordan Harbinger. On the Jordan Harbinger Show. We decode the stories, secrets, and skills of the world's most fascinating people and turn their wisdom into practical advice that you can use to impact your own life and those around you. Our mission is to help you become a better informed, more critical thinker through long form conversations with a variety of amazing folks, from spies to CEOs, athletes, authors, thinkers, performers, even the occasional Russian spy, gold smuggler, economic hitman, former jihadi astronaut or tech luminary.
[00:01:13] And if you're new to the show or you wanna tell your friends about the show, I suggest our episode starter packs. These are collections of our favorite episodes on persuasion and negotiation, China psychology, geopolitics, disinformation, and cyber warfare. Crime and cults and more. That'll help new listeners get a taste of everything we do here on the show.
[00:01:29] Just visit Jordan harbinger.com/start or search for us in your Spotify app to get started today, Kir Hill is with me. We'll be doing a deep dive on facial recognition. Turns out this surveillance technology has broad implications, not only for privacy, which as we know it might actually be dead entirely to security, dating, law enforcement, and even warfare.
[00:01:50] This is a fascinating conversation with one of the few true experts on this emerging field, and we get into both the negatives and the positives of a future in which your face might not belong to you. Here we go with Cashmere
[00:02:02] Hill.
[00:02:08] Jordan Harbinger: The facial recognition stuff is scary, not just because of how it can be misused. Actually because of how useful it is. So it's, it's kind of like ai, right? It's so useful that I think we're gonna run headlong into using it for everything with very little consideration of how this could go horribly wrong.
[00:02:25] What do you think about that?
[00:02:26] Kashmir Hill: Yeah, I mean, I think it's really complicated and there's some activists who say, Hey, this is, you know, potentially so dangerous that we have to ban it entirely. But then you see all the positive use cases and you realize it's more nuanced Yeah.
[00:02:40] Jordan Harbinger: Than you thought, right? It's, this is not just like, Hey, maybe people shouldn't have explosives in their house that they can buy unless they have tons of permits in their farmers and they need them for that.
[00:02:51] And then even the right, this is like, do you want your phone to know when you're looking at it so that it unlocks? Yeah. I kind of do want that. I don't wanna go back to the thumbprint thing. What about your computer? Yeah, that'd be great. What about instead of using passwords, we use facial recognition.
[00:03:05] Yeah, that sounds really convenient. And then it's like, well, if we ban that we can't do it. Of course the legal system's gonna have to adapt, and we'll talk about that in a little bit. But tell me about the companies that are doing this, primarily Clearview ai. What are they doing day to day? So there
[00:03:22] Kashmir Hill: are hundreds of companies that are selling facial recognition technology.
[00:03:26] What sets Clearview AI apart is that the company, a small, little New York based startup went out and scraped billions of photos from the public web, from social media sites like Facebook and Instagram and LinkedIn, and you know, did it without anyone's consent. Their database now has 30 billion photos.
[00:03:45] And what happens is you can upload a photo of somebody you don't know to Clearviews app and it will pull up all the photos of them that appear on the internet. So you can find out their name, you know, find their social media profiles and maybe even find photos of them that they don't even know are on the internet.
[00:04:01] Jordan Harbinger: I wanna try this for myself. Well, they let me do that. Probably not right?
[00:04:05] Kashmir Hill: Clearview Limits use of its app to police and law enforcement agencies. But you know, when I write in the book, what they accomplish is something other companies can do. So now there are public face search engines. You can go to a site called PIM I, or a site called Face Check id, and you can do that.
[00:04:24] You can upload your face and see where else it appears on the internet. Those sites have smaller databases than Clearview ai. They're not quite as powerful a tool. But yeah, I mean, I. The dams are breaking, you know the cat is scratching its way out of the bag. We're really at a kind of scary moment right now where we could all lose our,
[00:04:43] Jordan Harbinger: our anonymity.
[00:04:44] So many law enforcement or professionals listen to this show. I probably can't get away with this, but I'm going to try anyway. If somebody out there has access to Clearview, can you run me through that and just email me what you saw, or I will call you and talk about this if you don't wanna put it in writing.
[00:04:59] I'm so curious if it comes up with like, oh, someone scanned your high school yearbook and put it on a website and you don't even know that it's there and here's your photo from when you were 16 or like. Here's a driver's license database that got leaked and your face is in there. I'm so curious where it pops up.
[00:05:17] 'cause I already have thousands of photos online, but I'm not, that's not normal and we'll, we'll get to sort of the distinction between public figures and not as well.
[00:05:27] Kashmir Hill: I've had Clearview searches run on me. I mean, at first the company didn't want me to report on them and they put an alert on my face so that when police officers did run my face, people I was interviewing the company would get an alert basically and call the police officers and tell them not to talk to me.
[00:05:42] But eventually the company came around. I talked to the founder many times for the book and he's run these searches on me and it brings up, you know, sometimes what you would expect. You know, headshots of me on the web that I know about. But then it's also brought up flicker photos from 15 years ago, photos of me at a concert in a crowd in the background of someone else's photo.
[00:06:04] Whoa, me talking with a source at a public event in a photo I didn't realize was on the internet. There was this one photo of it with somebody, you know, standing in the foreground, somebody walking by in the background. And at first I couldn't see myself in the photo until I recognized the jacket of the person walking by in the background.
[00:06:22] And I realized, wow, that's me. I bought that jacket in Tokyo. It's very unique. Whoa. And it was this case where the computer was able to recognize me, where I couldn't even recognize
[00:06:31] Jordan Harbinger: myself. That's really something I was not expecting, that I was expecting human eyes to do slightly better than the computer, but it doesn't seem like that's the case.
[00:06:40] And of course. The computer can do it a million times a second, and we can do it, you know, in a 32nd batch if we squint. But I still expected humans to go, no, no, no. That is me. It's just blurry, but that's my purse. And those, I still have those shoes, but it seems like the computer can just be like, no, here's your blurry face from a rave that you thought you were gonna go to.
[00:06:58] Yeah, at
[00:06:59] Kashmir Hill: age 40, I mean, computers are better at this than us. They can remember billions of faces at the same time, they do make mistakes. Mistakes have happened. People have been arrested for the crime of looking like someone else because they were identified. Facial recognition technology. So I wouldn't wanna give anyone the impression that this works a hundred percent, you know, accurately all the time.
[00:07:19] It can make mistakes. But wow, it has come so far. It can be very powerful. Can it
[00:07:24] Jordan Harbinger: also find people that just really look a lot like me, but aren't me? Did you see any photos of your, you're like, that's definitely not me, but wow, that woman looks like me. That has to happen. I
[00:07:33] Kashmir Hill: have not seen doppelgangers that I can recall in a Clearview search, but it really, it depends on, there are all these settings that you can choose when you're doing a facial recognition program.
[00:07:44] And so in procedurals, like on tv, whenever somebody runs a facial recognition match, it's like, here's a photo of somebody and it just tells you who the person is. Right? But in reality, you get this whole page of results with lots of different photos, and the computer ranks it in terms of how confident they are that it's the same person.
[00:08:02] So if you like, turn the competency score down, you'll get more doppelgangers, you'll get more people where the computer thinks, oh, this is only 80% likely to be her. Mm-Hmm. Et cetera.
[00:08:12] Jordan Harbinger: I would love to see somebody who looks 80 to 90% like me. I've, I've had it happen once in my life. When I was living in Israel, there was a guy who was like a VJ.
[00:08:20] He and I looked so much alike that I showed my mom a photo and she goes, when did you get your eyebrow pierce? And I'm like, it's not me. It's a different guy. And it was really cool 'cause he was kind of like a famous guy. He was on whatever MTV equivalent was in Israel. And so I'd go to the mall and like 13-year-old girls would be like, oh my gosh.
[00:08:39] And then find out I spoke Hebrew like a toddler from uh, three months of lessons. And uh, yeah, they were disappointed. But it would be such an interesting test to run to see who looks like you all around the world. Although it, it is amazing that it can sort of still tell, 'cause I guess even if somebody looks like you, like the eyes are a little bit further apart, the ears are a little bit higher, lower on the head or tucked back more, the hairline's different or something like that.
[00:09:05] I don't know. Going back to the company, they're quite secretive or at least they were in the beginning, right? Didn't you try to walk there and the building didn't exist or something like that?
[00:09:15] Kashmir Hill: Yeah. When I first heard about Clearview ai, they had kind of shown up in a public records request as a tool that law enforcement might be using.
[00:09:23] And I went to their website, clearview.ai, and at the time it didn't say anything about facial recognition technology, it just said artificial intelligence for a better world. And there was an address in Manhattan, and it was three blocks away from the New York Times where I work. And so I decided to walk over there.
[00:09:39] Mm-Hmm. But when I got to where the building was supposed to be, it just wasn't there. Wow. And I kept going back and forth, like I compare it in the book to Harry Potter. I was like, is there a platform I'm not seeing here? Wow. And you know, the company kind of had a hidden who was behind the company.
[00:09:54] People weren't responding to me. I saw that Peter Thiel might be one of the investors who was listed on this kind of startup tracking website as having invest in Clearview ai. When I reached out to a spokesperson, a spokesperson was like, oh, I don't think I've heard of that company before. I'll look into it.
[00:10:10] And then I never heard from him again, and that kept happening with me. Every time I reached out to somebody, they just didn't wanna talk about Clearview AI or having any ties to the company. And so I had to go about it a different way, which is fine. Police officers who would use the app. Wow.
[00:10:25] Jordan Harbinger: I get why they'd wanna be secretive.
[00:10:27] If you're in a startup and you're in a security industry. But it still sounds really dystopian when their slogan is artificial intelligence for a better world than the building's not there. And it's all about taking people's personal information without their consent. I mean, that's like you're a little on the nose for being a villain when you do that.
[00:10:45] Yeah.
[00:10:46] Kashmir Hill: I mean, the way that one of their investors put it to me was that they were in stealth mode, but this was a little unlike any stealth mode I've seen before. And I think they realized that what they did was very controversial, that they collected all these photos. Of us, you know, without our consent and that they were operating in kind of a legal gray zone and that people were gonna be upset when they found out about
[00:11:08] Jordan Harbinger: it.
[00:11:09] Did they buy these photos from Facebook or were they just like, here's a bug where we can crawl every Facebook photo slowly over time and they don't care or notice? Yeah,
[00:11:17] Kashmir Hill: they were scraping photos and so that's creating these automated programs. Mm-Hmm. That go out there. And just look for photos of people and download them at Mass.
[00:11:26] The founder, Juan Tat, he's the kind of the technological mastermind behind the company. He described Venmo as being one of the first places that he was able to get these kinds of images because on venmo.com, they were showing in real time transactions that were happening on their network. For anybody who has their Venmo set to public, Mm-Hmm.
[00:11:45] Which you really should not do, right? If your Venmo is set to public, change it right now. Like, make that private. Why are you broadcasting it? We know
[00:11:51] Jordan Harbinger: what you're doing when you put a snowman emoji next to an $80 deposit to one of your friends. We know what you're doing.
[00:11:58] Kashmir Hill: And so Venmo would show all these transactions in real time, like Jordan paid cashmere.
[00:12:02] And Juan told me that he would just send a scraper to the site every few seconds and it would download people's profile photos and a link to their profile. And he just sent it there every few seconds. He got a million faces this way. It was like a slot machine where every time he pulled the lever faces spilled out.
[00:12:22] And so he essentially did that all around the internet and hired people to scrape faces for him. And yeah, it was just like a great big face hunt on the internet and there are a lot of fish to catch there.
[00:12:33] Jordan Harbinger: That must've been kind of fun though. I know it's sort of not a great thing to do, but it seems like a really fun thing to do, compile all these photos and make this product.
[00:12:40] It's interesting. It's exciting. Yeah. And you're,
[00:12:42] Kashmir Hill: they were doing something that hadn't been done before. Mm-Hmm. You know, that other companies hadn't been willing to do. And I think it was really exciting for them at first, though, they were collecting all these faces, building this big facial recognition app, and they had a product in search of a customer they didn't know.
[00:12:59] Mm-Hmm. Initially, who they would sell this to, like, who would buy this, who would
[00:13:02] Jordan Harbinger: pay for this? It seems obvious though, right? Because of course, the first thing, if you're really naive, you could say, oh, people are gonna want a little Facebook app where they find other people that look like them. That's so fun.
[00:13:12] That's something I thought of probably in college. Pretty soon you're gonna be able to search for people that look like you. Isn't that gonna be funny? That was back when they had all those little games or like quizzes, you know, which Harry Potter character are you? And you would answer stuff. I was like, they're gonna do this with pictures soon.
[00:13:27] But then of course, anybody with a security background at all is gonna go, ah, well, I wanna know who's walking into and out of my store. And if you've been reading the news, which you have for sure about China, they can just aim this at a giant outdoor crowd of people at a concert and be like, there's three wanted fugitives here.
[00:13:44] Here's the guy that jaywalked on the way to the concert, right? Go and issue these people citations and then go arrest these other three triad gangsters who've been on on the run. I mean, that stuff is exciting too. Let's talk about how it works and how well it works because. I mean, it seems like it works super well given your, your anecdotal example, but how does the computer actually, what is it looking for?
[00:14:07] Do you
[00:14:08] Kashmir Hill: know? Yeah. So this is one of those technologies that was supercharged by machine learning. Mm-hmm. Or neural networks Technology. The same. What is that? It's the same kind of tech that has made chat GBT so powerful. Essentially, the simple version of this is you can give a computer a bunch of data and it learns how to analyze it.
[00:14:29] And so that's what happened with facial recognition technology. It was like Facebook for example, had all these photos of people and we did the work for them. We tagged ourselves, you know? Yeah. In, in a dark room at a party, looking down, looking up, looking to the side. You know, when we are young, when we are old, and they're able to give all those photos of a person to the computer and say like, this is the same person.
[00:14:51] Learn this face. And that is how facial recognition technology now works. These computers know what to look for in a face at the pixel level to kind of see what makes a face unique, and they'll put it in a database with this, you know, long numerical code, a biometric identifier. And when you upload a photo of an unknown person.
[00:15:11] It generates that biometric identifier and then looks for anybody in its database that kind of has that same identifier or something similar to it. So it's essentially they've, you know, encoded our faces and now it can look through this database for a face with the
[00:15:25] Jordan Harbinger: same code. That's so interesting that the machine, so I think that's the key difference, right?
[00:15:30] The machine decides how it's gonna search for or sort and what to look for. 'cause I'm thinking, okay, what am I doing if I'm programming this? Look at the nose and the distance from the eyes and how far apart the eyes are in the mouth and how wide it is. And the computer's like, I've got this, let me step back old man, let me handle this.
[00:15:46] And it figures out, aha. There's people that look a lot alike, but their chin size is always different. So I can use that as the defining thing when these are really close together. But when they're really far apart, I'm gonna use the distance for their, their eye, their eyebrow is something that nobody, you know, it's like stuff we would just not think about.
[00:16:02] The computer can come up with the machine learning the correlations. Between things are always more and more bizarre, and I think when they use this type of AI for medical research, they're, people I know are really excited about this, right? 'cause they can put in every study that comes out over the last like 50 years and they'll say, Hey, did you know that people who have their big toe is shorter than their second toe?
[00:16:24] They're more likely to get dementia later. And it's like, well, no human would've. And it's gonna be even more subtle than that. It's gonna be like some gene allele that nobody even sees or tests for. Those people get dementia earlier unless they eat a lot of apples and they're gonna be able to find that, 'cause the machine will figure that out, whereas humanity never would on its own.
[00:16:43] And the
[00:16:43] Kashmir Hill: only problem with these kinds of technologies is that the scientists and engineers I talk to describe it as a black box. Like you don't actually know exactly what the machine is learning. And so with early, like really early facial recognition systems, they were actually looking at the backgrounds of the photos and they were identifying, oh, this person I see over and over again with the same background.
[00:17:06] And so they were learning that person was associated with that background. And we've seen that happen with some of the medical applications of AI where it kind of recognizes the font from a certain hospital. As being distinct and it's kind of learning something about the font. So that's the only problem is when these systems go wrong or kind of make incorrect decisions, we can't always figure out who, why they did that, because we don't know exactly how they work.
[00:17:32] Which
[00:17:32] Jordan Harbinger: is disturbing. Yeah. That could go wrong because then if there's a problem, you don't even know how to fix it. Yes, exactly. How did they train the machines other than us tagging ourselves in photos? How does it learn now? Because I haven't tagged myself in a photo in a zillion years, and I don't post on Instagram, but I know there's tons of photos out there with me.
[00:17:53] Is it just the, the genie out of the bottle, so now it's like, ah, we already have a thousand photos of Jordan. Any new one that comes up, we can just automatically tag him.
[00:18:00] Kashmir Hill: Yeah, I mean, at this point, the algorithms have been trained on lots of data. They're very powerful, and now they don't need to know who you are.
[00:18:11] They know now how to encode faces, and so they don't need to train on your face specifically. They know how to identify a face. And so you can be new to them and they can still find other photos of you. They, you know, the systems are very good at that. Again, they still make mistakes. Mm-Hmm. Sometimes this works better for some people than other people.
[00:18:28] Certain groups of people, there's been bias problems, facial recognition technology in the past, but at this point, the algorithms are trained. Yeah. They don't really need new training data.
[00:18:40] Jordan Harbinger: You are listening to the Jordan Harbinger show with our guest Kir Hill. We'll be right back. This episode is sponsored in part by Progressive.
[00:18:47] Most of you listening right now are probably multitasking. So while you're listening to me talk, you're probably also driving, cleaning, exercising, maybe doing some grocery shopping. But if you're not in some kind of moving vehicle, there's something else you could be doing right now. Getting an auto quote from Progressive Insurance.
[00:19:00] It's easy, and you could save money by doing it right from your phone. Drivers who save by switching to Progressive Save nearly $750 on average. And auto customers qualify for an average of seven discounts. Discounts for having multiple vehicles on your policy, being a homeowner, and more so just like your favorite podcast, progressive will be with you twenty four seven, three hundred and sixty five days a year.
[00:19:18] So you're protected no matter what. Multitask right now. Quote your car insurance@progressive.com to join the over 28 million drivers who trust Progressive, progressive casualty insurance company and affiliates. National average, 12 month savings of $744 by new customer surveyed to save with Progressive between June, 2022 and May, 2023.
[00:19:34] Potential savings will vary. Discounts not available in all states and situations. This episode is also sponsored by Better Help. We're often encouraged to reinvent ourselves with the new year, but what if we take a moment to appreciate what we are already doing? Right? I'm sticking to parts of my routine that work well for me.
[00:19:49] My exercise routine, for instance, that is a non-negotiable four days a week, there's my Mandarin Chinese learning journey, which is a rewarding 11 year adventure where I should be way further ahead than I am. It's been a slow but steady climb, and this year I'm committed to ramping up my learning pace.
[00:20:03] Now, how about some therapy? It's more than a tool for navigating life's challenges. It's a way to acknowledge your strengths and enact real positive Change therapy offers strategies for coping effectively in setting healthy boundaries. It's beneficial for anyone looking to hone their mental health. If therapy has been on your mind, but you find excuses to delay, consider this a gentle push to explore better help.
[00:20:21] It's an entirely online service that fits your schedule. You start with a questionnaire, get matched with a licensed therapist, and the flexibility to change therapists without extra cost is always there.
[00:20:30] Kashmir Hill: Celebrate the progress you've already made. Visit Better hope.com/jordan today to get 10% off your first month.
[00:20:36] That's better. hlp.com/jordan.
[00:20:39] Jordan Harbinger: If you're wondering how I managed to book all these amazing thinkers and creators every week, it is because of my network. And now I'm teaching you how to build your network for free over@sixminutenetworking.com. This course is all about improving your relationship building skills.
[00:20:52] It's not cringey, it's down to earth. It's not gonna make you or other people look or feel bad when you do. It's not salesy. It's not gonna be one of those things. You need this skill. If you're a teacher, you need the skill. If you're in sales, you need this skill. If you're a student looking for a job, I can't think of anybody who doesn't need to network.
[00:21:07] There's a lot of people who tell themselves they don't, but there's a lot of people who just leave that sort of proverbial money on the table and remain ignorant of the secret game that is being played around them. And look, I get it. It's uh, time consuming. This takes a few minutes a day. Really, that's all it takes, and many of the guests on the show subscribe and contribute to the course.
[00:21:26] So come join us. You'll be in smart company where you belong. You can find the course again for free@sixminutenetworking.com. Now back to Cashmere Hill. What if it's a picture of me when I'm 20? And then it gets a picture of me when I'm 40. I look quite different. You can tell it's the same person, but when I show people photos of me when I was younger, I mean they're often like, oh wow, okay, that's you.
[00:21:48] But you looked way, and I just looked a lot different, not just a different haircut, but like I really looked well, I looked younger and and better for that's, that's for sure. But, and there was a whole time 10 years ago where I was much bigger than I am now, and I looked very different. Can it tell that this is the young Jordan Harbinger or this is the older version of me?
[00:22:08] Can it tell, or is it like, oh, this is two people. There's
[00:22:11] Kashmir Hill: this big federal lab that tests all the algorithms. It's called the National Institute of Standards and Technology. And they have found that, you know, the accuracy of an algorithm can be affected by age, you know, by face coverings and such. But it is the case that a lot of these algorithms do still work over time.
[00:22:31] I mentioned before, like Clearview AI is, is able to pull up photos of me from 15 years ago on Flicker where I weighed more. I actually hate those photos. I didn't know they were on the internet. It was a, my sister's friend took photos at a New Year's Eve party in 2005 or 2006 and she put them publicly on the internet.
[00:22:50] I didn't know and on I found out they were there and I was like, oh, I hate these pictures of myself. Like exactly, please. I asked my sister tell her to take them down or make them private. But yeah, I mean it's probably gonna be pretty hard for a facial recognition system to match, you know, baby Jordan to Jordan now.
[00:23:06] Yeah. But it might not be that hard to match you to yourself 5, 10, 15, even 20
[00:23:11] Jordan Harbinger: years ago. Yeah, that's matching a baby. I mean, the babies I know from having two of my own. When I look at their older photos, they're just like, blobby, kinda look like old Chinese men. And now they look like completely different.
[00:23:26] I mean they, my son looks almost exactly like me and before he just looked like this little sort of Buddha figure when he was, you know, like one or less, his face was very round and babyish and he's almost just stretched into a totally different person. So that's understandable. But most law enforcement agencies are probably not out there trying to figure out who babies are.
[00:23:47] Although I can see in a missing person case why it would be really useful to see an aged version of a face that's more accurate than what we have now. Right. And then can find that person like, Hey, this person got kidnapped 10 years ago. They're showing up in Madison Square Garden.
[00:24:03] Kashmir Hill: Yeah, sorry. On the baby thing, it just made me think about Facebook.
[00:24:05] Founder Mark Zuckerberg, he posted on the 4th of July this photo of his family with his wife and his three daughters, and he put these like emoji stickers on the faces of his two older daughters 'cause clearly wanted to protect their privacy. But then the baby face, he just left it exposed. And so there were people saying, oh, it doesn't care about the privacy of the baby.
[00:24:25] And I think it was just because, yeah, most babies look alike and you know, facial recognition technology for that reason doesn't tend to work as well on babies.
[00:24:33] Jordan Harbinger: Yeah, interesting. He might regret that when Facebook comes out with technology that can do that later. And he's screwed over the one, you know, the youngest kid, but, oh well.
[00:24:43] Kashmir Hill: Uh, but yes, it is getting used. Madison Square Garden is one of my favorite examples of what people call surveillance creep. Mm-Hmm. Madison Square Garden originally installed facial recognition technology in 2018 for security threats. You know, they're on top of Penn Station, major transit hub. They have these huge crowds that come to Madison Square Garden to, you know, see the Knicks.
[00:25:05] See the Rangers. Mm-Hmm. See big concerts. And so they started using facial recognition technology. I. But then in the last year, the owner, James Dolan, realized that he could use the system to keep out his enemies. Namely lawyers who work at law firms that have suits against him, who he doesn't like. 'cause they cost him a lot of money.
[00:25:26] And so they went and scraped the lawyer's photos from their own websites, you know, from their bio pages, and created this band list of thousands of lawyers. And when those people try to get into a game, or a Mariah Carey concert or a Rockette show at Radio City Music Hall, they get turned away. And I've actually seen this happen.
[00:25:43] I went there with a band lawyer. I bought tickets for a Rangers game and you know, all these thousands of people streaming in. We walked through the doors, put our bags on the security belt. By the time we picked them up, a security guard has approached us. He asked her for ID and she had to leave and, and she said, Hey, I'm not working on any cases against Madison Square Garden.
[00:26:01] He says, it doesn't matter. Your whole firm is banned. It was wild to see really how powerful it is and what businesses could do with it if they wanted to in terms of either tracking us as we walk in, knowing who we are, knowing everything that's know about us from the internet, how much we might have to spend or trying to keep out people they don't like, for whatever reason, whether it's your political views or you wrote a bad review of them on Google or Yelp.
[00:26:24] Jordan Harbinger: It seems like she could have just said, oh, I don't work there anymore. I,
[00:26:27] Kashmir Hill: I don't, I maybe she would have if I hadn't been with her, you know? Yeah. Writing about it for the New York Times, but I don't think it would matter. They could probably go to the website right then and see that she had a bio
[00:26:39] Jordan Harbinger: there.
[00:26:39] Like, oh, they didn't remove my bio photo. I don't work there anymore. No, I work in house at ikea.
[00:26:44] Kashmir Hill: I asked some of the law firm partners. I was like, have you guys thought about not putting your photos on your biop pages in case more businesses start doing something like this?
[00:26:51] Jordan Harbinger: Yeah. It seems like they maybe thought about that and then they were like, nah, we'll just sue for some sort of discrimination and cost him way more money because two can play at this petty bullshit game that we're now playing, because that's really what this is.
[00:27:04] Right? Like, who cares if somebody goes to see Taylor Swift with their kids? This guy sounds like a prick.
[00:27:10] Kashmir Hill: Yeah, I mean, if the idea was ban the lawyers to dissuade litigation, that didn't exactly work out. Yeah, a lot of the lawyers did too. Just as you
[00:27:18] Jordan Harbinger: said. Yeah. Like even if you lose the suit, fine. I cost the guy $250,000 and now my other partner's gonna do it.
[00:27:26] For different thing. I mean do this. Ah man. Some of these people just can't help themselves. Are the algorithms just as good at detecting faces of all ethnicities? You mentioned there were bias issues, but is that what you mean by this? Yeah. I
[00:27:38] Kashmir Hill: mean, for a long time, a, a very troubling long time. Back to 2000, 2001, these algorithms worked basically best on mostly white men.
[00:27:48] Mm-hmm. And less well on everyone else. And the reason for that is that the people working on the technology were primarily white men. And they were making sure that it worked on them, worked on their friends. It was different with like algorithms from Asia for example, tend to work better on Asian people.
[00:28:02] You know, oiss, it was about the training data. Yeah, I was gonna
[00:28:04] Jordan Harbinger: say the training data and also the QA people probably are. Better at recognizing members of their own race versus other races.
[00:28:12] Kashmir Hill: Yeah. There is a same race effect. And so a lot of these facial recognition systems, it's a system working in concert with a human.
[00:28:19] Like usually the system says, here's some possibilities of who it might be. Then there's a human being who says, okay, I think this person is the most likely to be the match. So on the training data side, the vendors took the criticism and they said, okay, yeah, we need to make sure this works on everybody.
[00:28:34] And so they got more diverse training sets and they reduced that kind of occurrence of bias. And a lot of the algorithms have improved a lot there. But you still have a human being who ultimately has to look at this list of doppelganger, potentially. Mm-hmm. And choose who the right person is. So you still have the possibility for bias there.
[00:28:52] Then just even if it's perfect, you don't know when you're running a search like this, whether the person you're looking for is even in the database you're searching. There's many ways in which it can, it can still go wrong.
[00:29:03] Jordan Harbinger: It seems like you mentioned only law enforcement has access to this among other, I guess, small groups of people, but there are still bad cops that stalk people or harass people.
[00:29:14] It seems like that could go horribly wrong. Especially like imagine you're in a domestic abuse situation and your husband, your ex-husband, whatever, is a police officer, and now he can find you everywhere. That's really scary.
[00:29:26] Kashmir Hill: Yeah. There's actually a term for this because it does happen so often with surveillance technologies.
[00:29:32] Love in or love intelligence. Oh, I see. Yeah. And this is when police or intelligence officials kind of misuse surveillance tools. Mm-hmm. Where they're like searching databases for partners and loved ones. And you know what happens a lot? You know, police officers I've talked to, they think this is a very powerful technology.
[00:29:52] They want to use it, they wanna use it responsibly so that they can keep access to it. And so they say they have controls in place to make sure that officers aren't kind of just like Mm-Hmm, willy-nilly searching whoever they want. They're supposed to tag it with a case number, et cetera, that they review the logs.
[00:30:07] But again, that's just with Clearview ai, right? There are other face search engines out there that anyone can use. So I, there's one called Pi Eyes, and I write in the book about this guy who, he actually came to me and he wanted to confess how he was using PI eyes because he thought it was wrong. He wanted lawmakers to know so that they would hopefully like regulate it out of existence.
[00:30:27] But he essentially had a porn addiction and a privacy kink where when you would see women in pornographic films, he would, you know, they're using pseudonyms. They're trying to kind of keep their identities hidden because there's so much stigma around that kind of sex work. He would search them on PIM eyes, find out the real names, you know, find our high school photos and basically compile this big dossier of this is their who they really are.
[00:30:52] And eventually he got sick of doing that and he went through his Facebook friends list and would just look for his friends' faces to see if they had any risque photos online. And he found them. He found them on revenge porn sites. Oh man. Somebody who had been in a naked bike ride all these photos that had been safely obscure 'cause they weren't attached to these women's names.
[00:31:13] But once the internet was reorganized around our faces, all of a sudden he could find it.
[00:31:18] Jordan Harbinger: I went to the University of Michigan and we had this thing called the naked Mile. And it was basically you would just run through campus naked and tons of people did it. And every year they were like, stop doing this.
[00:31:27] And we're like, ah, what a bunch of prudes. And they kept saying, photos of this are gonna end up in places. And I remember being like, come on man. So somebody takes a photo of me wearing sunglasses in a hat biking naked or running naked through campus. Who cares? No one's gonna find it. And now it's like, oh.
[00:31:46] Actually technology evolves over the last two decades. Now there's the guy we're interviewing for this job wearing a fluorescent orange hat and sunglasses, and literally nothing else running through the quad. And it's like, maybe people just go, oh, well we all did stupid stuff in college, but maybe they also do something really nasty with it.
[00:32:06] It's really, and the revenge porn stuff is terrifying. For people who don't know what revenge porn is, how would you explain what this is?
[00:32:12] Kashmir Hill: So they're non-consensual, intimate images. So it's basically you are in a relationship, you shared, you know, intimate photos of yourself, selfies, and then relationship goes south and the person you shared the images with puts them on the public internet to punish you and embarrass
[00:32:27] Jordan Harbinger: you.
[00:32:28] And it can be really bad. Sometimes people film each other without their consent. And then there was a, a website that somebody eventually, I won't even give the name, but somebody bought it and shut it down. They spent millions of dollars on it because it was this huge revenge point site, and people's lives were essentially ruined by this.
[00:32:43] You get some 16-year-old girl on there that's now wants to kill herself because her stupid ex-boyfriend put all these videos up online. So this makes it even worse because at least then it was like that guy had to share it with her friends and then that was the worst of it. And then a few years later, she could be free of this, maybe move away something.
[00:33:02] Now this stuff follows you. You can move to the jungles of Cambodia and somebody can find you in your photos on a site like this using this technology. That's really creepy that this guy searched for adult film stars and then went to find them elsewhere. That's. Is that stalking if you don't go after the person?
[00:33:18] It's still so invasive and weird to me to do
[00:33:21] Kashmir Hill: that. Yeah. I mean, he considered it. He said, I, you know, I'm not, I'm just a digital peeping Tom. I'm not acting on this, but you could certainly imagine the way, just how nefarious it could be, and just this idea, I mean, what you're saying that the naked run. You know, there's so many times in which we rely on being anonymous in a crowd, being surrounded by strangers, not having all these moments just follow us forever.
[00:33:47] And that's my great fear with facial recognition technology, that you couldn't be in a restaurant having a sensitive conversation without worrying that somebody around you who's a stranger to you might overhear something juicy. Mm-Hmm. And then they take your photo and all of a sudden they know who you are.
[00:34:02] Right. And they can understand the context of the conversation. Just all these different moments. Buying something sensitive in a pharmacy, you know, walking out of a Planned Parenthood. Just all these moments where somebody could take a photo of you having a bad day on the subway and you're rude to somebody.
[00:34:16] They take your photo, they know who you are. Mm-Hmm. Just so many things could haunt you in this new world if we are just identifiable all the time by each other. Companies,
[00:34:25] Jordan Harbinger: governments. Yeah. People shouldn't have to take Moscow 1984 CIA precautions. To meet with somebody and have a sensitive conversation.
[00:34:34] It sounds like I'm exaggerating. But even now, of course, or especially now, you can't meet in public because then they see this person who's supposed to be an NGO worker at the US Embassy meeting with this other person and it's like, oh, well what are you meeting with them for? And you met him in another place earlier and we just asked you and you said you didn't know them or you said you knew them under these circumstances, but they're not.
[00:34:54] And so now they have to and, and they probably always have had to meet in these very like dead drop type things. I mean, dropping a note in a tree stump that's in code, it's like they still do some equivalent of this even with digital communication because of stuff like this. And as I said before, with domestic violence or people who are stalked, this is just an absolute nightmare for them, especially.
[00:35:16] Kashmir Hill: Yeah, it makes you have to really rethink, you know, anytime you're photographed, anytime you're on camera. Right. And certainly anything you post publicly
[00:35:23] Jordan Harbinger: on the web. Authoritarian governments could do the even worse with this, right? We're talking about users now that have limited resources, but we mentioned before China.
[00:35:32] I'm thinking North Korea, Iran, Canada, maybe not Canada. Well, who knows? I mean, the United States is, governments are gonna misuse this, I think is where I'm going with this, and governments that are gonna misuse this, it's either gonna happen sort of. By mission creep slash accident, US, the West Canada, whatever.
[00:35:50] But there are countries that are gonna do this because it controls the population like Iran and North Korea.
[00:35:56] Kashmir Hill: Yeah. I mean, some countries have deployed facial recognition technology far more widely than we have in the us. So in Moscow, they've deployed facial recognition algorithms on surveillance cameras.
[00:36:07] So they get real time kind of alerts when they're looking for wanted criminals or missing persons. I've heard about people who get stopped because they have that doppelganger problem where the system, you know, keeps saying that they're this wanted criminal. They get stopped. They have to show id, I'm not that person.
[00:36:23] Oh man. In China, during the protest in Hong Kong, the protestors would scale the camera poles and try to paint over them because facial recognition was being used to identify them. It's used there to automatically ticket people for jaywalking. To name and shame people who wear pajamas in public, in, in this one city.
[00:36:44] Okay. And in a public restroom. In Beijing, they were having problems with toilet
[00:36:49] Jordan Harbinger: paper. Th toilet paper. Yeah, I heard about this. So
[00:36:52] Kashmir Hill: installed facial recognition technology. They have to look at the camera, you know, dispenses a certain amount of toilet paper. And if you want more, you have to wait like seven minutes and look into the camera again.
[00:37:01] This is this problem of once you start putting the infrastructure in place. You. Maybe the intention originally is for safety and security purposes, but then you realize, oh, it's also good for all these other purposes, and suddenly you have this very controlled society where you're afraid to kind of
[00:37:18] Jordan Harbinger: poop in public.
[00:37:19] Yeah. Like I, I'm willing to use my face to unlock my phone, but when you're asking me to unlock the TP role, I'm already in dire straits. I'm on a street in Beijing using a public restroom. I obviously need more than eight squares of toilet paper. Bring your own TP if you ever go to Beijing. Yeah. That's the takeaway from this podcast.
[00:37:37] BY tp. Yeah. In the US we have to look for corruption and other issues where this could go horribly wrong. Right. Our legal system's gonna have to change to accommodate this. Do you have any thoughts on what might be necessary as far as changes in the line? I know you're not an attorney, but I'm curious if you have thought about
[00:37:53] Kashmir Hill: this.
[00:37:54] Yeah. I'm a a journalist, so I try not to give policy recommendations, but I can tell you what I have seen happening. Sure. And the reaction has been really different in Europe and Australia and Canada, as you mentioned. Mm-Hmm. Than in the US and those countries, when they found out about Clearview AI said, Hey, this company violates our privacy laws.
[00:38:15] You can't just collect photos of people and the sensitive biometric information and put them in a database without their consent. You can't do that here. And many jurisdictions find Clearview ai, they basically kick the company out of their countries. So that's a really different reaction to this, right?
[00:38:31] Whereas in the US we just don't have a national law like that that gives us that kind of control over our personal information. You know, there are some rare exceptions, like Illinois happens to have the state law that says you can't use people's voice print, face, print fingerprints without their consent.
[00:38:47] Wow. Or you have to pay a big fine if you're a company. And so if you live in Illinois, you have more protections over your face. But for most of the rest of us, that doesn't exist. So I think that's one thing we could think about is should these companies be allowed to make these databases or not? Should we have kind of control over these sensitive pieces of information if we are gonna have it out there?
[00:39:09] How should it be used? Should Madison Square Garden be able to ban lawyers? Yeah, they can't. Madison Square Garden actually owns a theater in Chicago and they can't use facial recognition technology to keep the lawyers out there because it would violate this law that Illinois has. Wow. They'd have to get the lawyer's consent to ban them.
[00:39:24] Jordan Harbinger: Is surveillance technology like there's, it's always misused even when the best of intentions, even when we start off with the best of intentions. Didn't the FBI, or I might be getting this wrong, somebody tried to get Martin Luther King to sort of like back off his stuff, or even wanted him to kill himself because of blackmail.
[00:39:42] Am I imagining this or is this a thing that happened?
[00:39:44] Kashmir Hill: Yeah, I mean, there was this time in the 1950s, 1960s when they called it the electronic listening invasion, and it was a time when there were all these little bugs, wiretap equipment and yeah, it was widely used to crack down on crime, but also to monitor dissidents.
[00:40:01] And so I think they bugged. Martin Luther King, Junior's hotel room, his office, they just had these listening devices everywhere. They recorded evidence of an extramarital affair, sent a tape to him and his wife and kind of, yeah, I mean they were encouraging him to commit suicide to kind step down from the movement.
[00:40:21] Hmm. So yeah, that was a time in which we were seeing surveillance technologies misused. And that was actually a time when people were freaking out in general, that they wouldn't be able to have private conversations anymore. 'cause they were so worried about these new technologies. And we did react back then, we passed laws that said that you can't, you know, just be recording people secretly without consent.
[00:40:44] That the government needed to go through certain steps to be able to use that kind of technology. And it's the reason why the surveillance cameras that are all over the United States are only recording our images and not our conversations. So that's where I'm hopeful. I mean, I think that there's been moments in time before where we said we want.
[00:41:02] A certain kind of privacy to exist in the world and we're gonna pass laws to, you know, create the future we want and not just let the technology dictate it for us.
[00:41:11] Jordan Harbinger: Huh. That's interesting. Then I wonder why my ring doorbell can do audio and video. Is it just people consent to being recorded? 'cause they're on my property at that point?
[00:41:18] I wonder how that
[00:41:19] Kashmir Hill: works. Yeah, it is a little legally complicated. I've, I've wondered about that. 'cause it is recording audio. I, I don't know how much it catches, how far away they are. Yeah. But I know that people who do have ring doorbells are encouraged to have kind of signage that says, Hey, you're being recorded here and audio's being recorded.
[00:41:38] I think you're actually supposed to
[00:41:39] Jordan Harbinger: notify people. Yeah. 'cause otherwise surveillance cameras, I mean, you're right. They don't have audio. I never really noticed that. And yet some of the stuff you buy for your house absolutely does. Like all my cameras here. On my property, they have audio and I'm sure they send it directly to China or wherever the servers are located.
[00:41:55] But yeah, they have audio and video and that's one reason why they're not in rooms where people might change, right? Or something.
[00:42:04] Kashmir Hill: And I think it's a little different when it's your house, it's your property. Whereas most surveillance cameras are kind of in public spaces. But yes, you know, when it's outside, there might, you might wanna put up a sign saying You're being recorded here.
[00:42:17] Yeah. If nothing else, it will probably deter bad behavior.
[00:42:23] Jordan Harbinger: This is the Jordan Harbinger show with our guest Kir Hill. We'll be right back. This episode of the Jordan Harbinger Show is brought to you by Nissan. Ever wondered what's around that next corner, or what happens when you push further? Nissan SUVs?
[00:42:34] Have the capabilities to take your adventure to the next level. As my listeners know, I get a lot of joy on this show talking about what's next. Dreaming big, pushing yourself further. That's why I'm excited once again to partner with Nissan because Nissan celebrates adventurers everywhere. Whether that next adventure for you is a cross country road trip or just driving yourself 10 minutes down the road to try that local rock climbing gym, Nissan is there to support you as you chase your dreams.
[00:42:56] So take a Nissan Rogue, Nissan Pathfinder, or Nissan Armada and go find your next big adventure with the 2024 Nissan Rogue. The class exclusive Google Built-in is your always updating assistant to call on for almost anything. No need to connect your phone as Google Assistant. Google Maps and Google Play Store are built right into the 12.3 inch HD touchscreen infotainment system of the 2024 Nissan Rogue.
[00:43:16] So thanks again to Nissan for sponsoring this episode of the Jordan Harbinger Show and for the reminder to find your next big adventure and enjoy the ride along the way. Learn more@nissanusa.com. This episode is brought to you in part by US Bank. Seems like there's a credit card for everything these days, right?
[00:43:31] Food cards, cards for travel cards for rare stamp collecting for me. I don't know what I'm gonna be spending money on from one minute to the next, but wouldn't you know it? US Bank has a card for people like me. Check out the US Bank Cash plus Visa signature card. With this card, you get up to 5% cash back on two categories that you choose every quarter.
[00:43:49] The great thing is the earning doesn't stop there. Even after you choose your first two earning categories. You also earn 2% back on one everyday category. You choose each quarter like gas stations and EV charging stations, or grocery stores or restaurants, and you still earn 1% on everything else. Apply today at US bank.com/cash+card.
[00:44:07] All that already sounds good, but this card just keeps earning with a $200 rewards bonus after spending a thousand dollars in eligible purchases within the first 120 days of account opening. If you like choosing how your card earns, apply at us bank.com/cash+card limited time offer. The creditor and issuer of this card is US Bank National Association, pursuant to a license from Visa, USA Inc.
[00:44:26] Some restrictions may apply. If you like this episode of the show, I invite you to do what other smart and considerate listeners do, which is take a moment and support our sponsors. All the deals, discount codes and ways to support the show are at Jordan harbinger.com/deals. And if you can't remember the name of a sponsor or you can't find the code, just email me.
[00:44:43] I'm jordan@jordanharbinger.com. I'd be more than happy to surface that code for you. Yes, it is that important. Thank you so much for supporting those who support the show. Now for the rest of my conversation with Kir Hill in repressive regimes, I'm thinking like Iran, what if you're secretly a Christian?
[00:45:01] Right? They could figure out who you are by your movement pattern, and if they ever caught you or a glimpse of you going into a secret church or something like that, you could be in real. Trouble. And I, I bring up this weirdly specific example because I had an Uber driver that was from Iran, and I was like, oh, tell me how you ended up here.
[00:45:17] And it turns out he was basically a secret Christian. His church was literally underground in a basement somewhere. And the Christian community in the United States helped him escape because they were being prosecuted and persecuted by the government. It was apparently illegal to do what they were doing to go and be Christian in a basement, I guess.
[00:45:36] I don't know. And he was still so afraid of the Iranian regime. He wouldn't even tell me more details. He changed the subject. I asked another question. He's like, Hey, I don't, I don't want to talk about this. And so regimes like that, having even more power over its population in real time, the ability to identify that, that's really terrifying.
[00:45:55] Especially what if I'm an Iranian secret police officer? I want information on Iranians in the US I could use Clearview and find out where these people are hiding or where they're living, and I could use it to harass them. And it wouldn't be hard for me to get an account saying I'm a police officer in Boise, Idaho.
[00:46:13] I mean, this is an intelligence
[00:46:14] Kashmir Hill: agency. Yeah, I mean, Clearview has said that they do not want to sell their technology to authoritarian regimes, but the problem is, it is becoming easier and easier to create technologies like this that, you know, there probably is some company or you know, home brewed company or the government could create something like this.
[00:46:35] I mean, it really is a potentially powerful weapon for control.
[00:46:40] Jordan Harbinger: Like you said, the scary part of this is even if we end up, let's say we get Clearview and they're great actors, and maybe they are, and then it's like, okay, fine. Well an authoritarian regime just builds their own version of this. It's probably not that hard to scrape photos from social media websites or buy them in bulk from somewhere, or even have your state run Hacking intelligence agency get millions or billions of them for specific groups, it might be slow, it might be less efficient, it might be slightly less accurate, but if you're just trying to get everybody in Turk, Stan or North Korea into a database and make it searchable, it just, it can't be that hard with modern technology to do that if you have the resources.
[00:47:21] Kashmir Hill: Yeah, I mean, I think we're gonna see a world where your face has different amounts of privacy, depending on where you live. One interesting thing we've seen happen in China is that they now have red list. They have blacklists for these are wanted people, these are people to monitor. And then there's red lists for people who don't want to be seen by the cameras, who are authorities, you know, who know about this surveillance infrastructure and wanna be able to move through the world and not be tracked.
[00:47:52] Wow. And so I think that's very interesting to see this privacy becoming a luxury good that you, your benefit for being in power is that you're not seen by the cameras. So
[00:48:03] Jordan Harbinger: basically the how does this work in, in practice? So the camera sees you says, oh, that's a guy that's on the red list. Suddenly your name doesn't show up.
[00:48:10] And the people who came and went, suddenly your photo's not available. Maybe it auto blurs you from the security agents that are looking at this later on and you're just anonymous. But I assume that's not something that you get unless you are a CCP official at a decent
[00:48:25] Kashmir Hill: level. Yeah, I think so. And the way one of my colleagues is in China, uh, or has done a lot of reporting on surveillance in China, Paul Moser at the New York Times.
[00:48:34] And he said that, you know, we think of China as kind of this monolith, but that the systems of surveillance are very localized and so it's happening like city by city, it's very different. You know how the local authorities are using it. And so, yeah, you know, there's basically a bunch of little sheriffs that have their own kind of surveillance systems and so it looks different depending on, on where you live.
[00:48:55] Geez, for now, I don't know at one point that all gets locked in and
[00:48:59] Jordan Harbinger: intertwined. Sure, yeah. If something tests, well, like the social credit score system is not all over China, but I have some teachers that have it in their town and other teachers that have never seen it. And yeah, it's quite interesting 'cause of course, yeah, you're right.
[00:49:12] You think it's in China. What does, does that mean it's in Shanghai? Does that mean it's in rural Xinjiang, or does that mean it's only in Xinjiang and not anywhere else in the country? Or is it being tested in a small province that you've never heard of or been to, to see if it actually works? Or if it causes more trouble than it, than it helps solve, trying to ban this kind of tech, it's impossible.
[00:49:31] It's like trying to ban alcohol, right? The technology exists. You can't put the genie back in the bottle. I don't think people are ready for this. This metaphor might not hold up, but I'll, I'll give it a shot. This tech turns everyone into a public figure. Now, I'm not really famous. I'm only barely scratching the qualifications of public figure by the loosest of definitions, right?
[00:49:51] But I know enough from my own experience of what might be called internet fame, if you can call it that. Most people, they don't want this at all, and many people won't be able to handle the consequences of this. What I mean is that, you said this earlier. Most people like being anonymous at some level, not so they can shoplift and get away with it, but because most people, it just creeps them out to know that somebody knows all the places they've been to in a whole week, how long they spent there, what they did when they were inside.
[00:50:18] For someone like me, it's fine in many ways. I like it when somebody recognizes me in public and says Hi, and I've never met them before and they know a ton about my life because of the parasocial relationship that we have on this podcast. But this is worse. This is a system tracking your every move, knowing all about you.
[00:50:34] And the only time it comes up to say hi is when you get a citation in the mail for jaywalking or you get some coupons from the lingerie story that you thought nobody knew you even shopped at or whatever. And that's just the commercial use of this system, right? Not the national. Security implications of this
[00:50:51] Kashmir Hill: stuff.
[00:50:52] Yeah, I mean, I think ultimately any technology like this, it's about power and the more that you know about someone else, the more power you potentially have over them. And so what worries me about facial recognition technology is that, you know, there's a certain creepiness that we feel when we're online, right?
[00:51:11] When you're going to a website and you know that there's like all these cookies on your computer and basically you're transmitting who you are. They're tracking that you've been on their website, they know you've been on other websites. You can tell from the ads that kind of feeling of being watched, being tracked.
[00:51:25] I just think it could all move into the real world with facial recognition technology, that our face would be this way of unlocking this whole online dossier about us. And yeah, there might be benefits to it, it could be great in some ways, but I think in other ways it'll be very chilling and kind of make you.
[00:51:44] Paranoid for a reason that all the time, you know, somebody might be looking at you because they know that you're Jordan and they're like, wondering what you're gonna pick up in the grocery store or, Mm-Hmm. They saw that you, you know, bumped into somebody on the subway and you didn't say you're sorry.
[00:51:57] Just all these little tiny Yeah. Ways in much larger ways, yeah. That we could be surveilled and controlled is alarming.
[00:52:05] Jordan Harbinger: I went out to dinner with a, a very sort of, well-known fitness influencer, and the waiter was like, whoa, are you so and so? And he is like, yeah. And he is like, what are you gonna order?
[00:52:14] And I'm like, oh, this is hilarious. You have to order something healthy now. We were gonna get a pizza. You can't get this pizza now I'm eating a pizza. You are getting Brussels sprouts or something. And he is like, yeah, I kind of want wine and the cheesy flatbread. And the guy was like, really? And he is like, dude, I'm celebrating something.
[00:52:33] I'm here with my friend. Like it was so funny 'cause I just thought you are so painted into a corner with your brand. That guy's gonna take a snapshot of you eating cheesy flatbread and be like, he's a fraud. You can't eat this in public. You're screwed. The price of fame, the price of fame. It seems like this will be tremendously convenient, right?
[00:52:51] You can go shopping and walk out with stuff and it'll be like, oh, we just build this to your Amazon account. I think they're already working on how
[00:52:57] Kashmir Hill: that works. They've got the palm print, right? Like I go into, I went to a Whole Foods outside of San Francisco and they're trying to collect the palm so you could just like pay with your palm.
[00:53:05] I asked the clerk, I was like, how many people have actually signed up for this? He is like very few. Yeah. It's not actually that appealing.
[00:53:13] Jordan Harbinger: It's invasive and also it's kind of gross 'cause you're touching something that everybody else is touching with their hand, which like a post covid era, not super advisable, I guess for a lot of folks.
[00:53:22] And also the convenience of that versus just paying with your card is marginal or paying with your phone is marginal facial recognition where you just push the cart straight out the door to your car and it's like, here's everything that was in the cart because it has RFID tags or something that is convenient and the security stuff is great.
[00:53:39] The stores could ban shoplifters. Hopefully not the wrong people, but the right
[00:53:44] Kashmir Hill: people. Yeah. That's certainly happening. That's happening already. Yeah. That, oh, it's, that's great. Yeah, yeah, yeah. Here in New York, grocery stores, Macy's has facial recognition technology to keep people out. That's happening.
[00:53:55] And hopefully again, you don't have a bad doppelganger. Who's Shoplifts?
[00:53:58] Jordan Harbinger: Yeah. Geez. I guess then you'd have to handle that with corporate and they'd have to figure something else out. I mean, they wanna keep you as a customer somehow. I'm sure there's a policy for that. Hotels could greet guests by name. I think that's something you wrote about in the book.
[00:54:09] That's great. Or even maybe I don't need a hotel key. Maybe when I get to my hotel room door, it just opens. 'cause it sees me. That kind of stuff would be cool. So
[00:54:17] Kashmir Hill: this actually happened to me. I was on book tour. My publisher booked me at, uh, four Seasons in St. Louis and I had a super early morning flight, like I had to leave at 5:30 AM and so I arrived.
[00:54:28] I just felt really bedraggled and I just didn't look good and I kind of walk up to the hotel and they greeted me. They're like, oh, KMIR Hill of the New York Times. And you know, it was, it was good customer service. They weren't using facial recogni technology. I don't know how they did it, but I was like, I don't wanna be recognized right now.
[00:54:45] I look yeah, bad, like I just wanna be anonymous. And so that's really funny. It's not always good to be
[00:54:51] Jordan Harbinger: recognized. That's just like the fitness thing. It's like, I thought being famous would be fun until I went outside in pajamas to get something from seven 11. Yeah, that's how that works. You, you gotta take the good with the bad.
[00:55:04] So we can't stop this, but what can we do? What sort of world are we headed for? Because this sort of tech, this sounds hyperbolic, but it basically ends privacy as we know It doesn't it?
[00:55:14] Kashmir Hill: I think it does definitely end anonymity as we know it, and I do think that we can stop it. And I just don't know exactly what that looks like.
[00:55:23] Maybe we decide police using this is a good thing within reason. And if they're doing proper investigation, it's not just arresting people. 'cause a facial recognition system says maybe it's the same person, but yeah, maybe we decide we want the police to use this. Maybe we decide, okay, we're okay with companies using it to keep out shoplifters, but we don't want them using it for other things like discriminating against people based on where they work.
[00:55:46] Or because they're an investigative journalist or a government official. Right. And maybe we don't want us to have it like that. You can just identify another person without their consent, you know? Yeah. Search their face. I, I do think that these are regulatable and that we could pass laws. We just, we need to choose to, like, we need to act to protect anonymity if we think it's important.
[00:56:08] I could also, I imagine a version of this that people opt into. Like a company that knows our social graph, like a, a Facebook or a meta, you know, they're working on augmented reality glasses and their chief technology officer has said, I'd love to put facial recognition capabilities in this. It would be great if you're at a cocktail party and there's this person that you've met five times before.
[00:56:27] Oh yeah. And you should know their name, and you just look at them and our glasses tell you who they are. I could imagine a consent model for face recognition where you say, yeah, I'm gonna opt in. Mm-Hmm. I'm gonna set the, you know, privacy settings for my face the same way I do with my Facebook profile and set it to public or set it to recognizable by.
[00:56:45] Friends, you know, people you're connected to or you make your face private. I could kind of imagine that. I, I feel like that's the American style of embracing facial recognition technology, that it's like, yeah, I, I'm okay with this. Like, I wanna be recognized because of whatever benefits you get from being known
[00:57:00] Jordan Harbinger: casinos use this, don't they, to keep out card counters and things like that.
[00:57:04] I feel like that's the original, my original introduction to this was casinos can actually use this. And it was amazing. And that was probably like 10 years ago.
[00:57:11] Kashmir Hill: Absolutely. They're very early adopters also using it. There's people who have problems with gambling who will put themselves on lists of like, I don't wanna be led into the casino.
[00:57:19] And so they'll use facial recognition technology to keep those people out as well. And what kind of scares me is that 10 years ago, like the technology didn't work that well. Like, no, especially in the real world. So I, I do wonder, you know, who was flagged, who shouldn't have been or who the technology missed.
[00:57:35] But yeah, casinos are definitely users. We're starting to see it in airports. Yeah. Airlines like TSA is using face recognition, you know, as you're going through, so it's spreading quickly, which is why I think we need to assess it right now before it's just so widely deployed that we have lost our anonymity.
[00:57:53] Jordan Harbinger: Look, TSA is one of the one places where I'm like, okay, maybe airports really need this because you don't want somebody to flee who's got a warrant out for their arrest. You don't want somebody who is a known affiliate of a terrorist organization boarding in a commercial airliner. You don't want that kind of, that's like the one use case where I think most of us can agree is probably where we want this technology in
[00:58:17] Kashmir Hill: use.
[00:58:18] Well, this has been like a point of friction in the us We've really resisted the idea of putting facial recognition cameras out there, like putting facial recognition in surveillance cameras for real time surveillance. There's been pushback here and it was funny 'cause I was talking to a vendor from the uk.
[00:58:33] Where they're more open to that use case and he is like, what's the matter of Americans? Like, why won't you guys just let us, you know, put this everywhere, be such an easy way to find wanted criminals, you know, missing persons, et cetera, fugitives. And we don't seem to like that. Like right now, the way we're using it is you knowingly look into a camera, or, yeah, using it to solve crimes where you have surveillance camera footage that was recorded
[00:58:56] Jordan Harbinger: previously.
[00:58:57] I read in your book that Ukraine is using Clearview to figure out who the dead Russian soldiers are and then send proof of death to the families in Russia in order to probably in part make sure they know the total cost of the war in human terms. And that's interesting, right? I'm not sure I'm totally against that right.
[00:59:14] Transparency. Even if that transparency is weaponized and if it causes people to wanna stop a war or it tells a mother that her son is dead, instead of the Russian government saying, oh yeah, I don't know, maybe he ran away and lives in Belgium. It seems like that's a fair use of this. I, I find that really interesting and very macabre.
[00:59:31] 'cause somebody's job is to take photos of all these dead soldiers' faces and run them through and then reach out to their loved ones and be like, by the way, that's gotta be tough even if you're fighting a war against those people.
[00:59:42] Kashmir Hill: Yeah. I think the reaction in the US is that it was pretty creepy use case.
[00:59:47] That's what we often see in war, right? In in war zones, is the most extreme use of these technologies. And then they do tend to trickle down into society after
[00:59:57] Jordan Harbinger: that. What about glasses and clothing that destroy facial recognition capabilities? I feel like I've seen this online. I've seen protesters wearing weird stuff like sequins and glitter glued on their face and weird designs painted on that supposedly helps deflect facial recognition technology somehow.
[01:00:14] So is that real? And do I have to just walk around with war paint all the time in the future? It seems very cyberpunk to do that, by the way. Yeah,
[01:00:22] Kashmir Hill: I mean the Sure bed is a scheme mask, but that's going to get you attention for other reasons. That'll flag. Yeah. But yeah, there are always researchers that are playing this cat and mouse game, right?
[01:00:31] Like how do you use technology to disrupt the technology? And so they'll find like an algorithm that they can break with either a weird pattern on with makeup or like a sweater that has a whole bunch of little people on it. And they work, but they usually only work for a certain amount of time and then the facial recognition vendor or the AI vendor will fix the problem.
[01:00:53] And so I don't think it's the most effective way to fight this long term or to solve this long term. I do think it's more we have to figure out what we want as a society and then create those rules, create
[01:01:04] Jordan Harbinger: those laws. Yeah. The ski mask or the sort of covid face shield that's tinted might work, I guess.
[01:01:10] Yeah. I mean if you don't wanna get outed, you're gonna have to wear your gay pride balaclava to the next festival. I mean, I don't know I gonna do
[01:01:18] Kashmir Hill: that. I did this story about pi I, so one of those public face search engines I was talking to you about, and I asked my fellow reporters here at the New York Times to volunteer their faces so I could test the technology and yeah, one of my colleagues, you know, she had a covid mask on and it was still able to pull up other photos of her.
[01:01:36] Another person had on and sunglasses and a hat still worked. Yeah, it was kind of amazing how far these algorithms have come and that, you know, during Covid, basically all the facial recognition vendors train their algorithms to work even when you're wearing that mask, so they can be hard to evade.
[01:01:54] Jordan Harbinger: Well, on the one hand, I love the convenience factor of being able to pay for something by simply walking outta the store and having it register.
[01:02:00] Me as the person who paid, and I know they do this in China for loans, you can use like a webcam thing and apply. Of course, people are tricking the loan companies with ai and I had a Chinese teacher who's like, I have to call the police because somebody used a fake version of me to get a loan. Oh my gosh.
[01:02:17] Using a webcam. And I'm like, oh gosh, it, that's gotta be really hard if you're using it for banking. Like, and I don't know how they do this. They make an AI version of you, or they take a video of you and they somehow play it and it makes the camera think that it's, I, I really don't know. But it apparently, this is a wide, a very broadly run scam in China.
[01:02:36] I mean, every payment system's imperfect at first, but. This one's particularly scary because if somebody steals your credit card, you can say, I didn't have the card on me at that time, or somebody got the number. But if somebody gets your face, it's a harder argument to be like, yeah, I wasn't there. Well, they used your face.
[01:02:51] Kashmir Hill: That's actually something I've worried about the deep fakes, that this will be a way that you can destroy someone's reputation, where you create a whole bunch of like deep fake versions of them doing something embarrassing, you know? Or create deep fake revenge porn and you just kind of seed it on the internet like landmines.
[01:03:09] mm-Hmm. So that when somebody does a face search for them, all these images come up and then you'll have to be like, wait, those aren't real. But you know, there's always gonna be that lag time where people think it's real. But sure. That is a scenario I'm definitely worried about.
[01:03:23] Jordan Harbinger: Yeah. I did an episode, episode 4 86 with Nina Schick, if you know her about deep fakes and how, and that was years ago.
[01:03:30] Now it's like they're actually happening Before it was just like, this is gonna be a thing now. It's just that, that technology. Has come and is here to stay. Well, thank you very much. This is, it proves that I'm not just being a worry wart, which is kind of what I thought in the beginning. Like, oh, it's not gonna be that bad, and.
[01:03:47] I don't know. I'm excited for the future of this. But yeah, we need regulation to step in. I know you won't make policy recommendations, but there's gotta be regulation on how this is used, or it's gonna get misused. It's gonna get misused anyways by the NSA and and things like that. So ideally it's not being misused by I.
[01:04:03] Everyone who can pay for a membership to the app, that's what I'm really worried about. Yeah,
[01:04:08] Kashmir Hill: I mean, I think you have, there's justified paranoia about this. I think that there are good use cases and that ultimately we wanna harness the good about technology and avoid the bad. And it is time on AI from everything.
[01:04:21] Everything from facial recognition technology to deep fakes. I mean, we have got to get on top of this
[01:04:28] Jordan Harbinger: Kir Hill. Thank you so much for coming on the show. Really appreciate it.
[01:04:31] Kashmir Hill: Thanks so much, Jordan. This is great.
[01:04:34] Jordan Harbinger: You're about to hear a preview of the Jordan Harbinger show with former Google Design ethicist, Tristan Harris, who helped build social media and is now sounding the alarm on its issues.
[01:04:44] YouTube is an engagement platform. TikTok is an engagement platform. Snapchat is an engagement platform because what they have in common is predating on human behavior and human attention as a commodity. It's an extractive business model that's like the Exxon of human anxiety. It pumps human anxiety and drives a profit from the turning of human beings into predictable behavior.
[01:05:06] And predictable behavior means the seven deadly sins, the worst of us. We're worth more when we're the product as dead slabs of human behavior than we are as freethinking individuals who are living our lives. When you are scrolling a newsfeed, you have a super computer that's pointed at your brain. They know everything about your psychological weaknesses that you don't even know about yourself.
[01:05:25] If I had TikTok open on my phone and I watched one video and I said, oh, that's kind of funny, and I'll scroll to the next one, who's really the author of the Choice, TikTok and Instagram both have programs to actively cultivate the influencer lifestyle and make that as attractive as possible because we are worth more when we are addicted.
[01:05:43] Outraged, polarized, anxious, misinformed, validation seeking and not knowing what's true. I think it's pretty easy to see that a society in which it's more profitable for each person to be addicted, narcissistic, distracted, confused about reality, not knowing what's true. That is not a society that can solve its problem.
[01:06:01] That is not a society that can solve climate change. That is not a society that can escape pandemics or agree on anything, and that is incompatible with the future that we wanna live in. We need a society that is consciously using tech to make a stronger, healthier, better, 21st century open society. And we either do that or we call the American experiment over, I think.
[01:06:23] To hear how technology is hacking human brains and attention spans. Check out episode 5 33 of the Jordan Harbinger Show. Really interesting conversation. This book, by the way, goes over the history of some of this type of facial recognition and other surveillance technology that's related to it. Knowing your face is gonna get scanned at a protest.
[01:06:43] Yeah, I think that might chill Free speech and freedom of assembly and I, I've heard that China, I know China has already started using this against its own citizens, but especially the Uyghurs. So there's a dystopian reality here that seems almost unavoidable when this is widespread. I do love the idea of being able to catch dangerous criminals by using facial recognition.
[01:07:03] People who are dangerous and eluding justice won't be able to do so for very long, and I feel like that would make our country a lot safer. However, as we've covered before, governments well, they misuse technology like this something like a hundred percent of the time. So that's what gives me the biggest pause here.
[01:07:18] All things Cashier Hill will be in the show notes@jordanharbinger.com. You can also ask the AI chatbot on the website. Transcripts are in the show notes. Advertisers deals, discount codes, and ways to support the show are all at Jordan harbinger.com/deals. Please consider supporting those who support the show.
[01:07:35] We've also got our newsletter. Each week, the team and I dig into an older episode of the show. We dissect the lessons from it. So hey, if you're a fan of the show, you wanna recap of important highlights and takeaways. You just wanna know what to listen to next. The newsletter is a great place to do just that.
[01:07:47] Jordan harbinger.com/news is where you can find it. We're gonna be doing some giveaways there as well. We did finish the flashcards. Those should be available now. If you are in six minute networking, they will pop up right there in the course. We'll do a formal announcement at some point and six minute networking, by the way, is over@sixminutenetworking.com.
[01:08:04] I'm at Jordan Harbinger on both Twitter and Instagram. You can also connect with me on LinkedIn. This show is created an association with Podcast one. My team is Jen Harbinger, Jace Sanderson, Robert Fogerty, mil Campo, Ian Baird and Gabriel Mizrahi. Remember, we rise by lifting others. The fee for this show is you share it with friends.
[01:08:21] When you find something useful or interesting. The greatest compliment you can give us is to share the show with those you care about. If you know somebody who's interested in this kind of privacy stuff, emerging technology, definitely share this episode with 'em. In the meantime, I hope you apply what you hear on the show so you can live what you learn, and we'll see you next time.
[01:08:41] Thanks again to Nissan for sponsoring this episode of the Jordan Harbinger Show. Learn more@nissanusa.com. Hey
[01:08:47] Kashmir Hill: buddy. Hey buddy. What's going on, man? Hi guy.
[01:08:51] Jordan Harbinger: Yeah, yeah, the team Loveline man. You guys remember us from back in the
[01:08:56] Kashmir Hill: day? Well, we're doing a pod and we're doing it every day and we've been doing it for a while.
[01:09:00] And if you, if I hear one more time, people say, God, I loved you and Adam together on Loveline, and I'm like, yeah, yeah. We're doing a podcast. Will you please just join us at the Adam and Dr. Drew Show, please at adam, dr drew show.com. What's a great show? Come on now. Only on podcast, Juan. That's us, Adam and Dr.
[01:09:16] Drew Show, just like the old days doctor's orders. Oh, oh man. You're funny. Yep. All right. Let's go save some babies.
[01:09:23] Jordan Harbinger: Let's do it.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.