Companies harvest 6GB of your data hourly. Psychologist Sandra Matz explains how they predict everything from depression to politics—and how to fight back.
What We Discuss with Sandra Matz:
- Companies collect ~6GB of data per hour on individuals through social media, credit cards, smartphones, and location tracking, enabling predictions about personality, politics, and mental health.
- Facebook identified depressed teenagers in 2015 and sold this information to advertisers rather than providing support, prioritizing profit over well-being.
- Algorithms need just 300 likes to know someone better than their spouse, while facial recognition can determine sexual orientation with 81% accuracy from facial features alone.
- “Anonymized” data isn’t truly anonymous — three credit card transactions can uniquely identify a person, revealing unintentional information beyond our curated online personas.
- Data co-ops offer a practical solution for regaining control. MS patients in Europe and Uber drivers in the US have formed co-ops to collectively manage their data, allowing them to benefit from data aggregation while maintaining ownership and directing outcomes toward their shared interests rather than corporate profit.
- And much more…
Like this show? Please leave us a review here — even one sentence helps! Consider including your Twitter handle so we can thank you personally!
Ever wonder why that gorilla suit Halloween costume ad seems to follow you around the internet after your battery hits 15 percent? In today’s world, we’ve unwittingly traded the nosy neighbor from a small town for an infinitely more observant digital equivalent. The village gossip who once noticed you running late for the bus has been replaced by algorithms that not only track your tardiness but predict your personality traits, political leanings, and even mental health status with unsettling accuracy. It’s a peculiar paradox — while we carefully curate our online personas with flattering vacation photos and witty status updates, our unintentional digital exhaust (those six gigabytes of data we generate hourly) tells a far more intimate story. And unlike the village gossip whose memory might fade, these digital observers never forget, creating a permanent record of behaviors we ourselves might not even recognize as revealing.
On this episode with data scientist and Mindmasters: The Data-Driven Science of Predicting and Changing Human Behavior author Sandra Matz, we discover just how deep this digital rabbit hole goes. Sandra reveals how after just 300 likes, Facebook knows you better than your spouse does, and facial recognition can determine sexual orientation with 81 percent accuracy just by analyzing facial features. Even more alarming, Sandra explains how Facebook identified depressed teenagers and, rather than offering support, sold this vulnerability to advertisers — a sobering example of how our digital footprints can be weaponized. But Sandra isn’t just sounding alarms — she offers hopeful alternatives like data co-ops, where MS patients in Europe and Uber drivers in America have created systems to collectively own and harness their information for mutual benefit rather than corporate profit. “We need these positive visions to even get us started,” Sandra insists, offering a refreshingly actionable counterpoint to digital doom and gloom. Whether you’re a privacy advocate, a tech enthusiast, or simply someone who’s ever wondered why your phone seems to read your mind, this conversation illuminates both the unseen architecture of our digital lives and the blueprints for reclaiming some sovereignty within it. Listen, learn, and enjoy!
Please Scroll Down for Featured Resources and Transcript!
Please note that some links on this page (books, movies, music, etc.) lead to affiliate programs for which The Jordan Harbinger Show receives compensation. It’s just one of the ways we keep the lights on around here. We appreciate your support!
- Sign up for Six-Minute Networking — our free networking and relationship development mini-course — at jordanharbinger.com/course!
- Subscribe to our once-a-week Wee Bit Wiser newsletter today and start filling your Wednesdays with wisdom!
- Do you even Reddit, bro? Join us at r/JordanHarbinger!
This Episode Is Sponsored By:
- The Moonshot Podcast: Listen here or wherever you find fine podcasts!
- Airbnb: Find out how much your space is worth at airbnb.com/host
- Tonal: Go to tonal.com and use promo code JORDAN for $200 off
- Progressive: Get a free online quote at progressive.com
- BetterHelp: Get 10% off your first month at betterhelp.com/jordan
How does James Patterson craft stories you just can’t put down? Find out on episode 1100: James Patterson | Building the Architecture of Addictive Fiction here!
Thanks, Sandra Matz!
Click here to let Jordan know about your number one takeaway from this episode!
And if you want us to answer your questions on one of our upcoming weekly Feedback Friday episodes, drop us a line at friday@jordanharbinger.com.
Resources from This Episode:
- Mindmasters: The Data-Driven Science of Predicting and Changing Human Behavior by Sandra Matz | Amazon
- Sandra Matz | Website
- Sandra Matz | LinkedIn
- Sandra Matz: Psychological Targeting: What Your Digital Footprints Reveal About You | TED Talk
- Psychological Targeting: The New Frontier in Ad Targeting | Ad Council
- What Psychological Targeting Can Do | Harvard Business Review
- The Nosy Neighbor is Your Web Browser: How Psychology Went Digital | Next Big Idea Club
- Chronicles of a Nosy Neighbor: Mrs. Kravitz’ Funniest Moments | Bewitched
- Pay Attention to “Identity Claims” to Read People More Effectively | Lifehacker
- How to Turn ‘Data Exhaust’ into a Competitive Edge | Knowledge at Wharton
- Facebook Told Advertisers It Can Identify Teens Feeling ‘Insecure’ and ‘Worthless’ | The Guardian
- Data Is Permanent but Leadership Isn’t by Sandra Matz | LinkedIn
- 20 Years of Facebook, but Trust in Social Media Remains Rock Bottom | Infosecurity Magazine
- Digital Footprint Statistics in 2024 | Digital Risk Inc.
- 28 Ways Companies and Governments Are Collecting Your Data Every Day | Business Insider
- Who Can See Your Internet Search History: Full Guide | NordVPN
- Language Patterns Discriminate Mild Depression from Normal Sadness and Euthymic State | Frontiers in Psychiatry
- Here’s Proof That Facebook Knows You Better Than Your Friends | Time
- How Chick Sexing Is Like Machine Learning | Triangulation
- I Made an AI Clone of Myself…and It’s Freakishly Good | Sara Dietschy
- I Took a ‘Decision Holiday’ and Put AI in Charge of My Life | The New York Times
- New AI Can Guess Whether You’re Gay or Straight from a Photograph | The Guardian
- A Picture to Die For | Slate
- Iron Maiden: ‘Fame Is the Excrement of Creativity’ | The Guardian
- Creating Word Clouds from Facebook Page Comments | Igor Rinkovec
- From Words to Wealth: How Rich People Talk Compared to Poor People | Practical Wisdom
- Are Extroverts Better Looking? | Psychology Today
- A Comprehensive Guide To Facial Recognition Algorithms | Hyperverge
- People Can Be Identified Through Their Credit-Card Transactions | Nature
- Moran Cerf | Hacking into Our Thoughts and Dreams | Jordan Harbinger
- History of the Cambridge Analytica Controversy | Bipartisan Policy Center
- An Attacker Killed a Judge’s Son. Now She Wants to Protect Other Families | NPR
- Scott Galloway: Tech Companies Should Be Broken Up | Literary Hub
- Data Co-Ops Empowering Individuals | The Workplace Podcast
- Radical Proposal: Data Cooperatives Could Give Us More Power Over Our Data | Stanford HAI
- Co-Op Helps Uber, Lyft Drivers Use Data to Maximize Earnings | TechCrunch
1135: Sandra Matz | How Algorithms Read and Reveal the Real You
This transcript is yet untouched by human hands. Please proceed with caution as we sort through what the robots have given us. We appreciate your patience!
[00:00:00] Jordan Harbinger: Coming up next on The Jordan Harbinger Show.
[00:00:02] Sandra Matz: Facebook in 2015 was actually accused of predicting whether teenagers on their platform were struggling from anxiety, depression, low self-esteem, and then they were selling them out to advertisers. So this is like someone at their most vulnerable state. Not only are they suffering from anxiety, they're also teenagers.
They're still figuring out the identity. So the moment that you tap into this vulnerability, the damage that you can do, I mean, it's very obvious.
[00:00:30] Jordan Harbinger: Welcome to the show. I'm Jordan Harbinger. On The Jordan Harbinger Show, we decode the stories, secrets, and skills of the world's most fascinating people and turn their wisdom into practical advice that you can use to impact your own life and those around you. Our mission is to help you become a better informed, more critical thinker through long form conversations with a variety of amazing folks.
From spies to CEOs, athletes, authors, thinkers and performers, even the occasional cold case, homicide investigator, hostage negotiator, gold smuggler, or Russian spy. And if you're new to the show or you wanna tell your friends about the show, I suggest our episodes starter packs. These are collections of our favorite episodes on topics like persuasion and negotiation, psychology and Geopolitics, disinformation, China, North Korea, crime, and cults and more.
That'll help new listeners get a taste of everything we do here on the show. Just visit Jordan harbinger.com/start or search for us in your Spotify app. To get started today, my friend Zandra Ma shows us how companies steal our data, use our data to target us, not only to sell us things, but how they can essentially read our moods, almost read our minds how AI and computers get to know us on such an intimate level, what they can do to predict things like depression, which unfortunately they monetize instead of helping us solve and how we leave millions of digital footprints each day.
This is a bit of a deep episode on the data we leave and how it is used. A lot of fascinating details in here. Algorithms can tell if someone is gay 81% of the time just using their face. That's pretty interesting. They might have better gay dar than humans. How computers actually test their assumptions about us and what having a low phone battery means about you.
And yes, we already know you're one of those. Now here we go with Zora Motts.
I think everybody knows that companies collect a lot of data on us, but I guess I didn't personally realize how granular they could get with psychological targeting. Can you explain what that is? First of all, I.
[00:02:17] Sandra Matz: Yeah, so like psychological targeting is in a way taking all of the digital traces that you leave.
So that ranges from what you post on social media to you swiping your credit card to your smartphone, capturing all of these very intimate things like where you go based on GBS, like you making, taking calls and then translating those footprints into meaningful psychological characteristics. So anywhere from your personality, your values, your political ideology, sexual orientation.
So really painting a picture of the person behind the data.
[00:02:45] Jordan Harbinger: You know, it just occurred to me, I wonder if they collect data in the same way when I pay with my phone. And the answer is now I'm just telling Apple and the credit card company everything that I buy, and I know that they know this because it'll pop up.
Like your Amex has been charged $47 for eating at this restaurant. So of course they're logging that. Right. They're eventually use that against me.
[00:03:05] Sandra Matz: Exactly. It must go through a bank.
[00:03:07] Jordan Harbinger: Yeah, it goes through a bank. But it's also, I'm telling you Apple, which is what business do they have about what I'm buying?
And the answer is they're in the business of data just like everybody else probably.
[00:03:16] Sandra Matz: Yeah. And it's funny 'cause Apple is one of these cases where they shut down the third party tracking, but they still collect all of the data. So at the end of the day they benefit. 'cause now they're holding the monopoly on the data that they capture.
[00:03:28] Jordan Harbinger: Yeah, I noticed whenever you install something new, it's like this app wants to track you and it's Ask App not to track. It doesn't say we won't track you, it says we're not gonna let them track You. Don't worry though. We're still tracking you obviously. Yeah. Mm-hmm. We're still tracking everything that you're doing.
It's like location data. Somebody told me the other day, oh, I turned off location data, so they don't know where I am at. And I'm like, they know they're just not sharing it with you and your friends and your mom. It's not like the FBI can't find you. Come on.
[00:03:51] Sandra Matz: Yeah. And also it depends on what you turn off, right?
You might be able to turn off the GPS, you still need to be connected to a cell tower, otherwise your phone doesn't work. The fact that you turn off GPS doesn't mean that you're not trackable.
[00:04:03] Jordan Harbinger: So I was doing training for some journalists a couple of months ago. In another country. And one of the things that they did was they put all of our phones in like a safe box.
And I said, oh, I'll just turn it on airplane mode. And they were like, oh, sweet summer child, that's not gonna do anything to stop intelligence services or whatever from finding or turning on the microphone to hear what we're doing. And I thought, oh, that makes sense. Especially if they wanna get the location data, they can do that.
If the phone is in a lead box, then it just sort of vanishes. And I guess they had us put it in the box in one place and then we moved to another place. So they just couldn't track us that way. But that was the only method they had and you had to put your watch in there. Everything.
[00:04:46] Sandra Matz: Oftentimes people take it even a step further, right?
While I'm not using social media, nobody can really track me across the internet. It's so shortsighted. 'cause obviously you use your credit card, your smartphone, and there's CCTV on pretty much every corner, so people will find you. It's hard to escape.
[00:05:03] Jordan Harbinger: I do wanna address those counterarguments in a little bit, but first I wanna scare the crap out of everybody a little bit more.
The way that computers and AI get to know us at an intimate level is. It really hard to describe to people who have lived with some level of privacy their whole life. Growing up in the eighties, in a medium sized town, some people were all up in my business, but they didn't really know that much about me.
I could hide stuff from parents pretty easily, but you had the ability to do that. You, on the other hand, grew up in a really small town and everybody knew everything about you. This analogy was actually really good to illustrate the idea. Here
[00:05:36] Sandra Matz: it is a tiny, tiny town, so 500 people. My parents keep reminding me that it's grown to a thousand now.
[00:05:43] Jordan Harbinger: Oh, big time. It's a hundred percent bigger. It
[00:05:46] Sandra Matz: doesn't make any difference 'cause it still meant that everybody knew everything about me, right? Who I was dating, what I was doing on the weekend, which music I was into, and what Village neighbors do best is then make inferences about who you are. They saw me running to the bus every morning.
They probably figured out that I wasn't the most organized, and then it doesn't stop there. Village neighbors are not just there to poke around in your life and to your psychology. They then try to meddle with your life. They are not really trying to figure out who you're dating. They wanna influence who you're dating, and sometimes that's.
Really helpful 'cause they know you and you get this feeling of there's someone who truly understands me and when they have my best interest at heart, they're gonna give the best advice that I can possibly get. But also, oftentimes it felt a lot more manipulative behind my back without me necessarily having control or appreciating the support that I was getting in in any way.
[00:06:39] Jordan Harbinger: Oh man, that would be really irritating because you know, it's like your parents theoretically have your best interest in mind, but the woman who lives five doors down, who babysat you twice when you were little, they don't really know you. They think they know you because they've made all these assumptions about you.
But they could be wildly off base. And also, oh, I think she should date that boy. Why? Because they both have dark hair. What the hell does that have to do with anything? You're not exactly using your genius matchmaking skills. It's like the boy she's dating now. I don't like him because one day he dropped something on my lawn by accident.
Okay? 20 years ago he dropped something on your lawn by accident, dog peed in your yard. So he's a bad kid now. And
[00:07:15] Sandra Matz: it's so idiosyncratic. It's actually what I find fascinating about the shift to the online world is we're doing it a lot more systematically. So your neighbors, they had their own biases, they had their own perspective on the world, and they were filtering all of the data that came in through their own lens and their own incentives.
Algorithms don't have the same incentives. They essentially do whatever you tell them to do. They optimize for the goal that you set for them. So the way that I've been thinking about essentially that we live in this digital village where algorithms now replace our neighbor with essentially a digital neighbor who takes all of the data traces and makes the same predictions.
And for me, the important part, and this is coming back to something you said is. It used to be the case that what happened in the village stayed in the village, right? So maybe it travels to the next town, but if I wanted to escape, I just moved to Berlin, or I moved to a bigger place, New York, and that's it.
But it's no longer true for the digital space. Once your data is out there, everybody has access to it.
[00:08:11] Jordan Harbinger: It's funny 'cause people will ask me something like, wow, your life is really not that private because you have a podcast and you have this online brand. But the difference is I have been for the last 18 years-ish thinking about what goes online because I realize, oh, if I post this, it's part of my brand.
If I post that as part of my brand, not that everything I do is branding, but it's just I'm consciously aware that like I can't start talking about. I don't know, buying a gun without people being like, okay, so now you're this political this, or maybe you're this guy, or maybe you're that guy, or maybe you're having a midlife crisis or whatever, right?
You're doing something there. Whereas normal people, they post on Facebook and they go, I'm just telling my friends that I got a new rifle for hunting. But what you're doing is telling the entire world for all of eternity that you are now a certain type of person, or with 50% probability, or maybe this person, you forget you posted that last Monday, but the algorithm never forgets that you own a browning with a scope that you can use to shoot a hundred yards.
They'll never forget that for the rest of your life.
[00:09:13] Sandra Matz: The thing is that even if you don't explicitly put this out there, right, so a psychologist, we think of data in two categories. One is these explicit identity claims, which is like posting on social media, right? That's you telling the world. Here's a person who I am.
Here's how I wanna see myself, and here's how I want other people to see me. Now, that's a very intentional signal that you're sending. Then there's also all of the other traces. You don't have to post about you buying a gun. I can just by tracking your GPS records, figure out that you probably went to a shop where most people buy guns, and if you do this repeatedly or you go to a shooting range, then my assumption is with like a high percentage accuracy that you probably own a gun.
And for me that's in a way, the very intrusive part that we oftentimes forget is that it's not just this explicit signaling, it's like all of these behavioral residue that we create without really intentionally thinking about it.
[00:10:06] Jordan Harbinger: That's interesting. I hadn't even thought about the fact that you don't have to post it.
It just comes out of the data exhaust or whatever they call it.
[00:10:12] Sandra Matz: Yeah. I think it's data exhaust
[00:10:13] Jordan Harbinger: where it's like location data. Oh, we don't need that. And it's actually now we can tell where this guy goes to lunch every day. Oh, that's useful. And we can advertise similar restaurants in the area. Like that all becomes useful as soon as they figure out what to do with the mess of data that they have, which is what AI is getting better at doing and these kinds of things.
One thing I felt was super interesting in the book was that these clues or data can predict depression and lots of people are depressed. I don't think that's a big surprise. There's like a million suicides a year. Is that global? I assume that's global. 'cause that's enormous.
[00:10:45] Sandra Matz: I think it's global. Yeah.
It's an insane amount.
[00:10:48] Jordan Harbinger: That's a huge number of people, no matter which way you slice it. 280 million people, give or take, are suffering from depression. I don't know how they figured that out or if it's an underestimate, but I. So if we can predict depression, I have to assume that these companies are doing everything in their power to help users who are at risk.
As soon as they get word of
[00:11:06] Sandra Matz: course. What else would they be doing?
[00:11:09] Jordan Harbinger: Not telling me a leather jacket and telling me it's gonna make me feel better.
[00:11:12] Sandra Matz: It's not a hypothetical. So Facebook in 2015 was actually accused of predicting whether teenagers on their platform were struggling from anxiety, depression, low self-esteem, and then they were selling them out to advertisers.
So this was a slide that that was actually circulated. And you can imagine, right? So this is like someone at their most vulnerable state. Not only are they suffering from anxiety, they're also teenagers. They're still figuring out the identity. So the moment that you tap into this vulnerability, the damage that you can do, I mean it's very obvious.
Then there's also potentially beneficial use cases of that kind of tracking.
[00:11:45] Jordan Harbinger: It seems like if you're not a complete psychopath, profiteer, I get it. Advertising to people is profitable. It seems like if you find out that teenagers are depressed, one of the best ways to get every parent on your side as a tech company would be like, Hey, imagine this headline.
Facebook saves 10,000 teenage lives per year with depression tracking, notifying teachers or caring adults, parents, doctors, healthcare people, authorities, whatever it is every year. By tracking them online, parents would be like, here's your new phone so that you can use Instagram, because this is the only insight we've really have into your life, and they're keeping you safe.
As opposed to the current narrative, which is the complete opposite.
[00:12:24] Sandra Matz: A hundred percent. And I was recently talking to a mother whose son attempted to commit suicide and it's traumatizing. And for me, we actually have this opportunity there to catch it early. 'cause as you said, typically how it works is you enter a full on depression, which first of all, even for an adult when you commit to it, is really difficult to get out.
'cause that's the point when you're inward turning, you're not necessarily seeking out help and it's really hard to work your way out. What you would ideally do, and this is where the tracking actually comes in handy, is you catch it early and what you can do with your phone, for example, is just looking at your smartphone sensing data.
Maybe you're not leaving the house as much anymore as you used to. Maybe there's much less physical activity. You're not taking as many calls anymore. So there's this deviation from your typical baseline, and again, it might be nothing. Maybe you're just on vacation and you're having a great time, but it's like I can send you this early warning signal that says maybe two people you nominate.
Maybe if I know that I have a history, which is oftentimes the case suffering from depression, and I see this coming in the future, I could say, I'm gonna nominate my spouse, and when this happens, I want you to notify me and I want you to notify my spouse. It's not a diagnosis, right? It doesn't replace a clinician coming in and going through all the questions, but it's at least one way of saying.
Just look into it. Maybe it's nothing but to be on the safe side. Why don't you try and get some support? And I think that's a total game changer.
[00:13:49] Jordan Harbinger: It seems like this would be so easy to implement because obviously they can trigger advertising. If you're depressed, they could easily trigger an email, a phone call, a notification in your spouse's use of that app or whatever.
It could literally just call the police or there could be some sort of central way to handle this. It surprises me that they haven't done it. Now, maybe that's naive because they're saying, oh, you'd say that's not profitable. Why would they do that? It seems like it would long term be such a massive PR win and create almost an incentive for parents to get these products in the hands of their children that it would be ROI positive.
I don't know. What do I know?
[00:14:26] Sandra Matz: I'm not sure. If Facebook were to offer this tomorrow, I still am not sure if I would want that. So I would rather have a dedicated entity that's not Facebook. Facebook has like all of these market incentives it's committed to, and you don't know what the leadership looks like tomorrow.
There's this saying in the book, data's permanent and leadership isn't. So even if you had a CEO who kind of today thinks we're gonna use it to help people, who knows the data's gonna be out there and they could use it in very different ways tomorrow. So I'd much rather have a dedicated entity that doesn't even have to collect my data.
There's now ways in which you can track locally on the phone and you just send your intelligence that says if these patterns show up, you alert locally on the phone and I never have to even collect the data initially. Now that's not Facebook's business model. Facebook's business model is wrap as much data as you can and then you see how you can commercialize.
So even if Facebook offered that,
[00:15:19] Jordan Harbinger: I personally would not trust them. I don't think I trust them. But this is because of what they do with the data that they're already getting. If they years ago had said, Hey, we can use this at Target advertising. Hey, and by the way. This looks like you might be starting down the path of having an eating disorder.
We're gonna notify somebody who you told us to notify, or your school authorities. Then maybe we would have a totally different opinion of Facebook instead of, remember early, it was like, wow, I can keep in touch with all my friends from school. This is incredible. I know what my aunt is up to. I only talk to her like once a year Now I see her photos every week.
This is such a glorious product. I don't see what could possibly go wrong with this. And then it was like two years, three years later, it was. How the hell do they know so much about me and why are they trying to influence what I purchase or the elections that we have? It was such a rapid downfall.
[00:16:08] Sandra Matz: Also didn't turn out to be connecting the world.
So there's that. Yeah,
[00:16:12] Jordan Harbinger: it exactly. How many digital footprints do we leave each day? Can you take us through a typical morning? 'cause a lot of people, again, they're gonna say, I don't post updates on Facebook and I don't let the phone track my location, or I don't even take my mobile phone with me when I go to the gym, whatever it is, they're gonna have some sort of reason that this doesn't apply to them and that the amount of data we create is insane.
[00:16:33] Sandra Matz: It's just mind blowing. So to start with like. Average person generates about six gigabytes of data every hour. That's just already the sheer volume, and then when you break it down, it just really taps into all of these different parts of your daily routines in life. So if you wake up in the morning, probably what most people do is they grab their phone, which means that now just you unlocking the screen means that someone knows you probably woke up, the phone was stationary, maybe it was dark, so no ambient light.
You didn't open it. The moment that you unlock it, someone knows that you're up. Then you're checking websites, you're sending messages, so you kind of know exactly who's connected to whom, what you're interested in. My morning routine is essentially just going to the deli, getting a coffee, which means that I swipe my credit card again.
Someone knows that I've been out buying something in a specific location. If you have a Fitbit or some kind of tracking device that counts. Your physical activity also sees when you deviate from your routine. So if you have a typical routine, and sometimes you don't do the same thing, people might have a sense that something is up.
Even if you don't have a Fitbit, take your phone with you on the walk or on the way to work. There's cameras and with facial recognition, someone again knows what you do, where you go, and so on. So there's all of these traces. Your car now has like sensors in it that track anything from your speed. Maybe you're going over the speed limit, maybe you're not a great driver, you're going from A to B.
So this idea that it's just social media that is really tracking us and coming back to Facebook is, there's just so many data traces.
[00:18:01] Jordan Harbinger: It's really impressive when you think about it that way, to give people who aren't tech nerds an idea of how much data this is. This is a MacBook Pro Storage every week, or maybe two.
So if you bought a laptop, it'd be full at the end of the week or halfway through the week, depending on how much storage you get in that thing. And companies are storing this because there's so many people. So are they really storing a terabyte of information on, are they storing 52 terabytes of data on me, personally every week?
Because where is all of that data?
[00:18:31] Sandra Matz: It is a great question. Some of it is you don't even need to store everything, but if you think about GPS records, oftentimes what you want is you want to extract the insights and you don't necessarily need to store the longitude latitude. What you want is, yeah, I kind of get the places that you visit.
Maybe I can map it against Google and see what happened in these places when you were there. So a lot of the companies that extract insights that can then be used to tap into your psychology don't require the storage of the raw data. But then there's also other companies who have these massive servers.
So still, I think even with that amount of data, like storage is so cheap. That it pays off at the end of the day, and you might be deleting it at some point, but just the longer you can keep it, the more of these behavioral trajectories you can actually generate and create about people.
[00:19:15] Jordan Harbinger: Yeah, I suppose you don't really need every shred of data, right?
If you have, say, someone's path that they walk every day to go get breakfast, and then they go to the gym and they come home, you can just say, it's basically this times a thousand. They don't have to get every time you cross the street
[00:19:28] Sandra Matz: and you can even store while there was a deviation, right? So even knowing that, so if you know here's the typical and now there's something that seems off, now you can trigger more data collection.
So there's also ways in which you can say, we see that it's a repeat pattern and we're gonna just break the data collection at the point that we see that it deviates.
[00:19:45] Jordan Harbinger: So yeah, I would say this time seven, but on the eighth day she went across the street real quick and came back. But then it was this again, 10 times in a row.
I mean, that's not that much data, right? It's not every step, every longitude and latitude repeated over and over again. And you're right, storage is super cheap. If you've got some sort of data farm or whatever in Iceland that's buried underneath the snow for cooling purposes, that just gets cheaper by the day.
So it doesn't really matter that much. It's just a crazy amount of data. Given that it's one terabyte per week times hundreds of millions of people using these things, the amount is just bananas.
[00:20:20] Sandra Matz: I think by now there's this estimate that there's more points of data in the universe than stars. To me, it's just insane.
I.
[00:20:27] Jordan Harbinger: It's quite impressive.
[00:20:28] Sandra Matz: How did they count? That's already, yeah. Intriguing, right? Yeah.
[00:20:31] Jordan Harbinger: I suppose it's all just math. It's way above my pay grade. Facebook knows us better than our closest friends and families. You wrote this in the book, and that is not terribly surprising. A little scary because I'd like to think my family knows me pretty well, but Facebook does know me better.
They'll show me some clothes where I'm like, oh, I have to buy that. Even though I know objectively that I'm gonna be disappointed as soon as it arrives and I find out that it was made by small children in Bangladesh or whatever. Yeah,
[00:20:57] Sandra Matz: that sounds like me. My husband keeps making fun of me for that Total impulse buyer.
Yeah. But yeah, if you think about it like it's Facebook, but also Google. You type questions into Google that you don't feel comfortable asking your closest friends or sometimes even spouse. It's not so surprising to me that all of the digital traces that we create can paint this picture of who we are in a more accurate way than the people around us.
[00:21:19] Jordan Harbinger: I was doing a show yesterday with my producer, we are doing some work, and I was like, isn't there a different word for pedophiles who are attracted to people in different stages of puberty? And he is like, one, how do you know that? And I was like, oh, it was a bit from a comedian. Let me just look it up.
And he goes, please tell me you didn't just Google different types of pedophiles. And I was like, oh yeah, shoot. I did. I did.
[00:21:39] Sandra Matz: Yeah. I still remember when I was doing my PhD, we had this one guy who was doing research on porn websites. Oh man. And I remember his seminar talk where he wants to open a website and just pulls up and it's all porn websites.
So yeah, you gotta be careful on what you type into that search bar. And now generative ai, right? People ask these large language models, the most obscure, absurd questions that are super intimate.
[00:22:03] Jordan Harbinger: There's stuff that I've asked chat GPT where I feel the need to tell it. A friend of mine has asked the following question,
[00:22:11] Sandra Matz: asking for a friend.
Of a friend. Yeah,
[00:22:12] Jordan Harbinger: exactly. Because in 20 years when it's like, here's what you searched this day in 2025, I'm like, I do not want it to pop up. I just don't want that to show up anywhere. And if it does, I want it to be like you asked on behalf of a friend. And I'm like, see, it wasn't me. Sure. Sher pal.
[00:22:28] Sandra Matz: Chat.
Bt. It's amazing 'cause people don't even ask questions, it's just statements. I think there's like some research people just say random stuff to chat bt. 'cause they wanna get it out of the system. Right. They just need to tell it someone. And they don't wanna tell it to the people who they think might be judging them.
After
[00:22:44] Jordan Harbinger: we have an AI chat bot on our website where people can use it to search for things that are inside episodes and we get an occasional report of what people have searched for just so we can make it more useful. And it's not my team that's behind. It's an AI company and they'll say, Hey, FYI, this was a pretty funny month for searches.
Here's our top five favorites. And some of the stuff that people are searching is they're trying to get the AI version of me to tell them how to commit a crime because they think maybe Jordan knows how to get away with this. It's interesting because obviously I'm not liable for that because it's not really me telling them how to do something.
But it's a little scary that somebody can get inside my brain, something I would never tell them. And if my AI version is, sure, I'll tell you like exactly how I would hide a dead body. And it's like, why are you letting my AI brain tell it this?
[00:23:32] Sandra Matz: To me? It's fascinating. 'cause if you don't feel comfortable asking another human being, it's one person who has to keep a secret.
But you are asking a server, you are asking essentially open ai. Now your question sits there for all eternity on a server. It might be passed around. And I think that's something that people don't realize. Somehow this intimacy of the screen feels like it's just not a person on the other side. If anything, it's probably more intimate and more dangerous to ask you a question there.
[00:23:59] Jordan Harbinger: Yeah. It's not a person on the other side. It's theoretically infinite number of people that can look at this at any time with no context and zero ability to defend yourself in the moment because you've been dead for 20 years. Yeah. If you're lucky. How do companies like Meta and other social media companies, how do they get to know our preferences?
Is it just me telling them that I like something when somebody posts something? Because I don't always like the things that I like. Does that make sense?
[00:24:23] Sandra Matz: Makes a lot of sense. You're just probably gonna be nice to people or cynical.
[00:24:28] Jordan Harbinger: Great. Vacation. I couldn't care less, but you're gonna feel bad if I don't click like on this.
And we're friends. I'm supporting you, but No, you look like an ass.
[00:24:35] Sandra Matz: Now you told them on the podcast,
[00:24:38] Jordan Harbinger: this is 87 selfies of your vacation. These are completely uninteresting. You're clearly a narcissist here. Here's my like, voila. Yeah,
[00:24:45] Sandra Matz: no, but you're absolutely right. And it's coming back to this distinction between identity claims, right?
So you liking something, you posting something about your vacation, you are following a certain page that you want other people to see that you follow. And those are all these explicit identity claims. But then there's all of this other stuff that they capture all the way from how much time did you scroll through the specific ad that they're showing you, or a specific piece of content to here's like some of the more subtle nuances in the way that you use language.
So coming back to this topic of depression, for example, it's not just you talking about symptoms and feeling down and maybe having these physical symptoms, even like the use of first person pronouns is a sign of depression. And that's not something that you put out there intentionally, right? It's like I.
It's hidden in some of the cues that you generate, either by you posting or by you just browsing the website. Now Facebook goes a step further. 'cause they also, first of all, buy third party data. So they also buy extra data to know you even better. And they even have data on people who are not using Facebook to contrast and see how they could potentially bring them in.
So Facebook really goes far beyond you liking or not liking the vacations of your friends.
[00:25:54] Jordan Harbinger: What was that about? First person pronouns indicating depression. What does that mean? Because I, I'm not sure I understood what you just said.
[00:26:01] Sandra Matz: Yeah, it's, it's actually one of my favorite examples in that space. It's essentially the use of first person pronouns, which is I, me, myself, what we know is that is empirically related to depression.
So emotional distress. And I remember when I first heard about this, I was like, I don't understand why this makes sense. I would've assumed it's narcissism, as you mentioned, right? If you talk about yourself and your vacation and what you've been up to. It's probably self-focused and maybe you're a narcissist, but what we know is that it's a signal that you're currently very focused on.
Why am I feeling so bad? How am I gonna get better? Am I ever going to get better? And because we have this inner monologue with ourselves and we can't constantly control it, that just creeps into the language. So people who are suffering from any type of emotional distress, they're just much more focused on the self.
And that leaks into the language. And again, in your posts about vacation and everything that's going on, you don't explicitly, intentionally use first person pronouns more when you're not feeling great. It's just something that leaks to the other side and leaks into your language.
[00:27:01] Jordan Harbinger: That's interesting. So in theory, that happens when we're talking to people in real life also, or is this mostly online communication?
[00:27:08] Sandra Matz: No, it's also talking in real life. It's a pretty substantial effect. I think 40 times more than when you're not feeling depressed or emotional distrust.
[00:27:16] Jordan Harbinger: 40 times more.
[00:27:17] Sandra Matz: It's 40%. It's not 40 times.
[00:27:19] Jordan Harbinger: That's interesting. That's unmistakable. Yeah, it's like pretty substantial. Wow. So in theory, even a smart TV or my phone, which is listening, even if I don't want it to be, or my Amazon Alexa thing, that could tell me if I'm depressed just by hearing what I'm talking about in the house or overhearing a phone conversation.
[00:27:36] Sandra Matz: Yeah. It's just like this passive listening into, not just what you're saying, but how you're saying it.
[00:27:42] Jordan Harbinger: You mentioned in the book that within 300 likes of me liking things on photos or whatever, the platform knows me better than my spouse. 65 likes. It, knows me better than my friends. That doesn't seem like that much.
[00:27:55] Sandra Matz: It's so little, right? So I remember when my colleagues published this study, I think the average, and this is 10 years ago now, the average number of likes was 230. So back in the day, the computer was already better than everybody except for the spouse. And you can very easily project into the here and now where you have a lot more data, you have a lot more sophisticated models.
So by now the computer is probably better than the spouse. And again, it sounds so intimate, but then if you think about the fact that a computer has access to the entirety of your digital life and some of the aspects that you're potentially trying to hide from other people, or you don't necessarily intend to signal, it's not as surprising.
[00:28:32] Jordan Harbinger: Yeah, that's not super surprising. I guess if you think about it this way, and I'm sure this isn't a one-to-one kind of comparison, but I. If I had to remember 300 things that my spouse likes at the same time, I don't know if I could do that. That's a lot of different things. How do they measure that? Is it like the Newlywed game where you get asked questions and the computer just gets it right more than the spouse?
[00:28:53] Sandra Matz: So in this case, it's actually, you complete a questionnaire. So you tell us, here's how I think of myself when it comes to personality, and it's all kind of asking you about behavior. So how often do you enjoy socializing? To what extent are you making a mess of your environment, and then the spouse completes the same questionnaire, so on your behalf.
So I think Jordan would answer, strongly agree to the question. I make a mess of things. Not sure, hypothetically.
[00:29:19] Jordan Harbinger: Wow. Yeah, that's interesting. And the computer gets it right more than the spouse. Again, though, I think trying to remember 300 different things about your spouse at one time, it's a lot. It's just a superhuman feat of memory alone, let alone knowing someone that, well, I.
[00:29:32] Sandra Matz: You also have a certain bias. There's like certain ways in which you want to see your spouse. So once you have a certain way of seeing them, the way that you integrate new information is just almost aligned with the perception that you have anyway. So it's much harder for humans to update just because it's in a way functional to stick with the impressions that we have.
[00:29:51] Jordan Harbinger: Now it's time for us to hopefully monetize you. We'll be right back. This episode is sponsored in part by the Moonshot Podcast. If you've ever wondered how big world changing ideas go from crazy concept to real life innovation, you've gotta check out The Moonshot Podcast. This 10 episode limited series takes you inside Alphabet's Moonshot Factory, AKA X, where they dream up and build things like self-driving cars, AI that codes itself, wildfire prediction tech, even laser powered internet.
I'm not sure how that works, but I wanna find out, get an inside look at the people, the challenges and the wild process of turning impossible ideas into reality. You're basically sitting down with the minds behind these projects to answer big questions like how we can make clean energy work, and if we can get tech to actually help save this planet.
What I like about the show is it's not just about success, it's also about failure risk, and relentless optimism. So if you're into bold ideas, breakthrough tech and the untold stories behind innovation, the Moonshot Podcast is your kind of show premiered March 10th with new episodes dropping every week.
Find it wherever you get your podcasts. This episode is sponsored in part by Airbnb. As Anthony Bourdain once said, travel isn't always pretty. It's not always comfortable, but that's okay. The journey changes you. It should change. You. Seeing the world opens your mind, introduces you to new perspectives and gives you experiences that no classroom or office ever could.
And with remote work being more common, there's never been a better time to take advantage of that freedom. Immerse yourself in new cultures, make lifelong memories, keep a home base to return to. Airbnb can make that a reality. Hosting on Airbnb is an easy way to earn extra income without taking on a second job.
And now with Airbnb's co-host network, it's even simpler. Got a spare room guest house place that sits empty while you're away. Instead of letting it just collect dust, let it help fund your next adventure. A co-host, they can take care of everything. List your space. They manage reservations, they communicate with guests.
They keep it all in tip top shape, and that unused space could cover your flights to Asia. Book you a cool hotel in Spain, even help with everyday expenses, giving you the freedom to explore more. Whether you're saving for your dream getaway or you're just looking for financial flexibility, Airbnb makes it easy to turn your space into extra income.
Find a co-host at airbnb.com/host. If you're wondering how I managed to book all these amazing thinkers, authors, creators every week, it's because of my network, the circle of people I know, like, and trust. And I'm teaching you how to build a network for yourself for free, whether for personal or professional reasons, whether you're retired or just entering the job market.
I have a course over@sixminutenetworking.com. The course requires no payment. I don't want your money. I don't even need to know who you are. You can be totally anonymous. It's all super easy, down to earth, non cringey. It's about your relationship building skills. And just a few minutes a day you can binge this course, practice a few things from it.
It will change the way that you relate to others. And that's the whole idea. And many of the guests on the show subscribe and contribute to the course. So come on and join us. You'll be in smart company where you belong. You can find the course again for free@sixminutenetworking.com. Alright, now back to Sandra Motts.
How do computers test their assumptions about what they know about us? I know I'm anthropomorphizing computers a little bit, but whatever. How do they test and see if they're right because. They have to do that somehow, right?
[00:32:49] Sandra Matz: Uhhuh, that's actually how they learn, right? So machine learning is called that way because they learn by trial and error.
So the way that we train a model, for example, to predict your personality from, say, Facebook likes. We give it a lot of data where people completed a questionnaire giving us answers of here's how I think about myself in terms of personality. And then they have access to all of the likes and they just played a trial and error game.
So maybe if you like the fan page of Lady Gaga, maybe that makes you more extroverted. Did I get it right or wrong? Got it. Okay. I'm gonna update my belief of what Lady Gaga actually means. Same for the fan page of CNN. Maybe that makes you more contentious and organized and reliable. So essentially you just throw a lot of data at them.
In the beginning they're just randomly guessing, and over time they become a lot better. 'cause you give them feedback, you tell them, yep, that was a good guess. No, this was a terrible guess.
[00:33:38] Jordan Harbinger: I see. So it's just tons of trial and error. You have a really good analogy in the book about Chick sexing. Don't worry.
Still safe for work folks. This is not the chick sexing that I tried in vain to accomplish in my twenties. This is the kind of chick sexing that happens on a farm. Yeah. Tell us about this. 'cause this is a very good metaphor.
[00:33:55] Sandra Matz: I love it as an example, just to explain machine learning. So there's like a profession.
That is essentially, it's called Chick Sexer. That's their name, which is amazing, right? I imagine you going to the conference and they ask for your title and you just say like, I'm a chick sexer. I mean, that's a life goal on your bucket list. But anyway, the point is that in hatcheries, you very quickly wanna determine whether a chick is male or female.
'cause for all the vegetarians out there, you, you're onto something. The males, they get shredded pretty much right away 'cause they don't produce eggs. So they mostly keep the females and it's really difficult to tell whether a chick is like male or female. 'cause they're like tiny, right? It's like a tiny baby chick.
So what chicks actually do, they essentially learn over time by having someone supervise and their actions, they just pick up a chick, they look at the little vent and say, oh, I think this one is a female. And then the supervisor says, correct. They put it in the one basket and they move on to the next. So they go through this trial and error game many, many times.
And it's not that it comes with an instruction manual. So it's not that they sit down for two week training course to see how to distinguish males and females. They just start with a 50% baseline where they might get it right or they might get it wrong. And then over time they develop this intuition and they see these patterns that sometimes they might not even be able to explain.
And I think of like machine learning the same way. Instead of looking at whether a chick is male or female, you might try to predict personality, you might try to predict gender, sexual orientation, political ideology. And the input is essentially people's digital footprints. And then over time they just learn how to distinguish.
Now the interesting part is actually that this is how it used to be. So we used to train these models specifically by supervising them. Large language models were never trained explicitly to make some of these predictions. They were just trained, predict the next word in a sentence and use the entire internet to do that.
And they can still make similar predictions. So if I now give Chet GBT access to, let's say your social media posts or your credit card spending, and I ask, what do you think is the personality profile of the person who generated the data? It's almost as accurate as these supervised models that we trained specifically for the task.
So that's a totally different game. 'cause now anyone can use it. You don't even need that data. You don't need the training process.
[00:36:11] Jordan Harbinger: That's really interesting. And for someone like me with a thousand or 2000 hours of audio content out there, yeah, that's just a bonus for these companies. They'll eventually be, I mean, I'm sure they already are ingesting all of that.
There's a company right now that's making an AI clone of me, whatever that means, and they're using all of the data from the show. They're only gonna use about a thousand hours, but apparently that's more than enough. I'm curious to see it, 'cause it will supposedly talk like me, have the same reactions as me.
It's basically as close as you can get to some sort of. Print out of my brain. And it's funny 'cause it seems like such a waste to do it for me because the other people who have this are like Nobel Prize winners and I think I'm just low hanging fruit because I have so much data out there.
[00:36:52] Sandra Matz: Interesting.
And I was gonna get there. 'cause I think to me that's the next level, right? So far we are talking about, we can make these predictions about your personality, but personality is helpful if you meet a stranger on the street and you know nothing about them. Knowing whether they're extroverted or introverted helps you understand how they might be thinking and how they might behave.
But the people that you're really close with, like your spouse for example, you don't think of them necessarily of she's the extroverted, open-minded. You kind of know them a lot more intimately. And I think that's what we're getting to with these digital doppelgangers. And then you can imagine, once I have a second, Jordan.
I can ask, well, how do I best persuade you? Would you buy this? Do you like this? Maybe not. So how do I get you to buy into this vision that I have or buy this product that I want to sell you? So I think it's just becoming more and more intimate for us. I was talking to this New York Times, um, column writer and she wrote this article of how she outsourced for a week her decisions to ai digital twin version of herself.
And she's like, it was pretty good. It got it right, like maybe 90% of the time, but it just turned me into this basic bitch where it was like always the same. And that I can totally see what it's trained to do is, yeah, what is the most likely thing that Jordan is gonna go say next? But what makes you unique is, yeah, you don't always say the most likely thing.
You still have like this unique element of like, depending on what's on the other side, you come up with something new. So I think that's the world that I am worried about. It's, I don't wanna be boring as a digital doppelganger.
[00:38:24] Jordan Harbinger: I found it really interesting that the algorithms can tell gay men by their faces 81% of the time.
I feel like a lot of humans are good at detecting this too. Isn't it called gaar? Isn't that the whole idea? I.
[00:38:36] Sandra Matz: So this is like research by one of my colleagues, and I remember that he sat me down and he is like, well, computer can tell this like accuracy look of 80%. I was like, I can do it too. And then I tried and I was barely better than chance.
So I think we oftentimes think that we know, and maybe I'm just particularly bad, but at least in their research they show that it's not just me. Most people are actually much worse than they think.
[00:38:57] Jordan Harbinger: That actually makes perfect sense. I think the reason people think they're good at it is because when the really obvious examples come through, they're like, oh, I got that one.
And it's like guy with the midriff shirt and walking a poodle in a sailor uniform. Okay, yeah, that guy's gay, good job Colombo, but like the quote unquote normal looking person, and you find out they're gay, you're like, oh. And you just like didn't know that about Tom, for example. It doesn't strike you as you would've gotten that wrong.
You just weren't thinking about it at all. So yeah, it makes sense that the algorithm would be better, but 81% of the time is really incredible.
[00:39:29] Sandra Matz: Yeah, and it's also the scary part is I think we rely a lot on grooming. Signals, right? So the poodle or like the hairstyle, the shirt that you mentioned, this is all stuff that you can change.
I think core of the research, and I should say that this is quite controversial, so I think there's a lot of people questioning it. But the fact that you can tell it based on facial features alone, both for like sexual orientation, but also personality, that would really take the creepiness to the next level.
'cause you can leave your phone at home, you can't leave your face at home. So for me, this is really like one of these, if this is true, and there are potential reasons for it to be true, right? So we know for example, that like hormones, they inflect testosterone, take testosterone. Testosterone kind of influences what your face looks like.
Because it makes you more masculine, but it also influences how aggressive you are. And so the fact that there's like these, for example, hormones that shape both is not completely delusional, I would say. And for me that's really extremely creepy.
[00:40:27] Jordan Harbinger: It is. Although we're tiptoeing on the line of something that's gonna get me canceled, but whatever.
There are plenty of aggressive, manly dudes who are also gay, so it's not just like, oh, well look at this guy's cheekbones. That's the whole like, humans can do this too. It's okay. The guy who looks. Like something out of a fashion show runway, like yeah, maybe that guy has a higher chance in your mind of being gay.
You don't really know, but like the boxer whose photo is in there and has a big old beard. I don't know. I probably wouldn't be my first guess and I probably wouldn't ask because I don't wanna get punched in the face.
[00:41:00] Sandra Matz: Actually. Beard, I think beard was one of the actually like higher likelihood, which I didn't pick up on.
[00:41:05] Jordan Harbinger: I wouldn't see that coming now, especially, uh, the trend is, oh, you gotta have a bushy special forces beard and a trucker hat, and it's like, this is the pinnacle of manliness along with your tattoo sleeve. And now it's like, well, according to our algorithm, there's a higher likelihood that you actually like men.
[00:41:19] Sandra Matz: For me, it's actually an interesting part of this entire prediction space is that there are certain signals that are somewhat universal and they are pretty stable over time. One of my favorite examples is if your phone is running out of battery. That makes you less organized, and that's probably gonna be true for the next 20 years for
[00:41:36] Jordan Harbinger: sure.
So, okay. I love that because when I see somebody whose battery is at 30% or 40%, especially if it's before lunch, I'm just thinking. I don't know if I can work with you. Clearly your life is a mess. How did you wake up with a phone that's not charged? What is wrong with you?
[00:41:53] Sandra Matz: That's what my husband tells me every single day.
I think without battery sharing more often than we're not. 'cause he clearly is much more organized than me. But yeah, so that's the signal. That's not gonna be different tomorrow. But then there's all these other signals that kind of cultural shifts. Maybe something was a niche. Game of Thrones used to be like this.
It's a fantasy and maybe it's just like these very nerdy, open-minded people. Then it became like suddenly everybody was watching it. So that's an interesting part for prediction.
[00:42:21] Jordan Harbinger: That's true. When people told me it was a show about dragons set in the ancient, I was like, don't even finish your sentence.
I'm never watching this. And then enough people were like, you have to watch it. You have to watch it, you have to watch it. And I started watching it and I was like, oh, this is really good. And then I remember telling other people, I was like, do you watch Game of Thrones? And they were like, I just can't with the dragons and the stuff.
And I was like, no, no, no, I get it. I know exactly what you mean, but I'm telling you that it's really compelling. So fascinating how the window shifts. The battery thing makes me feel a lot better though because I was kind of like, am I psycho level of unreasonable because I judge people based on their battery status?
At least it's not just me,
[00:42:55] Sandra Matz: maybe. Yes.
[00:42:56] Jordan Harbinger: Geez. One great example of people prioritizing, I guess you'd say, online clout over real life is the amount of people that die taking selfie photos.
[00:43:06] Sandra Matz: I can't remember the exact numbers, but it's insane. It's like more than from shark attacks and other stuff. So again, it comes back to this question of what are we doing this for?
And I think there's this fundamental need of humans to just talk about themselves. This is like why we see so many people posting on social media all the time. You mentioned that it's so annoying if your friend posts like a hundred pictures that you don't wanna see, but there's something that's inherently rewarding.
There's research that I think is fascinating, that shows that talking about yourself activates the same areas of the brain than having sex or taking drugs. So it just essentially gives you this dopamine boost. If that's the case, it's not so surprising that a lot of people are like, I'm just gonna use this time now, those five minutes that I have, and post something on social media.
So people are willing to give up money in experiments to just be able to talk about themselves.
[00:43:54] Jordan Harbinger: Good lord, that's really. Sheds a lot of light on why I started a podcast. I had no outlet for anything, anywhere else. It's just funny. It makes me think, oh, okay, maybe if I'd done more drugs, this show wouldn't exist.
Certainly if I'd been able to have more sex, that show wouldn't exist. I'd still be a lawyer, so thank my bad luck for that. It's very interesting to me that this phenomenon where, say you're out with friends and you order some food. I don't know if it's a conscious rule, but I basically give 10 seconds for them to take photos before I start eating.
'cause if I start eating right away, I ruin it. But I'm also not going to wait five minutes while they get the right angle and the right lighting and they rearranged the food on the table. That's like the end of my tolerance for this. And I won't travel with people who take more than a couple of selfies at each place.
I get, you want one or two, you went to a castle. It is impressive. It's really cool. But if you're trying different poses and different angles, I'm just leaving you behind. You can take an Uber.
[00:44:50] Sandra Matz: The food thing I've never understood, you're never gonna go back to these pictures. You post them on social.
You're never gonna go back and say, oh, I wish I could find this picture of the pizza that I had on 72nd Street. And we even know the moment that you take pictures, you actually reduce the likelihood that you are gonna remember that moment because you're not outsourcing your memory to your phone. It's like, okay, this is on my phone.
I took a picture so I don't need to remember it. So there is even something that's getting taken away by us. Taking all these pictures all the time.
[00:45:19] Jordan Harbinger: That's like how you get worse at math. If you only use a calculator and you never try to add, subtract, or divide on your own. Yeah. Oh, that's interesting. I didn't realize that you would remember something less because you took a photo.
It's almost counterintuitive. I get the logic behind it. Your brain says, I don't need to remember this. I have a photo. But you would think that focusing on it for an extra few seconds, trying to frame it in your phone camera, looking at it longer, that would make you remember it more. But actually it's the opposite.
[00:45:43] Sandra Matz: I think that's the old school. I think you're still coming from the generation where you had like 24 pictures and you're like, okay, that's something that's worth photographing. Whereas now it's the click, click, click, click, click. This no longer has something that is worthwhile.
[00:45:56] Jordan Harbinger: It used to be like a dollar or two by the time you got the film, took it to Walgreens to have it developed and waited however long you needed, you know, a week or whatever, three days to have it developed.
It ended up being, I don't know, a buck or two. It was expensive. You're right now, I've got a Sony over here that I got to film my kids and film events and stuff like that, and it holds, I think when I put the memory card in and it showed at the top, it was like, you can film 16 hours of 4K video or 9,999 photos, and I was like, oh, it just stops counting because there's probably 30,000 photos available on here
[00:46:30] Sandra Matz: back in the day.
You're like, man, this is like maybe one of these moments where I should take a second one just in case. Now you have 10 by default,
[00:46:37] Jordan Harbinger: the new cameras that are out there for sports photography. I think when you hold the button, first of all, it sounds like some kind of machine gun from Terminator, but it'll take, I wanna say a hundred photos in a second or something like that at maximum speed, which is great if you're trying to catch a jumper at the peak of their jump at the Olympics, and you wanna get the perfect moment of them going over a bar or something like that.
That's when it makes sense. But when you're taking a picture of your kids' deuce in the potty at home,
[00:47:06] Sandra Matz: it's like before they're smashing their head into the wall. Yes, exactly. That's what you capture.
[00:47:11] Jordan Harbinger: Yes. I want the wave in the skin of the forehead when it smashes into the drywall.
[00:47:16] Sandra Matz: Yeah.
[00:47:16] Jordan Harbinger: Yeah.
[00:47:17] Sandra Matz: Just as a memory, the good old days.
[00:47:19] Jordan Harbinger: Exactly. Oh my gosh. Tell me about Facebook status updates and word clouds. I miss Facebook updates. I mean, maybe they still exist, but it used to be. And I know I sound old af when I say this. A OL instant messenger. Do you remember that? Did you use that?
[00:47:32] Sandra Matz: Yeah, I do. ICQ and yeah,
[00:47:34] Jordan Harbinger: right. You had your away message and you're like, okay, I gotta think of something creatives and fun.
Like off days you just pick a quote from an author you like or something. But on days where you think of something really funny, you put that in there and everyone's checking everyone else's away message all the time to see if there's a, and it was like if you could do that day in and day out and make it fun, people were like, this guy's smart.
And Facebook. The original Facebook status updates where you just typed in the box what you were doing that was like almost like a status game for good writers in college.
[00:48:04] Sandra Matz: And it was also informative. I think like right now we're just posting first of all anything and like pictures of food and so it's lost its appeal and it's not just Facebook updates, right?
You can think of Facebook status updates as the same of like you posting on Twitter, even in the way Instagram pictures that we take in a way tell the same story, right? Can write about you going on a vacation or you can post a picture of you on vacation. There's a lot that we can learn. So we already talked about emotional distress, depression, all of the personality traits, and some of them are really obvious.
Oftentimes when people talk about machine learning ai, it's just like magic in a black box and we don't really know what it's doing. If you talk a lot about going out and parties and weekends, you're probably more extroverted than the person who talks about sitting at home reading, gardening and interested in fantasy novels.
So those are the obvious ones. Sometimes there's the ones that are a little bit less obvious and maybe more interesting. For psychologists, for example, income. This is like one of the topics that I study is can we predict someone's income, someone's socioeconomic standing based on what they post. And again, you see the obvious ones, like the rich people post about luxury vacations and brands.
Yeah, that makes sense. But you also see that people who have lower socioeconomic status or lower levels of income, they are first of all much more focused on the present and they're also much more focused on the self. And it's not that they're, again, like these narcissists that just only can focus on the here and now.
It's just freaking damn hard to think about the future in anything other than how do you make your ends meet if you don't have that much money. So there's these subtle cues that we can parse out when we look at what they talk about that are actually interesting beyond just prediction.
[00:49:43] Jordan Harbinger: How the rich and poor talk online is actually quite fascinating.
The idea that people who have lower socioeconomic status or people who are really having trouble making ends meet, can we just say, poor? If you can't make ends meet, you're not doing so well. I
[00:49:54] Sandra Matz: mean, I actually feel like that if we use labels, labels matter. And I know why people don't like them, but it's, most of the time I think you don't like them because they make them feel uncomfortable.
And no, you should feel uncomfortable 'cause there's people who are poor and it's just a freaking hard life to live.
[00:50:10] Jordan Harbinger: It's tough, and I never thought about that. 'cause of course, if I saw somebody who only talked about themselves and things they were doing that day, it would seem to me that they were not thinking long term because of some character defect or they're not smart enough or something like that.
But now of course, it makes total sense that if you can't think far enough in advance because you're just trying to literally feed your kids or you don't have gas to get to work, I. And you're that poor. It's not necessarily a character defect or you having screwed up your life rich people. Is it really that obvious that they just talk about luxury brands and vacations, or are there some more subtle cues that out people as high socioeconomic status?
Because I can't name one single time where I've been like, just getting back from my business class flight to Turkey and staying at a five star hotel. Here's my dinner. I just don't do that.
[00:50:59] Sandra Matz: Yeah. Some of them are more subtle, right? It's oftentimes the opposite. So if poor people talk about the present, you might be like more future focused saying it's always a contrast.
The way that these models work, even the fact that you talk about going to the shells or like an exotic place just means that you don't have to be bragging about going to the five star hotel on your next occasion. The fact that you can afford to fly outside of the country, which most people haven't done in a lifetime, that alone is an indication that you're doing pretty well.
[00:51:28] Jordan Harbinger: That is a good point. I hadn't even thought about that necessarily. There's still a stat that something like less than half of Americans have a passport or something like that. Yeah. You don't have to say, I'm going to a five star hotel in the the Shells. You just have to say, oh, immigration is so slow in, I don't know, India.
Okay, well you went to India. Even though you're complaining about something,
[00:51:47] Sandra Matz: and probably if you're complaining about that, just gives you an extra boost in socioeconomic status.
[00:51:52] Jordan Harbinger: Good point. Yeah. I suppose if you're just excited that it's the first time you've ever left the country, you're not complaining about immigration status, you're like, I can't wait to eat after I get outta this six hour line.
You're just excited to be the, I thought it was quite fascinating about how these algorithms can tell if you're extroverted or introverted. You mentioned based on likes, if you like fantasy novels or if you like going to music festivals. That makes sense. There was a theory in the book or a hypothesis in the book that attractive people become more extroverted and outgoing because of the attention they receive.
I. As kids, that makes a lot of sense. Or is that just pure speculation?
[00:52:28] Sandra Matz: That's a real research finding and it comes back to what we talked about earlier with face signaling, potentially parts of your identity on a psychological level. So we talked about testosterone, kind of being related to aggression.
This idea that like your environment responds to you in a certain way, right? If you are kind of this beautiful kid, perfectly symmetric face blue eyes and constantly smiling people around you are probably gonna be a lot more kind of appreciative and they're gonna talk to you and they're gonna approach you a lot more often.
And the fact that those kids then grow up to be somewhat more social and extroverted and craving the social affirmation and social stimulation is not super surprising. So it's like one of these ways in which actually who we are interacts with, like our environment. And that in turn again influences who we are.
[00:53:13] Jordan Harbinger: This makes so much sense, and it seems like it might be something that you could encourage in kids, regardless of how attractive they are, just by. Interacting with them a lot, putting them in environments where they are interacting with other people, adults and kids. You're right, there's still that spontaneous element.
My daughter, she's three, she loves to sing and dance, and she'll be like, turn your chair around. The show's gonna start. And I'm just like, where did you learn this crazy, extroverted behavior? But then she's in music class and then the teacher's paying attention to her and the other kids are paying attention to her.
So it is a reinforcing cycle. My son, who looks exactly like me, so he's very cute, he has it in sort of an almost like a negative way where he's like, whenever I do bad things, people pay attention to me at school or otherwise, and I'm just like, oh no, this is not the reinforcing that's
[00:54:02] Sandra Matz: you we're hoping for.
Yeah.
[00:54:03] Jordan Harbinger: This is not what we want. We want him to be reinforced. He's good, but he is not shy at all. It's crazy. He talks to the cops when they're here. He just has no fear at all.
[00:54:12] Sandra Matz: The interesting part is also I think the way that we oftentimes think of personality is like it's the static. Like you are either extroverted or you are introverted, but it's actually a lot more dynamic than I think even personality psychologists assumed a couple of years ago.
So it's not just that you can develop over the lifespan. So most of us become nicer, less neurotic. So there's like these trends that we see when people get older, but we also kind of very much fluctuate across situations. So like your son, depending on what the feedback is, might be kind of more reserved or more extroverted.
So I think there's also something that when we interact with our kids, and I just had a kid, so he is like 1-year-old, I just constantly think about how do I expose him to these different situations? Where sometimes I tell him like, look, it's totally okay to be quiet and sit in the corner and kind of just think for yourself for a second.
But then also I want him to have these other situations where he can be a lot more outgoing. So I almost think of it as like this repertoire where you have a certain tendency, right? There's pretty substantial genetic component to personality, but then there's also you being able to adjust to different contexts, and I think that's something that we can teach kids and even tell them, look, if you behave differently across situations, that doesn't make you hypocritical.
That can still be like this authentic version of yourself. It just means that you're adjusting to whoever is on the other side or what the context requires.
[00:55:33] Jordan Harbinger: You mentioned though, facial recognition can take faces from photos or videos taken by other people, or just the CCTV that's present in whatever stadium you're in or on a street corner if you live in China.
So it doesn't really matter if you don't use social media, you're still a part of this surveillance capitalism system or whatever we wanna call it.
[00:55:54] Sandra Matz: Yeah, absolutely. And for me, really intrusive part is that it's not just the ability to make inferences about who you are, right? You mentioned China. The reason for why the Chinese social scoring system is creepy in a way, is that it also influences what you can do and what you cannot do.
So it doesn't stop it. I wanna try and understand who you are. I'm also going to influence the path that your life can take, maybe the choices that you're making. So in China, if the government predicts based on your data that you might have a higher likelihood of voicing descent or protesting you, you are not allowed into Beijing.
Like I teach this class on the ethics of data, but that's what happening in China. What do you think here kind of companies decide whether you might get a loan or not, whether you might get credit or not, what your insurance premium is or not. It's very similar. We try to understand how you might behave and then we shift the offerings that we have.
We might try to sell you something that you don't need. So I think this notion that it's not just about privacy, it's really about the second step of people then interfering with your ability to make your own choices. For me, that part is almost creepier.
[00:56:58] Jordan Harbinger: It is creepy, and there's not much we can do about it.
'cause if an AI is making a decision to give someone a loan or not, and by the way, you know damn well, it's gonna be like this person's battery was 25% at lunchtime when they applied, we're not giving them a loan. They're totally irresponsible. It's not gonna say that's the reason. It's gonna say, oh, based on 20,000 factors that weighed a little bit to the left on whatever line, you're just short of making it.
We're not gonna be able to weigh all 10,000 of those factors. The fact that you. Applied and didn't finish the application all at once, and your battery status at the time and your location kept changing and like the fact that your jobs have changed so much, there's gonna be 10,000 of those. It's not gonna be like, we didn't give you a loan because you're brown.
That's gonna be an obviously not okay thing, but when it's 10, 20, 30,000 different little factors and they don't interrogate the AI as to why they just blindly accept it because it's accurate 99% of the time. That's where we start to run into these problems, I would imagine.
[00:57:55] Sandra Matz: Yeah. And they might all be related to some of the protected categories, right?
If we know that some of the behaviors that we show are related to you having low socioeconomic status, or to your ethnicity or to your sexual orientation. You don't need to capture that category 'cause it's like somewhere embedded in the traces that you leave. To some extent, I think on the global level, when we try and understand what are these models doing and are they potentially discriminating against people, I still think that there's something that we can actually do to probe.
Oftentimes people say, well, we don't know what the models are doing 'cause it's like these complicated neural nets and we just can't open a black box. We can still look at the output. If you're thinking about are we gonna give people a loan or not and you just see that none of the women are getting any loans or none of the women are getting hired into technical roles, maybe then that's something that the model is picking up on, right?
So even if you don't fully understand what it's doing, you can always look at the predictions and see is there anything that we see among the categories or the social demographics that we wanna protect that seems to be off in terms of how often we do the thumbs up that the person gets the loan or gets the job.
[00:58:58] Jordan Harbinger: You untrustworthy good for nothing. Deadbeat. We'll be right back. This episode is sponsored in part by tonal. Getting in shape is not easy, and the older I get, the more I realize how important of course it is. You hit a certain age suddenly it's not about looking jacked. It's about, Hey, how do I get outta this chair without grunting like my dad?
I still grunt though. I think it's part of, I think you have to, I think you are mandated to do that after age 40 Anyway. That's why Tonal is great. It's basically like having a full gym and a personal trainer mounted to your wall. It uses digital weights so it adjusts in real time as you lift, meaning it knows when you're struggling and it backs off just enough or pushes you when it knows you got more in the tank, which frankly is smarter than most of us when we're training solo.
It's a great idea. Gym memberships and personal trainer fees, they add up fast. With Tonal, you're making an investment. Once you're getting way more out of it, plus no commute, you finish a workout, you're already home, which if you've got kids, you got a busy schedule is huge. You could put this thing in your office, which would be awesome.
Another thing I love is tonal isn't just throw random workouts at you. There are structured programs designed by legit coaches. It keeps track of your progress, adjust your weights automatically, and they make it addictive in the best way. It's gamified. You can actually see yourself getting stronger, which keeps you coming back.
[01:00:07] Jen Harbinger: And right now Tonal is offering our listeners $200 off your tonal purchase with promo code Jordan. That's tonal.com and use promo code Jordan for $200 off your purchase. That's TONA l.com promo code Jordan for $200 off.
[01:00:21] Jordan Harbinger: This episode is also sponsored by Progressive. You choose to hit play on this podcast today.
Smart Choice. Progressive loves to help people make smart choices. That's why they offer a tool called Auto Quote Explorer that allows you to compare your progressive car insurance quote with rates from other companies. So you save time on the research and you can enjoy savings when you choose the best rate for you.
Give it a try after this episode, of course, at progressive.com. Progressive casualty insurance company and affiliates not available in all states or situations. Prices vary based on how you buy. This episode is also sponsored in part by Better Health. You know what is wild? We will spend hundreds of dollars on a new phone and supplements to optimize ourselves.
But when it comes to our mental health, so it's like, eh, I am good. I'm just gonna power through the existential dread. Here's the thing though, therapy, it's not just for when everything's falling apart. It's like maintenance. It's like changing the oil in the car. You don't wait until the engine explodes.
And therapy has helped me through all the curve balls thrown my way personally. Traditional therapy, it can be expensive. I get the hesitation. That's why Better Help is actually pretty brilliant. It's online, it's flexible. You pay a flat weekly fee, so no surprise bills that make you need more therapy.
They've got over 30,000 therapists, like a small army of people ready to help you. Get your head on straight, and if you don't vibe with your therapist, there's no awkward breakup text. You just switch easy. No driving across town. No weird waiting room chairs. You just click a button, boom, you're in bottom line, your mental health is worth it.
[01:01:39] Jen Harbinger: Visit better help.com/jordan to get 10% off your first month. That's better Help hlp.com/jordan.
[01:01:46] Jordan Harbinger: If you like this episode of the show, I invite you to do what other smart and considerate listeners do, which is take a moment and support our amazing sponsors who make this show possible. All of the deals, discount codes, and ways to support the show are searchable and clickable over at Jordan harbinger.com/deals.
You can always surface codes using the AI chat bot on the website as well. And if you can't remember the code, you're not sure if there is a code, go ahead and email usJordan@jordanharbinger.com. We are more than happy to surface that code for you. It is that important that you support those who support the show.
Now for the rest of my conversation with Zora Motts, I also found it shocking how easy it is to identify somebody personally based on, what was it, three credit card transactions. It seems like if it's that easy, you could also extrapolate a lot of info about people from those transactions once you identify them.
So if you find me based on three. Transactions, some software. I bought a haircut and where I ate lunch, now you've got a zillion other transactions you can identify me with that are like, here's every bit of clothing he bought. Forget Facebook status. More I spend my money is at least as identifying and intimate as that data.
[01:02:55] Sandra Matz: Yeah, totally. And it identifies your different levels, right? So the example that you gave, like the three data points that's coming from this notion of even if we anonymize data, right? Even if like I got all of the credit card spending from everybody in Manhattan and we say, but it's anonymized 'cause we're not using any names, we're not using date of birth, we're not using an address because your spending signature is so unique, right?
Almost like a fingerprint. It's very easy and if I know three things about you, I can just easily identify you in there. And then you're absolutely right. It's like if you think of identity at the next level, it's not just that I know, well it's Jordan. Now I can also make inferences again about maybe you are like the impulsive person 'cause you're constantly paying late fees and maybe you're not the most organized one.
It's again, something that might or might not show up in my own spending record. So it's also like one of these things where like oftentimes people say, well. Your online selves, they're so curated, right? And if you want it to be like a more organized and reliable person online, you can do this 'cause you just control.
Yeah, that's true for some of them. But my phone's still running out of battery and I'm still paying these late fees. And if I wanted to be someone completely different across all of my different kind of digital traces, I would probably actually recon that person at some point if I was changing my lifestyle entirely.
[01:04:11] Jordan Harbinger: Those people are just looking at the photo where it's, look, I just woke up and I'm in full makeup and I'm in my yoga gear sponsored by alo. That's what people mean by curated, but you can't fake the rest of it. The fact that your battery's low, you late fees are half your credit card transaction. You have a massive balance from month to month that keeps running because you can't pay it off.
That stuff you can't really hide. You can put on a brave face, shellacking a veneer over what it is, but you can't hide from the company. They know you're full of crap. And I have to admit, I pat myself on the back a little bit when you said in the book, the person who buys gym equipment and then donates to charity is an example of somebody who has their shit together.
I looked at my credit card statement and I was like, what have I done? Okay, I spent 500 bucks on gym equipment. Oh, there's my Amnesty International donation. And I was like, I'm a good person. According to the data, science doesn't lie.
[01:05:01] Sandra Matz: Like a personality psychologist would say. There's no good or bad traits.
There's just some that I was more socially desirable rights. Take it to the extreme. If you are like super, extremely organized, you're turning into my husband who is super sweet, also borderline OCD. We just moved and there's a gazillion boxes in the apartment. It's just like everything is completely disorganized and all I want is to be able to walk from the bathroom to the bedroom and I open a drawer of the cutlery and it's perfectly meticulously organized.
I'm sure he spent two hours sorting the cutlery while there was still like 100,000 boxes in the apartment. So where I might be going through the boxes a little bit more quickly and maybe a little bit less thoroughly, but maybe a bit more efficiently. So no inherently good or bad traits.
[01:05:49] Jordan Harbinger: That's really funny.
Like I, I, we don't have underwear, but all of our cereals are alphabetized in the cabinet.
[01:05:55] Sandra Matz: Oh my god.
[01:05:55] Jordan Harbinger: Yeah. This is probably your doing. Yeah. Yeah. That's really funny. Moron is a really interesting, your husband is a really interesting character, by the way. He was on episode 2 65 of this show, and you had your first impression of him, which was that accurate?
'cause you went out on a date or something. How much of your predictions of him initially turned out to be right later on?
[01:06:15] Sandra Matz: Yeah, very accurate. So I met him, we were actually both giving a talk at the conference for Digital Happiness, but he showed up late. I was about to go on stage, and then the organizer comes and says, Hey, the person who was supposed to speak after you, he's not here yet.
We've no idea where he is. We can't reach him. Could you just take the entire hour? I'm like, fine. And then midway through, he shows up and they usher me off the stage. Fast forward, it doesn't take that long for me to realize that he's smart and hot. So we go out after the session and we actually end up in his place and he kind of has these huge bookshelves and they're perfectly sorted by, here's the topic, here's the height of the books.
All perfectly aligned. Cutlery is perfectly sorted. I remember trying to put down my glass on the table and he. Freaked out to put a coaster on him. He was like an intellectually curious and somewhat borderline OCD and late and that was still spot on today.
[01:07:11] Jordan Harbinger: That sounds about right. I mean he's French and he's Jewish.
So you're lucky to miss your wedding actually. Yes. That's really funny. And also you ended up at his place after the talk location data says you're a little bit easy there, Sandra. It's,
[01:07:24] Sandra Matz: and that's actually, it's funny 'cause that was a lot of the inspiration for the work on digital footprints from the physical space.
So there's all of this work on if you snoop around the bedroom or the office of a stranger and you just pick up on all of these cues, and some of them the same way that we post on social media are curated. You have a poster out there and certain books on the shelf that you want other people to see, but then a lot of them are also like very subtle with what is in your bin.
Are your glasses sorted in the way? Do they have watermarks? I. So I think a lot of the work that we've been doing in the digital space was actually inspired by the physical space and the way that we make these inferences about strangers all the time as humans.
[01:08:04] Jordan Harbinger: I remember vaguely in the nineties when I started dating, there was a cliche that a woman would come into your house and look in your medicine cabinet to see what sort of drugs you had in there.
That actually was the original snooping around thing, like before digital stuff existed. 'cause you would open that cabinet and you'd be like, oh, okay. Here's the real stuff that they're not gonna tell me for months.
[01:08:26] Sandra Matz: What would I hoping to find?
[01:08:27] Jordan Harbinger: That's a good point because this is sort of before all the personality pills and Adderall and everything.
So I don't know. Are you looking for to see if they're diabetic? Back then, what would've mattered? I don't know. Yeah, but it was a thing for sure. Hemorrhoid cream. Hemorrhoid cream, who doesn't have a couple tucks laying around. I remember I was interviewing a very important, very distinguished man. He's like, can you come to my hotel and do it?
So we did and we set up in his hotel and I was like, I need to use the restroom real quick. And I remember going into the restroom and he had all this hemorrhoid stuff laid out on the counter. And I was like, good to know that the CEO of this giant, massive multinational company has serious hemorrhoids.
Poor guy,
[01:09:03] Sandra Matz: humbling experience. Yeah. '
[01:09:05] Jordan Harbinger: cause you can't be like, hold on, I'll be right back. Nope. He's just gonna find out about my hemorrhoids. Well,
[01:09:09] Sandra Matz: I'm sure he played it cool.
[01:09:10] Jordan Harbinger: He did. He played it. Very cool. But I guess at that point, when you're a billionaire, it's like, yeah, I got hemorrhoids. What are you gonna do about it?
Podcaster,
[01:09:16] Sandra Matz: whatever. You can
[01:09:18] Jordan Harbinger: leave if you have a problem with that. What is the limit of psychological targeting? Can I change someone's mind entirely or do I just influence people who are straddling the fence?
[01:09:29] Sandra Matz: It's a great question. 'cause I think if you looked at the media, it's like totally black and white, right?
It's like either it's this warfare tool and it's like changing your mind and it's changing your core identity. That's probably not the case. So I always think about it if that's something that you couldn't do in an offline world. If you think about your hardcore, diehard Republican uncle and by having long conversation with him, you can't convince him to take on a certain view on the world.
You're probably also not able to do this online with like algorithms, even though you can target them repeatedly and maybe you can send them down a rabbit hole. Changing someone's core identity takes a lot more than just like a couple of ads and maybe even repeatedly. But the thing is that it usually doesn't even need that, right?
Oftentimes it's like our choices are kind of small ones we are not even aware of. What's the cereal that you choose? What are you deciding to wear today? What are the news that you're trying to read, and where does that take you in terms of how you think about the world? So. Oftentimes when I think about influencing behavior, it's like these small changes.
And the same way that we do this in an offline world, right? Coming back to kids, humans are born to do this. Kids know exactly how they talk to their mom to get the candy as opposed to their dad. And it's not that by doing so, change who the other side is. It just makes it more likely that they behave in a certain way.
And for me, it's like taking what we've been doing for centuries in an offline world and we're just applying it at scale and in a way that's no longer bi-directional. It used to be the case that I do this to you and you do this to me right now, this is mostly happening from big companies to influencing your behavior.
[01:11:04] Jordan Harbinger: Yeah, that's interesting. 'cause it does seem like the sort of most basic mediocre use of all this psychological targeting is selling me shirts. It just seems like, can't you do more with this? And I remember Cambridge Analytica, right? It was like, oh, they totally influenced the election, did they? Or was it not much?
[01:11:22] Sandra Matz: So did you swing an election and you convinced the diehard Hillary supporter to suddenly stay at home and not vote? Probably not. But could you maybe have influenced some of the people who were not sure if they wanted to go out and vote, and maybe you caught them at the moment where they were really scared about immigration and you changed them from a Democrat to a Republican?
Probably. I think that the point of Cambridge Analytica is it wasn't necessarily even something that. Political campaigns have been using data for a long, long time and Obama was celebrated for the use of like, well, there's someone who's trying to understand their constituents and try and see what they're interested in, but like what do they care about?
What, how do I talk to them? I think what Cambridge Analytica more distinct from the previous attempts, at least in the public mind, was that people could suddenly make sense of it, but even if they had the data, everything before, we don't think about ourselves in like these separate data points. I don't think of myself as here's my browsing history and here's my social media and here's my credit card spending.
I think of myself as this holistic person that's impulsive and maybe a little bit neurotic and curious, and I think once you told the public that there's a company that can predict whether you are emotionally volatile or whether you might be introverted, outgoing, I think that's what resonated with people so.
Do I think that they won the election by doing this magical brainwashing? Probably not. Could something like psychological targeting swing an election when they're under margins? Probably, yes. Do we need psychology for that? Again, not entirely sure. 'cause you can make very similar predictions with kind of skipping that step.
[01:12:57] Jordan Harbinger: Can we use this technology to decrease political polarization? These companies, they know what we're all like, they can file us into echo chambers. Can we reverse that process?
[01:13:08] Sandra Matz: It's something that I've been super intrigued by and who knows if it's gonna plan out, but I always think of it as a technology, right?
The technology at the core is trying to say, can I understand where you're coming from? Here's your point of view. Here's your view on the world. Here's your values, here's your personality. And now you could imagine using that to explain to you, here's how the other side sees the world. I can convince a Democrat to kind of understand here's maybe why a Republican is more opposed to immigration and more opposed to abortion.
Not in a way that the Republican would try to convince you, right? Because they're coming from their own respect, but in a way that Democrats think about the world that's oftentimes a much more promising way of convincing the other side, or at least making you a bit more receptive to arguments of the other side.
And this is proven by research, by the way. It's essentially this idea of can I tap into your own moral campus to make you think about the world in a slightly different way?
[01:14:05] Jordan Harbinger: That would be an interesting experiment. Again, though, it has to be profitable or these companies won't actually wanna do it. Back to the privacy idea, what about people that think they don't need privacy because they have nothing to hide?
I hear that argument all the time. There's so many people that say, look, I have nothing to hide. I don't care if they're collecting data on me.
[01:14:23] Sandra Matz: Yeah, and now you have to stop me 'cause it's one of these topics that I could talk about forever. 'cause I hear this question all the time. So again, in the classroom when I talk about here's what we can do with your data, there's always at least one person who says that.
And in a way, I can even partially relate to this, right? 'cause it feels like, well, I tried everything. It just feels like an uphill battle that I can't win. So I might as well give up. But it's a very privileged position to be in, first of all, right? So the fact that you don't have to worry about your data being out there just means that you're currently in a really good spot.
If I can predict your sexual orientation, your mental health, from all the traces that you leave in many parts of the world, is not just preventing you from going to Beijing. That could mean the death penalty still in a lot of countries. So it just means that you're currently in a good spot. And what I think is even more true is that you don't know what it's gonna look like tomorrow if you don't have to worry about your data right now.
That might change entirely in the us. I think the Roe versus Wade Supreme Court decision made that painfully real for many, many women. But suddenly overnight you had to worry about your Google searches. 'cause maybe you are looking for kind of some pregnancy related abortion related advice. Maybe you were traveling across states taking your phone.
So I can see you are traveling to another state, maybe again, based on your GPS records, here's exactly the location. Maybe you went to a certain clinic, maybe you came back and you were suddenly no longer looking for certain things on Google and Amazon. So I think this notion that data is permanent and leadership isn't, should make all of us kind of worried.
And maybe that's the government changing, but it could also be just the leadership of companies going from one day to the next.
[01:15:57] Jordan Harbinger: That's a good point. Being gay is illegal in. More than half the world. I
[01:16:02] Sandra Matz: think it's a little bit less, but it, it's like still many, many more countries than you would imagine.
[01:16:06] Jordan Harbinger: So it's illegal, or at least could be illegal in a large number of places.
And
[01:16:11] Sandra Matz: religious affiliation
[01:16:13] Jordan Harbinger: as a Jewish person, we don't like lists. We don't like tracking.
[01:16:15] Sandra Matz: It's like one of the most compelling examples of like white data can become extremely dangerous. Like what we know from like Nazi Germany in the Second World War is that religious affiliation in parts of Europe was part of the census, which made it extremely easy for the Nazis to come in and say, well, we're just gonna go to City Hall quickly, check the record and see here's person A, B, and C.
They live in this place now let's go and find them. Now, fast forward to today, and we know that atrocity is varied vastly based on whether the data was available. And today you don't need part of the census 'cause you can just passively predict it from all of the traces that you generate and from all of the data that you create.
So for me, this notion that we just don't know what tomorrow is going to look like, it's just a good reminder that you probably should care about your privacy. Now actually do things that most people do, right? If you then show them the offline equivalence and you say, okay, look, your smartphone tracking a whereabouts twenty four seven is like a person walking behind you observing your every move.
That's the stalker that goes to jail. The person reading your messages like Google, and that's the mailman opening your mail. Again, a person that goes to jail. So when you give them these comparisons to the offline world, I think most people actually wake up to like, oh, maybe I do care about my privacy and maybe I just haven't figured out how to protect it better.
[01:17:34] Jordan Harbinger: Speaking of stalkers, surely it's happened that somebody has used online data to find and hurt someone. I'd be shocked if that hasn't happened multiple times already.
[01:17:42] Sandra Matz: Yeah, and it's actually led to legislation. It's like a very interesting example. There's this case of a judge in New Jersey whose son was actually tragically murdered.
By someone that she persecuted before and they found the data online. Got it. For like, I think a couple of dollars from a data broker found her home, had like this entire dossier on her and her family murdered her son 'cause she wasn't there. And that led to legislation that's now protecting judges from their data being out there, being sold by data brokers.
And to me it really raises this question, if we think that judges should be protected based on their data, why not protect everybody else? Right. I think there's many other people who you would be worried about people getting their hands on your data and then tracking you down.
[01:18:24] Jordan Harbinger: No kidding. Yeah. Hey, we gotta protect judges.
Okay. What about these bazillion other people that don't necessarily have political power, that don't try to take their stuff offline? I remember that case that was particularly disgusting and I get that judges are in a more vulnerable place than a lot of other folks. But what about I. All the other people that are in a similar place.
You've got prosecutors. Sure. Okay. Part of the legal system. What about police officers? Okay. What about teachers or disciplinarians in the school environment or just like maybe I don't wanna a stalker either. How's that?
[01:18:55] Sandra Matz: Yeah.
[01:18:56] Jordan Harbinger: You know
[01:18:56] Sandra Matz: anyone, right? You a surgeon, you make a mistake. There's always the word at someone at some point has beef with you and is trying to track you down.
So I think anything that we apply to a part of the population where we worry about data, I think should apply to everybody.
[01:19:11] Jordan Harbinger: How do we do this? The book goes into detail, so we don't need to go into like weeds too much, but one of the ideas you had was preventing companies from getting too many data points.
Why is that a good idea?
[01:19:21] Sandra Matz: I think of it as a puzzle, right? So if we think about what can I learn about a person based on their data? We talked about social media just being this curated one and then your smartphone sensing, giving us a different angle. And you can imagine that once you put all of these pieces together, you get a much more accurate reading of who that person is.
So if you company who can fill every single letter in the alphabet with a subsidiary, you can imagine that they hold pretty much this entire picture of who you are. So one example, and I'm certainly not the first one to suggest that, right? Scott Galloway, Tim Bu have been saying this for years, is if we could break up the tech monopolies, at least be a way of not having them capture this picture of who you are.
I think that's probably a hard sell. I think there's easier ones where there's now technologies that allow you to. Provide the same convenience and surveys and personalization, but without having to collect the data in the first place. And for me, that's something that you can implement from today to tomorrow.
[01:20:16] Jordan Harbinger: I know in the book you talk about taxing data brokers. I liked your idea, and this is very European and I cannot see how it would happen here, but God bless you. Data co-ops, where essentially we all own our data. How would this work? Because that's like crazy talk to us Americans that we would own our own data.
[01:20:36] Sandra Matz: It's both owning the data and then collectively managing it. So the idea of data coop is saying there's people who have a shared interest in using their data. That could be like my favorite one in Europe is one that looks at patients suffering from ms. So it is like one of these diseases that is so poorly understood.
It's determined by genetics, your medical history, your lifestyle. So you need quite a lot of data from patients to understand what might be driving symptoms and how to get better. Oftentimes what happens in the medical space is you send it to pharma companies, and in the best case, it takes years for them to develop a drug and then you paying like thousands if not millions of dollars for that.
What my data does, it's essentially owned by its members, so it's people who suffer from MS coming together under this kind of legal entity of a data coop, so it's member owned and it's legally obligated to act in the best interest of their patients. And what it can do is it can essentially say, we better understand based on research how the disease work, but we can also now communicate directly with your doctors in almost like an Amazon recommendation style.
And say, we've seen patients with similar symptoms in a similar trajectory respond really positively to these kinds of treatments. Why don't you try this as well? And then the doctor can give feedback and make the system even better. But you'd be surprised. So this is one example, but in the US there's a data code for Uber drivers.
So they're essentially pooling data to see how do we optimize the routes? How do we make sure that we are not getting overly tired and exhausted? So I think there's many instances where you can actually see this playing out in the US as well. It's becoming more popular.
[01:22:06] Jordan Harbinger: It's really encouraging because it seems like something that would be almost impossible, right?
Oh, you're giving the data to Facebook, so we're not gonna share it. And I don't love the idea of always giving stuff to the government, but it almost seems like you need federal regulation of how our data gets used in order for it to not get misused. But it really is disappointing to me that these companies are not hitting the low hanging fruit of finding out who has PTSD after coming back from war or after some traumatic event, finding out who's depressed, finding out who's going down the path of getting an eating disorder.
Because I've heard about, I should say, young women especially, they search for something and then the algorithm feeds them more of that, and then they search for more of it. And so it's, it's clearly really obvious and not that hard to predict who is getting an eating disorder in real time. And then they just don't do anything about it.
[01:22:53] Sandra Matz: And it's not even that they don't do anything. So it's oftentimes reinforcing algorithms just become more and more extreme in the way that they make recommendations.
[01:23:01] Jordan Harbinger: Are you hopeful for the future of how this looks? Because I hate ending on a sour note of like, and now your kids are all gonna be depressed and have eating disorders and no one's gonna care except for us on this podcast.
[01:23:13] Sandra Matz: I think you have to. So I constantly oscillate between being totally depressed and kind of thinking about it in a more optimistic way. And I've been called naive so many times by people who say like, we're all doomed. Why are we talking about these positive use cases? And my take on this is that. We need this positive counter narrative.
And I actually think about it in the context of kids, right? You can tell your kid if your kid misbehaves and throws food on the floor and takes down all of the books, which is not hypothetical. I'm going through this. Get it as we speak. I get it, I
[01:23:43] Jordan Harbinger: feel it.
[01:23:43] Sandra Matz: So you can tell them like, don't do this because it's bad and because it's like nothing that you should be doing, but you're not gonna be really successful.
What's much more successful is to show them something that they can do instead. But if you tell them instead of throwing food on the floor, why don't you do this? The chances that you're gonna change that behavior is much, much more likely. And I think about it the same way in the context of technology.
Yeah, we can say, here's all of the challenges and do we need more regulation? Probably should we get rid of some of the abuses? Absolutely. But I think if we think about the overall trajectory of technology, if we don't have these positive, here's what you should be doing instead. I think we're never gonna get there.
So maybe we're not gonna end up with this utopian future that I sometimes have in mind, but I do think we need these positive visions to even get us started.
[01:24:30] Jordan Harbinger: Sandra, thank you so much. Really interesting episode of the show.
[01:24:32] Sandra Matz: This was very enjoyable and I, I liked it. Thank you so much.
[01:24:38] Jordan Harbinger: You're about to hear a preview with James Patterson and what would make the bestselling author walk away at the top of his game?
[01:24:44] JHS Clip: It's rare that I don't write. What I discovered was that I loved doing it. And then I started writing stories and I just loved it. I didn't know whether I was any good, but I loved doing it. And I would just write, write, write, write, write. So when the first book came out, Thomas Berryman number gave Little Brown a blurb, and he said that I'm quite sure that James Patterson wrote a million words before he started this book.
It was a great compliment. And then I decided I'd try novel. I'm really happy with the way that turned out. One of the things you always like to do at the end of the chapter is they must turn that next page. That's a strength. The weaknesses, I sometimes don't go as deep as I should. Here's the secret. Hit 'em in the face with a cream pie, and while you have their attention, say something smart.
That's it. No cream pie. They didn't even notice it, so forget about it. You're just talking to yourself. And if you don't say something smart, once you get their attention, it's irrelevant. You surprise people, which I think is important for my kind of book. We need heroes. And one of the things about the military, and it's very true in this book, in American Heroes, but also walking my combat boots, the military is about we, not me.
And one of the things I think we need to get back to a bit more is we and is hard to come by now. Duty, honor, sacrifice. It just has to be more We rather than just me
[01:26:11] Jordan Harbinger: to hear more. As James Patterson reveals the moment that changed his life and the unconventional process that's helped him sell over 400 million books.
Check out episode 1100 of the Jordan Harbinger Show. All things Sandra Motts will be in the show notes@jordanharbinger.com. Advertisers deals, discounts, ways to support the show all at Jordan harbinger.com/deals. Please consider supporting those who make the show possible. Also, our newsletter wee bit wiser.
It will make you smarter, it'll make you more practical. Something should sink in. It's only a two minute read. It's almost every Wednesday. We talk about psychology relationships. Decision making. If you haven't signed up yet, I invite you to come check it out. It is a great companion to the show. Jordan harbinger.com/news is where you can find it.
Don't forget about six minute Networking as well. That's over@sixminutenetworking.com. I'm at Jordan Harbinger on Twitter and Instagram. You can also connect with me on LinkedIn and this show. It's created in association with Podcast one. My team is Jen Harbinger, Jase Sanderson, Robert Fogarty, Ian Baird, and Gabriel Mizrahi.
Remember, we rise by lifting others. The fee for the show is you share it with friends. When you find something useful or interesting, the greatest compliment you can give us is to share the show with those you care about. So if you know somebody who's interested in data, data privacy, how AI and the internet are monetizing what they know about us, definitely share this episode with them.
In the meantime, I hope you apply what you hear on the show so you can live what you learn, and we'll see you next time.
Sign up to receive email updates
Enter your name and email address below and I'll send you periodic updates about the podcast.