🎥 Video Link
Links referenced for video
- https://youtu.be/PArFP7ZJrtg?t=509 - Edward Snowden clip in intro
- https://www.europarl.europa.eu/doceo/document/E-10-2025-003250_EN.html - Chat Control article
- https://www.aclu.org/news/national-security/surveillance-company-flock-now-using-ai-to-report-us-to-police-if-it-thinks-our-movement-patterns-are-suspicious - Flock article
- https://theconversation.com/tech-giant-palantir-helps-the-us-government-monitor-its-citizens-its-ceo-wants-silicon-valley-to-find-its-moral-compass-260824 - Palantir Article
- https://privacy.anthropic.com/en/articles/10023548-how-long-do-you-store-my-data - Anthropic data retention policy
- https://intheshellpodcast.com - In the Shell Podcast
- https://yellowball.fm - 🟡 Yellowball, don’t just host your podcast, own it
Transcript
Please excuse any grammatical errors. I used a tool to generate the transcript and haven’t had a chance to read through it yet.
And everything we do now lasts forever. Not because we want to remember it, but because we’re no longer allowed to forget. So, I saw this clip a few days ago, and I know that it’s about 5 years old at this point, but the way that Snowden phrased it really stuck with me. I typically think about privacy from, I want to protect my data from these big companies or from governments. But when you look at it from the perspective of everything we do today is permanent, and it’s not because we want to remember it, it’s because we are no longer allowed to forget it. And when I heard that and started thinking about it, it really stuck with me about why I do the things that I do to try and preserve my privacy.
And the reason I’m making this video is because just yesterday I got this email from Claude, or Anthropic. By the way, I’m not a privacy purist when it comes to using AI. I try to use it responsibly, not share my data with it. It is helpful. I’m in the tech field, so got to use it. But in the email they sent, under the updates to data retention, your choices and controls: if you choose to allow us to use your data for model training, we will retain this data for 5 years.
In the back of my mind, I already knew that companies keep your data for an indeterminate amount of time, whether it’s for legal purposes or for their own analytics. But to see it spelled out so clearly and sent to all their users really just enforced that we are not allowed to forget anything anymore. Do you remember what you were doing last week, or last month, or what you were going through 3 years ago? Maybe you’re using AI because you had a question about something or some health condition, or maybe you were going through something, or a breakup, and maybe you had no one else to turn to, so you used your chat.
Whether or not you agree with that situation or using it for that purpose, people do, and people have, and are using it for that. But not everyone’s aware that that data is then being stored and used and retained, and there is a permanent record of that. Whether or not this little toggle works, that’s beside the point. But it really just got me thinking about how little privacy and ephemerality is left in daily life.
I mean, as you’re watching this video, wherever you are right now, look around the room. Is there an Alexa sitting in the corner? Do you have an iPhone that has Hey Siri turned on so that it can be there whenever you need it? When you walk around your neighborhood, are there Ring doorbells watching every step you take — where you go, what time it was at, who you were with, what you were doing? These recordings are all being uploaded to the cloud. They’re being transcribed. They’re being stored. They’re being associated with you. Whether or not that’s for marketing purposes or just for general surveillance.
And if that’s not enough, we even have governments at the highest levels doing whatever they can to surveil their citizens’ private communications. We have the EU with chat control in their attempt to protect the children. And in the US, we have police using Flock, which are these little cameras on the back of their cars that record license plates and tell them real-time locations of vehicles. The government wasn’t able to conduct this type of surveillance themselves due to regulation. So instead, they use a private company to do it. And now there’s a national database of Americans’ movements in their vehicles. And you couple that with AI, which is being used to report people to police if they detect suspicious movements, all in an attempt to predict crime and stop it before it happens.
And then we have officials at the highest levels spreading FUD — fear, uncertainty, and doubt — to convince citizens that things are so dangerous that we need to weaponize surveillance against them, with companies like Palantir having access to all of our data in order to keep you safe and prevent violent crimes.
I think we’ve become conditioned to a certain level of surveillance and kind of assume that all of our communication is recorded and monitored, which is not a great way to live. And just because we’ve become accustomed to it does not mean that it’s right. So one suggestion I would give, which is something that I’ve done, is at least try and make your home a place where you can have private, unfiltered conversations, share ideas, without having something listening in the background. Unplug Alexa, turn off Siri. There are a couple small things you can do to make sure that you can restore your right to ephemeral communication and not have to worry about that being stored and used against you someday.
I have a few more videos I just finished recording. I just need to finish editing those before I’ll publish them. This one was a bit more timely, which is why I’m releasing it first, but stay tuned for those coming up in the future. If you want to hear more from me in the meantime, I do have a podcast called In the Shell. You can head on over to intheshellpodcast.com to find out where to listen. And I also have a monthly newsletter that I send out, which you can sign up for at sidebretoritos.com. And if you have any questions or comments, feel free to leave those down below, and I’ll see you next time.