Not the Jetsons
Plus: More Bounce in California
Dear Citizens,
Good news from a courtroom in California, where a judge just ruled that the State’s Age-Appropriate Design Code could move forward. It’s a win for young people and our friends at Design It for Us, who have been driving the suit. We’re also waiting for a ruling in the landmark social media addiction trial, where in many ways, the platforms have already lost.
Here’s what else to read, listen to and do this week.
✨👦🛸 ONE READ: This Show is a Rerun ✨👦🛸
How are AI summaries fuelling disinformation? Friend of the Citizens Jessica Gonzalez has been playing with chatbots, and is not encouraged.
Here’s where I confess that I skim AI summaries when making low-stakes decisions about which fiction series to read next. But I’ve noticed a worrisome trend when I’ve researched more serious topics, like trying to understand my friend’s cancer diagnosis. Not only did Gemini feed me a series of bullet points with conflicting assertions of fact, the sources for the top answers weren’t leading cancer research institutes, but junk websites I’d never heard of.
Not exactly the Jetsons. In her piece, Jessica sums up the latest research on “LLM biases,” and reminds us that we’ve seen this show before - whatever the tech, whatever the year, social media algorithms are built for bigotry and division.
👉 Bonus read: From November but worth revisiting - Marietje Schaake looks at chatbots and the risks they pose to democracy. AI companies vowed their tools wouldn’t influence voting choices, so why are they recommending political parties?
The Citizens is a reader-supported publication. To support our work, consider becoming a free or paid subscriber.
🎧 ONE LISTEN: Droning On 🎧
How is AI enabling even more horrific warfare in the Middle East? In The Intercept’s latest podcast, technology reporter Sam Biddle explained how America is using AI in the growing war:
“Airstrikes, air war generally is already so prone to killing innocent people even when you take your time. But whenever you try to hurry for the sake of hurrying — and AI is great at enabling that — you just increase over and over again the chance of killing someone that you didn’t intend to or didn’t care enough to avoid killing. So I think that is an immense risk of just accelerating the metabolism of killing from the air by drone, by airplane — with the stamp of ‘intelligence’ that these AI companies are really pushing.”
👉 Listen to the full conversation here.
ICYMI: We wrote an explainer on how chatbots are being folded into the machinery of war, and why we’re not ready. Watch for our video explainer on socials this weekend 👀
👉 Bonus resource: This terrifying study found that in simulated war games, AI systems given autonomous control over weapons opted to use nuclear weapons in 95% of cases. Strikingly, none of the models ever chose surrender. Alarmed yet? Us too.
😮💨 ONE ACTION: More Mandelson!? 😮💨
Did Peter Mandelson, when he wasn’t doing all the other horrible things he was doing, help steer Palantir’s contract with the UK? We’ll never know! Or will we… Foxglove is on top of it, and calling for action at a critical moment.
👉 Join their call for transparency.
Every week we surface practical ways to turn concern into action. Subscribe to get it in your inbox.
👉 Bonus action: What does a US spy-tech firm have to do with Britain’s health service? Quite a lot, it turns out. Palantir signed a £330m contract with the NHS in 2023 to build its new data platform.
A new briefing from human rights groups and health organisations warns the software could enable data-driven abuses of power, including US-style ICE immigration raids.
The report launches Monday at 7pm, with speakers including Palantir whistleblower Juan Sebastián Pinto.
👉 RVSP here.
That’s all for this week,
Team Citizens
*PS - Join our Signal Chat if you haven’t already! There's over 350 people in there now.