← Back to blog

Redson Dev brief · PODCAST

PODCAST#AI#Product

Can the U.S. Rein in Prediction Markets? + Joanna Stern on Her Year of A.I. Experiments + Our Producer Goes to Attention School

Hard Fork · May 8, 2026

As the digital landscape continues to evolve, the boundaries between speculation, information, and influence blur, raising critical questions about how we regulate burgeoning online phenomena. This week's Hard Fork episode probes these complexities, notably diving into the contentious world of prediction markets and the broader implications of human-AI interaction. The discussion goes beyond mere reporting, offering a look into how technology shapes not just our information consumption but our very identity and attention. The episode unpacks the escalating scrutiny on prediction markets, following recent scandals that have caught the attention of U.S. Congress, including a case involving a soldier reportedly using classified information to bet on geopolitical outcomes. This segment interrogates the ethical and legal frameworks struggling to keep pace with these platforms. Additionally, journalist Joanna Stern shares insights from her year-long experiment living guided by an AI chatbot, the subject of her new book, "I Am Not A Robot." Her account provides a firsthand perspective on the profound, and sometimes unexpected, ways AI can integrate into daily life. Further, Hard Fork producer Rachel Cohn reports on her experience at the Strother School of Radical Attention, an initiative pushing back against the commodification of focus in a tech-saturated world. The deep dive into prediction markets, particularly the anecdote of the soldier betting on Maduro’s ouster, highlights the real-world impact and potential for misuse when financial incentives meet sensitive information. Stern's project, detailing her personal reliance on a chatbot, offers a powerful narrative on the current state of AI's practical application and its psychological effects. Cohn's experience at the Strother School then provides a counterpoint, advocating for a mindful detachment from the digital overdrive. For builders in software, AI, and product development, this episode serves as a multi-faceted prompt. Consider the ethical guardrails required when designing platforms that facilitate information-driven speculation. Reflect on how AI, when deeply embedded, reshapes user behavior and autonomy. Most critically, contemplate how your creations contribute to, or detract from, users' ability to maintain focused attention in an increasingly fragmented digital existence.

Source / further reading

Learn more at Hard Fork