Bryan Cassady and I explore how to move from “using AI” to truly “working with AI” by setting clear objectives, building simple systems, and treating AI as a teammate rather than a vending machine. Bryan shares the SPARKS framework, real collaboration examples, and a bold experiment that made his book more useful and more widely read.
• Objectives first over tool-first adoption
• Using AI vs working with AI as partner
• Systems thinking and plan–do–study loops
• Context, stories and better prompting
• SPARKS framework for practical workflows
• AI as feedback and revision engine
• Custom instructions, red teaming and confidence levels
• Myths of innovation and human–AI handoffs
• Making books interactive with AI tools
• Building a searchable second brain from content
This episode is sponsored by Metricool. Use the special code JOERI to get 30 days of Metricool Premium for free!
Looking for the structured conversation and key takeaways for CMOs and AI marketing leaders? Read the cleaned and structured reference version here:
How to Work With AI, Not Just Use It – with Bryan Cassady
How Objectives-First Teams Win With AI
Many teams rush to adopt AI yet struggle to show real gains because they bolt tools onto old habits. The conversation with Bryan Cassady reframes the challenge: stop treating AI like a vending machine and start treating it like a teammate. That shift starts with objectives. If a team cannot say what decision will change, what workflow will speed up, or what quality bar will rise, the tool will drift toward novelty. Objectives clarify what success looks like and anchor prompt design, context sharing, and evaluation. The mindset move sounds simple, but it unlocks better collaboration, since you now brief AI like you would brief a colleague—sharing constraints, target audience, and trade-offs—so responses map to business reality.
Bryan's definition of a generative organization is practical: create new value from what already exists by designing systems where AI augments people. That means mapping key loops—plan, do, study—and placing AI where it reduces friction or raises insight. The hard part is resisting the lure of one-shot prompts and instead building small, reliable exchanges. Humans bring taste, intuition, and risk judgment; AI brings speed, breadth, and pattern recall. The win comes from the order of handoffs. For ideation, you might start human, then iterate with AI to widen options, then return to human review. For analysis, you might let AI surface anomalies first, then ask focused human questions. Sequence beats heroics.
Context emerged as the secret ingredient. When teams complain that AI gives generic fluff, they often have not supplied the story behind the task. Stories carry goals and stakes, which sharpen the model’s predictions. Treat the model like a smart intern on day one: it learns fast but needs your playbook. Ask it to ask you questions before generating output, so it pulls missing details rather than guessing. Use regeneration and feedback to explore options, compare angles, and avoid first-draft bias. And pause to think. A short break prevents you from accepting reasonable-sounding answers that miss the mark. These small habits compound into better outcomes.

Bryan Cassady
To make the habits stick, Bryan offers SPARKS: Speak your thoughts to clarify messy context; Pivot the interaction so AI questions you first; Ask for more by regenerating and giving feedback; Reframe the brief to view the problem from new roles; Keep going past the obvious to reach nontrivial ideas; Stop and think to review quality and logic. SPARKS works because it blends human intention with machine exploration. It also reframes AI as a revision engine. Instead of expecting perfect output, expect critique, blind-spot checks, and structured improvement. Assign personas to review your draft and stage a debate that surfaces risks and counterarguments you missed.
The same systems view guided Bryan's book experiment: remove the paywall and add AI tools so readers can search, quiz, and apply ideas. Usage rose, and counterintuitively, sales did too. The lesson is broader than publishing. When you lower friction to understanding and practice, adoption climbs. For leaders, this means documenting processes, capturing tacit knowledge, and making it queryable. Upload talks, notes, and guides into a secure knowledge base so teams can ask, “What would we do here and why?” Then embed quality checks in custom instructions: push for source validation, red-team assumptions, and report confidence levels. In a faster market, systems that learn beat tools that shine.






