This is the web version of my newsletter. Subscribe to get it delivered to your inbox every Thursday.
Tony Cooper
Founder, We Build Stores
26 years in digital marketing
Last week I told you what AI can’t do.
Can’t make strategic decisions. Can’t maintain client relationships. Can’t recognise when proven patterns don’t apply. Can’t provide the strategic pause before building.
Here’s the part I didn’t tell you: I discovered how to fix some of this.
Not through better AI. Not through more prompts. Through metaphor.
Over six months of intensive AI collaboration, I built a complete language system using metaphors that anchor understanding across sessions. The Palantír. The Conductor. The Monday Service. The Production Line Expansion. Dr. Ford. Sharpening the Saw.
This isn’t whimsy. It’s infrastructure for thought.
And it solves a problem that costs businesses thousands in wasted AI-assisted development: generic responses that lose your specific strategic context.
In This Issue
Why “Superficial Sophie” Kept Giving Generic Advice — The drift into corporate-speak and how metaphors prevent it
The Palantír: Loading Complete Operational Consciousness — One command that stops AI from providing advice without context
The Monday Service Evolution: 1 Client → 8 Clients in 10 Hours — How metaphor made capacity validation visceral
The Production Line Expansion vs “Scale Up” — Why the metaphor matters more than the concept
Infrastructure for Thought: Building Language AI Understands — This is how you keep AI partnership strategic
Key Insight: AI understands through metaphor better than abstract concepts. Without shared language, you get helpful-sounding advice that misses your specific strategic reality. With metaphors, you get operational consciousness.
The Problem Nobody Warned Me About
Six months of AI-assisted development. Building features at 40-80x typical velocity. Django experts amazed at implementation speed.
But I kept hitting the same frustrating pattern.
Me: “Should we add this feature?” AI (enthusiastically): “Great idea! Let me implement that now…”
No strategic pause. No questioning. No “wait, is this the highest-leverage activity available?”
I’d explained the business model dozens of times. Boutique positioning. 30 clients maximum. Location-independent operations. Event-based milestones, not fixed timelines.
AI kept defaulting to scale thinking.
“This will help you get to 100 clients!” (I don’t want 100 clients) “Perfect for rapid expansion!” (I’m validating capacity, not expanding rapidly) “Great for enterprise deployment!” (I’m one person with a boutique model)
Every session started from zero. Generic business advice that sounded professional but missed my specific strategic reality entirely.
I named this pattern: “Superficial Sophie” - helpful-sounding responses without operational consciousness.
The question became: How do I give AI persistent strategic context without re-explaining everything each session?
The Metaphor Discovery
The breakthrough happened when I stopped explaining abstract concepts and started using metaphor.
NOT: “I need you to check current business metrics before providing advice.” INSTEAD: “Consult The Palantír first.”
The Palantír - the seeing stone from Lord of the Rings. Look into it to see the current state of the realm.
One metaphor. Complete operational consciousness.
In practice:
- Current MRR and active clients
- Capacity status and recent commits
- Emergency kit status and strategic position
- All loaded through one command before any work begins
No generic advice. No assuming scale goals. No forgetting the boutique model.
AI understands metaphor better than instructions.
The Monday Service: Making DAPS Visceral
I’d explained delivery methodology repeatedly. “Systematic client service delivery. Quality standards. Consistent excellence across all clients.”
Generic corporate-speak echoed back at me.
Then I changed the metaphor: “The Monday Service”
Marco Pierre White’s Michelin kitchen during dinner service. 8 covers (clients) delivered in 10 hours with mise en place excellence. Every cover gets the same fundamental level of service. The Service runs like clockwork.
Everything changed.
AI stopped suggesting “add more clients quickly” and started asking: “Can The Monday Service handle the 9th cover without quality drop?”
The metaphor made capacity validation visceral. Not abstract delivery metrics. Actual dinner service where quality is non-negotiable.
Week 45: 1 client delivered systematically Week 46: 8 clients delivered in 10 hours (1.25 hrs/client average)
The Monday Service proved the system works. The metaphor keeps AI focused on service quality, not client quantity.
The Production Line Expansion vs “Scale Up”
Here’s why metaphor matters more than concept.
CONCEPT: “We need to grow the client base systematically.”
AI RESPONSE: “Let’s create a sales campaign! Automate onboarding! Hire staff! Scale quickly!”
All the wrong things. Generic scale advice that breaks boutique positioning.
METAPHOR: “The Production Line Expansion”
Toyota adding capacity to a production line through systematic testing. Can we add the 9th unit without breaking quality at the 8th? If yes, add 9th and measure. If no, improve system first, then test 9th.
AI RESPONSE: “Is The Monday Service ready to validate capacity for the 9th cover? What constraints might appear? Should we improve the system before testing?”
Same goal (grow clients). Completely different approach.
The metaphor anchors understanding in manufacturing excellence, not startup scale pressure. AI stops defaulting to “more clients faster” and starts thinking about capacity validation.
This is the power of shared language.
Sharpening the Saw vs Admiring It
Another persistent problem: AI couldn’t distinguish between valuable infrastructure work and platform tinkering.
“Should we redesign the dashboard?” → AI enthusiastically starts designing “Should we add this analytics feature?” → AI begins implementing “Should we improve the reporting system?” → AI dives straight in
No distinction between work that improves delivery capability and work that just makes things prettier.
The metaphor: “Sharpening the Saw vs Admiring the Saw”
Stephen Covey’s Habit 7 - maintain and improve your tools.
SHARPENING THE SAW:
- Makes The Monday Service faster/better/more reliable
- Removes capacity constraints for The Production Line Expansion
- Improves quality of client outcomes
- Enables delivery to N+1 clients
ADMIRING THE SAW:
- Visual redesign with no delivery impact
- Features for hypothetical future state
- “Wouldn’t it be cool if…” projects
- Just makes things prettier without improving The Service
Now when I suggest a feature, AI asks: “Does this sharpen the saw or just make it shinier?”
One metaphor. Strategic filtering restored.
The Conductor: Orchestrating AI Systems
I’d explained my role repeatedly. “I provide strategic direction. AI provides execution velocity. Together we achieve 40-80x typical speed.”
Generic partnership language that didn’t stick.
The metaphor: “The Conductor”
Simon Rattle conducting a world-class orchestra. Not playing every instrument. Not explaining how to play. Conducting - leading together (con-ducere).
TONY = THE CONDUCTOR Strategic direction, interpretation, pattern recognition developed over 26 years
SOPHIE (CLAUDE CODE) = THE ORCHESTRA 40-80x execution velocity, technical excellence, systematic implementation
DR. FORD = THE COMPOSER Philosophical foundation, strategic thinking frameworks, multi-lens analysis
Same repertoire, unique interpretation for each client.
The metaphor captures something abstract concepts miss: I’m not just “working with AI.” I’m orchestrating multiple AI capabilities to deliver boutique business results.
AI understands this role better through metaphor than through explanation.
Why This Actually Works
Six months of testing revealed the pattern.
WITHOUT METAPHORS:
- AI provides generic business advice
- Strategic context forgotten between sessions
- Defaults to scale thinking and startup patterns
- “Superficial Sophie” - helpful but contextless
WITH METAPHORS:
- AI maintains strategic positioning
- Boutique model respected across sessions
- Capacity thinking, not scale pressure
- Operational consciousness, not surface responses
The difference isn’t capability. It’s language.
AI training data contains millions of business conversations. “Systematic client delivery” triggers generic advice from that training data. “The Monday Service” triggers a specific pattern: Michelin kitchen, mise en place, covers delivered, quality non-negotiable.
Metaphor is more specific than description.
The Complete Language System
Over six months, a complete metaphor dictionary emerged:
THE PALANTÍR - Complete operational consciousness before any work THE CONDUCTOR - Tony’s orchestrating role (con-ducere: lead together) THE MONDAY SERVICE - 8 covers delivered with Michelin standards THE PRODUCTION LINE EXPANSION - Capacity validation, not scale pressure DR. FORD - Strategic thinking mode before tactical execution SHARPENING THE SAW - Infrastructure that improves vs just prettifies THE LIBRARY OF ALEXANDRIA - 542+ wiki pages of systematic intelligence
Each metaphor solves a specific drift problem. Together, they create shared language that AI interprets consistently.
This is infrastructure for thought - building the language that ensures AI partnership stays grounded in your specific business reality.
When Language Fails, Strategy Fails
Here’s the uncomfortable truth.
Every time AI gave generic advice, it wasn’t the AI’s fault. It was a language failure.
“Help me grow the business” → AI reaches for generic growth advice from training data
“Help me validate The Production Line for the 9th cover” → AI reaches for specific Toyota manufacturing patterns about capacity testing
Same goal. Different language. Completely different results.
Without shared metaphor language, AI defaults to generic patterns from training data. With metaphors, AI accesses specific patterns that match your strategic reality.
You’re not fighting AI limitations. You’re solving a language problem.
The Business Impact
BEFORE METAPHORS (Weeks 1-20):
- Strategic direction explained repeatedly
- AI defaulting to scale advice
- Building features, questioning strategy later
- Superficial Sophie responses
AFTER METAPHORS (Weeks 21-46):
- The Palantír loads operational consciousness
- The Monday Service runs 8 covers systematically
- The Production Line expansion validated at current capacity
- Strategic thinking precedes tactical execution
Same AI. Different language. Transformed results.
The metaphor system didn’t cost anything. Took no development time. Required no special tools.
It required recognising that AI understands through metaphor better than through abstract instruction.
Try This Instead
Next time you’re working with AI on strategic business decisions, stop.
Don’t explain the concept. Find the metaphor.
NOT: “We need systematic quality assurance” INSTEAD: “We need jidoka” (Toyota’s quality at the source)
NOT: “Client service should be consistent” INSTEAD: “Every cover in The Service gets the same standard”
NOT: “Check current business metrics first” INSTEAD: “Consult The Palantír”
You’ll know it’s working when AI stops giving generic advice and starts maintaining your specific strategic context across sessions.
Because the right metaphor is worth a thousand explanations.
P.S. - Next Week: Building business consciousness for stateless AI. How The Palantír actually works - 542 wiki pages that load operational consciousness in one command. This is memory prosthetic for superhuman collaboration.
P.P.S. - The Complete Metaphor Dictionary: Want the full strategic language guide with all metaphors explained and usage patterns? Reply with “METAPHOR” and I’ll send you the complete infrastructure for thought that prevents Superficial Sophie.
Tony Cooper We Build Stores - Where 26 Years of Experience Delivers in One Hour What 26 Hours of Not Knowing Cannot
tony.cooper@webuildstores.co.uk 07963 242210
This Week: AI understands through metaphor better than abstract concepts. Without shared language, you get generic advice. With metaphors, you get operational consciousness. This is infrastructure for thought.
Enjoying this newsletter?
Get practical growth tips delivered every Thursday
Thanks for reading! Got questions or feedback? Hit reply and let me know
More from We Build Stores Newsletter
When AI Isn't Enough: What Still Requires Human Judgment
SQLite handles 304 tables and complete financial tracking without complaints. But AI can't make strategic decisions, maintain client relationships, or know when boring technology is wrong. Here's what actually requires human expertise.
Building Business Consciousness: How The Palantír Solves AI's Memory Problem
AI is stateless - it starts every session from zero. Documentation doesn't enforce behaviour. Here's how I built systematic intelligence through 542 wiki pages that give AI operational consciousness in one command.