What I Actually Learned Building a Boutique Agency with AI in 2025

Tuesday 23 December 2025

Everyone's talking about AI replacing jobs. Here's what happened when I spent a year building a boutique digital agency alongside an AI partner - the wins, the failures, and what's genuinely different now.

In This Issue:

  • The year AI stopped being a tool and became a colleague
  • What 40x velocity really means (and doesn't mean)
  • The three things that genuinely changed
  • What's different for 2026
  • The honest answer to 'will AI replace you?'

Last week I showed you the dashboard that tells truth - capacity metrics over vanity metrics, operational consciousness over emotional comfort.

This week, the Christmas edition I promised: What I learned building a boutique agency with AI in 2025.

Not the LinkedIn hype. Not the apocalyptic doom. Just what happened when I spent a year treating AI as a genuine business partner.

Some of it worked brilliantly. Some of it failed completely. All of it taught me something.


The Year AI Stopped Being a Tool

I’m going to tell you something that sounds like hype but isn’t.

In January 2025, I used AI like everyone else - generate some copy, summarise some research, maybe draft an email. Tool usage. Like a faster search engine that could write.

By December 2025, I was having strategic conversations with an AI partner about client positioning, capacity constraints, and business model evolution. Not “generate this for me” - actual dialogue about what I should do and why.

That’s not the same thing.

Here’s the shift that happened, and I can pinpoint exactly when it occurred.

June 2025. Claude Code.

Before June, I was using ChatGPT like everyone else - copy in, copy out. Useful, but limited. The AI couldn’t see my codebase, couldn’t understand my systems, couldn’t help me build anything that lasted.

Then I discovered Claude Code - an AI that could actually work inside my projects. Read my files. Understand my architecture. Build solutions that integrated with what I’d already built.

That changed everything.

Within weeks, I wasn’t asking “can you write X for me” - I was saying “let’s build this together.” The AI could see what existed, understand how it worked, and extend it intelligently. Not generating isolated snippets. Implementing real solutions.

By December, I’d built a platform worth over £100,000 in development costs - replacing every subscription SEO tool I used to pay for monthly. Rank tracking, keyword research, competitor analysis, technical audits - all built into one system that works exactly how I need it to work. That doesn’t happen with copy-in, copy-out AI.

The tool became a colleague.

I know how that sounds. I spent twenty-six years in this industry being sceptical of every “this changes everything” claim. Most of them were rubbish. This one isn’t.

But it’s also not what the hype suggests. AI didn’t replace my thinking. It extended it. Like having a developer partner who’s read everything, forgets nothing, and can implement at 2am when you’ve finally figured out what needs building.

The distinction matters: A tool does what you tell it. A colleague challenges what you’re telling it. By December, I had the latter.


What 40x Velocity Really Means

You’ve probably seen the claims. “AI makes you 10x more productive!” “100x faster with AI!”

Here’s my honest number after a year: 40-80x on specific tasks.

That’s real. I can document it. But it doesn’t mean what you think it means.

What 40x looks like:

Migrating my entire website from Wix to Astro - a complete platform rebuild that would have been a three-month project. Done in weeks. And now I rank for content. Wix was holding me back in ways I couldn’t even measure until I escaped it. That’s real.

Building a client reporting system that would have taken me two weeks of development time - completed in an afternoon. That’s real.

Writing and refining 51 weekly newsletters this year - with research, drafting, editing, and formatting that would have been a full day each, compressed to 2-3 hours. That’s real.

Creating operational documentation that captures how the business works - pages that simply wouldn’t exist otherwise. That’s real.

What 40x doesn’t mean:

It doesn’t mean I work 40x less. I work roughly the same hours. The output is 40x more.

It doesn’t mean everything is 40x faster. Client conversations take the same time. Thinking takes the same time. Relationship building takes the same time.

It doesn’t mean quality is 40x better. Speed without judgment produces fast garbage. The quality comes from knowing what to build, not from building it faster.

The velocity paradox:

When you can build things 40x faster, you don’t build 40x more things. You build the right things and iterate until they’re good.

Before AI, I’d build something once because rebuilding was expensive. Now I’ll rebuild something five times until it’s right. The total time is similar. The outcome is dramatically better.

40x velocity is a capability, not a result. What you do with that capability determines everything.


The Three Things That Genuinely Changed

Not everything changed. Most things didn’t. But three things genuinely shifted in ways I didn’t expect.

1. DOCUMENTATION BECAME POSSIBLE (AND RUTHLESS)

I now have a wiki explaining how my business works. Systems, processes, client frameworks, operational procedures - written, maintained, and searchable.

But here’s what took me six months to learn: More documentation isn’t better. Relevant documentation is better.

Early in the year, I documented everything. Every process, every decision, every edge case. The wiki grew. And grew. And became useless.

Because when you work with AI, documentation isn’t for humans reading at leisure. It’s for AI loading context before work. Every irrelevant paragraph competes with relevant information for attention. Every outdated process guide dilutes the signal with noise.

The shift: I stopped asking “should I document this?” and started asking “will AI need this to do good work?”

If the answer is no, it doesn’t exist. I produce relevant documentation or nothing.

Now the wiki gets pruned as aggressively as it gets written. Stale docs get thrown away. Redundant explanations get consolidated. The goal isn’t comprehensive coverage - it’s focused context.

This changes what’s possible. A business with focused documentation can hand work to AI and get good results. A business with sprawling documentation gets AI that’s read everything and understood nothing.

Before AI, documentation didn’t exist - the time cost was prohibitive. Now documentation is a byproduct of work. But the discipline is curation, not creation. What stays in the wiki earns its place by being useful, not just accurate.

2. STRATEGIC THINKING GOT A SPARRING PARTNER

I spent years making decisions alone. Not because I wanted to - because the alternative was expensive. Consultants charge. Advisors have agendas. Employees need managing.

Now I have a thinking partner available whenever I need one. Not to make decisions for me - to pressure-test the decisions I’m considering.

“What am I missing here?” “What’s the counterargument to this approach?” “If this fails, what’s the most likely reason?”

These conversations happen at 6am or 11pm. They happen mid-thought. They happen when I’m stuck and need to think out loud with someone who’ll push back.

The quality of my strategic decisions improved not because AI is smarter than me, but because I stopped making decisions in isolation.

3. THE CAPACITY MODEL BECAME VISIBLE

This is the big one.

Before this year, I knew roughly how busy I was. I had a general sense of capacity. I could feel when things were getting tight.

Now I have precise metrics. Time per client: 1.25 hours. Capacity utilisation: 30%. Constraint proximity: monitored weekly. The Palantir shows me everything.

This isn’t because AI is good at measurement - it’s because AI made it possible to build the measurement systems. The dashboards, the tracking, the analysis - all of it built in days rather than months.

Visibility changes behaviour. When you can see exactly where you are, you make different decisions than when you’re navigating by feel.


What Didn’t Change

For balance, here’s what remained exactly the same:

Client relationships still require human judgment.

AI can draft emails, but knowing when to call instead of email - that’s human. Reading between the lines of what a client says - that’s human. Building trust over time - that’s human.

I tried to shortcut this. It doesn’t work.

Quality still requires taste.

AI can produce volume. Producing volume that’s good requires knowing what good looks like. That’s twenty-six years of experience, not a prompt.

Fast garbage is still garbage. The speed just means you can make garbage faster.

Strategy still requires understanding context.

AI knows everything written down. It doesn’t know the unwritten things - the politics, the history, the personalities, the timing.

Strategic decisions require context that can’t be fully articulated. AI is a brilliant research assistant for strategy. It’s not a strategist.

Business fundamentals still apply.

Revenue minus costs equals profit. Deliver value or lose clients. Build reputation or stay invisible.

AI accelerates execution. It doesn’t change the physics of business.


What Failed

Not everything worked. Some things failed completely. Here’s what I learned from the failures:

The “automate everything” trap:

Early in the year, I tried to automate client communications. AI-drafted emails, AI-scheduled check-ins, AI-generated reports.

It felt efficient. Clients felt processed.

The emails were good. The relationships suffered. People can tell when they’re interacting with a system instead of a person. Maybe not consciously - but they feel it.

Lesson: Automate the preparation, not the connection.

The “more is better” fallacy:

When you can produce content 40x faster, the temptation is to produce 40x more content. I tried this. Blog posts, social content, email sequences - volume, volume, volume.

Results: More noise. Same signal. The metrics went up. The impact didn’t.

Lesson: Velocity is for iteration, not multiplication.

The “AI knows best” delegation:

For about two months, I over-delegated to AI. Let the system make recommendations. Follow the suggestions. Trust the analysis.

I stopped thinking critically. The output got worse.

AI is brilliant at analysis and generation. It’s not brilliant at judgment. When I stopped applying my own judgment, quality dropped.

Lesson: AI extends thinking. It doesn’t replace it.


The Honest Answer

“Will AI replace you?”

I’ve been asked this dozens of times this year. By clients, by colleagues, by people at conferences.

Here’s the honest answer:

AI will replace people who use AI as a replacement for thinking.

If your job is to produce generic content, AI does that faster and cheaper. If your job is to do research that could be automated, AI does that faster and cheaper. If your job is to process information without adding judgment, AI does that faster and cheaper.

AI won’t replace people who use AI as an extension of thinking.

If your job is to understand context that can’t be articulated, you’re safe. If your job is to build relationships that require trust, you’re safe. If your job is to exercise judgment that requires experience, you’re safe.

The question isn’t “will AI take my job?” The question is “what part of my job is judgment, and what part is processing?”

The processing part is going away. For everyone. Including me.

The judgment part is more valuable than ever. Because everyone else’s processing is going away too, and they still need judgment.

My answer to clients:

“I use AI extensively. It makes me 40x faster at building things. But the decisions about what to build, and whether it’s working - that’s twenty-six years of experience. The AI doesn’t have that. Neither does your competitor who just discovered ChatGPT.”

That’s the honest position. Some clients want it. Some don’t. The ones who want it get dramatically better results at the same price. The ones who don’t can find someone else.


What’s Different for 2026

Based on a year of evidence, here’s what’s changing:

Ruthless documentation curation.

The wiki works - but only when it’s focused. Sprawling documentation is worse than none.

2026: Every document earns its place or gets deleted. The question isn’t “is this accurate?” - it’s “does AI need this to do good work?” If no, it goes. The goal is a wiki that gives AI focus, not a library that gives AI confusion.

Fewer clients, deeper relationships.

The Ritz model - 30 covers maximum. Not because I can’t handle more. Because I shouldn’t.

2026: The focus is relationship depth, not client volume. Each client gets more attention, not more clients get divided attention.

More strategic work, less tactical execution.

AI handles tactical execution brilliantly. I don’t need to do it anymore.

2026: My time goes into strategy, relationships, and judgment calls. The stuff AI can’t do. The stuff that differentiates the business.

Transparent about the partnership.

I stopped hiding that I use AI. It’s in the newsletter. It’s in client conversations. It’s part of how I work.

2026: Full transparency. “We” means me and my AI partner. Clients who want pure-human can find it elsewhere. Clients who want results work with me.


The 2025 Summary

If I had to summarise what I learned in one paragraph:

AI is a force multiplier for capability, not a replacement for judgment. The people who treat it as a magic button get magic-button results - impressive at first, generic quickly. The people who treat it as a thinking partner get thinking-partner results - compounding improvement over time.

Here’s the thing nobody talks about: “prompting” as everyone understands it is not the way to work with AI.

The LinkedIn crowd obsesses over “killer prompts” - the perfect incantation that unlocks AI’s power. They’re missing the point entirely.

The way to work with AI is to give it as much relevant context as possible, then ask natural language questions. Not clever prompts. Context. Understanding. The real situation.

In Le Guin’s Earthsea, wizards who know the true name of something can work with it. Those who only know surface words are just shouting into the void. Same principle. Give AI the true context - your codebase, your constraints, your actual situation - and it becomes a partner. Feed it clever prompts without context, and you get clever-sounding nonsense.

The people still crafting “killer prompts” will be standing at the roadside wondering why everyone else ran past.

That’s not hype. That’s what happened when I spent a year figuring it out.


Looking Forward

2026 is going to be interesting.

Not because AI will be dramatically different - the fundamentals are established. But because the gap between “using AI well” and “using AI badly” is going to become obvious.

Everyone has access to the same tools now. The differentiation is what you do with them.

Some agencies will use AI to produce more generic work faster. Race to the bottom.

Some will use AI to do deeper work for fewer clients. Race to the top.

I know which race I’m running.

This is newsletter #52. A full year of weekly writing. Thank you for reading. Thank you for replying. Thank you for the conversations that started with “I read your newsletter and…”

Some of you have been here since week 1. Some joined recently. All of you made it worth writing. 2026 continues next week. Same time, same place.

Rest well. The work will be there when you get back. It always is. But for now - close the laptop, pour something decent, and remember why you started doing this in the first place.

Merry Christmas.


Tony Cooper We Build Stores - Where 26 Years of Experience Delivers in One Hour What 26 Hours of Not Knowing Cannot

tony.cooper@webuildstores.co.uk 07963 242210


This Year: AI went from tool to partner. 40x velocity means iteration, not multiplication. Documentation became possible - but only ruthlessly curated documentation that gives AI focus, not sprawl that creates confusion. Strategic thinking got a sparring partner. The capacity model became visible. What didn’t change: relationships, quality, judgment, and business fundamentals. What failed: automating connection, producing more instead of better, over-delegating judgment. 2026: ruthless curation, deeper relationships, fewer clients, transparent partnership, more strategy, less tactics.