Why Your Firm's AI Use Isn't Compounding (And What to Do About It)
The difference between ad hoc AI use and a systematic approach — and why it matters more than the tools you pick.

Most professional services firms are using AI. A lot of them. People are using it to draft documents, summarize research, answer questions, and speed up tasks that used to take twice as long. And yet, six months in, the firm isn't meaningfully better at anything. The AI use is real, but nothing is accumulating.
This is the ad hoc trap. Individual people are getting value from AI, but none of it is becoming institutional. When someone figures out a great prompt, it lives in their browser history. When they leave, or just switch computers, it's gone. The next person starts from scratch.
What compounding actually looks like
The difference between using AI and building with it
Compounding AI use has a specific shape. Someone figures out the right way to prompt for a client deliverable. That prompt gets saved and shared. Someone else uses it, improves it, and puts the better version back. Over time, the firm's output quality on that task improves - not because everyone got smarter, but because the learning didn't evaporate.
That's infrastructure. And it's the thing that separates firms who are genuinely ahead on AI from firms who are just busy with it.
Ad hoc AI use feels like progress because it produces better individual outputs. But it doesn't build anything. The firm runs on whatever the median person happens to know, in whatever AI tool they happen to prefer, using whatever prompt they happened to try first.
Why it stays ad hoc
The three reasons firms don't systematize
Nobody owns it
Systematizing AI use isn't anyone's job. The managing partner is busy. The person who uses AI most isn't thinking about it structurally. "We should share prompts" becomes an item on a list that never gets prioritized.
There's no obvious place to start
When you're trying to systematize something as broad as "how we use AI," the scope is paralyzing. Where do you even begin? What counts as a good prompt to save? What tool should the library live in?
The immediate ROI is already there
Individual AI use is already producing value. It's easy to stop at "this is working" without asking whether it could compound. The gap between good-enough and actually-building-something is invisible until you step back from it.
What to do about it
The three pieces of infrastructure that compound
Systematizing AI use doesn't require a technology project or a six-month rollout. It requires three things, done in order.
A shared standard for what a good prompt looks like
If everyone on your team has a different mental model of how to prompt, you can't share prompts effectively - they won't know how to read or adapt them. Getting everyone to the same baseline understanding of prompt structure is the prerequisite for everything else. This is what a session like Apparatus 101 is designed to do.
A prompt library
Somewhere prompts that work actually live, organized by workflow, maintained by someone who owns it. Not a folder. A system people can find and trust. A library with ten good prompts that get used beats a folder with fifty that nobody opens.
A norm around contribution
The library only stays useful if it grows. The simplest norm: if you use a prompt three times and it works, it goes in. That threshold keeps out experiments without requiring formal evaluation. The library gets better as the team uses it.
The timeline question
Why this matters more now than it did a year ago
A year ago, the gap between firms using AI and firms not using AI was meaningful. Today, almost everyone is using it. The gap that's opening up now is between firms with infrastructure and firms without it.
Firms with a shared prompt library, a common standard for output quality, and a norm around sharing - those firms are improving continuously. Their AI use gets better every month because every good prompt someone writes makes the next person better too.
Firms still in the ad hoc phase are roughly the same as they were a year ago, just with more individual users. The technology isn't the differentiator anymore. The infrastructure around it is.
