Articles/Reference

Glossary: AI Infrastructure Terms Every Firm Leader Should Know

Skills, Cowork, MCP, agents, orchestration — defined plainly for firm leaders who need the concepts without the jargon.

February 2026·8 min read

These are the terms that come up regularly in conversations about firm-level AI infrastructure. Most have technical definitions that are not particularly useful for firm leaders. The definitions below are built for practitioners who need to understand the concepts well enough to make decisions — not to implement them.

Skill

A reusable unit of AI capability, built from a prompt but designed for others to use. A Skill specifies what input it needs, what it does with that input, what the output looks like, and what situation it is meant for. Unlike a one-off prompt, a Skill can be handed to someone who did not write it and produce reliable results.

Skills Library

An organized collection of Skills that a firm's team can access and run. A good Skills library has naming conventions, use-case descriptions, ownership assignments, and governance for how Skills are added, updated, and retired. A prompt folder is not a Skills library unless it has those things.

Cowork

An extended AI session in which AI handles a multi-step task while you provide direction and judgment at key points — rather than a single prompt-and-response exchange. Cowork is appropriate for longer, more complex tasks: research synthesis, document review, multi-section reports. The quality of a Cowork session depends on the quality of the briefing.

MCP (Model Context Protocol)

A standard that allows AI to connect to and interact with external systems: your Google Drive, your Notion workspace, legal databases, financial data feeds, CRM platforms. MCP is the underlying technology that makes data connections possible. You do not need to understand how it works; you need to understand what it enables.

MCP Connector

A specific integration that links an AI tool to a particular data source or software system via MCP. A Google Drive connector lets AI search your Drive. A legal database connector lets AI query case law. Connectors are typically published by the data provider or by Anthropic; most do not require developer involvement to set up.

Data Connection

The live link between an AI environment and an external data source. When a connection is in place, AI can retrieve information from that source on demand rather than requiring you to manually find and paste the information into context. A connection to your deliverables archive means AI can search past work; a connection to a market data feed means AI can query current prices.

Context Window

The amount of text an AI model can hold in active memory at once — the conversation, any documents you have loaded, the instructions in the Project, and whatever the model has retrieved from connected sources. Context windows are large but not infinite. Knowing the limit matters when you are deciding whether to load documents into context directly or access them through a data connection.

Claude Project

A persistent AI environment set up with your firm's context: instructions, documents, tone guidelines, and any Skills or data connections you have configured. Everyone who uses the Project starts each session with the same foundation. Unlike a fresh chat, a Project carries your firm's context forward automatically.

Orchestration

The coordination of multiple AI actions or agents to complete a task that is too complex for a single model call. Orchestration is what happens in advanced AI systems where one AI instructs others, checks their work, and assembles the results. This is 303-level work — it requires the infrastructure that 202 builds.

Agent

An AI system that can take actions — not just generate text. An agent can search the web, query a database, write and execute code, send a message, or take steps in a software interface. Building reliable agents requires documented workflows, working data connections, and clear success criteria. You do not need agents before you have those things in place.

Firm OS

An informal term for the integrated set of AI infrastructure a firm builds: Skills library, data connections, documented workflows, and governance. A firm OS is not a product you buy; it is what you end up with after doing the work in 202. It is the foundation on which custom tools are built in 303.

Workflow Documentation

A written description of how a task gets done, specific enough that someone who has never done it before could follow it. In the context of AI infrastructure, workflow documentation includes which Skills to use, in what order, with what inputs, and at what points human judgment is required. The goal is institutional knowledge that survives when a person changes roles.

For foundational AI terms - LLMs, prompts, Projects, hallucination, tokens - the 101-level glossary covers those. The terms above build on that foundation.

Next step

Ready to turn what your team knows into something that lasts?

Apparatus 202 gives your team a Skills library, connected data sources, and workflows the firm owns. One session — no ongoing subscription.