Progressive discovery is a client-side pattern for managing tool context in MCP-connected agents. Instead of loading every tool schema from every connected server into the model's context window at once, the client provides a single meta-tool (like tool_search) that lets the model look up and load tool schemas on demand.
The problem it solves is straightforward: as agents connect to more MCP servers, the combined tool schemas consume significant context. A dozen servers with 10 tools each means 120 tool definitions competing for attention before the model has even seen the user's message. Most of those tools are irrelevant to any given turn.
How It Works
- At connection time, the client downloads tool schemas from all connected MCP servers but does not inject them into the model's context
- The client provides one meta-tool — a tool search/lookup function — that the model can call to discover available tools by keyword or category
- When the model determines it needs a capability, it calls the search tool, receives matching schemas, and those schemas are loaded into context for that turn
- The model then calls the actual tool with full schema awareness
This adds one extra inference step on turns where new tools are needed, but dramatically reduces baseline context usage on every turn.
Implementation Examples
Claude Code ships this pattern natively. Before progressive discovery, all MCP tool schemas were loaded upfront. After adoption, Claude Code showed a "massive reduction" in tool context usage per the MCP team's own measurements.
Anthropic API supports this via the standard tool-use flow — clients can implement the meta-tool pattern themselves without special API features.
Self-built clients can implement this with any model that supports tool use: maintain a local tool registry, expose a search function as the only always-loaded tool, and inject matched schemas dynamically.
When to Use It
Progressive discovery matters when:
- An agent connects to 3+ MCP servers or has 20+ total tools
- Context window pressure is a concern (cost, performance, or attention dilution)
- Tools are specialized enough that most are irrelevant on any given turn
For small tool sets (under 10-15 tools), the overhead of the extra lookup step likely isn't worth it — just load everything.
Relevance to MythOS MCP
📝MythOS MCP currently ships ~20 tools. As the tool surface grows (graph traversal, community features, template operations), progressive discovery becomes the pattern that prevents context bloat on the client side. MythOS already practices a form of this conceptually — get_context loads augmentation context on demand rather than embedding it in every tool response — but the formal pattern applies at the tool schema level.
This is one of those patterns that seems obvious in retrospect but required the ecosystem to hit a wall first. The MCP team presented it as the No. 1 thing client developers need to build in 2026. It's also a good reminder that protocol design and client design are separate problems — MCP puts information on the wire, but the client decides how much of it the model sees.
