A central agent reads a complex task, decomposes it into subtasks at runtime, dispatches each to a specialist worker, and synthesizes the results. Unlike a Pipeline, the decomposition is generated by the model — not fixed at design time.
A venture fund runs an AI agent for investment due diligence. Each new deal requires a different set of investigation workstreams: a fintech deal needs regulatory compliance analysis; a deeptech deal needs IP landscape research; a consumer deal needs cohort retention analysis. No two deals require exactly the same DD structure.
A fixed Pipeline cannot handle this — the stages differ by deal type. A human analyst reads the pitch deck and decides which workstreams to run. The Orchestrator-Workers pattern replaces this analyst: the Orchestrator reads the deck, generates a DD plan (which workstreams, in what order, what questions each must answer), dispatches specialist workers to each stream, and synthesizes results into an IC-ready memo. The workers — market researcher, team profiler, financial modeler, technical assessor — are fixed. What changes each time is which workers get activated and what questions they receive.
| Metric | Signal |
|---|---|
| Decomposition coverage | Do generated workstreams cover the full DD surface? Benchmark against expert-reviewed checklists by deal type. |
| Worker match accuracy | Are workstreams dispatched to the correct specialist? Mismatch rate from registry validation. |
| Synthesis coherence score | Cross-section consistency — how often does the IC flag contradictions in the memo? |
| Cost vs. fixed Pipeline | Does dynamic decomposition generate enough quality improvement to justify the Orchestrator overhead? |
| Node | What it does | Input | Output |
|---|---|---|---|
| DD Orchestrator | Reads pitch deck + fund thesis. Generates DD plan: which workstreams are required, what specific questions each worker must answer, what output format is expected. Dispatches workers in parallel or sequence based on dependencies. | Pitch deck + fund thesis + worker registry | DD plan: list of (worker, questions, context slice) tuples |
| Market Researcher | TAM/SAM analysis, competitive landscape, market timing. Web search + database access. | Market questions + company description | Market section: size, competitors, timing signal |
| Team Profiler | Founder backgrounds, prior exits, domain expertise, team gaps. LinkedIn, Crunchbase, news search. | Founder names + roles | Team section: credentials, red flags, network signal |
| Financial Modeler | Unit economics, growth projections, burn rate, runway. Works from financials if provided. | Financial data or estimates from pitch | Financials section: unit econ, 3-year projection, key assumptions |
| Memo Synthesizer | Combines worker outputs into IC-ready memo. Resolves contradictions across sections. Generates summary recommendation with confidence and key open questions. | All worker outputs + original pitch deck | Full IC memo: executive summary, sections, recommendation, open questions |
| Origin of Value | Where it appears | How it is captured |
|---|---|---|
| Governance | DD Orchestrator node | The Orchestrator decides which workstreams exist, which questions each worker receives, and what counts as sufficient coverage. This is the governance act — it determines where value is generated across the entire DD process. |
| Future Cashflow | Quality of decomposition | A bad DD plan — wrong workstreams, wrong questions — loses value at the plan step, before any worker executes. The Orchestrator's judgment quality is the primary driver of overall output quality. |
| Value Transfer | Orchestrator → Worker edges | Workers capture value by executing assigned work and returning it to the Orchestrator. Their contribution is scoped — they transfer execution value back, not directly to the IC. |
| Representation | Memo Synthesizer node | The synthesized memo represents all worker contributions in a unified, coherent form. Its quality determines whether the sum of parts exceeds individual sections — synthesis is where representation value is created or destroyed. |
VCM analog: Governance Token. The Orchestrator is the DAO allocating compute resources (workers) across tasks. Workers are contributors executing allocated work — their value is proportional to task completion, not orchestration. Adding a worker to the registry is analogous to a token holder joining a protocol.
For a healthcare deal, the Orchestrator generates a DD plan that includes "regulatory pathway analysis" as a workstream — but dispatches it to the Market Researcher, who has no regulatory expertise or data access. The Market Researcher produces a plausible-looking but fabricated regulatory analysis. The memo looks complete. Fix: Orchestrator must validate each workstream assignment against the worker registry — only dispatch to workers that explicitly support the requested workstream type. Worker registry entries must include capability declarations, not just names.
The Financial Modeler receives: "Analyze unit economics. Company: Acme." It has no access to the actual financial data from the pitch deck — the Orchestrator scoped the context too narrowly. The modeler fabricates unit economics from industry benchmarks. The memo presents invented numbers as analysis. Fix: Orchestrator context scoping must be explicit. For each worker, trace which facts from the source document are required and include them in the dispatch. A context audit step before dispatch prevents this.
Market Researcher says TAM is $8B. Financial Modeler's revenue projection assumes a $1.2B addressable market. The Memo Synthesizer combines both sections without flagging the contradiction — it produces a coherent-looking memo with an internal inconsistency that a careful reader would catch immediately. Fix: Synthesizer must run a consistency check across sections before producing the final memo. Contradictions between sections must be flagged as open questions, not silently included.
| Variant | Modification | When to use |
|---|---|---|
| Sequential Orchestration | Workers execute in dependency order, not parallel; Orchestrator gates each dispatch | Subtasks have hard dependencies — Team Profiler output needed before Financial Modeler can assess founder track record bonus |
| Adaptive Orchestration | Orchestrator revises the DD plan after each worker returns results — adds or removes workstreams based on findings | Task structure is only partially known upfront — e.g., a technical red flag triggers additional IP landscape workstream |
| Recursive Orchestration | Workers are themselves Orchestrators for deeper subtasks | Very complex tasks requiring multi-level decomposition — e.g., Market Researcher sub-orchestrates geographic market analysis |
| Pattern | Relationship |
|---|---|
| 10.11 Pipeline | Static alternative — if workstreams are always the same across deals, Pipeline is simpler, cheaper, and more auditable |
| 10.12 Router | Simpler routing — Router handles single-dispatch to one specialist; Orchestrator handles multi-dispatch to N specialists with a generated plan |
| 30.31 Feedback Loop | Close the loop — IC decisions on memo quality feed back to improve the Orchestrator's decomposition logic and Synthesizer's coherence |
The Orchestrator is the organizational IP. Workers are commodity — they can be replaced with better models from APIs. The Orchestrator encodes how the organization structures work: its decomposition logic, workstream taxonomy, worker registry, and context scoping rules represent accumulated institutional knowledge.
Integration = registering new worker capabilities in the Orchestrator's worker registry. Two firms using this pattern can merge at the Orchestrator layer by expanding the worker registry to include both firms' specialists — no architectural change required.
Red flag: an Orchestrator with no formal worker registry and no documented workstream taxonomy is opaque — it decomposes differently on every run with no reproducibility. Due diligence cannot determine what it actually does without extensive trace-log review. This is a valuation risk, not a technical curiosity.