Detect → Profile → Act¶
Every high-quality docs change starts the same way: identify the framework, resolve the primitive behavior, then write or validate with that context. This is the core workflow that keeps mcp-zen-of-docs predictable.
The problem¶
!!! note,:::note, and framework-specific component syntax can all mean the same thing.- AI models train on all of those forms at once, so generic output is often a blend.
- Prompting harder does not solve that. Reading the project first does.
The pattern¶
flowchart LR
D["① Detect\n\nInspect project signals.\nReturn framework, confidence, support level, and detected primitives."]
P["② Profile\n\nResolve one of 22 canonical primitives.\nReturn support data or a rendered snippet."]
A["③ Act\n\nScaffold, generate, validate, or onboard using the detected context."]
D -->|framework + signals| P
P -->|support or snippet| A
Step 1 — Detect¶
What it does: scans project_root and returns the most likely documentation framework,
plus supporting evidence.
What matters most in the response:
frameworksupport_levelconfidencematched_signalsauthoring_primitives
Example shape from detect(mode="context"):
{
"status": "success",
"tool": "detect_docs_context",
"project_root": ".",
"framework_detection": {
"best_match": {
"framework": "zensical",
"support_level": "full",
"confidence": 1.0,
"matched_signals": ["zensical.toml", "pyproject:zensical"],
"authoring_primitives": ["frontmatter", "heading-h1", "admonition", "code-fence", "..."]
}
}
}
Step 2 — Profile¶
What it does: resolves how a specific primitive behaves in a specific framework.
The profile tool has three useful modes:
show— see the catalog of primitives and framework capability informationresolve— inspect one primitive in one frameworktranslate— compare a primitive across two frameworks
If you want support data only:
{
"tool": "profile",
"arguments": {
"mode": "resolve",
"framework": "docusaurus",
"primitive": "tabs"
}
}
Example response shape:
{
"status": "success",
"tool": "resolve_primitive",
"framework": "docusaurus",
"primitive": "tabs",
"mode": "support",
"support_lookup": {
"support_level": "partial"
},
"render_result": null
}
If you want a framework-native snippet, switch to resolution_mode="render":
{
"tool": "profile",
"arguments": {
"mode": "resolve",
"framework": "zensical",
"primitive": "admonition",
"resolution_mode": "render",
"topic": "Prerequisites"
}
}
Step 3 — Act¶
What it does: applies the resolved context to a real job.
Typical next actions:
scaffoldto create or enrich a pagegenerateto produce diagrams, changelogs, or reference materialvalidateto check structure and qualityonboardto initialize a full docs workflow in one pass
Once a primitive is resolved, the assistant no longer needs to guess whether tabs are native, partial, or unsupported in that framework.
End-to-end example¶
sequenceDiagram
participant U as User
participant A as AI agent
participant D as detect
participant P as profile
participant S as scaffold
U->>A: "Improve the quickstart page"
A->>D: detect(project_root=".")
D-->>A: {framework: "zensical", matched_signals: ["zensical.toml", ...]}
A->>P: profile(mode="resolve", framework="zensical", primitive="admonition", resolution_mode="render")
P-->>A: {support_level: "full", snippet: "!!! note ..."}
A->>S: scaffold(mode="enrich", doc_path="docs/quickstart.md", ...)
S-->>A: page updated
A-->>U: "Done — the page now uses the native syntax for this framework."
Why this beats generic prompting
The important gain is not eloquence. It is constraint. The workflow anchors every docs edit to the repository's actual framework signals before content is generated.
What to read next¶
-
Authoring Primitives
See the full 22-primitive vocabulary the toolchain works with.
-
profile tool reference
Learn the exact mode and parameter shapes for support lookups and rendered snippets.
-
Quickstart
Follow the same pattern in a practical setup flow.