Skip to content

Project Analysis

Single-file analysis is useful for quick checks, but real insights come from analyzing entire projects — where dependency patterns, cross-file violations, and aggregate metrics reveal the true health of a codebase.

Running a project report

zen reports path/to/project

This scans all supported files in the directory (recursively), runs the detection pipeline for each language, and produces:

  • Per-file violation tables — grouped by language
  • Severity breakdown — how many violations at each level
  • Project summary — total violations, top offenders, and dominant issue categories

Export options

zen reports path/to/project --out report.md
Human-readable report with Rich formatting preserved.

zen reports path/to/project --export-json out/report.json
Structured output for CI artifacts, dashboards, or custom tooling.

zen reports path/to/project \
  --out report.md \
  --export-json report.json \
  --export-markdown report-export.md \
  --export-log report.log
All formats simultaneously.

CI integration

zen reports . --export-json report.json --quiet
  • --quiet suppresses Rich panels and banners — clean output for CI logs
  • Exit code 1 if violations exceed the configured severity_threshold
  • Attach report.json as a build artifact for review

Including prompts in reports

zen reports path/to/project --include-prompts --out full-report.md

Adds remediation prompts inline with each violation — useful for review-ready documents.

Language filtering

zen reports path/to/project --language python

Analyze only files matching a specific language. Useful for focused reviews or when only one language is configured.

Perspective-filtered reports

The report command supports the same public perspectives as the rest of the runtime:

  • all — full rule-first result with every surfaced violation
  • zen — rule-level result without dogma-analysis payloads
  • dogma — dogma-focused result with universal dogma analysis preserved
  • testing — violations linked to the detected testing family for matching test files
  • projection — violations linked to an explicit projection-family target supplied with --as
zen reports tests --perspective testing --out testing-only.md
zen reports src/frontend --language react --perspective projection --as nextjs

Dogma perspective is available

--perspective dogma is now runnable. It keeps universal dogma analysis attached and filters the projected result down to dogma-relevant violations.

What project analysis reveals

Unlike single-file linting, project-level analysis can detect:

  • Cross-file dependency patterns (Python only, with AST analysis)
  • Aggregate complexity trends — which modules are the most complex
  • Systemic issues — patterns that repeat across many files
  • Coverage gaps — files or directories with no analysis results

Perspective filtering is especially useful once a repository spans many languages and framework analyzers: you can keep one broad baseline report for the whole worktree, then generate narrower testing or projection views for the subsets that matter to a given rollout.

MCP + CLI remediation loop

  1. Run zen reports path/to/project --export-json report.json to create a baseline artifact.
  2. In your MCP client, call generate_agent_tasks for the project path (optionally with a min_severity filter) so the agent can focus on top offenders.
  3. Apply fixes in focused batches (high severity first), then re-run zen reports.
  4. Track score movement over time using the exported JSON in CI artifacts.

See Also