Tools

batch_analyze

High-performance bulk file analysis with parallel processing

The batch_analyze tool enables you to analyze multiple files at once with complexity, quality, and structure metrics. Using parallel processing, it can quickly assess large numbers of files and aggregate the results for comprehensive codebase insights.

How It Works

Stellarion uses a Rayon-powered parallel processing engine with 4-15 worker threads to analyze files concurrently, achieving 40-60% faster performance compared to sequential analysis. Files are matched using glob patterns, analyzed with real complexity metrics from the ComplexityAnalyzer, and results are aggregated with automatic output truncation to stay within token limits.

When to Use

  • Analyzing many files at once: Get metrics for an entire directory or module
  • Getting aggregated metrics: Understand overall complexity across a codebase
  • Comparing files by quality: Find the best and worst files in a set
  • Quality gate enforcement: Check if files exceed complexity thresholds
  • CI/CD integration: Automated quality checks in pipelines

Parameters

ParameterTypeRequiredDefaultDescription
patternstringNo-Single glob pattern (e.g., "src/**/*.ts")
patternsarrayNo-Multiple glob patterns
filesarrayNo-Explicit list of file paths
operationsarrayNo["complexity"]["complexity", "quality", "structure"]
parallelbooleanNofalseEnable parallel processing
aggregatebooleanNofalseCompute aggregated metrics
thresholdsobjectNo-Violation thresholds for quality gates
limitnumberNo50Maximum results to return
workersnumberNo4Number of parallel workers (1-15)
outputFormatstringNojsonjson or csv
comparebooleanNofalseCompare results across files
continueOnErrorbooleanNofalseContinue if individual files fail

MCP Command Syntax

mcp__stellarion__batch_analyze pattern:"src/**/*.ts" operations:["complexity","quality"] parallel:true aggregate:true

Operations

Complexity Analysis

Calculate metrics for each file:

  • Cyclomatic complexity
  • Cognitive complexity
  • Lines of code (LOC/SLOC)
  • Maintainability index
  • Nesting depth

Quality Assessment

Score each file:

  • Maintainability score (0-100)
  • Issue detection
  • Quality classification (excellent, good, needs attention, poor)

Structure Extraction

Count elements in each file:

  • Function count
  • Class count
  • Interface count
  • Method count
  • Language detection

Examples

Analyze All TypeScript Files

Natural Language:

Analyze complexity of all TypeScript files in src/

Direct MCP Call:

mcp__stellarion__batch_analyze pattern:"src/**/*.ts" operations:["complexity"] parallel:true

Multiple Operations with Aggregation

Natural Language:

Run complexity and quality analysis on src/ and give me summary statistics

Direct MCP Call:

mcp__stellarion__batch_analyze pattern:"src/**/*.ts" operations:["complexity","quality"] parallel:true aggregate:true

Returns: Per-file results plus aggregated metrics (average complexity, total issues, etc.)


Quality Gate with Thresholds

Natural Language:

Check if any files in src/ have complexity over 20

Direct MCP Call:

mcp__stellarion__batch_analyze pattern:"src/**/*.ts" operations:["complexity"] thresholds:{"complexity":20} parallel:true

Returns: List of files exceeding the threshold (violations)


Analyze Specific Files

Natural Language:

Compare the complexity of these service files

Direct MCP Call:

mcp__stellarion__batch_analyze files:["src/services/userService.ts","src/services/orderService.ts"] operations:["complexity","quality"] compare:true

Multiple Glob Patterns

Natural Language:

Analyze all TypeScript and JavaScript files in the project

Direct MCP Call:

mcp__stellarion__batch_analyze patterns:["src/**/*.ts","src/**/*.js"] operations:["complexity","structure"] parallel:true

CSV Export

Natural Language:

Export complexity metrics for all files to CSV format

Direct MCP Call:

mcp__stellarion__batch_analyze pattern:"src/**/*.ts" operations:["complexity"] outputFormat:csv

Returns:

file,cyclomatic,cognitive,maintainability,loc
src/a.ts,12,8,75,145
src/b.ts,25,18,62,320

Full Structure Analysis

Natural Language:

Count all functions, classes, and methods in the project

Direct MCP Call:

mcp__stellarion__batch_analyze pattern:"src/**/*.ts" operations:["structure"] parallel:true aggregate:true

High Performance with Many Workers

Natural Language:

Analyze this large codebase using maximum parallelism

Direct MCP Call:

mcp__stellarion__batch_analyze pattern:"**/*.ts" operations:["complexity"] parallel:true workers:10 limit:100

Output Format

Per-File Results

Each file includes:

FieldDescription
File pathLocation of the file
Cyclomatic complexityTotal complexity score
Cognitive complexityUnderstanding difficulty
Maintainability indexScore from 0-100
Lines of codeLOC and SLOC counts
Quality ratingexcellent, good, needs attention, poor
Function countNumber of functions (structure operation)
Class countNumber of classes (structure operation)

Aggregated Metrics

When aggregate: true:

MetricDescription
Total filesNumber of files analyzed
Average complexityMean cyclomatic complexity
Max complexityHighest complexity found
Total issuesSum of quality issues
Files by qualityCount per quality level

Threshold Violations

When thresholds are set:

FieldDescription
File pathFile exceeding threshold
Actual valueThe measured value
ThresholdThe configured limit
Violation typeWhat threshold was exceeded

Quality Gates for CI/CD

Use thresholds for automated quality checks:

mcp__stellarion__batch_analyze pattern:"src/**/*.ts" thresholds:{"complexity":20} operations:["complexity"] parallel:true

Returns:

{
  "violations": [
    {
      "file": "src/complex.ts",
      "complexity": 35,
      "threshold": 20
    }
  ],
  "passed": false
}

Use the violations array and passed field for CI/CD integration.

Performance Tips

  1. Enable parallel processing: Use parallel: true for 40-60% faster analysis
  2. Adjust worker count: Use workers: 8 for large codebases
  3. Use glob patterns efficiently: More specific patterns run faster
  4. Set appropriate limits: Use limit: 50 for large analyses
  5. Use aggregation for summaries: aggregate: true gives overview without all details
  6. Continue on errors: continueOnError: true for resilient batch processing

Performance Comparison

ApproachTime for 54 files
Sequential (file by file)30-60 seconds
batch_analyze (4 workers)5-10 seconds
batch_analyze (8 workers)3-6 seconds

Output Formats

JSON

Full structured data with all metrics:

{
  "results": [...],
  "aggregate": {...},
  "violations": [...],
  "metadata": {
    "files_analyzed": 54,
    "processing_time_ms": 4500
  }
}

CSV

Spreadsheet-compatible format:

file,cyclomatic,cognitive,maintainability,loc,quality
src/auth.ts,12,8,75,145,good
src/complex.ts,35,25,52,420,needs_attention

Tips for Effective Batch Analysis

  1. Use specific glob patterns: src/services/*.ts is faster than **/*.ts
  2. Enable parallel for large sets: Always use parallel: true for 10+ files
  3. Set thresholds for quality gates: Catch issues before they're merged
  4. Use CSV for spreadsheet analysis: Easy import to Excel/Sheets
  5. Use aggregate for quick health checks: Get the big picture fast

Combining with Other Tools

  1. Batch + Impact: Find high-complexity files, then check their impact
  2. Batch + Refactoring: Identify files needing work, then get specific suggestions
  3. Batch + Reports: Use batch results to generate quality reports