MCPquality 0.55
Massive Context
Handles massive token contexts with intelligent chunking, sub-queries, and local Ollama inference.
What it does
Handles massive token contexts with intelligent chunking, sub-queries, and local Ollama inference.
Processes extremely large text contexts (10M+ tokens) using the Recursive Language Model pattern. Strategically chunks content, runs sub-queries on individual chunks, and aggregates results for final synthesis. Supports local inference via Ollama and cloud fallback via Claude SDK, with tools for auto-analysis, context loading, batch sub-queries, and sandboxed Python code execution against loaded context.
Capabilities
mcptransport-stdioopen-source
Server
Quality
0.55/ 1.00
deterministic score 0.55 from registry signals: · indexed on pulsemcp · has source repo · registry-generated description present
Provenance
Indexed frompulsemcp
Enriched2026-04-22 00:23:47Z · deterministic:mcp:v1 · v1
First seen2026-04-21
Last seen2026-04-22