OpenAI MCP
Provides a high-performance bridge between OpenAI and Anthropic models with prompt templating, response streaming, an...
What it does
Provides a high-performance bridge between OpenAI and Anthropic models with prompt templating, response streaming, and efficient caching for applications requiring customizable LLM access.
This OpenAI-compatible MCP server implementation provides a bridge between AI assistants and large language models, supporting both OpenAI and Anthropic models. It features a robust architecture with prompt templating, streaming responses, efficient caching, and comprehensive error handling. The server exposes endpoints for health checks, context generation, and prompt management, while also offering advanced features like token usage tracking and Prometheus metrics integration. Ideal for applications requiring reliable, high-performance access to LLMs with the flexibility to customize prompts and manage response caching.
Capabilities
Server
Quality
deterministic score 0.62 from registry signals: · indexed on pulsemcp · has source repo · 35 github stars · registry-generated description present