AI Hub
Provides unified access to 100+ AI providers through LiteLLM integration, enabling seamless switching between OpenAI,...
What it does
Provides unified access to 100+ AI providers through LiteLLM integration, enabling seamless switching between OpenAI, Anthropic, Google, Azure, AWS Bedrock, and other services through YAML-based configuration with tools for chatting, listing models, and retrieving model information.
This MCP AI Hub server by Pengfei Ni provides unified access to 100+ AI providers through LiteLLM integration, enabling seamless switching between OpenAI, Anthropic, Google, Azure, AWS Bedrock, and other AI services through a single configuration file. Built with Python using FastMCP and featuring comprehensive model management with YAML-based configuration, environment variable support, and robust error handling, it offers three core tools for chatting with models, listing available models, and retrieving model information with support for both string and OpenAI message format inputs. The implementation includes extensive testing coverage, multiple transport options (stdio, SSE, HTTP), and flexible deployment configurations, making it ideal for developers building AI applications that need provider flexibility, organizations wanting to avoid vendor lock-in, and teams requiring centralized AI model management across different services without code changes.
Capabilities
Server
Quality
deterministic score 0.61 from registry signals: · indexed on pulsemcp · has source repo · 7 github stars · registry-generated description present