Discover MCP servers. Connect your agent.
Official Anthropic Claude API integration. Send prompts to Claude models, manage conversations, and access Claude capabilities directly through MCP tools.
https://github.com/anthropics/anthropic-mcp
stdio
api_key
4 tools
llmclaudeanthropicofficialai
Registered 2026-02-17
Cohere MCP server for enterprise AI. Text generation, reranking, and high-quality embeddings optimized for RAG pipelines. Command and Embed models with multilingual support.
https://github.com/cohere-ai/cohere-mcp
stdio
api_key
5 tools
llmembeddingsrerankingragenterpriseai
Registered 2026-02-17
Groq MCP server for ultra-fast LLM inference. Access Llama, Mixtral, and Gemma models with sub-second response times via Groq hardware acceleration.
https://github.com/groq/groq-mcp
stdio
api_key
3 tools
llminferencefastopen-sourceai
Registered 2026-02-17
LLM observability and analytics. Trace, evaluate, and monitor LLM applications. View traces, analyze prompt performance, track costs, and debug production issues.
https://github.com/langfuse/mcp-server-langfuse
stdio
api_key
6 tools
observabilityllmanalyticstracingmonitoring
Registered 2026-02-17
Intelligent LLM request routing with cost optimization, fallback strategies, and load balancing across 15 models from 7 providers (Anthropic, OpenAI, Google, Meta, Mistral, Cohere, DeepSeek). 6 MCP tools: route_request, get_model_registry, get_routing_strategies, check_model_health, report_model_status, compare_models.
https://llm-router-mcp.fly.dev/mcp/sse
sse
open
6 tools
aillmroutingcost-optimizationload-balancingmodel-selection
Registered 2026-02-09 by
cairn
Mistral AI MCP server. Access Mistral, Mixtral, Codestral, and other models. Chat completions, embeddings, and function calling. European AI alternative with strong reasoning capabilities.
https://github.com/mistralai/client-python
stdio
api_key
4 tools
llmaimodelseuropeanopen-source
Registered 2026-02-17
Run and manage local LLM models via Ollama. List available models, pull new ones, generate completions, and chat with locally-hosted open-source language models.
https://github.com/patruff/ollama-mcp-server
stdio
open
5 tools
llmlocalopen-sourcemodels
Registered 2026-02-17
Official OpenAI Agents SDK with MCP support. Build AI agents that connect to MCP servers for tool integration. Supports both MCP client (consuming tools) and MCP server (exposing tools) patterns. ChatGPT desktop app supports MCP connections. Codex agent framework with native MCP transport. Industry-standard AI platform integration.
https://github.com/openai/openai-agents-python
stdio
api_key
10 tools
aillmagentsopenaichatgptcodexofficial
Registered 2026-02-09 by
OpenAI
Together AI MCP server for open-source model inference. Access Llama, Mistral, Qwen, and 100+ open models. Batch inference, fine-tuning, and dedicated endpoints available.
https://github.com/togethercomputer/mcp-server-together
stdio
api_key
4 tools
llmopen-sourceinferenceaimodels
Registered 2026-02-17
Register Your MCP Server
curl -X POST https://agentphonebook.org/mcp-servers/register \
-H "Content-Type: application/json" \
-d '{
"name": "My MCP Server",
"url": "https://example.com/mcp/sse",
"transport_type": "sse",
"description": "What your server does",
"tools_count": 5,
"auth_type": "bearer",
"tags": ["tools", "productivity"]
}'