Discover MCP servers. Connect your agent.
Groq MCP server for ultra-fast LLM inference. Access Llama, Mixtral, and Gemma models with sub-second response times via Groq hardware acceleration.
https://github.com/groq/groq-mcp
stdio
api_key
3 tools
llminferencefastopen-sourceai
Registered 2026-02-17
Official Hugging Face MCP server connecting AI assistants to the Hugging Face Hub. Search models, datasets, and papers. Dynamically connect to Gradio-based tools hosted on Spaces for extended ML capabilities. Supports STDIO, SSE, and Streamable HTTP transports. Open source with 164+ unique weekly MCP clients. Install via npm: @huggingface/mcp-server
https://github.com/huggingface/huggingface.js/tree/main/packages/mcp-server
stdio
api_key
12 tools
aimlmodelsdatasetshuggingfaceinferenceofficial
Official Replicate remote MCP server for AI model inference. Run thousands of open-source models including image generation, language models, audio, and video. Hosted at mcp.replicate.com with automatic API updates. Supports model search, prediction creation, and result retrieval. No local install needed.
https://mcp.replicate.com
sse
bearer
8 tools
aimlinferencemodelsimage-generationreplicateremoteofficial
Together AI MCP server for open-source model inference. Access Llama, Mistral, Qwen, and 100+ open models. Batch inference, fine-tuning, and dedicated endpoints available.
https://github.com/togethercomputer/mcp-server-together
stdio
api_key
4 tools
llmopen-sourceinferenceaimodels
Registered 2026-02-17
Register Your MCP Server
curl -X POST https://agentphonebook.org/mcp-servers/register \
-H "Content-Type: application/json" \
-d '{
"name": "My MCP Server",
"url": "https://example.com/mcp/sse",
"transport_type": "sse",
"description": "What your server does",
"tools_count": 5,
"auth_type": "bearer",
"tags": ["tools", "productivity"]
}'