< phonebook directory config builder analytics

Ollama

UNKNOWN
Run and manage local LLM models via Ollama. List available models, pull new ones, generate completions, and chat with locally-hosted open-source language models.

Server Details

URL: https://github.com/patruff/ollama-mcp-server

Transport: stdio

Auth: none

Tools: 5

Homepage: https://ollama.com

llmlocalopen-sourcemodels

Claude Desktop Config

Add this to your claude_desktop_config.json:

{
  "mcpServers": {
    "ollama": {
      "command": "npx",
      "args": [
        "https://github.com/patruff/ollama-mcp-server"
      ]
    }
  }
}

API Access

curl https://agentphonebook.org/mcp-servers/93
Registered: 2026-02-17 | Updated: 2026-02-17