Deephaven MCP: AI agents meet real-time data

Query tables, execute scripts, and search docs through natural language

November 26 2025

Elijah PettyElijah PettyTechnical Writer @Deephaven
AI assistant interacting with Deephaven data through MCP protocol

You're juggling multiple Deephaven sessions, switching between documentation tabs, and copying table schemas into ChatGPT to formulate queries. What if your AI assistant could directly query your tables, execute scripts, and search documentation — all in natural language?

Deephaven now supports the Model Context Protocol (MCP), an open-source standard that connects AI applications to external services. Connect your preferred AI agent to Deephaven and let it handle routine data operations while you focus on analysis and insights.

MCP enables AI agents to work with Deephaven workers, writing queries, executing code, and accessing documentation.

This integration consists of two MCP servers. The systems server lets AI agents interact with Deephaven workers to write queries, execute code, and manage sessions. The docs server provides access to Deephaven's documentation, acting as a knowledge repository when queried.

What is MCP?

Model Context Protocol (MCP) is an open-source standard for connecting AI applications to external services like Deephaven. Similar to how REST APIs enable web services to communicate, MCP provides a standardized way for AI models to interact with databases, workflows, and filesystems.

Instead of building custom integrations for every AI tool and service combination, MCP creates a common protocol. Your AI assistant can interact with Deephaven the same way it would with any other MCP-compatible service.

Deephaven MCP architecture

Deephaven MCP consists of two servers:

Systems server

The systems server manages connections to Deephaven Community Core and Enterprise Core+ deployments. It provides:

Session management:

  • List and monitor configured sessions.
  • Start and stop Core+ sessions.
  • Connect to multiple deployments.
  • Reload configurations dynamically.

Data operations:

  • Retrieve table schemas and metadata.
  • Execute Python and Groovy scripts.
  • Query installed packages and environment details.
  • Analyze and retrieve table data.

Docs server

The docs server connects AI agents to Deephaven's documentation knowledge base. Query it in natural language to get specific answers with code examples and explanations pulled directly from official documentation.

Use cases

Deephaven MCP enables AI-driven workflows for common data operations:

Morning analysis workflow

Imagine starting your day with a simple prompt: "Show me any high-severity alerts from overnight and check if they correlate with unusual trading volume."

Your AI agent:

  1. Lists available sessions and connects to production
  2. Retrieves schemas for the alerts and trades tables
  3. Executes a query filtering high-severity alerts
  4. Joins with trading data to identify correlations
  5. Returns the results for your review

All while you're on your first coffee.

Additional workflows

Exploratory analysis:

  • "List all tables in my production session and show me the schema for the trades table."
  • "Show me the last 50 rows of the alerts table where severity equals 'high'."
  • "What packages are installed in my development environment? Is pandas version 2.0 or later?"

Code execution:

  • "Execute this Python script to create a new aggregated table in my staging environment."
  • "Create a new Core+ session named 'analysis-01' and load the customer dataset."

Documentation queries:

  • "How do I use update-by operations? Show me an example with real-time data."
  • "What's the difference between as-of joins and reverse as-of joins? Give me code examples."

AI agents handle routine tasks - checking schemas, listing tables, searching documentation - while you focus on analysis and insights.

See it in action

Here's an actual conversation with an AI agent connected to Deephaven MCP:

You: List all available sessions and show me the tables in production.

Agent: I found 3 configured sessions. Connecting to production...
Found 12 tables. Here are the key ones:
- trades: 2.4M rows, schema: Symbol, Price, Volume, Timestamp
- orders: 890K rows, schema: OrderID, Symbol, Quantity, Status
- positions: 156 rows, schema: Account, Symbol, Shares, Value

You: Show me the last 10 high-volume trades.

Agent: Executing query on trades table...
[Returns filtered results with volume > 100000]

You: How do I calculate a rolling average on this data?

Agent: Searching Deephaven docs... The update_by operation with
rolling_avg_time handles this. Here's an example:
[Returns code snippet with explanation]

The agent seamlessly moves between querying data, executing operations, and retrieving documentation — all from natural language prompts.

Setup

Deephaven MCP requires Python 3.11 or later and access to Deephaven Enterprise systems with Core+ workers.

Install

Install using uv (recommended) or pip:

# Using uv (recommended)
pip install uv
uv venv .venv -p 3.11
source .venv/bin/activate  # On MacOS/Linux
uv pip install "deephaven-mcp[coreplus]"

# Or using pip
python -m venv .venv
source .venv/bin/activate  # On MacOS/Linux
pip install "deephaven-mcp[coreplus]"

Configure

Create a deephaven_mcp.json file defining your Deephaven deployments:

{
  "coreplus": {
    "deployments": {
      "my_enterprise": {
        "host": "enterprise.example.com",
        "port": 443,
        "use_tls": true,
        "auth_type": "io.deephaven.authentication.psk.PskAuthenticationHandler",
        "auth_token_env_var": "DH_ENTERPRISE_TOKEN"
      }
    }
  }
}

The configuration file supports multiple deployments, mixing Community Core and Enterprise Core+ environments.

Connect your AI tool

Deephaven MCP works with Claude Desktop, Cursor, Visual Studio Code, and Windsurf. Add this configuration to your AI tool's MCP settings:

{
  "mcpServers": {
    "deephaven-systems": {
      "command": "/full/path/to/your/.venv/bin/dh-mcp-systems-server",
      "args": [],
      "env": {
        "DH_MCP_CONFIG_FILE": "/full/path/to/your/deephaven_mcp.json",
        "PYTHONLOGLEVEL": "INFO"
      }
    },
    "deephaven-docs": {
      "command": "/full/path/to/your/.venv/bin/mcp-proxy",
      "args": [
        "--transport=streamablehttp",
        "https://deephaven-mcp-docs-prod.dhc-demo.deephaven.io/mcp"
      ]
    }
  }
}

Restart your AI tool and test the connection:

Are the Deephaven MCP servers working? Can you list available sessions?

Get started

Deephaven MCP transforms how you interact with your data — replacing context switching and manual lookups with natural conversation. Whether you're exploring production data, executing complex queries, or learning new APIs, your AI assistant becomes a knowledgeable pair programmer that understands Deephaven.

The integration works with both Community Core and Enterprise Core+, so you can configure mixed environments in a single setup — local development instances alongside production and staging deployments.

Install the package, configure your deployments, and start experimenting:

Questions or want to share what you've built? Join our Slack community — we'd love to hear how you're using AI agents with Deephaven.