Skip to main content
The Delora MCP Server integrates with the Delora API to provide structured access to cross-chain swap and routing functionality via MCP. It allows AI tools to interact with chains, tokens, and routing logic using defined tools instead of manually constructing HTTP requests. MCP is a standardized protocol for connecting AI systems to external tools and services. Compatible clients can discover and invoke Delora tools directly through the protocol interface.
The server exposes read-only tools. It does not sign or broadcast transactions. Quote responses include unsigned transaction data that must be signed and submitted using your own wallet or backend infrastructure.

Quickstart

You can connect the hosted Delora MCP server directly in your MCP-compatible client. No local installation is required. Add the following configuration to your MCP client:
{
  "mcpServers": {
    "delora": {
      "type": "http",
      "url": "https://mcp.delora.build/mcp"
    }
  }
}
After adding the configuration, your AI tool can immediately begin discovering available tools and requesting routing data. See Installation for tool-specific setup instructions and configuration examples.

How It Works

The Delora MCP server wraps the REST API into MCP-compatible tools that AI agents can invoke directly. Instead of building raw HTTP requests, your AI tool works with structured methods such as get_chains, get_token, and get_quote.
AI Tool (Claude, Cursor, Copilot, etc.)


MCP Protocol


Delora MCP Server (https://mcp.delora.build/mcp)


Delora API (https://api.delora.build/v1/)


Cross-chain routing engine
This approach ensures consistent tool discovery and predictable integration behavior across supported AI environments.
Delora API returns an execution-ready transaction payload (calldata). The user or integrator wallet signs and broadcasts the blockchain transaction. Internally, the payload encodes the selected route, constraints, fee logic, deadlines, and adapter-specific execution data.

Example Workflow

A typical cross-chain routing flow using the Delora MCP server:
1

get_instructions

Read how to use the server (chains, tokens, quote).
2

get_chains

Get chain IDs and filter by chainTypes (EVM, SVM) if necessary.
3

get_token (chain, symbol) or get_tokens (chains)

Get token addresses on the source and destination networks.
4

get_quote (originChainId, destinationChainId, amount, originCurrency, destinationCurrency, …)

Get the best route, output amount, calldata, and gas estimate.
After receiving the quote:
  • If required, approve tokens using your wallet.
  • Sign and broadcast the returned transaction data externally.

API Key Configuration

Delora MCP does not take an API key as a tool argument. For HTTP MCP clients that support custom headers, you can pass your Delora API key directly to the MCP endpoint using either:
  • x-api-key: YOUR_API_KEY
  • Authorization: Bearer YOUR_API_KEY
The MCP server resolves the key and forwards it to Delora API as x-api-key. Example hosted MCP configuration:
{
  "mcpServers": {
    "delora": {
      "type": "http",
      "url": "https://mcp.delora.build/mcp",
      "headers": {
        "x-api-key": "YOUR_API_KEY"
      }
    }
  }
}
If your client does not support custom HTTP headers, or if you run Delora MCP in stdio mode, configure the MCP server process with DELORA_API_KEY instead:
# macOS / Linux
DELORA_API_KEY=YOUR_API_KEY npx -y @deloraprotocol/mcp@latest
# Windows PowerShell
$env:DELORA_API_KEY="YOUR_API_KEY"
npx -y @deloraprotocol/mcp@latest
For HTTP requests, incoming x-api-key / Authorization headers take priority over DELORA_API_KEY. For stdio, DELORA_API_KEY is the supported way to provide the key.

Integrator String & Fees

get_quote supports the optional integrator and fee parameters:
  • integrator should be the same integrator string configured for your Delora integration.
  • fee is optional, but it is only valid when integrator is also provided.
  • If you only need attribution or tracking, you can send integrator without fee.
For fee wallet setup and payout behavior, see Fee Configuration.

Next Steps

Installation

Setup instructions for AI models.

GitHub Repository

Source code and self-hosting guide.