MCP DefiLlama
An MCP server for interacting with DefiLlama, a comprehensive DeFi data aggregator
- Package:
@iqai/defillama-mcp - Purpose: Interacting with DefiLlama, a comprehensive DeFi data aggregator, for protocol analytics and TVL data.
Usage with ADK TypeScript
import {McpDefillama} from "@iqai/adk";
const toolset = McpDefillama()
const tools = await toolset.getTools()import {McpToolset} from "@iqai/adk";
const toolset = new McpToolset({
name: "Defillama MCP Client",
description: "Client for DefiLlama DeFi data aggregator",
transport: {
mode: "stdio",
command: "pnpm",
args: ["dlx", "@iqai/defillama-mcp"],
env: {
PATH: process.env.PATH || "",
},
},
})
const tools = await toolset.getTools(){
"mcpServers": {
"defillama-mcp-server": {
"command": "pnpm",
"args": ["dlx", "@iqai/defillama-mcp"]
}
}
}Available Tools
Remote MCP Endpoint
This MCP server is hosted remotely and tools are discovered dynamically at runtime. For the full list of available tools and endpoints, see the official CoinGecko MCP documentation.
Environment Variables
No required environment variables for basic usage.
Optional environment variables:
DEFILLAMA_API_KEY: Your DefiLlama API key for enhanced access.IQ_GATEWAY_URL: Custom IQ Gateway URL for enhanced resolution.IQ_GATEWAY_KEY: API key for IQ Gateway access.OPENROUTER_API_KEY: API key for OpenRouter LLM integration.GOOGLE_GENERATIVE_AI_API_KEY: Google generativeAI API key (alternative to OpenRouter).LLM_MODEL: Model name for LLM integration (default:openai/gpt-4.1-mini).
Usage Examples
Here's a complete example of using MCP DefiLlama with an ADK agent:
import { McpDefillama, AgentBuilder } from "@iqai/adk";
import dotenv from "dotenv";
dotenv.config();
async function main() {
// Initialize DefiLlama MCP toolset
const toolset = McpDefillama({
debug: false,
retryOptions: {
maxRetries: 3,
initialDelay: 500,
},
});
// Get available tools
const defillamaTools = await toolset.getTools();
// Create agent with DefiLlama tools
const { runner } = await AgentBuilder.create("defillama_agent")
.withModel("gemini-2.5-flash")
.withDescription("An agent that queries DeFi protocol data from DefiLlama")
.withTools(...defillamaTools)
.build();
try {
// Example queries
const response = await runner.ask(
"What are the top DeFi protocols by TVL?",
);
console.log(response);
} finally {
// Clean up resources
await toolset.close();
}
}
main().catch(console.error);Best Practices
- Resource Cleanup: Always call
await toolset.close()when done to properly close connections - Error Handling: Implement proper error handling for data requests
- Rate Limits: Be aware of API rate limits and implement appropriate retry logic
- LLM Integration: For enhanced resolution, configure an LLM provider via
OPENROUTER_API_KEYorGOOGLE_GENERATIVE_AI_API_KEY
Error Handling
Error Scenarios
The server handles various error scenarios gracefully.
š Invalid protocol identifiers: Verify protocol names or IDs are correct š Network issues: Check your internet connection and DefiLlama API status
Resources
- DefiLlama - DeFi data aggregator platform
- Model Context Protocol - MCP specification and standards