🌐 MCP-Use is the open source way to connect any LLM to any MCP server and build custom agents that have tool access, without using closed source or application clients.
💬 Get started quickly - chat with your servers on our hosted version! Try mcp-use chat (beta) .
mcp-use
is an open source library that enables developers to connect any LLM to any MCP server, allowing the creation of custom agents with tool access without relying on closed-source or application-specific clients.
Here's a quick example of how you can use mcp-use
:
import asyncio
from langchain_openai import ChatOpenAI
from mcp-use import MCPAgent, MCPClient
async def main():
client = MCPClient(config= {"mcpServers":{
"playwright": {
"command": "npx",
"args": ["@playwright/mcp@latest"],
"env": {"DISPLAY": ":1"}
}}})
# Create LLM
llm = ChatOpenAI(model="gpt-4o", api_key=...)
# Create agent with tools
agent = MCPAgent(llm=llm, client=client, max_steps=30)
# Run the query
result = await agent.run("Find the best restaurant in San Francisco")
if __name__ == "__main__":
asyncio.run(main())
➡️ Create your own with our Builder
- 💻 Open Source: Connect any LLM to any MCP server without vendor lock-in.
- ⚙️ Flexible Configuration: Support for any MCP server through a simple configuration system.
- ⚙️ Easy Setup: Simple JSON-based configuration for MCP server integration.
- 🤖 Universal LLM Support: Compatible with any LangChain-supported LLM provider.
- 🔌 HTTP Connection: Connect to MCP servers running on specific HTTP ports for web-based integrations.
- 🔀 Dynamic Server Selection: Agents can dynamically choose the most appropriate MCP server for the task.
- 📥 Installation: Install mcp-use and set up your environment
- 📖 Configuration: Learn how to configure mcp-use with your MCP server
We are constantly working to improve mcp-use
. Check out what we're planning and suggest new features!