Essentials
LLM Integration
Integrate any LLM with easy-mcp-use through LangChain
LLM Integration Guide
easy-mcp-use supports integration with any Language Learning Model (LLM) that is compatible with LangChain. This guide covers how to use different LLM providers with easy-mcp-use and emphasizes the flexibility to use any LangChain-supported model.
Universal LLM Support
easy-mcp-use leverages LangChain’s architecture to support any LLM that implements the LangChain interface. This means you can use virtually any model from any provider, including:
- OpenAI models (GPT-4, GPT-3.5, etc.)
- Anthropic models (Claude)
- Google models (Gemini)
- Mistral models
- Groq models
- Llama models
- Cohere models
- Open source models (via LlamaCpp, HuggingFace, etc.)
- Custom or self-hosted models
- Any other model with a LangChain integration
Read more at https://js.langchain.com/docs/integrations/chat/