haive.core.engine.aug_llm.mcp_config¶
MCP-enhanced AugLLMConfig with full type checking.
This module provides MCPAugLLMConfig, which extends AugLLMConfig with Model Context Protocol (MCP) support through proper mixin composition. It includes full type checking and seamless integration with the existing Haive configuration system.
The configuration automatically discovers MCP tools, manages resources, and enhances prompts while maintaining compatibility with all existing AugLLMConfig features.
Classes¶
AugLLMConfig enhanced with MCP (Model Context Protocol) support. |
Functions¶
|
Factory function to create and initialize MCPAugLLMConfig. |
Module Contents¶
- class haive.core.engine.aug_llm.mcp_config.MCPAugLLMConfig(/, **data)[source]¶
Bases:
_get_mcp_mixin(),haive.core.engine.aug_llm.config.AugLLMConfigAugLLMConfig enhanced with MCP (Model Context Protocol) support.
This configuration class extends AugLLMConfig with MCP capabilities through the MCPMixin, providing: - Automatic MCP tool discovery and integration - Resource management from MCP servers - Prompt template enhancement - Full type safety with MCP configurations
The class properly integrates with ToolRouteMixin (inherited from AugLLMConfig) to ensure MCP tools are correctly routed and managed alongside regular tools.
- Parameters:
data (Any)
- All attributes from AugLLMConfig plus
- - mcp_config
Optional MCP configuration
- - mcp_resources
List of discovered MCP resources
- - mcp_prompts
Dictionary of MCP prompt templates
- - auto_discover_mcp_tools
Whether to auto-discover tools
- - inject_mcp_resources
Whether to inject resources
- - use_mcp_prompts
Whether to use MCP prompts
Example
Basic usage with MCP server:
from haive.mcp.config import MCPConfig, MCPServerConfig config = MCPAugLLMConfig( name="mcp_agent", llm_config=LLMConfig(provider="openai", model="gpt-4"), mcp_config=MCPConfig( enabled=True, servers={ "filesystem": MCPServerConfig( transport="stdio", command="npx", args=["-y", "@modelcontextprotocol/server-filesystem"] ) } ), system_message="AI assistant with filesystem access" ) # Initialize MCP await config.setup() # MCP tools are now available alongside regular tools print(f"Total tools: {len(config.tools)}") print(f"MCP tools: {len(config.get_mcp_tools())}")
Multiple MCP servers:
config = MCPAugLLMConfig( name="multi_mcp_agent", llm_config={"provider": "openai", "model": "gpt-4"}, mcp_config=MCPConfig( enabled=True, servers={ "filesystem": MCPServerConfig(...), "github": MCPServerConfig(...), "postgres": MCPServerConfig(...) } ), tools=["calculator"], # Regular tools work too auto_discover_mcp_tools=True )
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.
- cleanup()[source]¶
Clean up resources including MCP connections.
Should be called when the configuration is no longer needed.
- Return type:
None
- debug_mcp_state()[source]¶
Print debug information about MCP integration state.
Useful for troubleshooting MCP configuration and tool discovery.
- Return type:
None
- get_tool_by_name(name)[source]¶
Get a tool instance by name, checking both regular and MCP tools.
- Parameters:
name (str) – Tool name to retrieve
- Returns:
Tool instance or None if not found
- Return type:
Any | None
- async setup()[source]¶
Initialize both AugLLMConfig and MCP integration.
This method: 1. Sets up MCP servers and discovers tools/resources/prompts 2. Enhances the system message with MCP information 3. Integrates MCP tools with the existing tool routing system
Should be called after creating the configuration but before using it with an agent.
- Return type:
None
- async haive.core.engine.aug_llm.mcp_config.create_mcp_aug_llm_config(name, model='gpt-4', mcp_servers=None, **kwargs)[source]¶
Factory function to create and initialize MCPAugLLMConfig.
- Parameters:
- Returns:
Initialized MCPAugLLMConfig instance
- Return type:
Example
Creating a configuration with factory:
config = await create_mcp_aug_llm_config( name="assistant", model="gpt-4", mcp_servers={ "filesystem": { "transport": "stdio", "command": "npx", "args": ["-y", "@modelcontextprotocol/server-filesystem"] } }, system_message="Helpful AI assistant", temperature=0.7 )