haive.core.models.llm.providers.mistral¶
Mistral AI Provider Module.
This module implements the Mistral AI language model provider for the Haive framework, supporting Mistral’s family of high-performance open and commercial language models.
The provider handles API key management, model configuration, and safe imports of the langchain-mistralai package dependencies.
Examples
Basic usage:
from haive.core.models.llm.providers.mistral import MistralProvider
provider = MistralProvider(
model="mistral-large-latest",
temperature=0.7,
max_tokens=1000
)
llm = provider.instantiate()
With function calling:
provider = MistralProvider(
model="mistral-large-latest",
temperature=0.1,
max_tokens=2000
)
Classes¶
Mistral AI language model provider configuration. |
Module Contents¶
- class haive.core.models.llm.providers.mistral.MistralProvider(/, **data)¶
Bases:
haive.core.models.llm.providers.base.BaseLLMProviderMistral AI language model provider configuration.
This provider supports Mistral’s family of models including Mistral Large, Mistral Medium, Mistral Small, and the open Mixtral models.
- Parameters:
data (Any)
- provider¶
Always LLMProvider.MISTRALAI
- Type:
Examples
Large model for complex tasks:
provider = MistralProvider( model="mistral-large-latest", temperature=0.7, max_tokens=2000 )
Small model for fast inference:
provider = MistralProvider( model="mistral-small-latest", temperature=0.1, max_tokens=500 )
Create a new model by parsing and validating input data from keyword arguments.
Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.
self is explicitly positional-only to allow self as a field name.