haive.core.models.llm.providers.mistral¶

Mistral AI Provider Module.

This module implements the Mistral AI language model provider for the Haive framework, supporting Mistral’s family of high-performance open and commercial language models.

The provider handles API key management, model configuration, and safe imports of the langchain-mistralai package dependencies.

Examples

Basic usage:

from haive.core.models.llm.providers.mistral import MistralProvider

provider = MistralProvider(
    model="mistral-large-latest",
    temperature=0.7,
    max_tokens=1000
)
llm = provider.instantiate()

With function calling:

provider = MistralProvider(
    model="mistral-large-latest",
    temperature=0.1,
    max_tokens=2000
)

Classes¶

MistralProvider

Mistral AI language model provider configuration.

Module Contents¶

class haive.core.models.llm.providers.mistral.MistralProvider(/, **data)¶

Bases: haive.core.models.llm.providers.base.BaseLLMProvider

Mistral AI language model provider configuration.

This provider supports Mistral’s family of models including Mistral Large, Mistral Medium, Mistral Small, and the open Mixtral models.

Parameters:

data (Any)

provider¶

Always LLMProvider.MISTRALAI

Type:

LLMProvider

model¶

The Mistral model to use

Type:

str

temperature¶

Sampling temperature (0.0-1.0)

Type:

float

max_tokens¶

Maximum tokens in response

Type:

int

top_p¶

Nucleus sampling parameter

Type:

float

random_seed¶

Seed for reproducible generation

Type:

int

safe_mode¶

Enable content filtering

Type:

bool

Examples

Large model for complex tasks:

provider = MistralProvider(
    model="mistral-large-latest",
    temperature=0.7,
    max_tokens=2000
)

Small model for fast inference:

provider = MistralProvider(
    model="mistral-small-latest",
    temperature=0.1,
    max_tokens=500
)

Create a new model by parsing and validating input data from keyword arguments.

Raises [ValidationError][pydantic_core.ValidationError] if the input data cannot be validated to form a valid model.

self is explicitly positional-only to allow self as a field name.

classmethod get_models()¶

Get available Mistral models.

Return type:

list[str]

max_tokens: int | None = None¶

Get maximum total tokens for this model.