Mistral Small
ActiveEfficient open-weights mid-sized model from Mistral.
Overview
Mistral Small is an open-weights model suitable for self-hosting and cost-sensitive commercial use. Apache-2.0 licensed.
History
Mistral Small was released on 2024-09-17.
Training & availability
Weights are publicly available under the Apache-2.0 license, making this an open-weight model suitable for on-prem deployment and fine-tuning.
Capabilities
-
Context window: 32K tokens.
-
Max output: 8K tokens.
-
Input modalities: text.
Recommended for: agentic, open-source.
Limitations
-
The context window (32K tokens) is modest by 2026 standards — unsuitable for processing long documents in a single request.
-
Text-only — cannot process images, audio, or video inputs.
Quick start
Minimal example using the OpenRouter API. Copy, paste, replace the key.
from openai import OpenAI
client = OpenAI(
base_url="https://openrouter.ai/api/v1",
api_key="sk-or-...",
)
resp = client.chat.completions.create(
model="mistral/mistral-small",
messages=[{"role": "user", "content": "Explain quantum computing in one sentence."}],
)
print(resp.choices[0].message.content)Cost calculator
Estimate your monthly bill. Presets are typical workload sizes.
Integrations & tooling support
- Tool calling
- Supported
- Structured outputs
- Not supported
Price vs quality
Priced low — good for high-volume tasks. Quality tier pending more benchmark coverage.
- Quality percentile
- —
- Effective price
- $0.0725/1M
- Pricing breakdown
- $0.05/1M in
$0.08/1M out
Community ratings
Rate Mistral Small
Sign in to rate and review.
Comments
Sign in to leave a comment.