Skip to content
SLM-Works

Mistral (7B Instruct v0.3)

Mistral · ~7B parameters · 4B–12B (mid-size) · Last reviewed 2026-03-20

Why we use it

Predictable Apache-2 lineage for teams that want permissive defaults and straightforward VPC packaging.

License summary

Apache 2.0 for many Mistral open weights - always read the specific checkpoint card; some variants use different terms.

Typical deployment profiles

  • VPC, single GPU class
  • VPC, multi-GPU

Focus tags

  • General
  • Multilingual

Typical use cases

  • General instruct tuning
  • Routing baselines
  • Cost-sensitive serving

External references

Related SLM-Works services

← Back to foundation models catalog · Insights · Glossary

Contact to validate a model choice for your environment.