Starting at $0
View Pricing Overview
What is Mixtral LLM?
Mixtral 8x7B is a sparse mixture of experts model (SMoE) with open weights. Licensed under Apache 2.0. Mixtral is an open-weight model with a permissive license designed to match or outperforms GPT3.5 on most standard benchmarks.
Recent Reviews
Leaving a review helps other professionals like you evaluate Enterprise Generative AI Platforms
Be the first one in your network to review Mixtral LLM, and make your voice heard!
Get StartedPricing
Entry-level set up fee?
- No setup fee
Offerings
- Free Trial
- Free/Freemium Version
- Premium Consulting/Integration Services
Would you like us to let the vendor know that you want pricing?
Alternatives Pricing
Product Demos
Mixtral 8x7B DESTROYS Other Models (MoE = AGI?)
YouTube
Mistral 8x7B Part 1- So What is a Mixture of Experts Model?
YouTube
Mixtral 8x22B MoE - The New Best Open LLM? Fully-Tested
YouTube
Mixtral of Experts (Paper Explained)
YouTube
NEW Mixtral 8x22b Tested - Mistral's New Flagship MoE Open-Source Model
YouTube
Product Details
- About
- Competitors
- Tech Details
- FAQs
What is Mixtral LLM?
Mixtral 8x7B is a sparse mixture of experts model (SMoE) with open weights. Licensed under Apache 2.0. Mixtral is an open-weight model with a permissive license designed to match or outperforms GPT3.5 on most standard benchmarks.
Mixtral 8x22B
Released in April of 2024, Mistral AI presents the LLM Mixtral 8x22B as the most performant open model. A 22B sparse Mixture-of-Experts (SMoE) that uses only 39B active parameters out of 141B. The LLM features:
- Fluency in English, French, Italian, German, Spanish, and strong in code
- 64k context window
- Native function calling capacities
- Function calling and json mode available on Mistral AI's API endpoint
Mixtral LLM Competitors
Mixtral LLM Technical Details
Operating Systems | Unspecified |
---|---|
Mobile Application | No |
Frequently Asked Questions
Mixtral 8x7B is a sparse mixture of experts model (SMoE) with open weights. Licensed under Apache 2.0. Mixtral is an open-weight model with a permissive license designed to match or outperforms GPT3.5 on most standard benchmarks.
Mixtral LLM starts at $0.