TrustRadius: an HG Insights company

What is Mixtral LLM?

Mixtral 8x7B is a sparse mixture of experts model (SMoE) with open weights. Licensed under Apache 2.0. Mixtral is an open-weight model with a permissive license designed to match or outperforms GPT3.5 on most standard benchmarks.

Mixtral 8x22B
Released in April of 2024, Mistral AI presents the LLM Mixtral 8x22B as the most performant open model. A 22B sparse Mixture-of-Experts (SMoE) that uses only 39B active parameters out of 141B. The LLM features:

  • Fluency in English, French, Italian, German, Spanish, and strong in code
  • 64k context window
  • Native function calling capacities
  • Function calling and json mode available on Mistral AI's API endpoint

Categories & Use Cases

Awards

Products that are considered exceptional by their customers based on a variety of criteria win TrustRadius awards. Learn more about the types of TrustRadius awards to make the best purchase decision. More about TrustRadius Awards

Product Demos

Technical Details

Technical Details
Mobile ApplicationNo

FAQs

What is Mixtral LLM?
Mixtral 8x7B is a sparse mixture of experts model (SMoE) with open weights. Licensed under Apache 2.0. Mixtral is an open-weight model with a permissive license designed to match or outperforms GPT3.5 on most standard benchmarks.
How much does Mixtral LLM cost?
Mixtral LLM starts at $0.
What are Mixtral LLM's top competitors?
OpenAI API Platform, Anthropic Claude, and Llama by Meta are common alternatives for Mixtral LLM.