Search
Categories
Vendor solutions
Write a review
Research Boards
Log in
Sign up
Home
Enterprise Generative AI
Mixtral LLM
Mixtral LLM
Get a Demo
Overview
Reviews
Pricing
Details
Alternatives
Mixtral LLM
Overview
Get a Demo
What is Mixtral LLM?
Mixtral 8x7B is a sparse mixture of experts model (SMoE) with open weights. Licensed under Apache 2.0. Mixtral is an open-weight model with a permissive license designed to match or outperforms GPT3.5 on most standard benchmarks.
Categories & Use Cases
Enterprise Generative AI
Related Products
Products similar to Mixtral LLM that may also meet your needs.
OpenAI API Platform
Anthropic Claude
Llama by Meta