Starting at $0
View Pricing Overview
What is Mixtral LLM?
Mixtral 8x7B is a sparse mixture of experts model (SMoE) with open weights. Licensed under Apache 2.0. Mixtral is an open-weight model with a permissive license designed to match or outperforms GPT3.5 on most standard benchmarks.
Loading...
Loading...