Skip to main content
TrustRadius: an HG Insights Company
Mixtral LLM

Mixtral LLM

Overview

What is Mixtral LLM?

Mixtral 8x7B is a sparse mixture of experts model (SMoE) with open weights. Licensed under Apache 2.0. Mixtral is an open-weight model with a permissive license designed to match or outperforms GPT3.5 on most standard benchmarks.

Read more
Loading...
Return to navigation
Loading...
Loading...