open-mixtral-8x7b
text-to-text
A 7B sparse Mixture-of-Experts (SMoE). Uses 12.9B active parameters out of 45B total.
Model Parameters
Advanced settings to control the behavior of the model.
A system prompt that will be prepended to all chat messages to guide the model's behavior. Automatically gets saved to your playground repository.
Controls randomness: 0 = deterministic, 2 = very creative
Maximum length of the response (1-128000 tokens)
open-mixtral-8x7b
Log in to chat with any model.