mistral

open-mixtral-8x22b

text-to-text
Mixtral 8x22B is currently the most performant open model. A 22B sparse Mixture-of-Experts (SMoE). Uses only 39B active parameters out of 141B.
Model Parameters

Advanced settings to control the behavior of the model.

A system prompt that will be prepended to all chat messages to guide the model's behavior. Automatically gets saved to your playground repository.

Controls randomness: 0 = deterministic, 2 = very creative

Maximum length of the response (1-128000 tokens)

mistralopen-mixtral-8x22b

Log in to chat with any model.

Connecting to Model...