Models/Deepseek V3
DeepSeekDeepSeek / Deepseek V3
Released: 12/26/2024
texttext
Input: $0.90 / Output: $0.90

DeepSeek V3 is a Mixture-of-Experts (MoE) LLM that excels in efficient processing of text-based tasks such as coding, translation, and writing. It leverages an innovative architecture that activates only a fraction of its total parameters for each token, enhancing performance while reducing computational demands. Some other noteworthy features of DeepSeek V3 include its open-source availability and its ability to handle complex reasoning tasks. | Metric | Value | |--------------------|--------------------| | Parameter Count | 671 billion | | Mixture of Experts | Yes | | Active Parameter Count | 37 billion | | Context Length | 128,000 tokens | | Multilingual | Yes | | Quantized* | Unknown | *Quantization is specific to the inference provider and the model may be offered with different quantization levels by other providers.

DeepSeek models available on Oxen.ai
ModalityPrice (1M tokens)
ModelInference providerInputOutputInputOutput
Fireworks AI
texttext$0.90$0.90
See all models available on Oxen.ai