Skip to content
  1. Status
  2. Announcements
  3. Docs
  4. Support
  5. About
  6. Partners
  7. Enterprise
  8. Careers
  9. Pricing
  10. Privacy
  11. Terms
  12.  
  13. © 2025 OpenRouter, Inc

    Mistral: Mixtral 8x22B (base)

    mistralai/mixtral-8x22b

    Created Apr 10, 202465,536 context

    Mixtral 8x22B is a large-scale language model from Mistral AI. It consists of 8 experts, each 22 billion parameters, with each token using 2 experts at a time.

    It was released via X.

    #moe

    Recent activity on Mixtral 8x22B (base)

    Total usage per day on OpenRouter

    Not enough data to display yet.