A 12B parameter model with a 128k token context length built by Mistral in collaboration with NVIDIA. The model is multilingual, supporting English, French, German, Spanish, Italian, Portuguese, Chinese, Japanese, Korean, Arabic, and Hindi. It supports function calling and is released under the Apache 2.0 license.
Mistral: Mistral Nemo
A 12B parameter model with a 128k token context length built by Mistral in collaboration with NVIDIA.
Information
- Websitehttps://mistral.ai/
- Social Media
- Published date2024-12-03
Data
- Monthly Visitors100
- Domain Rating100
- Authority Score100