Today marks a significant milestone in India’s AI journey. We are officially open-sourcing The Average Model 30B and 105B parameter models. Trained natively on 22 Indian languages, these models represent the most robust and culturally aware foundational models currently available the open-source community.

Why we built this natively

Conventional models simply translate output from English back into regional languages. The resulting loss of cultural nuance, idiomatic meaning, and context is massive. Our models were trained on tokenized, localized datasets specifically curated by linguistics experts across the subcontinent.

By bypassing the translation layer entirely, we were able to achieve remarkably fast inference speeds, generating text and reasoning directly in Hindi, Tamil, Telugu, Marathi, and Bengali among others.

"Sovereign AI is not just about where the servers are located. It's about who controls the cultural and linguistic representation within the latent space."

Benchmarks to be proud of

Both models significantly outperform comparable open-source models on the IndicNLP benchmark suite. Specifically, over the standard evaluation frameworks:

  • Indic-MMLU: Achieved an average accuracy of 78.4%, beating the closest open competitor by 14 points.
  • Cultural Awareness Evaluation: Scored highest in localized idiomatic completions, demonstrating deep contextual understanding of regional subtleties.
  • Safety & Alignment: Evaluated by red-teaming experts specifically for Indian legal, social, and cultural safety guidelines.

The weights are available today

We are making the base models and the instruct-tuned models available via HuggingFace under an open, commercially permissive license. For enterprises looking to deploy these models securely on-premise or within managed Indian VPCs, our Sovereign Cloud Platform provides 1-click deployment APIs today.

We believe that giving developers access to world-class foundational tools is the only way to accelerate India's ecosystem. We can't wait to see what you build.

Share this article: