Skip to main content
Feb 17

Cohere Debuts Open Multilingual AI Family

Cohere, a leading enterprise AI company, has unveiled a new suite of multilingual models during the ongoing India AI Summit. These innovative models,

2 min read129 views3 tags
Originally reported bytechcrunch

Cohere, a leading enterprise AI company, has unveiled a new suite of multilingual models during the ongoing India AI Summit. These innovative models, branded "Tiny Aya," are open-weight, meaning their foundational code is publicly accessible for developers to utilize and adapt. They boast support for over 70 languages and are designed to operate efficiently on common devices like laptops, entirely offline.

Developed by Cohere Labs, the company's dedicated research division, these models offer robust support for a diverse array of South Asian languages, including Bengali, Hindi, Punjabi, Urdu, Gujarati, Tamil, Telugu, and Marathi, among others.

The core Tiny Aya model features 3.35 billion parameters, indicating its significant size and complexity. Cohere has also introduced TinyAya-Global, a specialized variant fine-tuned for enhanced user command adherence, ideal for applications demanding extensive language coverage. Further expanding the family are regional iterations: TinyAya-Earth caters to African languages, TinyAya-Fire focuses on South Asian languages, and TinyAya-Water is optimized for the Asia Pacific, West Asia, and European regions.

In a recent statement, Cohere articulated the strategic advantage of this modular approach: “This strategy enables each model to cultivate deeper linguistic understanding and cultural sensitivity, resulting in systems that feel more intuitive and dependable for the communities they are designed to serve. Concurrently, all Tiny Aya models preserve broad multilingual capabilities, making them versatile foundations for subsequent customization and research.”

Cohere highlighted that these models, developed using relatively modest computing resources—a single cluster of 64 Nvidia H100 GPUs—are perfectly suited for researchers and developers crafting applications for native language speakers. Their on-device operational capability allows developers to implement offline translation features. The company further emphasized that its underlying software was specifically engineered for on-device deployment, requiring substantially less computing power than many comparable models.

For nations with rich linguistic diversity, such as India, this offline-enabled functionality presents a vast spectrum of potential applications and use cases, eliminating the need for continuous internet connectivity.

The Tiny Aya models are now accessible on HuggingFace, a prominent platform for AI model sharing and testing, as well as the Cohere Platform. Developers can download them for local deployment from HuggingFace, Kaggle, and Ollama. Cohere is also making its training and evaluation datasets available on HuggingFace, with plans to release a comprehensive technical report detailing its training methodology.

Last year, Cohere's CEO, Aidan Gomez, indicated the company's intention to go public “soon.” According to CNBC, Cohere concluded 2025 with strong financial performance, reporting $240 million in annual recurring revenue and achieving a consistent 50% quarter-over-quarter growth throughout the year.

ES
Editorial StaffEditor

The Editorial Staff at AIChief is a team of professional content writers with extensive experience in AI and marketing. Founded in 2025, AIChief has quickly grown into the largest free AI resource hub in the industry.

View all posts
Reader feedback

What did you think of this story?

User Comments

Filter:
No comments yet. Be the first to comment!
Continue reading
View all news