Aya Vision Pushes AI Beyond Language Barriers with Multimodal Breakthroughs

Aya Vision is designed to operate fluently in 23 languages spoken by more than half the world’s population.
While models are becoming more sophisticated, one major difficulty is ensuring AI understands and responds intuitively across languages and modalities. Cohere For AI, Cohere's open research division, has launched Aya Vision to push the boundaries of multilingual and multimodal AI, providing a glimpse into a more inclusive and context-aware future for AI. A New Standard for AI Evaluation Traditionally, AI models were evaluated using rigid accuracy metrics. However, these methods frequently penalise models for minor deviations like punctuation or language, even when the intended meaning remains identical. This strict evaluation approach fails to capture what people genuinely desire: AI that is natural, intuitive, and contextually aware. Recognising this limitation, Cohere created a
Subscribe or log in to Continue Reading

Uncompromising innovation. Timeless influence. Your support powers the future of independent tech journalism.

Already have an account? Sign In.

📣 Want to advertise in AIM Media House? Book here >

Picture of Anshika Mathews
Anshika Mathews
Anshika is the Senior Content Strategist for AIM Research. She holds a keen interest in technology and related policy-making and its impact on society. She can be reached at anshika.mathews@aimresearch.co
25 July 2025 | 583 Park Avenue, New York
The Biggest Exclusive Gathering of CDOs & AI Leaders In United States

Subscribe to our Newsletter: AIM Research’s most stimulating intellectual contributions on matters molding the future of AI and Data.