Alibaba DAMO Academy announces the launch of SeaLLMs, pioneering large language models (LLM) that come with 13-billion-parameter and 7-billion-parameter versions. The LLMs are specially designed to cater to the linguistic diversity of Southeast Asia.
The models represent a technological leap forward in terms of inclusivity, offering optimized support for local languages in the region including Tagalog, Vietnamese, Indonesian, Thai, Malay, Khmer, Lao, and Burmese. The conversational models, SeaLLM-chat, exhibit great adaptability to the unique cultural fabric of each market, aligning with local customs, styles, and legal frameworks, and emerging as an invaluable chatbot assistant for businesses engaging with SEA markets.
SeaLLMs are now open-sourced on Hugging Face, with released check point and available for research and commercial use.
“In our ongoing effort to bridge the technological divide, we are thrilled to introduce SeaLLMs, a series of AI models that not only understand local languages but also embrace the cultural richness of Southeast Asia,” said Lidong Bing, Director of the Language Technology Lab at Alibaba DAMO Academy. “This innovation is set to hasten the democratization of AI, empowering communities historically underrepresented in the digital realm.”
Echoing this sentiment, Luu Anh Tuan, Assistant Professor in the School of Computer Science and Engineering (SCSE) at Nanyang Technological University, a long-term partner of Alibaba in multi-language AI study, said, “Alibaba’s strides in creating a multi-lingual LLM are impressive. This initiative has the potential to unlock new opportunities for millions who speak languages beyond English and Chinese. Alibaba’s efforts in championing inclusive technology have now reached a milestone with SeaLLMs’ launch.”
SeaLLM-base models, underwent pre-training on a diverse, high-quality dataset inclusive of the SEA languages, ensuring a nuanced understanding of local contexts and native communication. This foundational work lays the groundwork for chat models, SeaLLM-chat models, which benefit from advanced fine-tuning techniques and a custom-built multilingual dataset. As a result, chatbot assistants based on these models can not only comprehend but respect and accurately reflect the cultural context of these languages in the region, such as social norms and customs, stylistic preferences, and legal considerations.
A notable technical advantage of SeaLLMs are their efficiency, particularly with non-Latin languages. They can interpret and process up to 9 times longer text (or fewer tokens for the same length of text) than other models like ChatGPT for non-Latin languages such as Burmese, Khmer, Lao, and Thai. That translates into more complex task execution capabilities, reduced operational and computational costs, and a lower environmental footprint.
Furthermore, SeaLLM-13B, with 13 billion parameters, outshines comparable open-source models in a broad range of linguistic, knowledge-related, and safety tasks, setting a new standard for performance. When evaluated through the M3Exam benchmark (a benchmark consisting of exam papers from primary school to university entrance examination), SeaLLMs display a profound understanding of a spectrum of subjects, from science, chemistry, physics to economics, in SEA languages, outperforming its contemporaries.
In the FLORES benchmark, which assesses machine translation capabilities between English and low-resource languages—those with limited data for training conversational AI systems, such as Lao and Khmer—SeaLLMs excel. They surpass existing models in these low-resource languages and deliver performances on par with state-of-the-art (SOTA) models in most high-resource languages, such as Vietnamese and Indonesian.
Alibaba DAMO Academy’s SeaLLMs series is not just an advancement in AI but a step towards a more inclusive digital future. For an in-depth look at SeaLLMs’ capabilities and impact, visit the project page on Hugging Face: SeaLLMs – Language Models for Southeast Asian Languages and the technical report: https://arxiv.org/abs/2312.00738.