Kazakh company Gen2B develops the first language model for Armenia

Kazakh tech company Gen2B, founded by venture investor Bakht Niyazov and tech entrepreneur Armen Atayan, has unveiled HyGPT — the first large language model, LLM, for the Eastern Armenian language. The model was developed in collaboration with Armenia’s National Center for AI and Technology, NCCAIT.

HyGPT is built on the Gemma 2 architecture and trained on 10 billion Armenian tokens. It shows strong performance in tasks such as translation, question answering, mathematical reasoning, and instruction following. On several Armenian benchmarks, HyGPT-10b-it outperforms international models with significantly larger parameter sizes.

HyGPT is now available for download on the Hugging Face platform and can be used in educational projects, digital products, public services, and business applications.

This marks Gen2B’s second local success — following the release of Irbis-GPT, a Kazakh language model launched at the end of 2024. Irbis-GPT has already been downloaded over 8,500 times. The Gen2B team’s next focus is on developing a model for the Uzbek language.

While Gen2B’s Kazakh, Armenian, and other low-resource language models and AI assistants are already being used by clients in sales, customer support, and process automation, this represents only a small part of a much broader system. At the core of Gen2B’s solutions is an innovative platform that functions as an autonomous AI. Rather than merely supporting teams or clients, it analyzes, learns, and generates managerial recommendations independently — based on data from all communication channels. The company is building a new kind of ecosystem designed for AI-first companies, where AI can serve as a fully-fledged manager of business units.

All models are available free of charge for both personal and commercial use in Kazakh and Armenian. The only exceptions are banks, telecommunications providers, insurance companies, and financial institutions, which must first obtain permission from Gen2B before using the models.