On a drizzly and windswept afternoon this summer time, I visited the headquarters of Rokid, a startup creating sensible glasses in Hangzhou, China. As I chatted with engineers, their phrases had been swiftly translated from Mandarin to English, after which transcribed onto a tiny translucent display simply above my proper eye utilizing one of many firm’s new prototype units.
Rokid’s high-tech spectacles use Qwen, an open-weight massive language mannequin developed by the Chinese language ecommerce large Alibaba.
Qwen—full title 通义千问 or Tōngyì Qiānwèn in Chinese language—is just not the perfect AI mannequin round. OpenAI’s GPT-5, Google’s Gemini 3, and Anthropic’s Claude typically rating larger on benchmarks designed to gauge completely different dimensions of machine cleverness. Neither is Qwen the primary really cutting-edge open-weight mannequin, that being Meta’s Llama, which was launched by the social media large in 2023.
But Qwen, and different Chinese language fashions—from DeepSeek, Moonshot AI, Z.ai, and MiniMax—are more and more common as a result of they’re each excellent and really straightforward to tinker with. Based on HuggingFace, an organization that gives entry to AI fashions and code, downloads of open Chinese language fashions on its platform surpassed downloads for US ones in July of this 12 months. DeepSeek shook the world by releasing a cutting-edge massive language mannequin with a lot much less compute than US rivals, however OpenRouter, a platform that routes queries to completely different AI fashions, says Qwen has quickly risen in reputation by means of the 12 months to grow to be the second-most-popular open mannequin on the earth.
Qwen can do most belongings you’d need from a complicated AI mannequin. For Rokid’s customers, this may embody figuring out merchandise snapped by a built-in digicam, getting instructions from a map, drafting messages, looking out the net, and so forth. Since Qwen can simply be downloaded and modified, Rokid hosts a model of the mannequin, fine-tuned to go well with its functions. Additionally it is doable to run a teensy model of Qwen on smartphones or different units simply in case the web connection goes down.
Earlier than going to China I put in a small model of Qwen on my MacBook Air and used it to observe some primary Mandarin. For a lot of functions, modestly sized open supply fashions like Qwen are simply nearly as good because the behemoths that stay inside large information facilities.
The rise of Qwen and different Chinese language open-weight fashions has coincided with stumbles for some well-known American AI fashions within the final 12 months. When Meta unveiled Llama 4 in April 2025, the mannequin’s efficiency was a disappointment, failing to succeed in the heights of common benchmarks like LM Area. The slip left many builders searching for different open fashions to play with.
