Alibaba's Qwen AI team experienced a sudden leadership crisis as technical lead Junyang Lin and several key researchers departed immediately following the successful launch of Qwen 3.5. Lin announced his departure on X with a brief post: "me stepping down. bye my beloved qwen." Colleagues indicated the departure was not voluntary, with the timing particularly striking given that Qwen has become the most-forked open-source model family on Hugging Face with over 700 million cumulative downloads and nearly 400 open-sourced models.
Multiple Key Researchers Exit During Peak Success
The departures extended beyond Lin to include Binyuan Hui (code development lead, who reportedly joined Meta in January 2026), Yu Bowen (post-training lead), and Chen Cheng (model contributions). Reports from LatePost suggest the exits followed a reorganization at Alibaba's Tongyi AI Lab, moving toward a fragmented structure separating pre-training, post-training, text, and multimodality into distinct teams. Chen Cheng posted on social media: "I know leaving wasn't your choice. Just last night, we were side by side launching the Qwen3.5 small model. I honestly can't imagine Qwen without you."
Qwen 3.5 Launch Showcases Technical Achievements
The exodus occurred during a major technical milestone: Qwen 3.5 launched with four new small models on March 2—Qwen3.5-0.8B/2B/4B/9B—catering to edge devices, lightweight agents, and memory-constrained server deployment. All four models support native multimodal processing, handling both text and images within a single model. The 2B variant runs on just 7GB RAM and functions as a full reasoning and multimodal model, with users reporting the 0.8B model runs directly on phones.
Developer Community Expresses Concern
The developer community has voiced significant concern about potential disbanding of a team with a strong track record in developing effective smaller models. Analysis from Recode China AI noted that Lin's "defining achievement was turning Qwen from an unknown side project into the world's most influential open-source LLM series." Lin joined Alibaba fresh out of Peking University in July 2019 and was a key contributor to Alibaba's early AI work including the M6 and OFA models before building Qwen. The uncertainty comes at a critical moment when Qwen has established itself as a leading force in open-source AI development.
Key Takeaways
- Qwen technical lead Junyang Lin and multiple key researchers departed following a reorganization that fragmented the Tongyi AI Lab into separate teams
- Qwen has become the most-forked open-source model family on Hugging Face with over 700 million downloads and nearly 400 open-sourced models
- Qwen 3.5 launched with four new models ranging from 0.8B to 9B parameters, including a 2B variant that runs on just 7GB RAM
- Code development lead Binyuan Hui reportedly left in January 2026 and joined Meta
- The developer community has expressed concern about the future of a team known for developing highly effective smaller models