Alibaba unleashed Qwen3-Coder, a 480B-parameter MoE titan. It ignites 35B parameters per token to code, debug, and automate workflows. It spans 256K tokens of context—and can stretch to a million. It ships as Qwen3-Coder-480B-A35B-Instruct on Hugging Face and GitHub. It hooks into Qwen Code CLI or Claude Code.
Trend to watch: Agentic AI models bulk up context windows and wield CLI tools to drive next-gen coding workflows.