On March 6,Schoolmistress 2 Alibaba released and open-sourced its new reasoning model, QwQ-32B, featuring 32 billion parameters. Despite being significantly smaller than DeepSeek-R1, which has 6,710 billion parameters (with 3.7 billion active), QwQ-32B matches its performance in various benchmarks. QwQ-32B excelled in math and coding tests, outperforming OpenAI’s o1-mini and distilled versions of DeepSeek-R1. It also scored higher than DeepSeek-R1 in some evaluations like LiveBench and IFEval. The model leverages reinforcement learning and integrates agent capabilities for critical thinking and adaptive reasoning. Notably, QwQ-32B requires much less computational power, making it deployable on consumer-grade hardware. This release aligns with Alibaba’s AI strategy, which includes significant investments in cloud and AI infrastructure. Following the release, Alibaba’s US stock rose 8.61% to $141.03, with Hong Kong shares up over 7%.[Jiemian, in Chinese]
Related Articles
2025-06-26 09:31
581 views
Best portable power station deal: Save 44% on the Jackery Explorer 100 v2
SAVE OVER $350: As of May 8, the Jackery Explorer 1000 v2 power station is on sale for $448.99, down
Read More
2025-06-26 09:19
111 views
Twitter's iOS app now visibly threads together replies
This new change doesn't make Twitter any less of a dumpster fire, but for iOS users at least, it's n
Read More
2025-06-26 07:45
1404 views
Those Apple Cards are looking pretty busted just six months later
Beauty truly is fleeting.Apple stans around the world are coming to terms with that fact a mere six
Read More