News listMeituan has quietly launched its new-generation large model "LongCat-2.0-Preview"! It sets a record for the largest training computing power in China and specializes in AI Agent.
動區 BlockTempo2026-04-28 11:30:05

Meituan has quietly launched its new-generation large model "LongCat-2.0-Preview"! It sets a record for the largest training computing power in China and specializes in AI Agent.

ORIGINAL美團低調上線新一代大模型「LongCat-2.0-Preview」!創中國算力最大訓練紀錄,專攻 AI Agent
AI Impact AnalysisGrok analyzing...
📄Full Article· Automatically extracted by trafilaturaGemini 翻譯1288 words
A major move made in silence! Meituan has quietly launched its new-generation large model "LongCat-2.0-Preview" without any official promotion or open-source release. This behemoth, specifically built for AI Agent, not only possesses a million-token context processing capability but has also been revealed by industry insiders to have been trained entirely on 50,000 to 60,000 "domestic computing cards," setting a historical record for the largest-scale training on domestic computing power. Currently, the model is available by invitation only, offering 10 million free Tokens daily. (Previous coverage: JPMorgan: AI is not a job-stealer but a productivity multiplier; demand expansion is the key to employment) (Background: Is on-chain trading cards a viable business? Renaiss platform hits over $10 million in trading volume in 5 months, secondary market is heating up) As the global AI arms race intensifies, Chinese tech giant Meituan has chosen to flex its muscles in an extremely low-key yet shocking manner. Recently, multiple industry insiders and developers discovered that Meituan's LongCat API platform quietly launched a new model named LongCat-2.0-Preview. The changelog shows a date of April 20, but surprisingly, Meituan has yet to release any official press release, technical report, or open-source code, currently providing invitation-only access via API. Looking back at the history of the Meituan LongCat series, from the Flash-Chat (560B MoE), Flash-Thinking, and Omni versions active since 2025, every release was accompanied by detailed official blog posts, technical white papers, and simultaneous open-source releases on Hugging Face and GitHub. However, for this cross-generational 2.0 version, Meituan has opted for a "silent wealth-building" strategy. According to the brief official changelog, LongCat-2.0-Preview is built for productivity, explicitly listing three core capabilities: - Agent-oriented development: Native support for Tool Calling, multi-step reasoning, and long-context tasks. - Automated productivity: Highly proficient in code generation, automated workflows, and execution of complex instructions. - Deep ecosystem integration: Deeply integrated with cutting-edge tools such as Claude Code, OpenClaw, OpenCode, and Kilo Code. With more media and insider revelations on April 24, the true details of this mysterious model have finally surfaced. It is reported that LongCat-2.0-Preview adopts a Mixture-of-Experts (MoE) architecture, with total parameters exceeding the trillion level. Its parameter scale and activated parameter count are basically on the same magnitude as the DeepSeek V4, which was also released on the same day. At the same time, the model supports a context window of up to 1M (one million), capable of processing massive inputs of millions of words at once, with overall capabilities targeting the most advanced international models like GPT-5.5. Compared to the massive parameters of the model itself, the strategic significance lies in its underlying computing power. Insiders revealed that the training and inference of LongCat-2.0-Preview were "entirely" completed based on domestic Chinese computing clusters. It is claimed that Meituan utilized as many as 50,000 to 60,000 domestic accelerator cards for this purpose. This not only completely eliminates reliance on high-end overseas GPUs like NVIDIA but also sets the "largest-scale large model training record" completed on domestic computing power to date. Currently, the model is open for invitation testing on the official website (longcat.ai), and it generously provides a daily quota of 10 million free Tokens for developers to try out.
Data Status✓ Full text extractedRead Original (動區 BlockTempo)
🔍Historical Similar Events· Keyword + Asset Matching6 items
💡 Currently matching via keywords + symbols (MVP) · Will be upgraded to embedding semantic search later
Raw Information
ID:7287866c48
Source:動區 BlockTempo
Published:2026-04-28 11:30:05
Category:zh_news · Export Category zh
Symbols:Unspecified
Community Votes:+0 /0 · ⭐ 0 Important · 💬 0 Comments
Meituan has quietly launched its new-generation large model "LongCat-2.0-Preview"! It sets a record for the largest training computing power in China and specializes in AI Agent. | Feel.Trading