News listGoogle upgrades Gemini Deep Research Max: Integrates MCP for enterprise database connectivity and native charts, enabling analyst-level due diligence
動區 BlockTempo2026-04-22 02:36:41

Google upgrades Gemini Deep Research Max: Integrates MCP for enterprise database connectivity and native charts, enabling analyst-level due diligence

ORIGINALGoogle 升級 Gemini Deep Research Max:整合 MCP 可連企業內部資料庫、原生圖表,實現分析師盡職調查
AI Impact AnalysisGrok analyzing...
📄Full Article· Automatically extracted by trafilaturaGemini 翻譯2164 words
Google announced a major upgrade to Gemini Deep Research, launching two new agents: Deep Research and Deep Research Max. These are integrated with the latest Gemini 3.1 Pro model and utilize the MCP protocol to connect to financial data platforms and enterprise internal data sources such as FactSet, S&P Global, and PitchBook. (Context: OpenAI unlocks Deep Research: Paid users can perform 10 queries per month; Microsoft releases multimodal AI agent Magma) (Background: OpenAI launches "ChatGPT Agent"! Integrating Operator and Deep Research: Capable of booking tickets, ordering food, and creating presentations) A major upgrade, simultaneously launching two agents: Deep Research (speed-focused) and Deep Research Max (quality-focused), fully integrated with Gemini 3.1 Pro, and entering public testing for the first time via the Gemini API paid plan. Google announced Gemini Deep Research last night (21st). The core difference between Deep Research and Deep Research Max lies in "extended test-time compute." The agent does not just run once and submit; it iteratively reasons, searches, and corrects, acting like a tireless research assistant until it deems the report quality meets the standard. Google officially stated that Max has achieved a "leapfrog" advancement in industry-standard retrieval and reasoning capabilities. Compared to the preview version from last December, the number of sources consulted has increased significantly, allowing it to capture key nuances missed by the previous version and proactively cite authoritative sources like SEC filings and peer-reviewed journals when weighing conflicting evidence. Users can schedule a run overnight, and by the time the analyst team arrives at the office in the morning, a complete due diligence report is waiting in their inbox. Speed is not the point; depth is. In contrast, the standard version of Deep Research focuses on a significant reduction in latency and cost, replacing the December preview version as the default option for interactive scenarios (when users need real-time Q&A and do not require the time-consuming deep mining of Max). MCP Support: Evolving from searching the web to "searching any database" This upgrade to Deep Research also provides native support for MCP (Model Context Protocol). Previously, agents could only retrieve public web information; now, through MCP, they can seamlessly connect to enterprise-customized data sources and professional data streams. The practical significance is: Finance departments can connect internal ERP systems and private APIs from market data providers via an MCP server. Deep Research can then search the public web, Bloomberg terminal data, and internal databases simultaneously within a single research workflow, without the need for manual tool switching. Google also announced partnerships with FactSet, S&P Global, and PitchBook. These three institutions co-designed MCP servers, allowing clients to integrate financial and market data from these platforms directly into the Deep Research workflow. For investment banks, private equity firms, and research institutions, the significance of this bridge is self-evident. In terms of tool combinations, users can simultaneously enable Google Search, remote MCP, URL Context, Code Execution, and File Search; they can also completely disable the web, allowing the agent to operate only within custom databases, which is particularly critical for enterprise clients with data leakage concerns. First is native charts and infographics. This is a first for the Gemini API; Deep Research no longer just outputs text but can directly generate HTML charts or Nano Banana infographics, upgrading research reports from plain text to visualized analytical files. Second is collaborative planning. Before executing research, the agent generates a research plan that users can review, guide, and modify before letting the agent proceed. This allows for more granular control over the scope of the investigation, moving away from the "throw a question, wait for a report" black box toward human-AI co-definition of the research framework. Third is real-time streaming. The system tracks the agent's intermediate reasoning steps, and a live thought summary allows users to see what the agent is doing while waiting. Text and images are generated and returned in real-time, significantly reducing the uncertainty of long wait times. Regarding multimodal grounding, Deep Research can now ingest PDF, CSV, images, audio, and video as input, eliminating the need for manual pre-processing for cross-format data integration. The emergence of Deep Research Max, to some extent, marks a new stage of maturity for AI Agents in enterprise research workflows. In the past, when we talked about AI-assisted research, it mostly stayed at the level of "summarize this file for me" or "search for a few articles for me," which was essentially an automated search assistant. But when an agent can iteratively reason, autonomously weigh conflicting evidence, cite SEC filings, and access private financial databases via MCP, what it does is closer to the due diligence work performed by a junior analyst. Of course, "closer to" does not mean "replaces." How to verify the agent's reasoning logic, how to manage its access permissions to private data, and how to use AI-
Data Status✓ Full text extractedRead Original (動區 BlockTempo)
🔍Historical Similar Events· Keyword + Asset Matching6 items
💡 Currently matching via keywords + symbols (MVP) · Will be upgraded to embedding semantic search later
Raw Information
ID:c2bd4ec700
Source:動區 BlockTempo
Published:2026-04-22 02:36:41
Category:zh_news · Export Category zh
Symbols:Unspecified
Community Votes:+0 /0 · ⭐ 0 Important · 💬 0 Comments