-+ 0.00%
-+ 0.00%
-+ 0.00%

When agent workflows began to disrupt traditional business models, “AI+ Cloud Data Platform” software stocks took advantage of the rise

Zhitongcaijing·12/11/2025 09:33:02
Listen to the news

The Zhitong Finance App learned that after participating in Amazon's AWS Re:Invent in 2025, senior analysts at Wall Street finance giant Citigroup (Citi) believe that at least until the end of 2026, the smart cloud data platform software companies that focus on “AI-driven usage billing data and cloud core resource usage consumption” will be the most important performance growth and valuation of the global stock market software sector at least until the end of 2026 Lift the main track.

Citigroup analysts expect AI agent workflows to become the dominant force in enterprise software and cloud computing spending. The most direct beneficiaries will undoubtedly be leading cloud computing vendors (hyperscalers), while “AI+ cloud data platform” software companies that charge for actual data/usage will be the biggest winners in this wave of software and cloud spending — such as software industry leaders such as Oracle, Snowflake, MongoDB, and Elastic. Therefore, for the software sector, Citi's overall configuration trend is to add an “AI+ cloud data/cloud monitoring” consumption-based intelligent cloud data platform, and carefully view the closed and increasingly expensive one-stop AI application platform.

Agentic Workflows refers to a system where AI agents independently complete multi-step tasks (such as data extraction, analysis, and final actual decision-making). Citi believes that global enterprises are shifting from “AI testing” to “the initial stage of large-scale deployment of AI agents,” a trend that will drive enterprise software and cloud computing spending to increase 20-35% in 2026 from a 50% increase in 2025.

Citibank analysts said that in their own environments, customers are using AI data engineering tools more and more strongly, as reflected in the obvious acceleration of the use of Snowflake Intelligence/Cortex (AI function module). First-line anecdotes feedback shows that a single AI use case can bring about 200,000 US dollars in incremental consumption. In some large customers, there may be more than 10 use cases, which corresponds to a more impressive increase in usage and expenditure; a major US airline said They are testing Elastic for AI cloud observation and are planning to replace Splunk. At the same time, they see that customers also have strong demand for Elastic's “AI+ cloud security” capabilities and want to integrate endpoint security (endpoint security) through its security products.

Previously, Google, a leader in cloud computing and search engines, AppLovin, an AI application leader focusing on “AI+ digital advertising,” Palantir, a leader in “AI+ data analysis,” and Meta, the parent company of Facebook and Instagram, and American cloud software giant Saifushi, which focuses on the customer relationship management software (CRM) field, have all published extremely strong performance data and future performance prospects since this year. This means that not only is the demand for AI computing power infrastructure represented by Nvidia's AI GPUs extremely strong, but demand for AI software applications, especially enterprise-level AI application software that can comprehensively improve B-side operating efficiency, is also strong, and it is rapidly penetrating all walks of life.

Judging from the current technological trajectory, the development direction of AI application software is focused on “generative AI application software” (such as DeepSeek, ChatGPT, Sora, and Claude launched by Anthropic), and on the basis of generative AI, AI functions are shifting from chatbox-style question-and-answer to “AI agents that independently perform various complicated and complex tasks.” The urgent need for companies to improve efficiency and reduce operating costs can be described as vigorously promoting the two core categories of AI application software — generative AI applications and the widespread application of AI agents. Among them, AI agents are most likely a major trend in AI applications before 2030. The emergence of AI agents means that artificial intelligence is beginning to evolve from an information support tool to a highly intelligent productivity tool.

As far as the bullish narrative logic of the AI application sector in the global stock market is concerned, Gemini 3, a major launch by cloud computing and search engine leader Google, instantly became popular worldwide, and the strong performance and prospects recently announced by Cloudflare Inc., a cloud computing service company focused on the “Connectivity Cloud (Connectivity Cloud)” positioning, and software leaders such as SafeTime and MongoDB are signs that the latest AI applications are actively expanding and penetrating the enterprise/consumer side. “Verify the viability of AI application stories + early warm-up A potential accelerated growth trend after 2026”.

Google Gemini3 sparks a new wave of AI applications! GPT-5.2 is about to stage a “latecomer”

According to Citigroup's analyst team, the “AI+ cloud data intelligence platforms” such as Snowflake and MongoDB mentioned above are moving from the “grand narrative stage” to the “performance and cash flow delivery growth stage”. The prototype of the “super bull market” of these stocks is accelerating; in addition, after Google launched the Gemini3 AI application ecosystem in late November, this cutting-edge AI application became popular all over the world, driving the instantaneous surge in demand for Google's AI computing power, as well as the strong demand for Google's AI computing power previously announced The performance highlights that “AI+ cloud data platform” software companies such as MongoDB are entering a “strong Nvidia-style performance growth trajectory.”

Google Gemini 3 has undoubtedly recently set off a new wave of AI applications around the world. Once released, Gemini3 series products brought huge AI token processing capacity, forcing Google to drastically reduce the amount of free access to Gemini 3 Pro and Nano Banana Pro, and also imposed temporary restrictions on Pro subscribers. Combined with South Korea's recent trade export data, demand for HBM storage systems and enterprise-grade SSDs continues to be strong, further verifying that “the AI boom is still in the early stages of construction where computing power infrastructure is in short supply.”

Notably, OpenAI, the developer of ChatGPT, is determined not to give up, and this wave of AI applications is expected to spread rapidly to various industries as competition between Google and OpenAI is in full swing. The long-term growth prospects of “AI+ cloud data platform” software companies such as MongoDB can be described as a major benefit.

Currently, discerning netizens have discovered telltale signs of GPT-5.2. Screenshots circulating in the developer community show that GPT-5.2 and GPT-5.2 and GPT-5.2-Thinking options suddenly appear in the Cursor model drop-down menu. This also means that maybe OpenAI developers have understood that AI+ programming is not only a killer application of AI models, but also the field that best embodies model reasoning ability. All in all, what we can anticipate is that a spark-filled “AI application superwar” between Google and OpenAI is about to begin.

Many clues suggest that GPT-5.2 has significantly surpassed Gemini 3. Comparative charts circulating on social media show that GPT-5.2 almost completely “crushes” Gemini 3 and Claude 4.5 in comparison of the big model's core metrics. Sam Altman, the leader of OpenAI, also claimed in an internal evaluation that the new AI big model will be ahead of Google's competitors in terms of reasoning ability.

1765444042 (1) .png

MongoDB's high-quality growth earnings data, combined with the high overall revenue growth trend of Google Cloud (Google Cloud) as shown in recent Google earnings reports, the surge in the number of tokens in Gemini series products, and Google's huge AI capex (AI-related capital expenditure), together indicate that the demand for cloud data platforms from AI application development to the deployment layer and cloud AI inference computing power requirements on cloud computing platforms such as Google are all in a very high boom range.

More importantly, with the shocking launch of Google's Gemini 3 series of AI products, including Nano Banana Pro, which quickly became popular among enterprises and consumer users around the world, it was highlighted that the entire “Google AI ecosystem”, including MongoDB, is still on a strong growth trajectory.

Relying on the long-term deep integration of Atlas and Google Cloud/Vertex AI, MongoDB can be described as the core beneficiary of the cloud database construction and vector search layer in this unprecedented AI superwave. The AI application stack represented by Atlas+Vertex AI is rapidly entering the practical and large-scale stage, pointing to the surge in demand for actual AI applications such as generative AI application software, AI search, AI recommendations, and AI agents among B-side and C-side users AI workflows are rapidly penetrating between businesses and consumers.

According to the performance report released by MongoDB in early December, the company's core business revenue all exceeded Wall Street's general expectations and provided extremely strong guidance for the next fiscal quarter and full year. MongoDB not only develops the database software itself, but also provides a complete set of cloud computing services and commercialization support around this database. Relying on the extremely strong performance data brought about by the long-term deep integration of the cloud database PaaS platform service MongoDB Atlas on the Google Cloud platform, it can be described as highlighting that the “Google AI ecosystem” is fully benefiting from the surge in revenue from the surge in AI computing power demand for Google's Gemini series product portfolio.” One of the biggest winners in the “Google AI Ecosystem”.

The main logic of AI investment is changing: gradually moving from a “computing power story” to “proxy workflow and data usage to realize strong revenue generation”

After going through the “first phase of AI investment” from 2023 to 2024, which is dominated by computing power and infrastructure, the market is clearly shifting to the main line of “AI applications and cloud data intelligence platforms”. A number of software companies, such as Saifushi, MongoDB, AppLovin, Meta, and Google, have recently begun to continue to deliver revenue and profit growth rates that exceed expectations in advertising efficiency improvements, marketing automation, cloud data intelligence platforms, and native AI applications in developer and enterprise workflows. Some companies have even shown an increase in single-customer value (ARPU), increase in usage time, and cloud resource consumption brought about by the popularity of AI agents.

As Citi mentioned in Re:Invent feedback, cloud vendors and ISVs are reconstructing business models around “consumptive billing+cloud data intelligence/actual data usage for AI workloads”, which actually means — AI is no longer just a capital story, but is transforming into measurable revenue elasticity and a longer-term growth curve.

AI-driven cloud databases, cloud observation/monitoring, marketing and advertising intelligent delivery platforms, and large-scale internet/cloud platform enterprises with self-built AI models and large-scale AI application development to deployment complete ecosystems may become core assets under the new wave of AI applications: they can not only share the computing power consumption and data call dividends brought about by the big blue ocean era of AI inference, but also lock in customer resources over a long period of time through the platform ecosystem to continuously improve profit margins and cash flow quality.

For example, MongoDB's performance growth logic can be said to have changed from a “general document database” to a cloud database technology route that clearly binds to the Google Cloud Platform+ Gemini series of AI application software from development to deployment. Long-term and in-depth cooperation with Google can be described as an essential part of its strong performance report.

MongoDB can be described as a “super smooth outlet” for the surge in demand for AI inference computing power on cloud platforms from large cloud computing vendors such as Google. As a hosted database, MongoDB Atlas runs natively on Google Cloud, and is deeply integrated with core data/AI developer ecosystem services such as Vertex AI and BigQuery in Google Cloud. MongoDB's own case article also mentioned that the vast majority of AI startups use Google Cloud (Google Cloud) + MongoDB Atlas as the main database and cloud-based AI infrastructure to run AI applications based on LLM (Hyperparameter Big Language Model) to successfully deploy a complete AI ecosystem platform in the cloud.

Citigroup analysts said that first-line feedback from AWS Re:Invent 2025 shows that the real winners brought by AI are shifting from a “storytelling model company” to a “cloud data and observation platform that undertakes Agentic Workflows.” Citi's investment position on the software sector is very clear: adding cloud data/security/observation platforms and leading cloud vendors that use usage as software-based infrastructure in the AI application era; maintaining a cautious attitude towards closed and expensive one-stop AI platforms such as Palantir, emphasizing that they mainly look at performance and contract renewals, rather than just mood and narrative.

MongoDB's latest revenue and profit have completely exceeded expectations and raised its annual performance forecast, echoing Google's latest large-scale AI Capex and Google's more vigorous cloud computing business revenue growth, indicating that public cloud computing vendors such as Google, Microsoft, and Amazon are still in a very high boom range due to AI computing power infrastructure construction and the accelerated penetration of AI applications.

Today, Gemini has more than 650 million monthly active users, and the total number of tokens processed each month has increased more than 20 times within a year. Google's entire AI ecosystem can be described as “thriving” following the explosive expansion of revenue data based on Google's cloud platform, a surge in demand for AI inference computing power from the cloud, and a surge in the penetration rate of Google's entire AI application ecosystem.