-+ 0.00%
-+ 0.00%
-+ 0.00%

AI server orders exploded! Foxconn enjoyed AI infrastructure dividends, and Q4 revenue surged 22% in one fell swoop to a record high

Zhitongcaijing·01/05/2026 09:57:05
Listen to the news

The Zhitong Finance App learned that Foxconn (Foxconn), the world's largest electronics OEM, headquartered in Taiwan Province of China, released record quarterly revenue data on Monday. As the world's largest electronic product OEM, Foxconn can be described as fully benefiting from the strong demand frenzy for artificial intelligence products (AI server clusters+AI smartphones, AI smart glasses, etc.). In particular, the current unprecedented wave of AI infrastructure continues to bring exponential growth in AI server order size to Foxconn, driving Foxconn's quarterly revenue to record highs.

As the largest AI server cluster contract manufacturer for the “AI chip superpower” NVDA.US (NVDA.US), and the most core iPhone series product assembler of the global consumer electronics leader Apple (AAPL.US), Foxconn said in a newly released revenue statement that its overall revenue for the fourth quarter jumped 22.07% from the same period in 2024 to a record NT$2.6028 trillion (approximately US$82.73 billion). This result greatly surpassed LSEG SmartTest's unanimous expectations of NT$2.418 trillion; LSEG SmartTest gave Wall Street analysts who had made more accurate predictions over a long period of time a greater share of weight in their predictions.

Foxconn said that revenue for this quarter (fourth calendar quarter) achieved significant month-on-month and year-over-year growth, exceeding market expectations and creating a high base comparison base for the first quarter of 2026. In US dollars, Foxconn management said that fourth-quarter revenue increased sharply by 26.4%. However, even considering the very high base in the fourth quarter, the market is unanimous that strong global demand for artificial intelligence server rack products will drive the company's performance close to its highest level of growth in the past five years.

Overall, this strong growth was mainly driven by the strong performance of Foxconn's cloud computing and network product foundry division. It is mainly due to a surge in demand from global enterprises or government agencies for AI server computing power cluster products built by this department, while the overall monthly and quarterly revenue of its smart consumer electronics division (including iPhones) declined slightly due to unfavorable exchange rates.

Statistics show that in December alone, Foxconn's overall revenue reached approximately NT$862.86 billion, which means a year-on-year year-on-year growth rate of 31.77%, a record monthly high for a single month in December. You need to know that generally, October-December is the month with the strongest increase in Foxconn server and smartphone OEM orders.

As soon as the Gemini3 series products were released, they brought huge AI token processing capacity, further verifying that “this unparalleled boom in global AI layout is still in the early stages of accelerated construction when the supply of computing power infrastructure is in short supply.” The Bank of America said in its research report that the global AI arms race is still in the “early to middle stage”. Despite recent sharp downside fluctuations in popular chip stocks such as Nvidia and Broadcom, investors should continue to pay attention to industry leaders. Pioneer Pilot, one of the world's largest asset management giants, recently pointed out in a research report that the AI investment cycle may have reached only 30% to 40% of the final peak.

Foxconn — one of the biggest winners in the AI infrastructure boom

Foxconn is best known for building iPhones for Apple, but its business has fully expanded into the fields of AI computing power infrastructure and automobile manufacturing. Foxconn is the world's largest AI server manufacturer, and one of the few that can mass-produce “customized AI server rack packages” for extremely large AI training/inference workloads, and is also one of the world's most critical suppliers of Nvidia's AI GPU computing power cluster equipment. It is no less important than TSMC, the “king of chip foundry.”

The reason why Foxconn has become a direct beneficiary of this “unprecedented wave of AI infrastructure” is that the biggest increase in AI capital expenditure of up to a trillion dollars occurred not only on “AI chips”, but also on “data center hardware bases” such as server/cabinet integration, large-scale data center interconnection and wiring, and cutting-edge cooling and power equipment.

Foxconn is both the largest server manufacturer for the Nvidia Hopper/Blackwell architecture AI server cluster and the main assembler of Apple's iPhone series products, and its performance and order structure in 2025 are clearly skewed towards the AI computing power infrastructure field: in the company's disclosure and financial media reports, the revenue share of “cloud computing and network products (mainly around AI servers)” has risen sharply to about 41% to 42% and once surpassed consumer electronics. This makes AI servers the core growth engine for Nvidia AI While GPUs are shipped on a large scale, they can also enjoy AI training/inference system integration shares and large-scale AI computing power dividends “from AI GPU/AI ASICs to complete machines, from complete servers to AI cabinets”.

As AI training/inference infrastructure expands rapidly, the market demand for “rapid large-scale delivery” is extremely strong. Foxconn's strengths lie in global manufacturing capabilities, supply chain organization, and cabinet-level integrated delivery capabilities, and is extending its reach to key data center equipment (such as cabling, networks, and power systems), such as arrangements for cooperating with OpenAI to design and manufacture key AI data center equipment in the US, further strengthening its core card position in the “AI infrastructure hardware stack.”

According to Wall Street giants Morgan Stanley, Citi, Loop Capital, and Wedbush, the global AI infrastructure investment wave with AI chip computing power hardware as the core is far from over and is only at the beginning. Driven by an unprecedented “AI inference computing power demand storm”, the scale of this round of AI infrastructure investment, which will continue until 2030, is expected to reach 3 trillion to 4 trillion US dollars.

Soon, Foxconn will receive a huge order from OpenAI

Earlier in November, OpenAI, a global AI application leader, announced that it had reached a major partnership with Foxconn. Foxconn and OpenAI will jointly design and manufacture critical artificial intelligence data center core facility components in the US. OpenAI and Foxconn have joined forces, and it is also one of the latest in-depth cooperation announcements closely related to strengthening local manufacturing capabilities to meet the needs of what can be called heavyweight AI computing power infrastructure.

OpenAI has teamed up with Foxconn from Taiwan Province of China to focus on implementing the design and manufacture of critical artificial intelligence data center components in the US. The focus of this collaboration between Foxconn and OpenAI is not to expand AI GPU or AI ASIC computing power hardware production capacity itself, but to “supply the AI data center framework and infrastructure such as water and electricity”: AI server cabinets/racks, power supply systems, network systems, basic wiring, and liquid cooling. The core purpose is two things — making OpenAI's hyperscale AI computing infrastructure project completed faster and more controllable, and promoting the localization of the data center infrastructure supply chain (i.e. made in the US).

OpenAI has become a staunch supporter of Trump's ambition to “return manufacturing to America,” and this latest collaboration between OpenAI and Foxconn certainly echoes a series of measures announced by Trump after returning to the White House to strengthen the AI computing power supply chain and manufacturing return in the US.

According to media reports, the collaboration between OpenAI and Foxconn will focus on AI data center server racks/racks (racks), power supply systems, high-performance network architectures and cabling, and liquid cooling systems. In other words, Foxconn is responsible for integrating all discrete components of core computing power hardware such as AI GPU/AI ASIC “before” and “after insertion” into a standardized, large-scale AI data center computer room base that can be mass-produced in the US.