For the past year, the conversation has been “whose model is better.” Today’s two headlines drag the spotlight back to the substrate: in AI, the ceiling is increasingly defined by power, datacenters, advanced nodes, and delivery—i.e., infrastructure.

1. Alphabet beats expectations: 2026 capex guided to $175B–$185B; Gemini reaches 750M monthly active users
Alphabet reported Q4 results with revenue of $113.8B and net income of $34.46B. EPS came in at $2.82 (vs. $2.63 expected). The real headline is the 2026 capex guide of $175B–$185B—far above the $119.5B market expectation. Gemini now has 750M monthly active users.
Commentary:
A $175B–$185B capex guide is Alphabet effectively telling the market: AI is now an infrastructure game where “power + datacenters + compute supply” sets the upper bound. Whoever is willing (and able) to build, and to convert compute into revenue, captures the next platform rent.
On the business side, Google Cloud revenue at $17.66B is the bright spot. Search growing 17% with AI Overviews suggests generative AI is improving engagement and ad performance. Meanwhile, YouTube ad growth at 9% lagged overall ad growth (14%), signaling short-form competition (TikTok, Reels) is still chipping away at attention and ad budgets.
Gemini at 750M MAU means distribution is working—Google has “laid the pipes.” But the next proof is commercial: retention, paid conversion, enterprise seat expansion, and measurable ARPU uplift from AI features. And if a deeper Gemini–iPhone partnership materializes in 2026, the funnel gets even larger.
Alphabet is using $185B to make a simple claim: the endgame belongs to whoever can burn for infrastructure without blinking. The question is—who else can realistically follow to that scale?
2. TSMC plans 3nm mass production in Kumamoto, Japan: about $17B in investment
TSMC plans to mass-produce 3nm chips at its Kumamoto site in Japan, with investment around $17B. Kumamoto’s second fab, originally expected to target 6–12nm, is being upgraded to 3nm—positioning Japan as the fourth region capable of 3nm volume production.
Commentary:
Japan’s most advanced domestic logic capability has long been stuck around 45–65nm, with even mature nodes (28nm+) heavily dependent on overseas foundries. A 3nm upgrade is strategically massive: major Japanese players (e.g., Sony, Toyota, SoftBank) could access leading-edge capacity domestically instead of relying entirely on cross-border supply chains.
This is also TSMC leaning harder into geographic diversification—paying higher overseas cost to buy supply-chain resilience and customer trust. But the real determinant is execution: can TSMC deliver stable 3nm yields and ramps on schedule outside Taiwan?
Costs matter too. Japan’s labor, power, and land are materially more expensive; it’s notable that roughly 40% of the $17B is earmarked for infrastructure. For TSMC, this is “pay more to reduce systemic risk.” For Japan, it’s a strategic re-anchoring of advanced manufacturing capacity.
Further reading (most important AI events in the past 72 hours):
As “model deltas” become increasingly transient, “infrastructure and delivery” is turning into the durable moat. Over the next year, the big story may be less about who ships the smartest model—and more about who can scale compute, lower unit costs, and close the loop into real revenue. Who do you think is the next company willing to push capex into the $100B+ range?