Straight from the Desk
Syz the moment
Live feeds, charts, breaking stories, all day long.
- All
- equities
- United States
- Macroeconomics
- Food for Thoughts
- markets
- bitcoin
- Central banks
- Fixed Income
- geopolitics
- gold
- europe
- Asia
- Commodities
- investing
- AI
- Technology
- technical analysis
- Crypto
- nvidia
- china
- ETF
- earnings
- oil
- Forex
- energy
- banking
- Volatility
- Real Estate
- magnificent-7
- Alternatives
- apple
- emerging-markets
- tesla
- switzerland
- Middle East
- United Kingdom
- amazon
- assetmanagement
- microsoft
- ethereum
- russia
- meta
- Industrial-production
- ESG
- Healthcare
- Global Markets Outlook
- bankruptcy
- Turkey
- brics
- Market Outlook
- africa
- performance
The US has 5,381 data centers — more than China (449) and every other major nation combined.
The American bet is simple: own the most compute, win the AI race. But China isn’t playing that game. Instead of chasing data center volume, China open-sourced frontier models (DeepSeek, Qwen, Baichuan) that run on cheap hardware. DeepSeek trained a frontier model for $5–6M (vs. tens of millions in the US). Inference costs are ~280x cheaper than ChatGPT. Modular data centers deploy in weeks, built around ultra-low-cost power. China isn’t scaling infrastructure. They’re scaling efficiency — and commoditizing intelligence. Meanwhile, the US is hitting a wall: the power grid. Data centers already use 6% of US electricity, headed to 11% by 2030. Spare grid capacity has fallen from 26% → 19%, on track for <15%. Some regions face 7-year waitlists just to connect new facilities. Ohio alone rejected 17 GW of new data center interconnection requests. You can build data centers. But can you power them? China can. By 2025, their installed capacity hits 3.99 TW (up 19% YoY). Renewables are nearly half of all generation. In the first five months of 2025 alone: 197 GW solar added 46 GW wind added By 2030, China is expected to have 400 GW of spare power capacity — over 3× global data center demand. The US built the most data centers. China built the power to scale whatever it wants. The real race isn’t about who has more compute today — it’s who can power their compute tomorrow. And on that dimension, China is pulling ahead. Source: StockMarket.news, Apollo
”The AI Bubble” in perspective.
What if the doomsayers are right but they have been, and they continue, miss the last +30% of the ”bubble” for their told-you-so moment? This is indeed what already happened most of this year. Source: Emre Akcakmak, Bloomberg, FT
🔥 OpenAI’s “Go Big or Go Bust” Strategy Just Went Public — and the numbers are insane.
According to leaked financials, OpenAI is preparing to lose $74B in 2028 alone — yes, one year — before expecting to swing to real profitability by 2030. What about this year? $13B in revenue $9B in cash burn A burn rate of ~70% of revenue ‼️ And it only gets wilder: OpenAI expects three-quarters of its 2028 revenue to be wiped out by operating losses. Meanwhile, competitor Anthropic expects to break even in 2028. OpenAI expects to burn $115B cumulatively through 2029. OpenAI’s commitments: Up to $1.4T over 8 years for compute deals Nearly $100B on backup data-center capacity Aiming for $200B in revenue by 2030 (a 15x jump from today) 💡 The read-through: This is the biggest strategic divergence in AI right now: Anthropic = disciplined scaling OpenAI = moonshot economics OpenAI is effectively saying: “We’ll lose tens of billions now to own the entire future later.” But there’s a catch: 95% of businesses still get zero real value from AI today. And OpenAI is funding its hyperscale buildout not from revenue (like AWS did), but from debt, investors, and chip-supplier deals — while losing money on every ChatGPT interaction. This ends one of two ways: 🚀 The most valuable company in history 💥 Or a case study in overestimating demand There’s no middle lane when you’re burning cash faster than any startup in history... Source: hedgie on X
A very interesting article by the FT >>>
Key takeaways: ➡️ 1. Jensen Huang’s Warning Isn’t Just Self-Interest Although Nvidia benefits from greater global AI investment, Huang’s claim that China may win the AI race has substantive grounding. The argument isn’t only about chips—it’s increasingly about energy. ➡️ 2. AI Progress Is Becoming Limited by Electricity, Not Chips Training frontier models consumes massive electricity. A single GPT-4–scale model can use ~463,000 MWh/year — more than 35,000 U.S. homes. As AI workloads expand, data centre electricity consumption could more than double by 2030. By 2040, data centres may consume 1,800 TWh annually, enough to power 150 million U.S. homes. Conclusion: The bottleneck is shifting from access to high-end chips to access to cheap, abundant power. ➡️ 3. China Has a Structural Energy Advantage China is rapidly expanding renewable energy capacity: Added 356 GW of new renewable energy last year (solar + wind). Solar alone grew 277 GW, far exceeding additions in the U.S. Massive government-backed projects linking industrial policy and grid expansion: Solar in Inner Mongolia Hydropower in Sichuan High-voltage lines to coastal tech hubs Local governments also subsidize electricity for Chinese tech giants (Alibaba, Tencent, ByteDance), lowering the effective cost of AI training, even with less advanced chips like Huawei’s Ascend 910B. ➡️ 4. The U.S. Faces Growing Power Constraints U.S. wholesale electricity prices near data-centre clusters are up as much as 267% over five years. Investment in large wind and solar projects is declining due to policy and regulatory uncertainty. The White House has ended subsidies for wind and solar, slowing capacity growth. Outcome: The U.S. is adding compute demand faster than energy supply. ➡️ 5. Chip Superiority Alone May Not Decide the Winner Nvidia’s H100 and Blackwell chips still outperform Chinese alternatives. But the historical “chip supremacy” model may matter less as: Chip performance grows only single digits yearly. China’s energy capacity grows double digits yearly. More cheap power → more compute hours → more model training → faster innovation. ➡️ 6. AI Dominance Will Belong to Those With Cheap Energy The article frames AI as part of a much older pattern: Britain dominated through cheap coal. The U.S. dominated through oil and hydroelectric power. Now, AI dominance will go to those who can run the most computation, not just build the best chips. ‼️ Final takeaway: The future of AI power belongs to countries that can provide abundant, inexpensive electricity — and right now, China is building that capacity faster than anyone else
Anthropic commits $50B to build custom AI data centers across multiple US locations including Texas and New York.
Partnership with UK-based Fluidstack Ltd will bring sites online throughout 2026, creating 800 permanent jobs and 2,400 construction jobs. First major data center build Anthropic is developing directly rather than through cloud partners like Amazon or Google. Source: Shanu Mathew @ShanuMathew93
🚀 Morgan Stanley: “The AI Memory Super Cycle Has Arrived — and It’s Unlike Anything We’ve Seen Before.”
Morgan Stanley says the new AI-driven memory super cycle will far surpass any past cycle — in scale, speed, and earnings power. 💡 Here’s what’s different this time: Led by AI data centers and cloud giants, not consumer devices. Price sensitivity is gone. Memory is now a strategic must-have, not a cost item. HBM (High Bandwidth Memory) demand is exploding, squeezing traditional DRAM supply. 📈 The numbers are jaw-dropping: Q4 server DRAM contract prices up ~70% (vs. 30% expected). DDR5 spot prices +336% since September. NAND up 20–30% — and still rising amid severe shortages. Enterprise SSD demand expected to surge 50%+ YoY by 2026. 🏭 Suppliers in control: SK hynix and Samsung now hold unprecedented pricing power. Morgan Stanley remains Overweight on both, expecting record profits and new share price highs. 🔥 The key insight: “This isn’t a typical memory cycle. It’s a structural shift — driven by AI inference workloads and hyperscaler demand. Earnings, not valuations, will define the peak.” 💰 Even after massive price hikes, memory is still below its last cycle peak ($1/Gb vs. $1.25 in 2018). Morgan Stanley sees further upside as AI capex accelerates. 📊 Bottom line: This AI memory super cycle is longer, stronger, and more profitable than any before. Morgan Stanley expects 2026–27 earnings 30–50% above market consensus for SK hynix and Samsung. “We’re in uncharted territory — this is not just a cycle. It’s a paradigm shift.”
Data centre operator CoreWeave’s stock has fallen more than 20 per cent over the past two weeks, alongside the drop in bigger names.
On Tuesday, the company’s shares were down a further 16 per cent after it lowered its forecast for annual revenue as a result of expected data centre delays. The cost to protect against a default on CoreWeave’s debt has jumped as the equity price has fallen, with the group’s five-year credit default swaps trading at 505 basis points, from below 350bp at the start of October, according to LSEG data. Source: FT
The index is derived from state-level employment, wage, and unemployment data, capturing how many U.S. states experience significant labour-market deterioration at any given time.
About $1.5T may come from investment-grade bonds, plus $150B from leveraged finance and up to $40B a year in data-center securitizations. Even then, there’s still roughly a $1.4T funding gap likely filled by private credit and governments. Source: Wall St Engine
Investing with intelligence
Our latest research, commentary and market outlooks

