Straight from the Desk
Syz the moment
Live feeds, charts, breaking stories, all day long.
- All
- equities
- United States
- Macroeconomics
- Food for Thoughts
- markets
- Central banks
- Fixed Income
- bitcoin
- Asia
- europe
- investing
- technical analysis
- geopolitics
- gold
- Crypto
- AI
- Technology
- Commodities
- nvidia
- ETF
- earnings
- Forex
- china
- Real Estate
- banking
- oil
- Volatility
- magnificent-7
- energy
- apple
- Alternatives
- emerging-markets
- switzerland
- tesla
- United Kingdom
- assetmanagement
- Middle East
- amazon
- russia
- ethereum
- microsoft
- ESG
- meta
- Industrial-production
- bankruptcy
- Healthcare
- Turkey
- Global Markets Outlook
- africa
- Market Outlook
- brics
🔥 OpenAI’s “Go Big or Go Bust” Strategy Just Went Public — and the numbers are insane.
According to leaked financials, OpenAI is preparing to lose $74B in 2028 alone — yes, one year — before expecting to swing to real profitability by 2030. What about this year? $13B in revenue $9B in cash burn A burn rate of ~70% of revenue ‼️ And it only gets wilder: OpenAI expects three-quarters of its 2028 revenue to be wiped out by operating losses. Meanwhile, competitor Anthropic expects to break even in 2028. OpenAI expects to burn $115B cumulatively through 2029. OpenAI’s commitments: Up to $1.4T over 8 years for compute deals Nearly $100B on backup data-center capacity Aiming for $200B in revenue by 2030 (a 15x jump from today) 💡 The read-through: This is the biggest strategic divergence in AI right now: Anthropic = disciplined scaling OpenAI = moonshot economics OpenAI is effectively saying: “We’ll lose tens of billions now to own the entire future later.” But there’s a catch: 95% of businesses still get zero real value from AI today. And OpenAI is funding its hyperscale buildout not from revenue (like AWS did), but from debt, investors, and chip-supplier deals — while losing money on every ChatGPT interaction. This ends one of two ways: 🚀 The most valuable company in history 💥 Or a case study in overestimating demand There’s no middle lane when you’re burning cash faster than any startup in history... Source: hedgie on X
A very interesting article by the FT >>>
Key takeaways: ➡️ 1. Jensen Huang’s Warning Isn’t Just Self-Interest Although Nvidia benefits from greater global AI investment, Huang’s claim that China may win the AI race has substantive grounding. The argument isn’t only about chips—it’s increasingly about energy. ➡️ 2. AI Progress Is Becoming Limited by Electricity, Not Chips Training frontier models consumes massive electricity. A single GPT-4–scale model can use ~463,000 MWh/year — more than 35,000 U.S. homes. As AI workloads expand, data centre electricity consumption could more than double by 2030. By 2040, data centres may consume 1,800 TWh annually, enough to power 150 million U.S. homes. Conclusion: The bottleneck is shifting from access to high-end chips to access to cheap, abundant power. ➡️ 3. China Has a Structural Energy Advantage China is rapidly expanding renewable energy capacity: Added 356 GW of new renewable energy last year (solar + wind). Solar alone grew 277 GW, far exceeding additions in the U.S. Massive government-backed projects linking industrial policy and grid expansion: Solar in Inner Mongolia Hydropower in Sichuan High-voltage lines to coastal tech hubs Local governments also subsidize electricity for Chinese tech giants (Alibaba, Tencent, ByteDance), lowering the effective cost of AI training, even with less advanced chips like Huawei’s Ascend 910B. ➡️ 4. The U.S. Faces Growing Power Constraints U.S. wholesale electricity prices near data-centre clusters are up as much as 267% over five years. Investment in large wind and solar projects is declining due to policy and regulatory uncertainty. The White House has ended subsidies for wind and solar, slowing capacity growth. Outcome: The U.S. is adding compute demand faster than energy supply. ➡️ 5. Chip Superiority Alone May Not Decide the Winner Nvidia’s H100 and Blackwell chips still outperform Chinese alternatives. But the historical “chip supremacy” model may matter less as: Chip performance grows only single digits yearly. China’s energy capacity grows double digits yearly. More cheap power → more compute hours → more model training → faster innovation. ➡️ 6. AI Dominance Will Belong to Those With Cheap Energy The article frames AI as part of a much older pattern: Britain dominated through cheap coal. The U.S. dominated through oil and hydroelectric power. Now, AI dominance will go to those who can run the most computation, not just build the best chips. ‼️ Final takeaway: The future of AI power belongs to countries that can provide abundant, inexpensive electricity — and right now, China is building that capacity faster than anyone else
Anthropic commits $50B to build custom AI data centers across multiple US locations including Texas and New York.
Partnership with UK-based Fluidstack Ltd will bring sites online throughout 2026, creating 800 permanent jobs and 2,400 construction jobs. First major data center build Anthropic is developing directly rather than through cloud partners like Amazon or Google. Source: Shanu Mathew @ShanuMathew93
🚀 Morgan Stanley: “The AI Memory Super Cycle Has Arrived — and It’s Unlike Anything We’ve Seen Before.”
Morgan Stanley says the new AI-driven memory super cycle will far surpass any past cycle — in scale, speed, and earnings power. 💡 Here’s what’s different this time: Led by AI data centers and cloud giants, not consumer devices. Price sensitivity is gone. Memory is now a strategic must-have, not a cost item. HBM (High Bandwidth Memory) demand is exploding, squeezing traditional DRAM supply. 📈 The numbers are jaw-dropping: Q4 server DRAM contract prices up ~70% (vs. 30% expected). DDR5 spot prices +336% since September. NAND up 20–30% — and still rising amid severe shortages. Enterprise SSD demand expected to surge 50%+ YoY by 2026. 🏭 Suppliers in control: SK hynix and Samsung now hold unprecedented pricing power. Morgan Stanley remains Overweight on both, expecting record profits and new share price highs. 🔥 The key insight: “This isn’t a typical memory cycle. It’s a structural shift — driven by AI inference workloads and hyperscaler demand. Earnings, not valuations, will define the peak.” 💰 Even after massive price hikes, memory is still below its last cycle peak ($1/Gb vs. $1.25 in 2018). Morgan Stanley sees further upside as AI capex accelerates. 📊 Bottom line: This AI memory super cycle is longer, stronger, and more profitable than any before. Morgan Stanley expects 2026–27 earnings 30–50% above market consensus for SK hynix and Samsung. “We’re in uncharted territory — this is not just a cycle. It’s a paradigm shift.”
Data centre operator CoreWeave’s stock has fallen more than 20 per cent over the past two weeks, alongside the drop in bigger names.
On Tuesday, the company’s shares were down a further 16 per cent after it lowered its forecast for annual revenue as a result of expected data centre delays. The cost to protect against a default on CoreWeave’s debt has jumped as the equity price has fallen, with the group’s five-year credit default swaps trading at 505 basis points, from below 350bp at the start of October, according to LSEG data. Source: FT
The index is derived from state-level employment, wage, and unemployment data, capturing how many U.S. states experience significant labour-market deterioration at any given time.
About $1.5T may come from investment-grade bonds, plus $150B from leveraged finance and up to $40B a year in data-center securitizations. Even then, there’s still roughly a $1.4T funding gap likely filled by private credit and governments. Source: Wall St Engine
🚨 TSMC’s Growth Just Slowed — and the Entire AI Supply Chain Is Watching
Bloomberg reports TSMC’s October sales rose 16.9% — still strong, but slower than the recent surge. Why it matters: TSMC sits at the choke point of advanced chips. When it slows, the ripple hits GPUs, memory, packaging, the whole AI stack. 📉 A slowdown in monthly sales doesn’t always mean weak demand. It can be: - Order timing (customers shifting deliveries) - Yield ramps (new node transitions) - Inventory digestion (buyers catching up) But zoom out… 💰 Cloud giants plan to spend >$400B on AI infrastructure next year — a 21% jump. That money flows straight into NVIDIA GPUs, high-bandwidth memory, advanced packaging, and TSMC’s cutting-edge wafers. 🗣️ NVIDIA’s CEO says demand is “getting stronger month by month” and even met with TSMC’s CEO to ask for more capacity. Translation: supply, not demand, is still the bottleneck. TSMC also builds for AMD, Qualcomm, and Apple — so who gets priority at the fab affects entire product launches. 🎯 The company says capacity is “very tight,” and chip designers are literally chasing slots in Hsinchu. Source: Bloomberg, Rohan Paul @rohanpaul_ai
Elon predicts that OpenAI will eat Microsoft alive and believes MSFT should stop supporting it.
Source: The AI Investor @The_AI_Investor
Investing with intelligence
Our latest research, commentary and market outlooks

