Straight from the Desk
Syz the moment
Live feeds, charts, breaking stories, all day long.
- All
- equities
- United States
- Macroeconomics
- Food for Thoughts
- markets
- bitcoin
- Central banks
- Fixed Income
- geopolitics
- gold
- europe
- Asia
- Commodities
- investing
- AI
- Technology
- technical analysis
- Crypto
- nvidia
- china
- ETF
- earnings
- oil
- Forex
- energy
- banking
- Volatility
- Real Estate
- magnificent-7
- Alternatives
- apple
- emerging-markets
- tesla
- switzerland
- Middle East
- United Kingdom
- amazon
- assetmanagement
- microsoft
- ethereum
- russia
- meta
- Industrial-production
- ESG
- Healthcare
- Global Markets Outlook
- bankruptcy
- Turkey
- brics
- Market Outlook
- africa
- performance
TPU > GPU ???
Google's AI chips - TPUs, or tensor processing units - are having a moment. These semiconductors were used to train its latest genAI model, Gemini 3, which has received rave reviews, and are cheaper to use than Nvidia's offerings. 🚀 But here's the real reason Google invented the TPU Back in 2013, Google ran a simple forecast that scared everyone: If every Android user used voice search for just 3 minutes a day, Google would need to double its global data centers. Not because of videos. Not because of storage. But because AI was too expensive to run on normal chips. So Google made a bold move: 👉 Build its own AI chip - the TPU. 15 months later, it was already powering Google Maps, Photos, and Translate… long before the public even knew it existed. ⚡ Why TPUs Matter GPUs are great, but they were built for video games, not AI. TPUs were built only for AI. No extra baggage. No wasted energy. Just raw efficiency and speed. That focus paid off: TPUs deliver better performance per dollar Use less energy Are faster for many AI tasks And with each generation, Google doubles performance Even Nvidia’s CEO, Jensen Huang, openly respects Google’s TPU program. 🤔 Then why don’t more companies use TPUs? Simple: Most engineers grew up with Nvidia + CUDA, and TPUs only run on Google Cloud. Switching ecosystems is hard — even if the tech is better. ☁️ The Bigger Picture: Google’s Cloud Advantage AI is crushing cloud margins because everyone depends on Nvidia. Google isn’t. It owns the chip and the software stack. That means: ✔️ lower costs ✔️ better margins ✔️ faster innovation ✔️ and a defensible advantage competitors can’t easily copy Some experts now say TPUs are as good as or even better than Nvidia’s best chips. 🔥 The Punchline Google didn’t build TPUs to sell chips. It built them to survive its own AI growth. Today, TPUs might be Google Cloud’s biggest competitive weapon for the next decade. And the moment Google fully opens them to the world? The AI infrastructure game changes. Source: zerohedge, uncoveralpha
Very interesting chart by Jen Zhu @jenzhuscott showing that Google has the most comprehensive stacks in AI compared to all peers/competitors.
It means they have more defensibility against the incestuous financing games that’s now the core of the “AI bubble”. She also notes that Gemini’s market share has grown rapidly from 5.6% 12 months ago to 13.7% now, mostly at the expense of ChatGPT - this was before the launch of Gemini 3. Source: The information reporting
Google went from 5% to 14% market share BEFORE Gemini 3 launch👀
WSJ: "Gemini 3’s surge past ChatGPT and other competitors on benchmark tests has handed Google an elusive victory". OpenAI/ChatGPT continues to have a massive lead but the market share has been clearly eroding. DeepSeek, Perplexity and Claude come after Gemini-. Microsoft/Copilot is barely visible on the chart. Source: Josh Wolfe @wolfejosh
The world now has 11,030 data centers:
• Americas: 4,995 • Europe: 3,476 • Asia & Oceania: 2,068 • Africa & the Middle East: 494 Source: Investing visuals @InvestingVisual, Visual Capitalist
45% of fund managers surveyed by Bank of America in November said an "AI bubble" was the biggest tail risk for markets, spiking from just 11% in September.
Over half of these investors said they think AI stocks are already in a bubble. Source: BofA
🔥 3 Bullish Signals from NVIDIA’s Earnings Call Last Night — and why the AI trade is far from over.
Most CEOs play it safe on earnings calls. Jensen Huang did the opposite. Here are the 3 comments everyone in tech, AI, and markets should pay attention to: 🚀 1. “No AI Bubble” — Just Three Structural Shifts 1️⃣ The migration from CPU ➝ accelerated computing 2️⃣ Generative AI hitting a tipping point across every workload 3️⃣ The rise of agentic AI All three require massive infrastructure builds. And the kicker? Inference demand is exploding — and is set to become a major revenue engine for NVIDIA. 💰 2. “Funding Is NOT the Problem” Worried customers are running out of capital? NVIDIA is not. Hyperscalers are already monetizing AI, sovereign buying is ramping, and agentic AI opens entirely new revenue pools. Translation: the money is there, and it's accelerating. 🧠 3. “The Ecosystem Is the Moat” This one flew under the radar but is HUGE. The CFO pointed out: A100 GPUs from SIX years ago are still fully utilized — thanks to the Kuda software stack. It means: Longer useful life for GPUs Better ROI on datacenter capex A deeper, stickier NVIDIA ecosystem Plus, NVIDIA is expanding partnerships across enterprise platforms and top AI developers. 📈 Bottom Line This was a monster print: ✔ Strong results ✔ Confident guidance ✔ Constructive multi-year outlook After-hours? NVIDIA popped ~5%, and AI-related names rallied across the board. 📊 Valuation Check (Yes, Really) NVIDIA’s stock has actually de-rated lately — earnings kept growing, the share price didn’t. And now with Q4 guidance out, investors will pivot to 2026–2027. Here’s the jaw-dropper: ➡️ Using 2027 FactSet consensus, NVIDIA trades at 21× P/E. Twenty. One. Times. Earnings. For the company powering the entire AI revolution. 🔮 The Broader Message The AI trade is alive. Healthy. And nowhere near done.
Investing with intelligence
Our latest research, commentary and market outlooks

