Straight from the Desk
Syz the moment
Live feeds, charts, breaking stories, all day long.
- All
- equities
- United States
- Macroeconomics
- Food for Thoughts
- markets
- Central banks
- Fixed Income
- bitcoin
- Asia
- europe
- investing
- technical analysis
- geopolitics
- gold
- Crypto
- AI
- Technology
- Commodities
- nvidia
- ETF
- earnings
- Forex
- china
- Real Estate
- banking
- oil
- Volatility
- magnificent-7
- energy
- apple
- Alternatives
- emerging-markets
- switzerland
- tesla
- United Kingdom
- assetmanagement
- Middle East
- amazon
- russia
- ethereum
- microsoft
- ESG
- meta
- Industrial-production
- bankruptcy
- Healthcare
- Turkey
- Global Markets Outlook
- africa
- Market Outlook
- brics
TPU > GPU ???
Google's AI chips - TPUs, or tensor processing units - are having a moment. These semiconductors were used to train its latest genAI model, Gemini 3, which has received rave reviews, and are cheaper to use than Nvidia's offerings. 🚀 But here's the real reason Google invented the TPU Back in 2013, Google ran a simple forecast that scared everyone: If every Android user used voice search for just 3 minutes a day, Google would need to double its global data centers. Not because of videos. Not because of storage. But because AI was too expensive to run on normal chips. So Google made a bold move: 👉 Build its own AI chip - the TPU. 15 months later, it was already powering Google Maps, Photos, and Translate… long before the public even knew it existed. ⚡ Why TPUs Matter GPUs are great, but they were built for video games, not AI. TPUs were built only for AI. No extra baggage. No wasted energy. Just raw efficiency and speed. That focus paid off: TPUs deliver better performance per dollar Use less energy Are faster for many AI tasks And with each generation, Google doubles performance Even Nvidia’s CEO, Jensen Huang, openly respects Google’s TPU program. 🤔 Then why don’t more companies use TPUs? Simple: Most engineers grew up with Nvidia + CUDA, and TPUs only run on Google Cloud. Switching ecosystems is hard — even if the tech is better. ☁️ The Bigger Picture: Google’s Cloud Advantage AI is crushing cloud margins because everyone depends on Nvidia. Google isn’t. It owns the chip and the software stack. That means: ✔️ lower costs ✔️ better margins ✔️ faster innovation ✔️ and a defensible advantage competitors can’t easily copy Some experts now say TPUs are as good as or even better than Nvidia’s best chips. 🔥 The Punchline Google didn’t build TPUs to sell chips. It built them to survive its own AI growth. Today, TPUs might be Google Cloud’s biggest competitive weapon for the next decade. And the moment Google fully opens them to the world? The AI infrastructure game changes. Source: zerohedge, uncoveralpha
Google $GOOGL is now up 15% since Buffett's Berkshire disclosed a $4B stake
Source: Michael Burry Stock Tracker ♟ @burrytracker
Very interesting chart by Jen Zhu @jenzhuscott showing that Google has the most comprehensive stacks in AI compared to all peers/competitors.
It means they have more defensibility against the incestuous financing games that’s now the core of the “AI bubble”. She also notes that Gemini’s market share has grown rapidly from 5.6% 12 months ago to 13.7% now, mostly at the expense of ChatGPT - this was before the launch of Gemini 3. Source: The information reporting
Google went from 5% to 14% market share BEFORE Gemini 3 launch👀
WSJ: "Gemini 3’s surge past ChatGPT and other competitors on benchmark tests has handed Google an elusive victory". OpenAI/ChatGPT continues to have a massive lead but the market share has been clearly eroding. DeepSeek, Perplexity and Claude come after Gemini-. Microsoft/Copilot is barely visible on the chart. Source: Josh Wolfe @wolfejosh
Investing with intelligence
Our latest research, commentary and market outlooks

