- Freestyle
- Posts
- Freestyle - January 2025
Freestyle - January 2025
AI Cloud King: Azure or AWS, Reddit's New Chatbot, 1300% eCommerce Surge, YouTube’s AI Toggle
Freestyle is where we examine the changing tides of technology from our front-row seats. These are raw, evolving thoughts—half-baked ideas meant to spark conversation. The real refinement happens when you reply, challenge, and build on what we put out there. 🤝
AI Cloud Showdown: Microsoft, AWS, Stargate
It’s been over two years since the "ChatGPT moment" brought large language models into the mainstream. Now that the hype has matured, the real battle is shaping up in the cloud infrastructure arena: who’s winning the AI workloads war—Microsoft or Amazon? And with the recent twist of Stargate entering the fold, things just got even more interesting.

Sizing the Demand
Amazon (AWS):
On their recent earnings call, Amazon loosely suggested AWS’s AI revenue is a multi-billion dollar business now. Ben Thompson at Stratechery posited that AWS is at about $500M in AI revenue per quarter, or $2B annualized. While that’s small compared to AWS’s overall scale, it’s reportedly growing at triple-digit percentages—an encouraging sign for Amazon’s AI ambitions.
Microsoft (Azure):
Meanwhile, Microsoft said in their last earnings call that they will surpass $10B of annualized AI revenue next quarter. Analysts at Bernstein did a great job breaking down that figure:
~$2B from Copilots (e.g. GitHub Copilot, M365 Copilot, etc.)
~$2.5B from inference revenue from ChatGPT
~$5.5B from Azure OpenAI Services (e.g. running models)
The name “Azure OpenAI Services” can be a bit of a misnomer, given Microsoft is now serving an array of first- and third-party AI models outside of OpenAI. Still, the broad takeaway is that Microsoft’s early and deep partnership with OpenAI has funneled a lot of high-value AI workloads onto Azure’s infrastructure.
A Glimpse at the Scoreboard
The closest apples-to-apples comparison, if we trust both Bernstein’s and Stratechery’s analyses, is that AWS AI sits around $2B annualized, while Azure OpenAI Services is at $5.5B annualized. By that measure, Azure’s AI infrastructure business is roughly 2.75x the size of Amazon’s.
New AI customers are likely increasingly choosing Azure because of Microsoft’s experience co-developing and hosting OpenAI’s complex workloads. In a BG2 podcast interview, Satya Nadella highlighted that Microsoft won several enterprise customers—Shopify, Stripe, and Spotify among them—directly on the back of Azure’s perceived AI leadership.
Meanwhile, AWS has been adding custom hardware like Trainium2 chips and ramping up large-scale clusters (for example, a reported 400k-chip training cluster buildout for Anthropic). While that may help lure in large training workloads, the early momentum in enterprise inference seems to be tilting toward Azure’s platform.
Microsoft’s Bet on OpenAI Paying Off
Microsoft’s heavy investment in OpenAI—estimated at around $14B when factoring in staged commitments—appears to be paying off in the form of incremental cloud revenue. Even conservatively assuming half of Azure’s $5.5B OpenAI Services revenue is tied directly to OpenAI, that still represents roughly $2.75B. Add in another $2.5B from ChatGPT inference hosted on Azure, and you get $5.75B of high‐margin revenue funneled in from OpenAI workloads.
And these numbers could double just next year. That kind of rapid revenue scaling makes it easy to justify Microsoft’s initial investment, even without factoring in its potential equity stake in OpenAI.
Enter Stargate
The Microsoft-OpenAI partnership has been a constant subject of scrutiny. Over the last year, the media spotlight has swung between reports of friction—Sam Altman being temporarily ousted as CEO and alleged frustrations about Microsoft throttling compute—and more recent developments like Microsoft hiring Mustafa Suleyman and the Inflection AI team. The Stargate announcement added yet another twist.
“The initial equity funders in Stargate are SoftBank, OpenAI, Oracle, and MGX. SoftBank and OpenAI are the lead partners for Stargate, with SoftBank having financial responsibility and OpenAI having operational responsibility. Masayoshi Son will be the chairman.
Arm, Microsoft, NVIDIA, Oracle, and OpenAI are the key initial technology partners. The buildout is currently underway, starting in Texas, and we are evaluating potential sites across the country for more campuses as we finalize definitive agreements.”
While some speculated that this marked a fraying of the Microsoft-OpenAI relationship, Sam Altman quickly put that rumor to rest, tweeting:
absolutely not! very important and huge partnership, for a long time to come.
we just need moar compute.
— Sam Altman (@sama)
1:21 AM • Jan 22, 2025
Microsoft also released a blog post that shed light on some key—but subtle—details of its relationship with OpenAI. While Microsoft is no longer the exclusive cloud provider, it still holds the right of first refusal to build out capacity for OpenAI when needed. More importantly, nothing has changed when it comes to Microsoft’s exclusive rights to host OpenAI’s APIs. That means the $2.75B in Azure OpenAI Services revenue remains secure, and the $2.5B from ChatGPT inference still seems untouchable—or at least something Microsoft can ensure stays in-house by building the necessary infrastructure before anyone else gets a shot.
What does appear to be shifting is Microsoft’s appetite to bankroll OpenAI’s seemingly limitless model training ambitions. From where I’m sitting, Microsoft seems to be managing to have its cake and eat it too—reaping the benefits of OpenAI’s ecosystem (and the associated revenue streams) without shouldering the ever-expanding costs of compute for next-gen models.
But the story doesn’t feel fully written yet, and I’m left wondering when the other shoe might drop. For instance, the Microsoft blog post carefully avoids addressing the AGI clause overhang—the provision that could strip Microsoft of proprietary access to OpenAI’s models if OpenAI achieves AGI. That ambiguity leaves lingering questions about the long-term balance of power in this partnership.
For now, the scoreboard reads: AI Cloud King = Azure (but let’s check back next quarter). The competitive dynamics of cloud computing are famously fluid—AWS didn’t earn its behemoth status by chance, and Trainium2 is generating a lot of buzz among our friends from top developers in Silicon Valley. Future earnings calls will tell us more, but as it stands, the early numbers suggest Redmond is pulling ahead in the race for enterprise AI infrastructure.
Reddit Joins the Chatbot Battle
In a surprise move, Reddit recently debuted its own chatbot, Reddit Answers. While some might wonder why the forum powerhouse would wade into an arena already crowded by the likes of ChatGPT, Claude, Gemini, and Perplexity, the broader business strategy behind Reddit’s chatbot experiment is more subtle than it first appears.

Boon from LLM Demand
It turns out that generative AI has been a windfall for Reddit, helping the stock price rise ~4x since March 2024.
Translation Use Cases: By leveraging large language models (LLMs), Reddit is expanding engagement via real-time translations of forum content into new languages. This spawns user growth in regions where Reddit previously had little traction.
Selling Access to Data: Reddit reportedly inked high-value contracts with two major model developers: Google and OpenAI. Why would they pay for Reddit’s corpus? Because Reddit is a treasure trove of unfiltered, real-time conversations on everything from CPG product comparisons to WallStreetBets investment hype. That breadth and recency can be invaluable for training (and retraining) AI models to understand shifting consumer sentiment. Reddit did $33.2M of “Other Revenue” last quarter, or $133M annualized, which is believed to be mostly data licensing agreements to model providers.
Why Launch a Chatbot?
If Reddit is already raking in $133M from third-party chatbot providers, why invest in “Reddit Answers” at all? The answer likely lies in strategic leverage: by having an in-house chatbot—even one that might not gain massive scale—Reddit signals to AI providers, “We can build our own if we want to.” This is reminiscent of Apple’s position when negotiating with Google over default search on iOS; because Google can’t risk losing that default status, it keeps paying higher fees (now reported at $20B annually). A homegrown chatbot also means Reddit could eventually pull back its data from other AI providers—a “credible threat” scenario that typically raises the value of any licensing deal.
Reddit’s strategic move underscores a bigger shift in the AI landscape: unique, high-quality datasets are becoming increasingly valuable. Until recently, most large language models scraped enormous amounts of text for free. Now, as the market matures, data owners like Reddit are getting smarter. For example, news publishers like The New York Times have challenged unlicensed scraping, while The Associated Press has struck a revenue-sharing deal with OpenAI.
As chatbots begin to rival (and potentially replace) Google’s traditional search model, we’re witnessing a broader shift in how content is valued online. Rather than solely being rewarded for sheer volume of views, data owners will increasingly be paid based on the quality and uniqueness of their information. This signals a major transformation in the way online content is monetized. A change in incentives will surely result in a change in the underlying content.
Chatbots Drive a 1300% eCommerce Surge
Chatbots drove a whopping 1300% increase in web traffic to eCommerce sites this holiday season—a jaw-dropping leap, especially given that ChatGPT’s user base “only” rose ~3.5x over a similar stretch (from around 100M to 350M MAUs, according to multiple reports). A newly published report spells out the trend:
“This season, traffic to retail sites from generative AI-powered chat bots (shoppers clicking on a link to a retail site) increased by 1,300% compared to the year prior. Cyber Monday saw the biggest growth in chat bot usage, up 1,950% YoY. While the base of users remains modest, the uptick shows the value that chat bots are playing as shopping assistants. In an Adobe survey of 5,000 U.S. consumers, 7 in 10 respondents who have used generative AI for shopping believe it enhances their experience. Additionally, 20% of respondents turn to generative AI to find the best deals, followed by quickly finding specific items online (19%) and getting brand recommendations (15%).”
Outsized growth in shopping behavior compared to overall chatbot user growth suggests people are finding all sorts of new ways to exploit these chatty helpers. Perplexity has rolled out a shopping feature that hunts down relevant products from multiple retailers—price comparisons, reviews, and one-click checkout delivered straight to your chat window. OpenAI joined the party with a “Search” option that curates images, real-time pricing, and user reviews, then sprinkles in suggestions for alternate picks. Meanwhile, Amazon went and built its own shopping chatbot that’s deeply embedded in the Amazon ecosystem, helping shoppers navigate product hunts, comparisons, and even the checkout process—though it’s confined to Amazon listings and doesn’t feed into that 1300% traffic figure.

With so many folks turning to AI for product advice, eCommerce brands should jump on “chatbot search optimization.” A startup named Profound is already helping merchants sync up their product data with large language models to boost discoverability in chat-based queries. Perplexity has a “merchant program” where they ask merchants to fill out a form that makes them more likely to appear in search queries. All in all, the astounding surge in AI-driven shopping points to a broader shift in the retail world, where these conversational sidekicks are increasingly steering us toward our next purchase—and maybe even helping us find that perfect holiday deal.
YouTube’s AI Toggle: Shaping Creator Monetization
YouTube has introduced an “opt-in” setting for creators who want third-party AI models to train on their videos. While other models require creator opt-in, Google retains the right to train on all YouTube uploads by default—giving it exclusive access to the world’s largest trove of video data (over 500 hours uploaded per minute).

It’s no surprise that Veo, a Google-affiliated model, excels in realism. In side-by-side demos of AI-generated videos, such as slicing a steak, Veo is the only model that avoids glaring physics errors—while Sora struggles with basic object stability. If Google’s lead in video training continues to compound, YouTube could become a significant advantage not just for AI-generated video but also for training future robotics and real-world interaction models developed by Google.

But perhaps the bigger twist is how this “opt-in” system could reshape the YouTube creator economy. Note what I wrote above in the Reddit section: “Rather than solely being rewarded for sheer volume of views, data owners will increasingly be paid based on the quality and uniqueness of their information. This signals a major transformation in the way online content is monetized. A change in incentives will surely result in a change in the underlying content.”
Today, YouTubers earn money from ads tied to view counts. Tomorrow, model providers might pay directly for the right to train on specialized or niche content—rewarding quality and uniqueness over sheer virality. After all, an intensely detailed channel on, say, precision CAD design could be far more useful to an LLM than a general entertainment channel with millions of followers. If that scenario unfolds, creators might start optimizing for knowledge depth rather than mass appeal. Of course, it’s unclear whether YouTube will facilitate such monetary deals or if this opt-in feature is a strategic move to deflect antitrust scrutiny while Google reaps exclusive benefits. For now, it marks a shift in who controls the data—and who could profit from it—within the evolving AI ecosystem. We’ll be watching closely.
🤙
The opinions expressed in this newsletter are my own, subject to change without notice, and do not necessarily reflect those of Timeless Partners, LLC (“Timeless Partners”). This newsletter is an informal collection of thoughts, articles, and reflections that have recently caught my attention. For discussion purposes, I may present perspectives that contradict my own. I fully expect to change my mind on some topics over time and hope that readers approach these ideas with the same mental flexibility.
Nothing in this newsletter should be interpreted as investment advice, research, or valuation judgment. This newsletter is not intended to, and does not, relate specifically to any investment strategy or product that Timeless Partners offers. Any strategy discussed herein may be unsuitable for investors depending on their specific objectives and situation. Investing involves risk and there can be no assurance that an investment strategy will be successful.
Links to external websites are for convenience only. Neither I, nor Timeless Partners, is responsible for the content or use of such sites. Information provided herein, including any projections or forward-looking statements, targets, forecasts, or expectations, is only current as of the publication date and may become outdated due to subsequent events. The accuracy, completeness, or timeliness of the information cannot be guaranteed, and neither I, nor Timeless Partners, assume any duty to update this newsletter. Actual events or outcomes may differ significantly from those contemplated herein.
It should not be assumed that either I or Timeless Partners has made or will make investment recommendations in the future that are consistent with the views expressed herein. We may make investment recommendations, hold positions, or engage in transactions that are inconsistent with the information and views expressed herein. Moreover, it should not be assumed that any security, instrument, or company identified in the newsletter is a current, past, or potential portfolio holding of mine or of Timeless Partners, and no recommendation is made as to the purchase, sale, or other action with respect to such security, instrument, or company.
Neither I, nor Timeless Partners, make any representation or warranty, express or implied, as to the accuracy, completeness or fairness of the information contained in this newsletter and no responsibility or liability is accepted for any such information. By accessing this newsletter, the reader acknowledges its understanding and acceptance of the foregoing statement.