The Best Investment Theme For The New Trump Presidency

There is no shortage of investing ideas being thrown around that could potentially do well under the new Trump administration – but what would actually work?

Last week, Donald Trump won the latest US Presidential Elections, which would see him be sworn in as the USA’s new President on 20 January 2025. Often, there’s a huge rush of investment themes that accompany the inauguration of a new political leader in a country. It’s no exception this time. 

For my own investment activities, the only theme I’m in favour of with the new Trump presidency – in fact, with any new presidency – is to look at a stock as a piece of a business, and assess the value of that business. Why? Because there’s a long history of investment themes accompanying shifts in political leadership that have soured. In a November 2014 article for The Motley Fool, Morgan Housel shared some examples:

“During the 1992 election, a popular argument was that Bill Clinton’s proposed remake of the U.S. healthcare system would be disastrous for pharmaceutical stocks… by the end of Clinton’s presidency pharmaceutical companies were some of the most valuable companies in the world. Pfizer increased 791% during Clinton’s presidency. Amgen surged 611%. Johnson & Johnson popped 385%. Merck jumped 299%. Those crushed the market, with the S&P 500 rising 251% from January 1993 to January 2001…

…During the 2000 election, Newsweek wrote that if George W. Bush wins, the ensuing tax changes could “help banks, brokers and other investment firms.” By the end of Bush’s second term, the KBW Bank Index had dropped almost 80%. The article also recommended pharmaceutical stocks thanks to Bush’s light touch on regulation. The NYSE Pharmaceutical Index lost nearly half its value during Bush’s presidency…

…During the 2008 election, many predicted that an Obama victory would be a win for green energy like solar and wind and a loss for big oil… The opposite happened: The iShares Clean Energy ETF is down 51% since then, while Chevron (CVX 0.10%) is up 110%.

During the 2012 election, Fox Business wrote that if Obama wins, “home builders such as Pulte and Toll Brothers could see increased demand for new homes due to a continuation of the Obama Administration’s efforts to limit foreclosures, keeping homeowners in their existing properties.” Their shares have underperformed the S&P 500 by 26 percentage points and 40 percentage points since then, respectively.”

It was more of the same in the presidential elections that came after Housel’s article.

When Trump won the 2016 US elections for his first term as President, CNBC proclaimed the banking sector as a strong beneficiary because of his promises to ease banking regulations. But from the day Trump was sworn into office (President-elects are typically sworn in on 20 January in the following year after the elections) till the time he stepped down four years later, the KBW Nasdaq Bank Index was up by less than 20%, whereas the S&P 500 was up by nearly 70%. The KBW Nasdaq Bank Index tracks the stock market performance of 24 of America’s largest banks.

CNBC surveyed more than 100 investment professionals shortly after Joe Biden won the 2020 elections. They thought that “consumer discretionary, industrials and financials will perform the best under a Biden administration.” From Biden’s first day as President till today, the S&P 500 is up by slightly under 60%. Meanwhile, the S&P 500 Consumer Discretionary Index, which comprises consumer discretionary companies within the S&P 500 index, has gained just around 30%. The Dow Jones Industrials Index (a collection of American industrial companies) and the KBW Nasdaq Bank Index are both also trailing the S&P 500 with their respective gains of around 40% and 20%.

I have no idea if the hot themes for Trump’s second term as President would end up performing well. But given the weight of the historical evidence, I have no interest in participating in them. Politics and investing seldom mix well.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I don’t have a vested interest in any company mentioned. Holdings are subject to change at any time.

Market View: Levels to watch for US equities in Nov; Market reaction to US Sep PCE, BOJ rate decision; Earnings out of Amazon, Apple, Meta, Microsoft; Singtel

Last week, on 01 November 2024, I was invited for a short interview on Money FM 89.3, Singapore’s first business and personal finance radio station, by Chua Tian Tian, the co-host of the station’s The Evening Runway show. We discussed a number of topics, some of which are:

  • What the Bank of Japan’s interest rate decision and the US September Personal Consumption Expenditure numbers mean for stocks (Hints: It’s important to differentiate between the economy and the stock market; even the US Federal Reserve has very little control over the movement of US stocks, according to recent research from New York University finance professor, Aswath Damodaran)
  • What the Australian Competition and Consumer Commission’s lawsuit against Optus Mobile means for Singtel (Hint: Optus represents only a minority of Singtel’s overall earnings, so even if Optus’s entire business is zero-ed, it would not be catastrophic for Singtel; but it’s very unlikely that Optus’s business would be materially diminished because of the lawsuit)
  • The latest earnings results of the mega-cap US technology companies (Hint: Apple is increasingly becoming a services business; Microsoft’s latest comments on its AI revenues is positive for the sustainability of the business; Meta is already seeing clear improvements in its core advertising business from its AI investments; Intel’s future depends on the success of its foundry business, which is struggling at the moment because Intel’s most advanced chip designs are actually outsourced to Taiwan Semiconductor Manufacturing Company)
  • What the upcoming US presidential election means for the US stock market (Hint: The returns an investor can earn from 1950 to 2024 by staying invested across all US presidents absolutely dwarfs what the investor can earn from only investing under Republican presidents or Democrat presidents)

You can check out the recording of our conversation below!


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Apple, Meta Platforms, Microsoft, and Taiwan Semiconductor Manufacturing Company. Holdings are subject to change at any time.

The US Stock Market And US Presidents

History’s verdict on how US stocks have performed under different US presidents

The US presidential election is just a few weeks away. And as usual, large swathes of participants in the US stock market are trying to predict the victor because they think it will have significant consequences on how US stocks perform. I don’t have a crystal ball. But I do have history’s verdict, thanks to excellent research from the US-based wealth management firm, Ritholtz Wealth Management, that I came across recently.

Here’s a table showing the annualised returns of the S&P 500 for each US President, going back to Theodore Roosevelt’s first term in 1901:

Table 1; Source: Ritholtz Wealth Management 

I think the key takeaway from the table is that how the US stock market performs does not depend on what political party the US President belongs to. Republican presidents have presided over bad episodes for US stocks (Herbert Hoover, Richard Nixon, and George W. Bush, for example) as well as fantastic times (Calvin Coolidge, Dwight Eisenhower, and Ronald Reagan, for example). The same goes for Democrat presidents, who have led the country through both poor stock market returns (Woodrow Wilson and Franklin Roosevelt, for example) as well as great gains (Franklin Roosevelt, Lyndon Johnson, and Barack Obama, for example). Presidents do not have that much power over the financial markets. Don’t let politics influence your investing decision-making.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I don’t have a vested interest in any company mentioned. Holdings are subject to change at any time.

The Problems With China’s Economy And How To Fix Them

An analysis of China’s balance sheet recession, and what can be done about it.

Economist Richard Koo (Gu Chao Ming) is the author of the book The Other Half of Macroeconomics and the Fate of Globalization. Investor Li Lu published a Mandarin review of the book in November 2019, which I translated into English in March 2020. When I translated Li’s review, I found myself nodding in agreement to Koo’s unique concept of a balance sheet recession as well as his analyses of Japan’s economic collapse in the late 1980s and early 1990s, and the Japanese government’s responses to the crash. 

When I realised that Koo was interviewed last week in an episode of the Bloomberg Odd Lots podcast to discuss the Chinese government’s recent flurry of stimulus measures, I knew I had to tune in – and I was not disappointed. In this article, I want to share my favourite takeaways (the paragraphs in italics are transcripts from the podcast)

Takeaway #1: China is currently facing a balance sheet recession, and in a balance sheet recession, the economy can shrink very rapidly and be stuck for a long time

I think China is facing balance sheet recession and balance sheet recession happens when a debt-financed bubble bursts, asset prices collapse, liabilities remain, people realise that their balance sheets’ under water or nearly so, and they all try to repair their balance sheets all at the same time…

…Suppose I have $1000 of income and I spend $900 myself. The $900 is already someone else’s income so that’s not a problem. But the $100 that I saved will go through people like us, our financial institutions, and will be lent to someone who can use it. That person borrows and spends it, then total expenditure in economy will be $900 that I spent, plus $100 that this guy spent, to get $1000 against original income of $1000. That’s how economy moves forward, right? If there are too many borrowers and economy is doing well, central banks will raise rates. Too few, central bank will lower rates to make sure that this cycle is maintained. That’s the usual economy.

But what happens in the balance sheet recession is that when I have $1000 in income and I spend $900 myself, that $900 is not a problem. But the $100 I decide to save ends up stuck in the financial system because no one’s borrowing money. And China, so many people are refusing to borrow money these days because of that issue. Then economy shrinks from $1000 to $900, so 10% decline. The next round, the $900 is someone else’s income, when that person decides to save 10% and spends $810 and decides to save $90, that $90 gets stuck in the financial system again, because repairing financial balance sheets could take a very long time. I mean, Japanese took nearly 20 years to repair their balance sheets.

But in the meantime, economy can go from $1000, $900, $810, $730, very, very quickly. That actually happened in United States during the Great Depression. From 1929 to 1933, the United States lost 46% of its nominal GDP. Something quite similar actually happened in Spain after 2008when unemployment rates skyrocketed to 26% in just three and a half years or so. That’s the kind of danger we face in the balance sheet recession.

Takeaway #2: Monetary policy (changing the level of interest rates) is not useful in dealing with a balance sheet recession – what’s needed is fiscal policy (government spending), but it has yet to arrive for China

I’m no great fan of using monetary policy, meaning policies from the central bank to fight what I call a balance sheet recession…

…Repairing balance sheets of course is the right thing to do. But when everybody does it all at the same time, we enter the problem of fallacy of composition, in that even though everybody’s doing the right things, collectively we get the wrong results. And we get that problem in this case because in the national economy, if someone is repairing balance sheets, meaning paying down debt or increasing savings, someone has to borrow those funds to keep the economy going. But in usual economies, you bring interest rates down, there’ll be people out there willing to borrow the money and spend it. That’s how you keep the economy going.

But in the balance sheet recession, you bring interest rates down to very low levels – and Chinese interest rates are already pretty low. But even if you bring it down to zero, people will be still repairing balance sheets because if you are in negative equity territory, you have to come out of that as quickly as possible. So when you’re in that situation, you cannot expect private sector to respond to lowering of interest rates or quantitative easing, forward guidance, and all of those monetary policy, to get this private sector to borrow money again because they are all doing the right things, paying down debt. So when you’re in that situation, the economy could weaken very, very quickly because all the saved funds that are returned to the banking system cannot come out again. That’s how you end up with economy shrinking very, very rapidly.

The only way to stop this is for the government, which is outside of the fallacy of composition, to borrow money. And that’s the fiscal policy of course, but that hasn’t come out yet. And so yes, they did the quick and easy part with big numbers on the monetary side. But if you are in balance sheet recession, monetary policy, I’m afraid is not going to be very effective. You really need a fiscal policy to get the economy moving and that hasn’t arrived yet.

Takeaway #3: China’s fiscal policy for dealing with the balance sheet recession needs to be targeted, and a good place to start would be to complete all unfinished housing projects in the country, followed by developing public works projects with a social rate of return that’s higher than Chinese government bond yields

If people are all concerned about repairing their balance sheets, you give them money to spend and too often they just use it to pay down debt. So even within fiscal stimulus, you have to be very careful here because tax cuts I’m afraid, are not very effective during balance sheet recessions because people use that money to repair their balance sheets. Repairing balance sheets is of course the right thing to do, but it will not add to GDP when they’re using that tax cuts to pay down debt or rebuild their savings. So that will not add to consumption as much as you would expect under ordinary circumstances. So I would really like to see government just borrow and spend the money because that will be the most effective way to stop the deflationary spiral…

… I would use money first to complete all the apartments that were started but are not yet complete. In that case you might have to take some heavy handed actions, but basically the government should take over these companies and the projects, and start putting money so that they’ll complete the projects. That way, you don’t have to decide what to make, because the things that are already in the process of being built – or the construction drawings are there, workers are there, where to get the materials. And in many cases, potential buyers already know. So in that case, you don’t waste time thinking about what to build, who’s to design, and who the order should go to.

Remember President Obama, when he took over 2009, US was in a balance sheet recession after the collapse of the housing bubble. But he was so careful not to make the Japanese mistake of building bridges to nowhere and roads to nowhere. He took a long time to decide which projects should be funded. But that year-and-a-half or so, I think the US lost quite a bit of time because during that time, economy continued to weaken. There were no shovel-ready projects.

But in the Chinese case, I would argue that these uncompleted apartments are the shovel-ready projects. You already know who wants them, who paid their down payments and all of that. So I will spend the money first on those projects, complete those projects, and use the time while the money is used to complete these apartments.

I would use the magic wand to get the brightest people in China to come into one room and ask them to come up with public works projects with a social rate of return higher than 2.0%. The reason is that Chinese government bond is about 2.00-something. If these people can come up with public works projects with a social rate of return higher than let’s say 2.1%, then those projects will be basically self-financing. It won’t be a burden on future taxpayers. Then once apartments are complete, then the economy still is struggling from balance sheet recession, then I would like to spend the money on those projects that these bright people might come up with.

Takeaway #4: The central government in China actually has a budget deficit that is a big part of the country’s GDP, unlike what official statistics say

But in China, even though same rules should have applied, local governments were able to sell lots of land, make a lot of money in the process, and then they were able to do quite a bit of fiscal stimulus, which also of course added to their GDP. That model will have to be completely revised now because no one wants to buy land anymore. So the big source of revenue of local governments are gone and as a result, many of them are very close to bankrupt. Under the circumstances, I’m afraid central government will have to take over a lot of these problems from the local government. So this myth that Chinese central government, the budget deficit is not a very big part of GDP, that myth will have to be thrown out. Central government will have to take on, not all of it perhaps, but some of the liabilities of the local governments so that local governments can move forward.

Takeaway #5: There’s plenty of available-capital for the Chinese central government to borrow from, and the low yields of Chinese government bonds are a sign of this

So even though budget deficit of China might be very large, the money is there for government to borrow. If the money is not there for the government to borrow, Chinese government bond yields should have gone up higher and higher. But as you know, Chinese government 10-year government bond yields almost down to 2.001% or 2%. It went that low because there are not enough borrowers out there. Financial institutions have to place this money somewhere, all these deleveraged funds coming back into the financial institutions, newly generated savings, all the money that central bank put in, all comes to basically people like us in the financial institutions, the fund managers. But if the private sector is not borrowing money, the only borrower left is the government.

So even if the required budget deficit might be very large to stabilize the economy, the funds are available in the financial market. Only the government just have to borrow that and spend it. So financing should not be a big issue for governments in balance sheet recession. Japan was running huge budget deficits and a lot of conventional minded economists who never understood the dynamics of balance sheet recession was warning about Japan’s budget deficit growing sky high, and then interest rates going sky high. Well, interest rates kept on coming down because of the mechanism that I just described to you, that all those funds coming into the financial sector cannot go to the private sector, end up going to our government bond market. And I see the same pattern developing in China today.

Takeaway #6: Depending on exports is a great way for a country to escape from a balance sheet recession, but this route is not available for China because its economy is already running the largest trade surplus in the world

Export is definitely one of the best ways if you can use it, to come out of balance sheet recession. But China, just like Japan 30 years ago, is the largest trade surplus country in the world. And if the world’s largest trade surplus country in the world tries to export its way out, very many trading partners will complain. You are already such a large destabilizing factor on the world trade, now you’re going to destabilize it even more.

I remember 30 years ago that United States, Europe, and others were very much against Japan trying to export its way out. Because of their displeasure, particularly the US displeasure, Japanese yen, which started at 160 yen when the bubble burst in 1990, ended up 80 yen to the dollar, five years later, 1995. What that indicated to me was that if you’re running trade deficit, you can probably export your way out and no one can really complain because you are a deficit country to begin with. But if you are the surplus country, and if you’re the largest trade surplus country in the world, there will be huge pushback against that kind of move by the Chinese. We already seeing that, in very many countries complaining that China should not export its problems.

Takeaway #7: Regulatory uncertainties for businesses that are caused by the Chinese central government may have played a role in the corporate sector’s unwillingness to borrow

Aside from a balance sheet recession, which is a very, very serious disease to begin with, we have those other factors that started hurting the Chinese economy, I would say, starting as early as 2016.

When you look at the flow of funds data for the Chinese economy, you notice that the Chinese corporate sector started reducing their borrowings, starting around 2016. So until 2016, Chinese companies were borrowing all the household sector savings generated, which is of course the ideal world. The household sector saving money, the corporate sector borrowing money. But starting around 2016, you see corporate sector borrowing less and less. And at around the Covid time, corporate sector was actually a net saver, not a net borrower. So that trend, I think has to do with what you just described, that regulatory uncertainties got bigger and bigger under the current leadership and I think people began to realize that even after you make these big investments in the new projects, they may not be able to expect the same revenue stream that they expected earlier because of this regulatory uncertainty.

Takeaway #8: China’s economy was already running a significant budget deficit prior to the bubble bursting, and this may have made the central government reluctant to step in as borrower of last resort now to fix the balance sheet recession

If the household sector is saving money, but the corporate sector is not borrowing money, you need someone else to fill that gap. And actually that gap was filled by Chinese government, mostly decentralized local governments. But if that temporary fiscal jolt of fiscal stimulus then turn the economy around, then those local government interventions would’ve been justified. But because this was a much more deeply rooted – here, I would use structural problems, this regulatory uncertainties and middle income trap and so forth – local government just had to keep on borrowing and spending money to keep the economy going. That was happening long before the bubble burst. So if you look at total, or what I call general government spending – not just the central government, but the general government – they were financial deficit to the tune of almost 7% of GDP by 2022. This is before the bubble bursting.

So if you are already running a budget deficit, 7% of GDP before the onset of balance sheet recession, then whatever you have to do to stop balance sheet recession, we have to be on top of the 7%. Suppose you need 5% GDP equivalent to keep the economy going, then you’re talking about 12% of GDP budget deficit. I think that’s one of the reasons why Chinese policy makers, even though many of them are fully aware that in the balance sheet recession, you need the government to come in, they haven’t been able to come to a full consensus yet because even before the bubble burst, Chinese government was writing a large budget deficit.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I don’t have a vested interest in any company mentioned. Holdings are subject to change at any time.

What The USA’s Largest Bank Thinks About The State Of The Country’s Economy In Q3 2024

Insights from JPMorgan Chase’s management on the health of American consumers and businesses in the third quarter of 2024.

JPMorgan Chase (NYSE: JPM) is currently the largest bank in the USA by total assets. Because of this status, JPMorgan is naturally able to feel the pulse of the country’s economy. The bank’s latest earnings conference call – for the third quarter of 2024 – was held last week and contained useful insights on the state of American consumers and businesses. The bottom-line is this: the world remains treacherous, but the US economy – and the consumer – remains on solid footing 

What’s shown between the two horizontal lines below are quotes from JPMorgan’s management team that I picked up from the call.


1. The geopolitical situation looks treacherous to JPMorgan’s management, and could have major impacts on the economy in the short term

We have been closely monitoring the geopolitical situation for some time, and recent events show that conditions are treacherous and getting worse. There is significant human suffering, and the outcome of these situations could have far-reaching effects on both short-term economic outcomes and more importantly on the course of history.

2. The US economy remains resilient, but there are risks; JPMorgan’s management wants to be prepared for any environment, as they think the future can become quite turbulent

While inflation is slowing and the U.S. economy remains resilient, several critical issues remain, including large fiscal deficits, infrastructure needs, restructuring of trade and remilitarization of the world. While we hope for the best, these events and the prevailing uncertainty demonstrate why we must be prepared for any environment…

…I’ve been quite clear that I think things — or the future could be quite turbulent. 

3. Net charge-offs for the whole bank (effectively bad loans that JPMorgan can’t recover) rose from US$1.5 billion a year ago; Consumer & Community Banking’s net charge offs rose from US$0.5 billion a year ago

Credit costs were $3.1 billion, reflecting net charge-offs of $2.1 billion and a net reserve build of $1 billion, which included $882 million in Consumer, primarily in Card and $144 million in Wholesale. Net charge-offs were up $590 million year-on-year, predominantly driven by Card…

…In terms of credit performance this quarter, credit costs were $2.8 billion driven by Card and reflected net charge-offs of $1.9 billion, up $520 million year-on-year and a net reserve build of $876 million predominantly from higher revolving balances.

4. JPMorgan’s credit card outstanding loans was up double-digits

Card outstandings were up 11% due to strong account acquisition and the continued normalization of revolve. 

5. Auto originations are down

In Auto, originations were $10 billion, down 2%, while maintaining strong margins and high-quality credit. 

6. JPMorgan’s investment banking fees had strong growth in 2024 Q3, signalling higher appetite for capital-markets activity from companies; management is cautiously optimistic about companies’ enthusiasm towards capital markets activities, but headwinds persist 

IB fees were up 31% year-on-year, and we ranked #1 with year-to-date wallet share of 9.1%. And advisory fees were up 10%, benefiting from the closing of a few large deals. Underwriting fees were up meaningfully with debt up 56% and equity up 26% primarily driven by favorable market conditions. In light of the positive momentum throughout the year, we’re optimistic about our pipeline, but the M&A, regulatory environment and geopolitical situation are continued sources of uncertainty.

7. Management is seeing muted demand for new loans from companies partly because they can easily access capital markets; demand for loans in the multifamily homes market is muted; management is not seeing any major increase in appetite for borrowing after the recent interest rate cut

In the middle market and large corporate client segments, we continue to see softness in both new loan demand and revolver utilization, in part due to clients’ access to receptive capital markets. In multifamily, while we are seeing encouraging signs in loan originations as long-term rates fall, we expect overall growth to remain muted in the near term as originations are offset by payoff activity…

…[Question] Lower rates was supposed to drive a pickup in loan growth and conversion of some of these Investment Banking pipelines. I mean, obviously, we just had one cut and it’s early. But any beginning signs of this in terms of the interest in borrowing more, and again, conversion of the banking pipelines?

[Answer] Generally no, frankly, with a couple of minor exceptions…

… I do think that some of that DCM [debt capital markets] outperformance is in the types of deals that are opportunistic deals that aren’t in our pipeline. And those are often driven by treasurers and CFOs sort of seeing improvement in market levels and jumping on those. So it’s possible that, that’s a little of a consequence of the cuts…

…I mentioned we did see, for example, a pickup in mortgage applications and a tiny bit of pickup in refi. In our multi-family lending business, there might be some hints of more activity there. But these cuts were very heavily priced, right? The curve has been inverted for a long time. So to a large degree, this is expected. So I’m not — it’s not obvious to me that you should expect immediate dramatic reactions, and that’s not really what we’re seeing.

8. Management expects the yield curve to remain inverted

The way we view the curve remains inverted. 

9. Management thinks asset prices are elevated, but they are unclear to what extent

We have at a minimum $30 billion of excess capital. And for me, it’s not burning a hole in my pocket…

…Asset prices, in my view, and you — and like you’ve got to take a view sometimes, are inflated. I don’t know if they’re extremely inflated or a little bit, but I’d prefer to wait. We will be able to deploy it. Our shareholders will be very well served by this waiting…

…I’m not that exuberant about thinking even tech valuations or any valuations will stay at these very inflated values. And so I’m just — we’re just quite patient in that. 

10. Consumer spending behaviour is normalising, so a rotation out of discretionary spending into non-discretionary spending is not a sign of consumers preparing for a downturn; retail spending is not weakening; management sees the consumer as being on solid footing; management’s base case is that there is no recession

I think what there is to say about consumer spend is a little bit boring in a sense because what’s happened is that it’s become normal. So meaning — I mean I think we’re getting to the point of where it no longer makes sense to talk about the pandemic. But maybe one last time.

One of the things that you had was that heavy rotation into T&E as people did a lot of traveling, and they booked cruises that they hadn’t done before, and everyone was going out to dinner a lot, whatever. So you had the big spike in T&E, the big rotation into discretionary spending, and that’s now normalized.

And you would normally think that rotation out of discretionary into nondiscretionary would be a sign of consumers battening down the hatches and getting ready for a much worse environment. But given the levels that it started from, what we see it as is actually like normalization. And inside that data, we’re not seeing weakening, for example, in retail spending.

So overall, we see the spending patterns as being sort of solid and consistent with the narrative that the consumer is on solid footing and consistent with the strong labor market and the current central case of a kind of no-landing scenario economically. But obviously, as we always point out, that’s one scenario, and there are many other scenarios.

11. Management thinks that the Federal Reserve’s quantitative tightening (QT) should be wound down because there are signs of stress in certain corners of the financial markets caused by QT

[Question] You I think mentioned QT stopping at some point. We saw the repo sort of market spike at the end of September. Just give us your perspective on the risk of market liquidity shock as we move into year-end. How — and do you have a view on how quickly Fed should recalibrate QT or actually stop QT to prevent some [indiscernible]?

[Answer] The argument out there is that the repo spike that we saw at the end of this quarter was an indication that maybe the market is approaching that lowest comfortable level of reserves that’s been heavily speculated about, and recognizing that, that number is probably higher and driven by the evolution of firms’ liquidity requirements as opposed to some of the more traditional measures…

…It would seem to add some weight to the notion that maybe QT should be wound down. And that seems to be increasingly the consensus, that, that’s going to get announced at some point in the fourth quarter.

12. Management sees inflationary factors in the environment

I’m not actually sure they can actually do that because you have inflationary factors out there, partially driven by QE. 


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I don’t have a vested interest in any company mentioned. Holdings are subject to change at any time.

How Recessions and Interest Rate Changes Affect Stocks

Knowing how stocks have performed in the past in the context of recessions and changes in interest rates provides us with possible paths that stocks could take in the future.

After years of investing in stocks, I’ve noticed that stock market participants place a lot of emphasis on how recessions and changes in interest rates affect stocks. This topic is even more important right now for investors in US stocks, given fears that a recession could happen soon in the country, and the interest rate cut last month by the Federal Reserve, the country’s central bank. I have no crystal ball, so I have no idea how the US stock market would react if a recession were to arrive in the near future and/or the Federal Reserve continues to lower interest rates.   

What I have is historical context. History is of course not a perfect indicator of the future, but it can give us context for possible future outcomes. I’ve written a few articles over the years in this blog discussing the historical relationships between stocks, recessions, and movements in interest rates, some of which are given below (from oldest to the most recent):

I thought it would be useful to collect the information from these separate pieces into a single place, so here goes!

The history of recessions and stocks

These are the important historical relationships between recessions and stocks:

  • It’s not a given that stocks will definitely fall during a recession. According to a June 2022 article by Ben Carlson, Director of Institutional Asset Management at Ritholtz Wealth Management, there have been 12 recessions in the USA since World War II (WWII). The average return for the S&P 500 (a broad US stock market benchmark) when all these recessions took place was 1.4%. There were some horrible returns within the average. For example, the recession that stretched from December 2007 to June 2009 saw the S&P 500 fall by 35.5%. But there were also decent returns. For the recession between July 1981 and November 1982, the S&P 500 gained 14.7%.
  • Holding onto stocks in the lead up to, through, and in the years after a recession, has mostly produced good returns. Carlson also showed in his aforementioned article that if you had invested in the S&P 500 six months prior to all of the 12 recessions since WWII and held on for 10 years after each of them, you would have earned a positive return on every occasion. Furthermore, the returns were largely rewarding. The worst return was a total gain of 9.4% for the recession that lasted from March 2001 to November 2001. The best was the first post-WWII recession that happened from November 1948 to October 1949, a staggering return of 555.7%. After taking away the best and worst returns, the average was 257.2%. 
  • Avoiding recessions flawlessly would have caused your return to drop significantly. Data from Michael Batnick, Carlson’s colleague at Ritholtz Wealth Management, showed that a dollar invested in US stocks at the start of 1980 would be worth north of $78 around the end of 2018 if you had simply held the stocks and did nothing. But if you invested the same dollar in US stocks at the start of 1980 and expertly side-stepped the ensuing recessions to perfection, you would have less than $32 at the same endpoint.
  • Stocks tend to bottom before the economy does. The three most recent recessions in the USA prior to COVID-19 would be the recessions that lasted from July 1990 to March 1991, from March 2001 to November 2001, and from December 2007 to June 2009. During the first recession in this sample, data on the S&P 500 from Yale economist Robert Shiller, who won a Nobel Prize in 2013, showed that the S&P 500 bottomed in October 1990. In the second episode, the S&P 500 found its low 15 months after the end of the recession, in February 2003. This phenomenon was caused by the aftermath of the dotcom bubble’s bursting. For the third recession, the S&P 500 reached a trough in March 2009, three months before the recession ended. Moreover, after the December 2007 – June 2009 recession ended, the US economy continued to worsen in at least one important way over the next few months. In March 2009, the unemployment rate was 8.7%. By June, it rose to 9.5% and crested at 10% in October. But by the time the unemployment rate peaked at 10%, the S&P 500 was 52% higher than its low in March 2009. Even if we are right today that the economy would be in worse shape in the months ahead, stocks may already have bottomed or be near one – only time can tell.
  • The occurrence of multiple recessions has not stopped the upward march of stocks. The logarithmic chart below shows the performance of the S&P 500 (including dividends) from January 1871 to February 2020. It turns out that US stocks have done exceedingly well over these 149 years (up 46,459,412% in total including dividends, or 9.2% per year) despite the US economy having encountered numerous recessions. If you’re investing for the long run, recessions are nothing to fear.
Figure 1; Source: Robert Shiller data; National Bureau of Economic Research

The history of interest rates and stocks

These are the important historical relationships between interest rates and stocks:

  • Rising interest rates have been met with rising valuations. According to Robert Shiller’s data, the US 10-year Treasury yield was 2.3% at the start of 1950. By September 1981, it had risen to 15.3%, the highest rate recorded in Shiller’s dataset. In that same period, the S&P 500’s price-to-earnings (P/E) ratio moved from 7 to 8. In other words, the P/E ratio for the S&P 500 increased slightly despite the huge jump in interest rates. It’s worth noting too that the S&P 500’s P/E ratio of 7 at the start of 1950 was not a result of earnings that were temporarily inflated. Yes, there’s cherry picking with the dates. For example, if I had chosen January 1946 as the starting point, when the US 10-year Treasury yield was 2.2% and the P/E ratio for the S&P 500 was 19, then it would be a case of valuations falling alongside rising interest rates. But this goes to show that while interest rates have a role to play in the movement of stocks, it is far from the only thing that matters.
  • Stocks have climbed in rising interest rate environments. In a September 2022 piece, Carlson showed that the S&P 500 climbed by 21% annually from 1954 to 1964 even when the yield on 3-month Treasury bills (a good proxy for the Fed Funds rate, which is the key interest rate set by the Federal Reserve) surged from around 1.2% to 4.4% in the same period. In the 1960s, the yield on the 3-month Treasury bill doubled from just over 4% to 8%, but US stocks still rose by 7.7% per year. And then in the 1970s, rates climbed from 8% to 12% and the S&P 500 still produced an annual return of nearly 6%.
  • Stocks have done poorly in both high and low interest rate environments, and have also done well in both high and low interest rate environments. Carlson published an article in February 2023 that looked at how the US stock market performed in different interest rate regimes. It turns out there’s no clear link between the two. In the 1950s, the 3-month Treasury bill (which is effectively a risk-free investment, since it’s a US government bond with one of the shortest maturities around) had a low average yield of 2.0%; US stocks returned 19.5% annually back then, a phenomenal gain. In the 2000s, US stocks fell by 1.0% per year when the average yield on the 3-month Treasury bill was 2.7%. Meanwhile, a blockbuster 17.3% annualised return in US stocks in the 1980s was accompanied by a high average yield of 8.8% for the 3-month Treasury bill. In the 1970s, the 3-month Treasury bill yielded a high average of 6.3% while US stocks returned just 5.9% per year. 
  • A cut in interest rates by the Federal Reserve is not guaranteed to be a good or bad event for stocks. Josh Brown, CEO of Ritholtz Wealth Management, shared fantastic data in an August 2024 article on how US stocks have performed in the past when the Federal Reserve lowered interest rates. His data, in the form of a chart, goes back to 1957 and I reproduced them in tabular format in Table 1; it shows how US stocks did in the next 12 months following a rate cut, as well as whether a recession occurred in the same window. I also split the data in Table 1 according to whether a recession had occurred shortly after a rate cut, since eight of the 21 past rate-cut cycles from the Federal Reserve since 1957 took place without an impending recession. Table 2 shows the same data as Table 1 but for rate cuts with a recession; Table 3 is for rate cuts without a recession. What the data show is that US stocks have historically done well, on average, in the 12 months following a rate-cut. The overall record, seen in Table 1, is an average 12-month forward return of 9%. When a recession happened shortly after a rate-cut, the average 12-month forward return is 8%; when a recession did not happen shortly after a rate-cut, the average 12-month forward return is 12%. A recession is not necessarily bad for stocks. As Table 2 shows, US stocks have historically delivered an average return of 8% over the next 12 months after rate cuts that came with impending recessions. It’s not a guarantee that stocks will produce good returns in the 12 months after a rate cut even if a recession does not occur, as can be seen from the August 1976 episode in Table 3.
Table 1; Source: Josh Brown
Table 2; Source: Josh Brown
Table 3; Source: Josh Brown

Conclusion

Knowing how stocks have performed in the past in the context of recessions and changes in interest rates provides us with possible paths that stocks could take in the future. But it’s also worth bearing in mind that anything can happen in the financial markets. Things that have never happened before do happen, so there are limits to learning from history. Nonetheless, there’s a really important lesson from all the data seen above that I think is broadly applicable even far into the future, and it is that one-factor analysis in finance – “if A happens, then B will occur” – should be largely avoided because clear-cut relationships are rarely seen.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I currently have no vested interest in any company mentioned. Holdings are subject to change at any time. 

The Federal Reserve Has Much Less Power Over Financial Markets Than You Think 

It makes sense to mostly ignore the Federal Reserve’s actions when assessing opportunities in the stock market.

Last week, the Federal Reserve, the USA’s central bank, opted to lower the federal funds rate (the key interest rate controlled by it) by 50 basis points, or 0.5%. The move, both before and after it was announced, was heavily scrutinised by market participants. There’s a wide-held belief that the Federal Reserve wields tremendous influence over nearly all aspects of financial market activity in the USA.

But Aswath Damodaran, the famed finance professor from New York University, made an interesting observation in a recent blog post: The Federal Reserve actually does not have anywhere close to the level of influence over America’s financial markets as many market participants think.

In his post, Damodaran looked at the 249 calendar quarters from 1962 to 2024, classified them according to how the federal funds rate changed, and compared the changes to how various metrics in the US financial markets moved. There were 96 quarters in the period where the federal funds rate was raised, 132 quarters where it was cut, and 21 quarters where it was unchanged. Some examples of what he found:

  • A median change of -0.01% in the 10-year Treasury rate was seen in the following quarter after the 96 quarters where the federal funds rate increased, whereas a median change of 0.07% was seen in the following quarter after the 132 quarters where the federal funds rate was lowered. Put another way, the 10-year Treasury rate has historically tended to (1) decrease when the federal funds rate increased, and (2) increase when the federal funds rate decreased. This means that the Federal Reserve has very little control over longer-term interest rates. 
  • A median change of -0.13% in the 15-year mortgage rate was seen in the following quarter after the quarters where the federal funds rate increased, whereas a median change of -0.06% was seen in the following quarter after the quarters where the federal funds rate was lowered. It turns out that the Federal Reserve also exerts little control over the types of interest rates that consumers directly interact with on a frequent basis.
  • A median change of 2.85% in US stocks was seen in the following quarter after the quarters where the federal funds rate increased, a median change of 3.07% was seen in the following quarter after the quarters where the federal funds rate was lowered, and a median change of 5.52% was seen in the following quarter after the quarters where the federal funds rate was unchanged. When discussing the stock-market related data, Damodaran provided a provocative question and answer: 

“At the risk of disagreeing with much of conventional wisdom, is it possible that the less activity there is on the part of the Fed, the better stocks do? I think so, and stock markets will be better served with fewer interviews and speeches from members of the FOMC and less political grandstanding (from senators, congresspeople and presidential candidates) on what the Federal Reserve should or should not do.”

I have always paid scant attention to what the Federal Reserve is doing when making my investing decisions. My view, born from observations of financial market history* and a desire to build a lasting investment strategy, is that business fundamentals trump macro-economics. Damodaran’s data lends further support for my stance to mostly ignore the Federal Reserve’s actions when I assess opportunities in the stock market. 

*A great example can be found in Berkshire Hathaway, Warren Buffett’s investment conglomerate. Berkshire produced an 18.7% annual growth rate in its book value per share from 1965 to 2018, which drove a 20.5% annual increase in its stock price. Throughout those 53 years, Berkshire endured numerous macro worries, such as the Vietnam War, the Black Monday stock market crash, the “breaking” of the Bank of England, the Asian Financial Crisis, the bursting of the Dotcom Bubble, the Great Financial Crisis, Brexit, and the US-China trade war. Damodaran’s aforementioned blog post also showed that the federal funds rate moved from around 5% in the mid-1960s to more than 20% in the early-1980s and then to around 2.5% in 2018. And yet, an 18.7% input (Berkshire’s book value per share growth) still resulted in a 20.5% output (Berkshire’s stock price growth).


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have no vested interest in any company mentioned. Holdings are subject to change at any time.

More Of The Latest Thoughts From American Technology Companies On AI (2024 Q2)

A collection of quotes on artificial intelligence, or AI, from the management teams of US-listed technology companies in the 2024 Q2 earnings season.

Last month, I published The Latest Thoughts From American Technology Companies On AI (2024 Q2). In it, I shared commentary in earnings conference calls for the second quarter of 2024, from the leaders of technology companies that I follow or have a vested interest in, on the topic of AI and how the technology could impact their industry and the business world writ large. 

A few more technology companies I’m watching hosted earnings conference calls for 2024’s first quarter after I prepared the article. The leaders of these companies also had insights on AI that I think would be useful to share. This is an ongoing series. For the older commentary:

Here they are, in no particular order:

Adobe (NASDAQ: ADBE)

Adobe’s management believes that Adobe’s approach to AI is highly differentiated; the greatest differentiation is at the interface layer, as Adobe is able to rapidly integrate AI across its product portfolio and allow users to realise value

Adobe’s customer-centric approach to AI is highly differentiated across data, models and interfaces…

…Our greatest differentiation comes at the interface layer with our ability to rapidly integrate AI across our industry-leading product portfolio, making it easy for customers of all sizes to adopt and realize value from AI. 

Adobe’s Firefly models are trained on data that allow outputs to be commercially safe and management thinks this feature of being commercially safe is really important to enterprises; Adobe now has Firefly models for imaging, vector, design, and video; Firefly is in a wide range of Adobe products; Firefly has powered more than 12 billion generations since its launch in March 2023 (was 9 billion in 2024 Q1); management’s strategy is to build Firefly models into more streamlined and precise workflows within Adobe’s products; Adobe has Firefly Service APIs for organisations to generate content at scale, and the API calls tripled quarter-on-quarter; Firefly Service APIs are gaining real traction

We train our Firefly models on data that allows us to offer customers a solution designed to be commercially safe. We have now released Firefly models for imaging, vector and design and just previewed a new Firefly video model…

… Firefly-powered features in Adobe Photoshop, Illustrator, Lightroom, and Premier Pro help creators expand upon their natural creativity and accelerate productivity. Adobe Express is a quick and easy create anything application, unlocking creative expression for millions of users. Acrobat AI Assistant helps extract greater value from PDF documents. Adobe Experience Platform AI Assistant empowers brands to automate workflows and generate new audiences and journeys. Adobe GenStudio brings together content and data, integrating high-velocity creative expression with the enterprise activation needed to deliver personalization at scale…

…We have now surpassed 12 billion Firefly-powered generations across Adobe tools…

… Our strategy is to build technology that will create more streamlined and precise workflows within our tools through features like text-to-template in Express, Generative Fill in Photoshop, Generative Recolor in Illustrator, Generative Remove in Lightroom and the upcoming Generative Extend for Video and Premier Pro. We’re exposing the power of our creative tools and the magic of generative AI through Firefly Service APIs so organizations can generate and assemble content at scale…

…The introduction of the new Firefly video model earlier this week at IBC is another important milestone in our journey. Our video model, like the other models in the Firefly family, is built to be commercially safe with fine-grain control and application integration at its core. This will empower editors to realize their creative vision more productively in our video products, including Premier Pro…

…Strong demand for Firefly Services, which provide APIs, tools and services for content generation, editing and assembly, empowering organizations to automate content production while maintaining quality and control. Total API calls tripled quarter over quarter…

…Firefly Services, which is you can think of that also as a consumption model where we have that, it’s off to a really good start. Our ability to give enterprises the ability to automate content, create custom models within enterprises, we’re seeing real traction because it’s a differentiated solution and that it’s designed to be commercially safe…

…One other thing I’d just emphasize there is that the commercial safety is so important to businesses of all sizes, frankly, and that is something that we feel very, very differentiated.

Adobe released significant advancements in AI Assistant across Adobe Acrobat and Reader in 2024 Q2 (FY2024 Q3) and saw 70% sequential growth in AI interactions in AI Assistant; the advancements in AI Assistant include content creation capabilities; Tata Consultancy Services used AI Assistant in Adobe Acrobat to create event summaries of hours of conference videos in minutes; management intends to actively promote subscription plans for Adobe Acrobat and Reader that include generative AI capabilities

For decades, PDF has been the de facto standard for storing unstructured data, resulting in the creation and sharing of trillions of PDFs. The introduction of AI Assistant across Adobe Acrobat and Reader has transformed the way people interact with and extract value from these documents. In Q3, we released significant advancements, including the ability to have conversations across multiple documents and support for different document formats, saving users valuable time and providing important insights. We are thrilled to see this value translate into AI Assistant usage with over 70% quarter-over-quarter growth in AI interactions. 

In addition to consumption, we’re focused on leveraging generative AI to expand content creation in Adobe Acrobat. We’ve integrated Adobe Firefly Image Generation into our edit PDF workflows. We’ve optimized AI Assistant in Acrobat to generate content fit for presentations, e-mails and other forms of communication, and we’re laying the groundwork for richer content creation, including the generation of Adobe Express projects.

The application of this technology across verticals and industries is virtually limitless. Tata Consultancy Services recently used Adobe Premiere Pro to transcribe hours of conference videos and then used AI Assistant in Acrobat to create digestible event summaries in minutes. This allowed them to distribute newsletters on session content to attendees in real-time.

We’re excited to leverage generative AI to add value to content creation and consumption in Acrobat and Reader in the months ahead. Given the early adoption of AI Assistant, we intend to actively promote subscription plans that include generative AI capabilities over legacy perpetual plans that do not.

Adobe GenStudio is integrated across Experience Cloud and Creative Cloud and helps marketers quickly plan, create, store, deliver, and measure marketing content; Vanguard used Adobe GenStudio to increase quality engagement with investors by 176% through one-to-one personalisation, and to enjoy millions in savings

Customers are embracing the opportunity to address their content supply chain challenges with Adobe GenStudio. With native integrations across Experience Cloud and Creative Cloud, GenStudio empowers marketers to quickly plan, create, store, deliver, and measure marketing content and drive greater efficiency in their organizations. Financial services leader Vanguard is creating an integrated content supply chain to serve the strategic goal of deepening their relationships with a broad range of investors. Leveraging the GenStudio solution, Vanguard was able to increase quality engagement by 176% by focusing on one-to-one personalization and to realize millions in savings by improving content velocity and resource allocation with an end-to-end content creation workflow.

Adobe’s management has been very consistent over the past 1-1.5 years in how they have approached AI, and that is, Adobe would be developing a broad set of models for the creative community, and the models would be highly differentiated based on quality, commercial safety, and integrability into Adobe’s product portfolio

I think we’ve been incredibly consistent with what we’ve said, dating back 1 year, 1.5 years ago, where we talked about the fact that we were going to develop the broadest set of models for the creative community. And we were going to differentiate the models based on quality, commercial safety, integratability into our tools and controllability. And as you’ve seen very methodically over the last 18 months, we continue to bring more and more of that innovation to life. And that fundamentally is working as we’ve now started to integrate it much more actively into our base. If you look at it with photography, we now have in our tool, Generative Remove, we have AI-assisted edits in design, we have Generative Pattern, Generative Fill Shape. We have, in Photoshop, we have Gen Remove. We also have Gen Fill, and I can continue on with all the generations, but we’ve also now started to integrate it in Firefly Services for what we’re enabling enterprises to be able to access and use in terms of batch work and through APIs.

Adobe’s management is seeing the accelerated use and consumption of generative AI credits in Adobe’s products play out the way they expected it to; total consumption credits are going up with the introduction of each new generative AI capability 

If you look at sort of how that’s played out, as we talked about, we’re seeing accelerated use and generative credits being consumed because of that deeper integration into all of our tools, and that is playing out as we expected…

…And we do see with every subsequent capability we integrate into the tool, total credits consumed going up. 

Adobe’s management is seeing Adobe enjoying indirect monetisation from the AI features of its products, such as (1) the products having more value and pricing, (2) users being retained better when they use generative AI features, and (3) higher conversion of users when they try out Adobe products

When you look at then how that converts to monetization, first and foremost, we’ve integrated it a lot of that value into our core products with more value and more pricing. We’re also seeing that when people use these generative features, they retain better. We’re also seeing that when people come to Adobe to try our Creative Cloud applications or Express application, they’re able to convert better. And so there are all these ancillary implied benefits that we’re getting. 

For direct monetisation of the AI features in Adobe’s products, management is thinking of (1) instituting caps on generative AI credit consumption, (2) having AI plans with different AI capabilities; but direct monetisation is currently still not the key focus that management has, because they want to focus on proliferation and usage of generative AI across the user base

In terms of direct monetization, what we’ve said in the past is that the current model is around generative credits, which is I think where you’re going with this. And we do see with every subsequent capability we integrate into the tool, total credits consumed going up. Now what we are trying to do as we go forward, we haven’t started instituting the caps yet. And part of this is, as we’ve said all along, we want to really focus our attention on proliferation and usage across our base. We see a lot of users excited about it. It’s some of the most actively used features that we’ve ever released. And we want to avoid the generation anxiety that people feel. But we’re watching very closely as the economy of generative credits evolves, and we’re going to look at instituting those caps at some point when we feel the time is right and/or we’re also looking at other alternative models. What we did with Acrobat AI Assistant has proven to be very effective. And so we’re also considering other opportunities like having standard CC plans that have a core set of generative capabilities but also having premium API — sorry, premium AI plans that will include things more like video and other things.

Adobe’s management thinks Adobe’s generative AI video models are already pretty capable, but they are going to get better over time; management thinks that the real value of generative AI video models is not in their ability to create a video through a description the user gives, but in their ability to extend the video

I don’t know if you had a chance to see some of the videos we put out there integrated directly into premier, also text to video, images to video, more controllability. We have also the ability now to generate not just themes with humans and dogs and organic animals, but all these like overlays and things that creative professionals actually want to work with. And so we’re very excited about the set of things that they can get out of the box that get going. And human faces and things will just continue to get better…

…I spend a couple of hours with our video team. They have just absolutely hit it out of the park. I mean, the work that they have done, which is leveraging the image models with video, and again, I think to David’s point, the integration with Premier, that’s where we’ve always said, it’s the integration of the model and the application that differentiates it. I think when other models first came out, people were like, “Wow, you can describe it.” That’s just such a small part of where the value is. And the real value is, you have a video, you want to extend it. It’s a game changer in terms of what we can do. So really excited about the stuff that we’re doing in video. 

MongoDB (NASDAQ: MDB)

MongoDB’s management sees AI as a longer-term opportunity for MongoDB; management is seeing companies largely still experimenting with AI applications currently; management thinks inference workloads will come, but monetisation of AI apps will take time

AI continues to be an additional long-term opportunity for our business. At the start of the fiscal year, we told you that we didn’t expect AI to be a meaningful tailwind for our business in fiscal year 2025, which has proven accurate. Based on recent peer commentary, it seems that the industry now mostly agrees with this view. Companies are currently focusing their spending on the infrastructure layer of AI and are still largely experimenting with AI applications. Inference workloads will come and should benefit MongoDB greatly in the long run, but we are still very early when the monetization of AI apps will take time. AI demand is a question of when, not if.

MongoDB’s management has been talking to customers and they think MongoDB is the ideal database for AI apps for five reasons: (1) AI workloads involve a wide variety of data types and MongoDB’s document-model database is meant to handle this variety well, thus providing a well-rounded one-stop solution, (2) MongoDB’s database is high-performance and scalable, and allows AI workloads to utilise real-time operational data, (3) MongoDB’s database is integrated with leading app development frameworks and AI platforms, (4) MongoDB’s database has enterprise-grade security and compliance features, and (5) MongoDB’s database can be run anywhere on the customer’s choice; management feels very good about MongoDB’s positioning for AI

Our discussions with customers and partners give us increasing conviction that we are the ideal data layer for AI apps for a number of key reasons.

First, more than any other type of modern workload, AI-driven workloads require the underlying database to be capable of processing queries against rich and complex data structures quickly and efficiently. Our flexible document model is uniquely positioned to help customers build sophisticated AI applications because it is designed to handle different data types, your source data, vector data, metadata and generated data right alongside your live operational data, outdating the need for multiple database systems and complex back-end architectures.

Second, MongoDB offers a high performance and scalable architecture. As the latency of LLMs improve, the value of using real-time operational data for AI apps will become even more important.

Third, we are seamlessly integrated with leading app development frameworks and AI platforms, enabling developers to incorporate MongoDB into their existing workflows while having the flexibility to choose the LLM and other specific tools that best suit their needs.

Fourth, we meet or exceed the security and compliance requirements expected from an enterprise database, including enterprise-grade encryption, authorization and auditability.

Lastly, customers can run MongoDB anywhere, on-premise or as a fully managed service in 1 of the 118 global cloud regions across 3 hyperscalers giving them the flexibility to run workloads to meet — to best meet their application use cases and business needs…

… As the performance of these LLMs and latency of these LLMs increase, accessing real-time data becomes really important like, say, you’re calling and talking to a customer support chatbot, that you want that chatbot to have up-to-date information about that customer so that they can provide the most relevant and accurate information possible…

…I think it’s a quickly evolving space, but we feel very good about our positioning for AI, even though it’s still very early days.

MongoDB’s management sees 3 ways AI can accelerate MongoDB’s business over time: (1) AI will drive the cost of building applications, as all past platform shifts have done, thus leading to more apps and higher demand for databases, (2) MongoDB can be the database of choice for developers building AI applications (see Point 9 on MongoDB’s new AI Applications Program), and (3) MongoDB can help customers modernise their application estate (see Point 10 for more on this opportunity)

We see 3 main opportunities where we believe AI will accelerate our business over time. The first is that the cost of building applications in the world of AI will come down as we’ve seen with every previous platform shift, creating more applications and more data requiring more databases. The second opportunity is for us to be the database of choice for customers building greenfield AI applications…

…The third opportunity is to help customers modernize their legacy application estate. 

MongoDB’s management made the MongoDB AI Applications Program (MAAP) generally available in July 2024; MAAP brings the cloud computing hyperscalers and prominent AI model-building startups into one ecosystem to reduce the complexity and difficulty for MongoDB’s customers when they build new AI applications 

While we see that there’s tremendous amount of interest in and planning for new AI-powered applications, the complexity and fast-moving nature of the AI ecosystem slows customers down. That’s why we launched the MongoDB AI Applications Program, or MAAP, which became generally available to customers last month. MAAP brings together a unique ecosystem, including the 3 major cloud providers, AWS, Azure and GCP as well as Accenture and AI pioneers like Anthropic and Cohere. MAAP offers customers reference architectures and end-to-end technology stack that includes prebuilt integrations, professional services and a unified support system to help customers quickly build and deploy AI applications.

Modernising legacy application estates is a big opportunity, as most of the $80 billion database market is still in legacy relational databases; MongoDB has the Relational Migrator product to help customers migrate from legacy relational databases to the company’s document-model database; management thinks AI can significantly improve the process of modernising legacy applications by helping with understanding legacy code and rewriting them as modern versions; MongoDB launched a few pilots with customers earlier in 2024 to modernise their legacy applications with the help of AI and the results are exciting; the CIO (Chief Information Officer) of an insurance company in the pilots said the modernisation process was the first tangible return he had seen in his AI investments; management thinks it will take time for the modernisation program to contribute meaningful revenue to MongoDB, but they are excited 

Most of the existing $80-billion-plus database industry is built on dated relational architecture. Modernizing legacy applications has always been part of our business, and we have taken steps over the years to simplify and demystify this complex process through partnerships, education and most recently, our Relational Migrator product. AI offers a potential step function improvement, lowering the cost and reducing their time and risk to modernize legacy applications…

…Earlier this year, we launched several pilots with our customers where we work with them to modernize mission-critical applications, leveraging both AI tooling and services. The early results from these pilots are very exciting as our customers are experiencing significant reductions in time and cost of modernization. In particular, we have seen dramatic improvements in time and cost to rewrite application code and generate test suites. We see increasing interest from customers that want to modernize their legacy application estate, including large enterprise customers. As a CIO of one of the world’s largest insurance companies said about our pilot, this is the first tangible return he’s seen on his AI investments. While it’s still early days and generating meaningful revenue from this program will take time, we are excited about the results of our pilots and the growing pipeline of customers eager to modernize their legacy estate…

…Since day one, since our IPO, we’ve been getting customers to migrate off relational to MongoDB. But one of the biggest friction points has been that while it’s easy to move the data, you can map the schema from a relational schema to a document schema and you can automate that, the biggest stumbling block is that the customer has to or some third party has to rewrite the application, which, by definition, creates more costs, more time and in some cases, more risk especially for older apps, where the development teams who built those apps no longer exist. So what’s been compelling about AI is that AI has finally created a shortcut to overcome that big hurdle. And so essentially, you can start basically diagnosing the code, understand the code, recreate a modern version of that code and generate test suites to make sure the new code performs like the old code. So that definitely gets people’s interest because now, all of a sudden, what may take years or multiyears, you can do in a lot less time. And the pilots that we have done, the time and cost savings have been very, very compelling.

That being said, we’re in the very early days. There’s a lot of interest. We have a growing pipeline of customers across, frankly, all parts of the world from North America to EMEA and even the Pac Rim. And so we’re quite excited about the opportunity. But again, I would say it’s very early days.

Delivery Hero, a leading local delivery platform, is using MongoDB Atlas Vector Search to provide AI-powered hypersonalised results to users; Delivery Hero found that MongoDB Atlas Vector Search helped it build solutions for less cost than alternative technologies

Delivery Hero, a long-time MongoDB Atlas customer is the world’s leading local delivery platform, operating in 70-plus countries across 4 continents. Their quick commerce service enables customers to select fresh produce for delivery from local grocery stores. Approximately 10% of the inventory is fast-moving perishable produce that can go quickly out of stock. The company risks losing revenue and increasing customer churn if the customer didn’t have viable alternatives to their first choice. To address these risks, they are now using state-of-the-art AI models and MongoDB Atlas Vector Search to give hyperpersonalized alternatives to customers in real time if items they want to order are out of stock. With the introduction of MongoDB Atlas Vector Search, the data science team recognized that they could build a highly performant, real-time solution more quickly and for less cost than alternative technologies. 

MongoDB’s management believes that general-purpose LLMs (large language models) will win and will use RAG (retrieval augmented generation) as the primary way to combine generally available data to proprietary data; management is seeing advanced RAG use-cases in answering complex questions

There are some questions about LLMs, whether a general-purpose LLM or a fine-tune LLM, what the trade-offs are. Our belief is that given the performance of LLMs, you’re going to see the general purpose LLMs probably win and will use RAG as the predominant approach to marry generally available data with proprietary data. And then you are starting to see things like advanced RAG use cases where you get much more sophisticated ways to ask complex questions, provide more accurate and detailed answers and better adapt to different types of information and queries.

MongoDB’s management is seeing most AI workloads happen in the cloud, but they also see a lot of customers using open-source LLMs and running those workloads locally

We predominantly see most of the AI workloads in the cloud, but there are definitely lots of customers who are looking at using open source LLMs, in particular, things like Llama, and running those workloads locally.

MongoDB’s management believes MongoDB wins against Postgres for winning AI workloads because MongoDB can handle complex data types whereas Postgres, which is a relational – or SQL – database, struggles

MongoDB is designed to handle these different data structures. And I talked about we can help unify metadata, operational data, vector data and generate it all in one platform. Relational databases, and Postgres is one of them, have limitations in terms of what they can — how they can handle different types of data. In fact, when the data gets too large, these relational databases have to do what’s called off-row storage. And it becomes — it creates a performance overhead on these relational platforms. Postgres has this thing called TOAST, which is — stands for The Oversized-Attribute Storage Technique. And it’s basically a way to handle these different data types, but it creates a massive performance overhead. So we believe that we are architecturally far better for these more complex AI workloads than relational databases.

MongoDB’s management is seeing growing adoption of Vector, and Vector is helping attract new customers to MongoDB; an existing Atlas customer, a financial news organisation, migrated from Elastic Search to Atlas Search in order to use MongoDB’s Vector Search capabilities; an European energy company is using Vector Search for a geospatial search application

On Vector, we’re continuing to see growth in adoption, and we see Vector is effective in attracting new customers to the MongoDB platform. A world-renowned financial news organization, which is already running in Atlas, migrated from Elasticsearch to Atlas Search using Search Nodes to take advantage of our Vector Search capabilities to build a site search that combines lexical search with semantic search to find the most relevant articles for user query. And a European energy company built a geospatial search application using Atlas Search and Vector search and the app was built on-prem but — and to clouds to vectorize geospatial data and facilitate research and discovery.

MongoDB’s management is seeing MongoDB’s customers improve their software development productivity with the help of AI, but the rate of improvement is all over the place

[Question] We’ve talked before in the past that AI is just driving a lot of new code, making developers significantly more productive. Have you seen that behavior in any of your existing customers on Atlas where maybe their utilization rate goes up or the number of applications built per customer goes up?

[Answer] A common question I ask our customers when I meet with them in terms of what code generation tools that they’re using and what benefits they’re gaining. The answers tend to be a little bit all over the map. Some people see 10%, 15% productivity improvement. Some people say 20%, 25% productivity improvement. Some people say it helps my senior developers be more productive. Some people say it helps my junior developers become more like senior developers. So the answers tend to be all over the map.

Nvidia (NASDAQ: NVDA)

Nvidia’s Data Center revenue had incredibly strong growth in 2024 Q2, driven by demand for the Hopper GPU computing platform; compute revenue was up by 2.5x while networking revenue was up by 2x

Data Center revenue of $26.3 billion was a record, up 16% sequentially and up 154% year-on-year, driven by strong demand for NVIDIA Hopper, GPU computing and our networking platforms. Compute revenue grew more than 2.5x. Networking revenue grew more than 2x from the last year.

Even as Nvidia is getting ready to launch its Blackwell-architecture GPUs, customers are still buying the Hopper-architecture GPUs; the H200 platform, based on the Hopper architecture, started ramping in 2024 Q2 and offers 40% more memory bandwidth than the H100; management thinks that the reasons why the Hopper-architecture chips still enjoy strong demand despite the imminent arrival of the Blackwell-architecture chips are (1) AI companies need chips today to process data right now, and (2) AI companies are in a race to build the best model and they’re all racing to be the first

Customers continue to accelerate their Hopper architecture purchases while gearing up to adopt Blackwell…

…NVIDIA H200 platform began ramping in Q2, shipping to large CSPs, consumer Internet and enterprise companies. The NVIDIA H200 builds upon the strength of our Hopper architecture and offering over 40% more memory bandwidth compared to the H100…

…The demand for Hopper is really strong. And it’s true, the demand for Blackwell is incredible. There’s a couple of reasons for that. The first reason is, if you just look at the world’s cloud service providers and the amount of GPU capacity they have available, it’s basically none…

…A generative AI company spends the vast majority of their invested capital into infrastructure so that they could use an AI to help them create products. And so these companies need it now. They just simply can’t afford — you just raise money, they want you to put it to use now. You have processing that you have to do. You can’t do it next year. You got to do it today. And so that’s one reason. The second reason for Hopper demand right now is because of the race to the next plateau. The first person to the next plateau gets to introduce some revolutionary level of AI. The second person who gets there is incrementally better or about the same. And so the ability to systematically and consistently race to the next plateau and be the first one there is how you establish leadership…

…We believe our Hopper will continue to grow into the second half. We have many new products for Hopper or existing products for Hopper that we believe will start continuing to ramp in the next quarters, including our Q3 and those new products moving to Q4. So let’s say, Hopper, therefore, versus H1 is a growth opportunity for that. 

Nvidia’s management thinks that the next generation of AI models will need 10-20 times more compute to train

Next-generation models will require 10 to 20x more compute to train with significantly more data. The trend is expected to continue.

Nvidia’s management sees inferencing accounting for 40% of Data Center revenue over the last 4 quarters (was 40% as of 2024 Q1)

Over the trailing 4 quarters, we estimate that inference drove more than 40% of our Data Center revenue.

Nvidia’s management is seeing demand coming from builders of frontier AI models, consumer Internet companies, and companies building generative AI applications for a wide range of use cases

Demand for NVIDIA is coming from frontier model makers, consumer Internet services, and tens of thousands of companies and start-ups building generative AI applications for consumers, advertising, education, enterprise and health care, and robotics. 

Nvidia’s Data Center revenue in China grew sequentially in 2024 Q2, but still remains below the level seen prior to export controls; management expects tough competition in China

Our Data Center revenue in China grew sequentially in Q2 and a significant contributor to our Data Center revenue. As a percentage of total Data Center revenue, it remains below levels seen prior to the imposition of export controls. We continue to expect the China market to be very competitive going forward.

Nvidia has leadership in inference

The latest round of MLPerf inference benchmarks highlighted NVIDIA’s inference leadership, with both NVIDIA Hopper and Blackwell platforms combining to win gold medals on all tests.

Nvidia’s Blackwell family of chips combines GPUs, CPUs, DPUs (data processing units), NVLink, and networking; the GB200 NVL72 system in the Blackwell family links up 72 GPUs to act as 1 GPU and is up to 30 times faster for LLM (large language model) inference workloads; Nvidia has made a change to the Blackwell architecture to improve production yields; Blackwell’s production is expected to ramp in the fourth quarter of 2024; management sees demand for Blackwell exceeding supply by a wide margin up to 2025; there are more than 100 different Blackwell architecture systems; Nvidia’s Blackwell systems come in both air-cooled and liquid-cooled flavours; management expects Nvidia’s Data Center business to grow significantly in 2025 and 2026, powered by the Blackwell system; management sees Blackwell as a step-function improvement over Hopper that delivers 3-5 times more AI throughput than Hopper; Blackwell required 7 one-of-a-kind chips to build; Nvidia designed and optimised the Blackwell system end-to-end

The NVIDIA GB200 NVL72 system with the fifth-generation NVLink enables all 72 GPUs to act as a single GPU and deliver up to 30x faster inference for LLM’s workloads and unlocking the ability to run trillion-parameter models in real time…

…We executed a change to the Blackwell GPU mass to improve production yields. Blackwell production ramp is scheduled to begin in the fourth quarter and continue into fiscal year ’26. In Q4, we expect to get several billion dollars in Blackwell revenue…

Demand for Blackwell platforms is well above supply, and we expect this to continue into next year…

…There are something like 100 different types of Blackwell-based systems that are built that were shown at Computex, and we’re enabling our ecosystem to start sampling those…

…We offer multiple configurations of Blackwell. Blackwell comes in either a Blackwell classic, if you will, that uses the HGX form factor that we pioneered with Volta. I think it was Volta. And so we’ve been shipping the HGX form factor for some time. It is air cooled. The Grace Blackwell is liquid cooled…

…We expect to grow our Data Center business quite significantly next year. Blackwell is going to be a complete game changer for the industry. And Blackwell is going to carry into the following year…

…Blackwall is a step-function leap over Hopper. Blackwell is an AI infrastructure platform, not just the GPU. It also happens to be the name of our GPU, but it’s an AI infrastructure platform. As we reveal more of Blackwell and sample systems to our partners and customers, the extent of Blackwell’s lead becomes clear. The Blackwell vision took nearly 5 years and 7 one-of-a-kind chips to realize: the Gray CPU, the Blackwell dual GPU and a colos package, ConnectX DPU for East-West traffic, BlueField DPU for North-South and storage traffic, NVLink switch for all-to-all GPU communications, and Quantum and Spectrum-X for both InfiniBand and Ethernet can support the massive burst traffic of AI. Blackwell AI factories are building size computers. NVIDIA designed and optimized the Blackwell platform full stack, end-to-end, from chips, systems, networking, even structured cables, power and cooling and mounts of software to make it fast for customers to build AI factories. These are very capital-intensive infrastructures. Customers want to deploy it as soon as they get their hands on the equipment and deliver the best performance and TCO. Blackwell provides 3 to 5x more AI throughput in a power-limited data center than Hopper…

…The Blackwell system lets us connect 144 GPUs in 72 GB200 packages into 1 NVLink domain, with an aggregate NVLink bandwidth of 259 terabytes per second in 1 rack. Just to put that in perspective, that’s about 10x higher than Hopper.  

Nvidia’s Ethernet for AI revenue doubled sequentially; management sees Nvidia’s ethernet product, Spectrum-X, enjoying wide support from the AI ecosystem; Spectrum-X performs 1.6 times better than traditional Ethernet; management plans to launch new Spectrum-X products every year and thinks that Spectrum-X will soon become a multi-billion dollar product

Our Ethernet for AI revenue, which includes our Spectrum-X end-to-end Ethernet platform, doubled sequentially with hundreds of customers adopting our Ethernet offerings. Spectrum-X has broad market support from OEM and ODM partners and is being adopted by CSPs, GPU cloud providers and enterprises, including xAI to connect the largest GPU compute cluster in the world. Spectrum-X supercharges Ethernet for AI processing and delivers 1.6x the performance of traditional Ethernet. We plan to launch new Spectrum-X products every year to support demand for scaling compute clusters from tens of thousands of GPUs today to millions of DPUs in the near future. Spectrum-X is well on track to begin a multibillion-dollar product line within a year.

Japan’s government is working with Nvidia to build an AI supercomputer; Nvidia’s management thinks sovereign AI revenue will be in the low-teens billion-range this year; management is seeing countries want to build their own generative AI that incorporates their own language, culture, and data

Japan’s National Institute of Advanced Industrial Science and Technology is building its AI Bridging Cloud Infrastructure 3.0 supercomputer with NVIDIA. We believe sovereign AI revenue will reach low double-digit billions this year…

…It certainly is a unique and growing opportunity, something that surfaced with generative AI and the desires of countries around the world to have their own generative AI that would be able to incorporate their own language, incorporate their own culture, incorporate their own data in that country.

Most of the Fortune 100 companies are working with Nvidia on AI projects

We are working with most of the Fortune 100 companies on AI initiatives across industries and geographies. 

Nvidia’s management is seeing a range of applications driving the company’s growth; these applications include (1) Amdocs’ smart agent which is reducing customer service costs by 30%, and (2) Wistron’s usage of Nvidia AI Ominiverse to reduce cycle times in its factories by 50%

A range of applications are fueling our growth, including AI-powered chatbots, generative AI copilots and agents to build new, monetizable business applications and enhance employee productivity. Amdocs is using NVIDIA generative AI for their smart agent, transforming the customer experience and reducing customer service costs by 30%. ServiceNow is using NVIDIA for its Now Assist offering, the fastest-growing new product in the company’s history. SAP is using NVIDIA to build Joule copilot. Cohesity is using NVIDIA to build their generative AI agent and lower generative AI development costs. Snowflake, who serves over 3 billion queries a day for over 10,000 enterprise customers, is working with NVIDIA to build copilots. And lastly, Wistron is using NVIDIA AI Omniverse to reduce end-to-end cycle times for their factories by 50%.

Every automobile company that is developing autonomous vehicle technology is working with Nvidia; management thinks that automotive will account for multi-billions in revenue for Nvidia; Nvidia won the Autonomous Brand Challenge at the recent Computer Vision and Pattern Recognition Conference

Every automaker developing autonomous vehicle technology is using NVIDIA in their data centers. Automotive will drive multibillion dollars in revenue across on-prem and cloud consumption and will grow as next-generation AV models require significantly more compute…

…At the Computer Vision and Pattern Recognition Conference, NVIDIA won the Autonomous Brand Challenge in the end-to-end driving at scale category, outperforming more than 400 entries worldwide. 

Nvidia’s management announced Nvidia AI Foundry – a platform for building custom AI models – in 2024 Q2; users of Nvidia AI Foundry are able to customise Meta’s Llama 3.1 foundation AI model; Nvidia AI Foundry is the first platform where users are able to customise an open-source, frontier-level foundation AI model; Accenture is already using Nvidia AI Foundry 

During the quarter, we announced a new NVIDIA AI foundry service to supercharge generative AI for the world’s enterprises with Meta’s Llama 3.1 collection of models… 

…Companies for the first time can leverage the capabilities of an open source, frontier-level model to develop customized AI applications to encode their institutional knowledge into an AI flywheel to automate and accelerate their business. Accenture is the first to adopt the new service to build custom Llama 3.1 models for both its own use and to assist clients seeking to deploy generative AI applications.

Companies from many industries are using NIMs (Nvidia inference microservices) for deployment of generative AI; AT&T saw 70% cost savings and 8 times latency reduction with NIM; there are 150 organisations using NIMs; Nvidia recently announced NIM Agent Blueprints, a catalog of reference AI applications; Nvidia is using NIMs to open the Nvidia Omniverse to new industries

NVIDIA NIMs accelerate and simplify model deployment. Companies across health care, energy, financial services, retail, transportation, and telecommunications are adopting NIMs, including Aramco, Lowes, and Uber. AT&T realized 70% cost savings and 8x latency reduction affter moving into NIMs for generative AI, call transcription and classification. Over 150 partners are embedding NIMs across every layer of the AI ecosystem. 

We announced NIM Agent Blueprints, a catalog of customizable reference applications that include a full suite of software for building and deploying enterprise generative AI applications. With NIM Agent Blueprints, enterprises can refine their AI applications over time, creating a data-driven AI flywheel. The first NIM Agent Blueprints include workloads for customer service, computer-aided drug discovery, and enterprise retrieval augmented generation. Our system integrators, technology solution providers, and system builders are bringing NVIDIA NIM Agent Blueprints to enterprises…

…We announced new NVIDIA USD NIMs and connectors to open Omniverse to new industries and enable developers to incorporate generative AI copilots and agents into USD workloads, accelerating our ability to build highly accurate virtual worlds.

Nvidia’s AI Enterprise software platform is powering Nvidia’s software-related business to approach a $2 billion annual revenue run-rate by the end of this year; management thinks Nvidia AI Enterprise represents great value for customers by providing GPUs at a price of $4,500 per GPU per year; management thinks the TAM (total addressable market) for Nvidia’s AI software business can be significant

NVIDIA NIM and NIM Agent Blueprints are available through the NVIDIA AI Enterprise software platform, which has great momentum. We expect our software, SaaS and support revenue to approach a $2 billion annual run rate exiting this year, with NVIDIA AI Enterprise notably contributing to growth…

…At $4,500 per GPU per year, NVIDIA AI Enterprise is an exceptional value for deploying AI anywhere. And for NVIDIA’s software TAM, it can be significant as the CUDA-compatible GPU installed base grows from millions to tens of millions. 

Computers that contain Nvidia’s RTX chip can deliver up to 1,300 AI TOPS (tera operations per second); there are more than 200 RTX AI computer models from computer manufacturers; there is an installed base of 100 million RTX AI computers; a game called Mecha BREAK is the first game to use Nvidia ACE, a generative AI service for creating digital humans

Every PC with RTX is an AI PC. RTX PCs can deliver up to 1,300 AI tops and there are now over 200 RTX AI laptops designed from leading PC manufacturers. With 600 AI-powered applications and games and an installed base of 100 million devices, RTX is set to revolutionize consumer experiences with generative AI. NVIDIA ACE, a suite of generative AI technologies is available for RTX AI PCs. Mecha BREAK is the first game to use NVIDIA ACE, including our small language model, Nemotron-4 4B, optimized on device inference. 

Foxconn, the largest electronics manufacturer in the world, and Mercedes-Benz, the well-known auto manufacturer, are using Nvidia Omniverse to produce digital twins of their manufacturing plants

The world’s largest electronics manufacturer, Foxconn, is using NVIDIA Omniverse to power digital twins of the physical plants that produce NVIDIA Blackwell systems. And several large global enterprises, including Mercedes-Benz, signed multiyear contracts for NVIDIA Omniverse Cloud to build industrial digital twins of factories.

Many robotics companies are using Nvidia’s AI robot software

Boston Dynamics, BYD Electronics, Figure, Intrinsyc, Siemens, Skilled AI and Teradyne Robotics are using the NVIDIA Isaac robotics platform for autonomous robot arms, humanoids and mobile robots.

Nvidia’s management is seeing some customers save up to 90% in computing costs by transitioning from genera-purpose computing (CPUs) to accelerated computing (GPUs)

We know that accelerated computing, of course, speeds up applications. It also enables you to do computing at a much larger scale, for example, scientific simulations or database processing. But what that translates directly to is lower cost and lower energy consumed. And in fact, this week, there’s a blog that came out that talked about a whole bunch of new libraries that we offer. And that’s really the core of the first platform transition, going from general-purpose computing to accelerated computing. And it’s not unusual to see someone save 90% of their computing cost. And the reason for that is, of course, you just sped up an application 50x. You would expect the computing cost to decline quite significantly.

Nvidia’s management believes that generative AI is a new way to write software and is changing how every layer of computing is done

Generative AI, taking a step back about why it is that we went so deeply into it, is because it’s not just a feature, it’s not just a capability, it’s a fundamental new way of doing software. Instead of human-engineered algorithms, we now have data. We tell the AI, we tell the model, we tell the computer what are the expected answers, what are our previous observations, and then for it to figure out what the algorithm is, what’s the function. It learns a universal — AI is a bit of a universal function approximator and it learns the function. And so you could learn the function of almost anything, and anything that you have that’s predictable, anything that has structure, anything that you have previous examples of. And so now here we are with generative AI. It’s a fundamental new form of computer science. It’s affecting how every layer of computing is done from CPU to GPU, from human-engineered algorithms to machine-learned algorithms, and the type of applications you could now develop and produce is fundamentally remarkable.

Nvidia’s management thinks AI models are still seeing the benefits of scaling

There are several things that are happening in generative AI. So the first thing that’s happening is the frontier models are growing in quite substantial scale. And we’re still all seeing the benefits of scaling.

The amount of compute needed to train an AI model goes up much faster than the size of the model; management thinks the next generation of AI models could require 10-40 times more compute 

Whenever you double the size of a model, you also have to more than double the size of the data set to go train it. And so the amount of flops necessary in order to create that model goes up quadratically. And so it’s not unexpected to see that the next-generation models could take 10x, 20x, 40x more compute than last generation.

Nvidia’s management is seeing more frontier model makers in 2024 than in 2023

Surprisingly, there are more frontier model makers than last year.

Nvidia’s management is seeing advertising-related computing needs shifting from being powered by CPUs to being powered by GPUs and generative AI

The largest systems, largest computing systems in the world today, and you’ve heard me talk about this in the past, which are recommender systems moving from CPUs. It’s now moving from CPUs to generative AI. So recommender systems, ad generation, custom ad generation targeting ads at very large scale and quite hyper-targeting, search and user-generated content, these are all very large-scale applications that have now evolved to generative AI.

Nvidia’s management is seeing generative AI startups generating tens of billions of revenue-opportunities for cloud computing providers

The number of generative AI start-ups is generating tens of billions of dollars of cloud renting opportunities for our cloud partners

Nvidia’s management is seeing that cloud computing providers have zero GPU capacity available because they are using it for internal workloads (such as accelerating data processing) and renting it out to model makers and other AI startups

If you just look at the world’s cloud service providers and the amount of GPU capacity they have available, it’s basically none. And the reason for that is because they’re either being deployed internally for accelerating their own workloads, data processing, for example…

…The second is, of course, the rentals. They’re renting capacity to model makers. They’re renting it to start-up companies. 

Nvidia’s management thinks Nvidia’s GPUs are the only AI GPUs that process and accelerate data; before the advent of generative AI, the number one use case of Nvidia’s GPUs was to accelerate data processing

NVIDIA’s GPUs are the only accelerators on the planet that process and accelerate data. SQL data, Panda’s data, data science toolkits like Panda’s, and the new one, Polar’s, these are the ones that are the most popular data processing platforms in the world, and aside from CPUs which, as I’ve mentioned before, are really running out of steam, NVIDIA’s accelerated computing is really the only way to get boosting performance out of that. And so the #1 use case long before generative AI came along is the migration of applications one after another to accelerated computing.

Nvidia’s management thinks that those who purchase Nvidia AI chips are getting immediate ROI (return on investment) for a few reasons: (1) GPUs are a better way to build data centers compared to CPUs because GPUs save money on data processing compared to CPUs, (2) cloud computing providers who rent out GPUs are able to rent out their GPUs the moment they are built up in the data center because there are many generative AI companies clamouring for the chips, and (3) generative AI improves a company’s own services, which delivers a fast ROI

The people who are investing in NVIDIA infrastructure are getting returns on it right away. It’s the best ROI infrastructure, computing infrastructure investment you can make today. And so one way to think through it, probably the easiest way to think through it is just to go back to first principles. You have $1 trillion worth of general-purpose computing infrastructure. And the question is, do you want to build more of that or not?

And for every $1 billion worth of Juniper CPU-based infrastructure that you stand up, you probably rent it for less than $1 billion. And so because it’s commoditized, there’s already $1 trillion on the ground. What’s the point of getting more? And so the people who are clamoring to get this infrastructure, one, when they build out Hopper-based infrastructure and soon, Blackwell-based infrastructure, they start saving money. That’s tremendous return on investment. And the reason why they start saving money is because data processing saves money, and data processing is probably just a giant part of it already. And so recommender systems save money, so on and so forth, okay? And so you start saving money.

The second thing is everything you stand up are going to get rented because so many companies are being founded to create generative AI. And so your capacity gets rented right away and the return on investment of that is really good.

And then the third reason is your own business. Do you want to either create the next frontier yourself or your own Internet services, benefit from a next-generation ad system or a next-generation recommender system or a next-generation search system? So for your own services, for your own stores, for your own user-generated content, social media platforms, for your own services, generative AI is also a fast ROI.

Nvidia’s management is seeing a significant number of data centers wanting liquid-cooled GPU systems because the use of liquid cooling enables 3-5 times more AI throughput compared to the past, resulting in cheaper TCO (total cost of ownership)

The number of data centers that want to go to liquid cooled is quite significant. And the reason for that is because we can, in a liquid-cooled data center, in any power-limited data center, whatever size of data center you choose, you could install and deploy anywhere from 3 to 5x the AI throughput compared to the past. And so liquid cooling is cheaper. Our TCO is better, and liquid cooling allows you to have the benefit of this capability we call NVLink, which allows us to expand it to 72 Grace Blackwell packages, which has essentially 144 GPUs.

Nvidia does not do the full integration of its GPU systems into a data center because it is not the company’s area of expertise

Our customers hate that we do integration. The supply chain hates us doing integration. They want to do the integration. That’s their value-add. There’s a final design-in, if you will. It’s not quite as simple as shimmying into a data center, but the design fit-in is really complicated. And so the design fit-in, the installation, the bring-up, the repair-and-replace, that entire cycle is done all over the world. And we have a sprawling network of ODM and OEM partners that does this incredibly well.

Nvidia has released many new libraries for CUDA, across a wide variety of use cases, for AI software developers to work with

Accelerated computing starts with CUDA-X libraries. New libraries open new markets for NVIDIA. We released many new libraries, including CUDA-X Accelerated Polars, Pandas and Spark, the leading data science and data processing libraries; CUVI-S for vector databases, this is incredibly hot right now; Ariel and Ciona for 5G wireless base station, a whole world of data centers that we can go into now; Parabricks for gene sequencing and AlphaFold2 for protein structure prediction is now CUDA accelerated.

Nvidia now has 3 networking platforms for GPUs

We now have 3 networking platforms, NVLink for GPU scale-up, Quantum InfiniBand for supercomputing and dedicated AI factories, and Spectrum-X for AI on Ethernet. NVIDIA’s networking footprint is much bigger than before. 

Salesforce (NYSE: CRM)

Agentforce is a new architecture and product that management believes will be fundamental to Salesforce’s AI leadership in the next decade; Salseforce will be introducing Agentforce in its upcoming Dreamforce customer-event; Agentforce is an autonomous AI and management will be getting every attendee at Dreamforce to turn on their own AI agents; Salesforce is already building agents for the company, Workday, and Workday will be Salseforce’s first Agentforce partner; Agentforce allows companies to build custom agents for sales, service, marketing, and commerce; management believes that within a year, most companies will be deploying autonomous AI agents at scale, and these agents will have a big positive impact on companies’ operations; Agentforce is currently management’s singular focus; many companies are already using Agentforce, including one of the world’s largest healthcare companies, which is resolving more than 90% of patient inquiries with Agentforce, and thinks Agentforce is much better than any other competing AI platform; a very large media company is using Agentforce to resolve 90% of employee and consumer issues; management thinks Salesforce is the first company to deploy high-quality enterprise AI agents at scale; Agentforce is not a co-pilot, it is an autonomous agent that is accurate and can be deployed right out of the box; users of Agentforce can do advanced planning and reasoning with minimal input; management sees Agentforce as being a trusted colleague that will complement human users; management sees thousands of companies using Agentforce by January 2025; early trials of Agentforce has showed remarkable success

We’re going to talk about a whole different kind of sales force today, a different kind of architecture and a product that we didn’t even talk about on the last earnings call that is going to be fundamental to our future and a manifestation of our decade of AI leadership, which is Agentforce. Now in just a few weeks, we’re going to kick off Dreamforce, and I hope all of you are planning to be there, the largest AI event in the world with more than 45,000 trailblazers in San Francisco. And this year, Dreamforce is really becoming Agentforce…

…We’re going to show our new Agentforce agents and how we’ve reimagined enterprise software for this new world of autonomous AI. And every customer, I’m going to try to get every customer who comes to Dreamforce to turn agents on while they’re there…

…This idea that you’re not just going to have sales agents and service agents who probably read, heard maybe you saw in CBC, we’re building the agents for Workday and we’re going to be building custom agents for so many of you as well with Agentforce, because it is a development platform as well as this incredible capability to radically extend your sales and service organizations.

So when you arrive at the Dreamforce campus, you’re going to see a big sign outside that says, humans with agents drive customer success together. And that’s because we now so strongly believe the future isn’t about having a sales force or a service force or a marketing force or a commerce force or an analytics force. The future is about also having an Agentforce. And while many customers today don’t yet have agent forces, but they do have sales forces or service forces, I assure you that within a year, we’re all going to have agent forces, and we’re going to have them at scale. And it’s going to radically extend our companies and it’s going to augment our employees, make us more productive. It’s going to turn us into these incredible margin and revenue machines. It’s going to be pretty awesome…

…with this Agentforce platform, we’re making it easy to build these powerful autonomous agents for sales, for service, for marketing, for commerce, automating the entire workflow on their own, embedding agents in the flow of work and getting our customers to the agent future first. And this is our primary goal of our company right now. This is my singular focus…

…We’re going to talk about the customers who have it, customers like OpenTable and Wiley and — and ADP and RBC and so many others who are deploying these agents and running them on top of our Data Cloud and our apps…

…At Dreamforce, you’re going to hear one of the very largest health care companies in the world. It’s got 20 million consumers here in the United States who is resolving more than 90% of all patient inquiries with Agentforce and they’re benchmarking us significantly higher than any other competing AI platform, and that’s based on some incredible AI breakthroughs that we have had at Salesforce…

…One of these very large media companies that we work with, a lot of probably know who have everything, every possible media asset, while they’re just resolving 90% of all of their employee and consumer issues with Agentforce, pretty awesome. So there’s nothing more transformational than agents on the technology horizon that I can see and Salesforce is going to be the first company at scale to deploy enterprise agents and not just any enterprise agents, the highest quality, most accurate agents in the world…

…We’re seeing that breakthrough occur because with our new Agentforce platform, we’re going to make a quantum leap for in AI, and that’s why it wants you all at Dreamforce, because I want you to have your hands on this technology to really understand this. This is not co-pilots…

…These agents are autonomous. They’re able to act with accuracy. They’re able to come right out of the box. They’re able to go right out of the platform…

…These agents don’t require a conversational prompt to take action. You can do advanced planning, reasoning with minimal human input. And the example of this incredible health care company, you’re going to be able to say to the agent, “Hey, I want to look at my labs, I want to understand this. It looks like I need repeat labs. Can you reschedule those for me? It looks like I need to see my doctor, can you schedule that for me? I also want to get an MRI, I want to get this.” And the level of automation that we’re going to be able to provide and unleash the productivity back into these organizations is awesome…

…This is going to be like having these trusted colleagues can handle these time-consuming tasks engaging with these — whether it’s inbound lead or resolving this customer, patient inquiry or whatever it is, this is humans with agents driving customer success together and Agentforce agents can be set up in minutes, easily scalable, work around with the block, any language. And by the beginning of next fiscal year, we will have thousands of customers using this platform. And we will have hand help them to make it successful for them, deploy it. The early trials have been remarkable to see these customers have the success, it has been just awesome…

…We’re just at the beginning of building an Agentforce ecosystem with companies able to build agents on our platform for their workforce and use cases, and we’re excited to have Workday as our first agent force partner.

Salesforce has been able to significantly reduce hallucinations with its AI products, and thus deliver highly accurate results, through the use of new retrieval augmented generation (RAG) techniques

The accuracy of our results, the reduction of hallucinations and the level of capability of AI is unlike anything I think that any of us have ever seen, and we’ve got some incredible new techniques, especially incredible new augmented RAG techniques that are delivering us the capability to deliver this accuracy with our — for our customers.

Salesforce’s management still sees the company as the No.1 AI CRM in the world

Of course, Salesforce is the #1 AI CRM.

 In 2024 Q2, Einstein delivered 25 trillion transactions and 1 trillion workflows; Wyndham is using Einstein to reduce average call times to free up time for service agents for higher-value work

We’re just operating at this incredible scale, delivering 25 trillion Einstein transactions across all of the clouds during the quarter, that’s 25 trillion and more than 1 trillion workflows…

…MuleSoft allows Wyndham to unlock business-critical data from various platforms and onboard future franchisees faster. And with Einstein generated recommended service replies, average call times have been reduced and service agents can focus on higher priority work

Salesforce’s management thinks many of the company’s customers have a misconception about AI in that they need to build and train their own AI models; management is able to use Salesforce’s AI models and resolve issues much better than its customers’ own models; management thinks Salesforce’s AI models have the highest efficacy

I think that there’s a lot of misconceptions about AI with my customers. I have been out there very disappointed with the huge amount of money that so many of these customers have wasted on AI. They are trying to DIY their AI…

…This idea that our customers are going to have to build their own models, train their own models, retrain their own models, retrain them again and I’m meeting with these customers, and they’re so excited when they and they say, “Oh, I built this model, and we’re resolving 10%, 20%, 30%, 40% and — of this or that and whatever. ” And I’m like, really, we’ll take a look at our models and our capability where you don’t have to train or retrain anything and you’re going to get more than 90%. And then they say, wait a minute, how do you do that? And this is a moment where every customer needs to realize you don’t need the DIY your AI. You can use a platform like Salesforce to get the highest efficacy of artificial intelligence, the best capability to fully automate your company, achieve all of your goals and you can do it with professional enterprise software…

…We’re in met with one of the largest CIOs in the world is telling me how excited he was for the B2C part of this business, he built this model and accuracy rates, than I was like, really, let me show you what we’re doing here. And then he said to me, why am I doing this? Why am I not just using your platform? And I said good question. So these customers are spending millions of dollars, but are they really getting the results that they want? It feels like this early days of cloud. It’s just early days of social mobile. Customers feel like they have to DIY it, they don’t need to, they can make it all happen themselves. And you can just see that to deliver this very high-quality capability they can use a deeply integrated platform like Salesforce.

Salesforce’s management is seeing the company’s customers get immediate ROI (return on investment) from deploying AI automation solutions because the solutions can be easily and quickly customised and configured

We’ve created out-of-the-box platform to deliver all of this for them. So this could be service reply recommendations, account summaries, report generation, you’ve seen in Slack, this kind of auto summarization, recaps, all of these amazing things, the level of automation, the amount of code that our team has written, the transformation of our platform in the last 18 months, it’s remarkable. And customers love it because they can take the platform and then all of this generative AI use case customize it for their own needs or configure it using our capability because they’re doing that without writing a line of code. It’s clicks, not code, deploy them in days, not weeks. They’re doing this in months, not years, and you’re getting immediate ROI. 

Salesforce’s management thinks many of the company’s customers are really disappointed with Microsoft Co-pilot because of a lack of accuracy

So many customers are so disappointed in what they bought from Microsoft CoPilots because they’re not getting the accuracy and the response that they want. Microsoft is disappointed so many customers with AI. 

Wiley is achieving double-digit percentage increase in customer satisfaction and deflection rate, and 50% case resolution with the first generation of Agentforce; Royal Bank of Canada and AP are seeing 90% case resolution rates with the second generation of Agentforce; OpenTable is using Agentforce to support its interactions with 60,000 restaurants and 160 million diners

Wiley is a long-standing Salesforce customer. It’s one of our first deployments in the first agent force trial. It’s pretty awesome. And you all know they make textbooks and it’s back-to-school. But maybe you don’t know that Wiley has to like surge their sales and service organization at back-to-school time when everyone’s buying these textbooks. Well, now they can use agents to do that surge. They don’t have to go buy a bunch of gig workers and bring them in. and that age and capacity is so exciting for them. What we saw with Wiley was, this is a quote from them, “we’re seeing double-digit percentage increase in customer satisfaction and deflection rate compared to older technologies and in these early weeks of our busiest season. ” So that was very reassuring to us. that we have the right thing that’s happening. And Wiley has already seen a 50% increase in case resolution. That’s with our first generation of Agentforce.

As I mentioned, the second generation of Agentforce, which we have with customers already, including some of these amazing organizations like Royal Bank of Canada, ADP and others is 90% case resolution. It is an awesome moment in this tech business.

OpenTable is another super great story. You all know they are managing 60,000 restaurants, 160 million diners to support. They’re on Agentforce now. They require that incredible scale to deliver top-notch customer service. That’s why they’re using the product. It’s been awesome to get their results and it can be all kinds of questions resolving basic issues, account activations, reservation management, loyalty point expiration. Agentforce for service can easily answer all of these questions like when do my points expire for a diner asset, a follow-up question like, what about in Mexico? What about — can I make this change? That’s where we’re delivering those incredible moments for OpenTable, giving them this kind of productivity enhancement. 

Agentforce is driving growth in cloud products’ sales for Salesforce

Agentforce for sales, you can imagine extending your sales force with SDRs, BDRs who are agents that are going out and building pipeline for you and generating all kind of demand and even really closing deals. So, this is going to drive sales cloud growth. It already is, service cloud growth. It already is because customers are going to extend their sales and service organizations and become a lot more productive with these agents.

Salesforce will be releasing industry-specific AI agents in the coming months

In the coming months, we’re going to release Agentforce agents for other roles, including industry-specific agents, health agents, as I mentioned. 

Data Cloud provides the foundation for Agentforce because it holds a huge amount of data and metadata; management continues to believe that data is the foundation of AI; Data Cloud federates and connects to all other data clouds of a user to deliver super accurate AI; Data Cloud is Salesforce’s fastest-growing organic product and will be the fastest to hit $1 billion, $5 billion, and $10 billion in revenue; Data Cloud customers were up 130% year-on-year in 2024 Q2; number of Data Cloud customers spending more than $1 million annually have doubled; Data Cloud processed 2.3 quadrillion records in 2024 Q2 (was 2 quadrillion in 2024 Q1); Data Cloud consumption was up 110% year-on-year in 2024 Q2; American Family Insurance is using Data Cloud to create a 360-degree view of customers; Adecco Group is using Data Cloud to create seamless access to information for 27,000 of its employees; Windhma is using Data Cloud to unify profiles of 165 million guest records, many of which are duplicates across multiple sources

This type of performance from our Agentforce platform wouldn’t be possible without Data Cloud. One of the reasons that our agents are so accurate is because of the huge amount of data and metadata that we had. And data is the foundation for every AI transformation. And with Data Cloud, we’re providing a high-performance data lake that brings together all our customer and business data, federating data from external repositories through this credible zero-copy alliance. So customers can use our Data Cloud and then federate and connect to all their other data clouds and then we can bring it all together to deliver the super accurate AI. 

That’s why Data Cloud is absolutely our fastest-growing organic product in history. It will be the fastest product to $1 billion — it’s going to probably be the fastest product of $5 billion, $10 billion. In Q2, the number of paid Data Cloud customers grew 130% year-over-year and the number of customers spending more than $1 million annually have already doubled. In the second quarter alone, and this is amazing, data Cloud processed 2.3 quadrillion records with 110% platform consumption growth year-over-year…

…American Family Insurance with millions of policyholders nationwide is using Data Cloud to consolidate data from multiple sources through our zero-copy partner network, creating a 360-view of the customers, enabling quick segmentation and activating lead data, including their real-time web interactions. The Adecco Group expanded their data cloud in the quarter, a great example of a company leveraging its gold mine of data to gain a unified view of its customers. Connecting all this data means that 27,000 Adecco employees using Salesforce will have seamless access to key information, including financial metrics and job fulfillment status, to help Adecco improve their job fill rate ratio and reduce their cost to serve…

…Wyndham utilizes Data Cloud to unify profiles of 165 million guest records, many of which were duplicates across many sources like Amazon Redshift and the Sabre Reservation System as well as Sales Cloud, Marketing Cloud and Service Cloud. 

Salesforce has rewritten all of its software to be under one unified platform; management thinks building AI agents without a unified platform is risky; the decision to unite all of Salesforce’s software was made 18 months ago with the shift to AI

We’ve automated every customer touch point and now we’re bringing these apps, data and agents together. It’s these 3 levels, and this is in 3 separate pieces of code or 3 different platforms or 3 different systems. This is 1 platform. We’ve rewritten all of our acquisitions, all of our core modules, our Data Cloud and our agents as 1 unified platform, which is how we are delivering not only this incredible functionality but this high level of accuracy and capability. And from this first-hand experience in meeting with these customers around the globe, I can unequivocably tell you that building these agents without a complete integrated platform is like trying to assemble a plane mid-flight, it’s risky chaotic and it’s not likely to succeed…

…With the shift to AI, it just became clear 18 months ago, we need to hit the accelerator pedal and rewrite all these things onto the core platform because customers are going to get this incredible value by having 1 integrated system, and it scales from small companies to extremely large companies. 

Bookings for Salesforce’s AI products was up more than 100% year-on-year in 2024 Q2; Salesforce signed 1,500 AI deals in 2024 Q2; aircraft maker Bombardier is using Salesforce’s AI products to arm sales reps with better information on, and recommendations for, prospects

We’re already accelerating this move from AI hype to AI reality for thousands of customers with amazing capabilities across our entire AI product portfolio. New bookings for these products more than doubled quarter-over-quarter. We signed 1,500 AI deals in Q2 alone. Some of the world’s largest brands are using AI solutions, including Alliant, Bombardier and CMA CGM. Bombardier, the maker of some of the world’s top performing aircraft, is enabling sales reps to sell smarter by consolidating need to know information on prospects in advance of meetings and providing recommendations on how to best engage with them through the Einstein copilot and prompt builder. 

Salesforce has a new team called Salesforce CTOs that will work alongside customers in deploying AI agents

To help our customers navigate this new world, we just launched a new team called Salesforce CTOs. These are deeply technical individuals who work alongside our customers to help them create and execute a plan for every stage of their AI journey to become agent first companies. 

Salesforce sees itself as customer zero for all its AI products, including Agentforce, and it is deploying its own AI products internally with success; 35,000 Salesforce employees are using Einstein as an AI assistant; Salesforce has already used Slack AI to create 500,000 channel summaries since February 2024, saving 3 million hours of work

We’re continuing our own AI journey internally as a Customer Zero of all of our products with great results. We now have 35,000 employees using Einstein as a trusted AI assistant, helping them work smarter and close deals faster. And since we launched Slack AI in February, our employees have created more than 500,000 channels — channel summaries, saving nearly 3 million hours of work. We’ll, of course, deploy Agentforce agents soon in a variety of different roles and tasks to augment, automate and deliver productivity and unmatched experiences for all employees and customers at scale.

Salesforce will be introducing Industry Toolkit at Dreamforce; Industry Toolkit contains more than 100 ready-to-use AI-powered actions; Industry Toolkit can be used with Agentforce 

At Dreamforce, we’re excited to share our new AI tool kit — industry toolkit, which features more than 100 ready-to-use customizable AI-powered actions. All of these actions can be applied to build industry-specific agents with Agentforce.

Salesforce’s management wants to see 1 billion AI agents by FY2026; there are already 200 million agents identified in trials

I’ll just give you my own personal goals. So I’m not giving any guidance here. My goal is that by the end of fiscal year ’26 that we will have 1 billion agents. Already in just looking at the number of consumers identified just in the trials that we have going on, we have like 100 million identified or more. Okay. call it, 200 million. But the funny thing is, of course, it’s only 1 agent. But let’s just think it’s like a manifestation of all these agents talking to all these consumers.

Salesforce already has a long history of selling non-human consumption-based products; with AI agents, management sees pricing on a consumption basis or on a per conversation basis (at $2 per conversation); management thinks AI agents is a very high-margin opportunity

On pricing. When you think about — when you think about apps and you think about humans, because humans use apps, not in all cases. So for example, the Data Cloud is a consumption product. The Commerce Cloud is a consumption product. Of course, the e-mail product, Marketing Cloud is a consumption product. Heroku is a consumption product. So of course, we’ve had non-human consumption-based products for quite a long time at Salesforce…

…When we look at pricing, it will be on a consumption basis. And when we think about that, we think about saying to our customers, and we have, it’s about $2 per conversation. So, that is kind of how we think about it, that we’re going to have a lot of agents out there, even though it’s only 1 agent. It’s a very high margin opportunity, as you can imagine, and we’re going to be reaching — look, you have to think about these agents are like, this is the new website. This is your new phone number. This is how your customers are going to be connecting with you in this new way, and we’re going to be helping our customers to manage these conversations. And it’s probably a per conversational charge as a good way to look at it or we’re selling additional consumption credits like we do with our data cloud. 

Veeva Systems (NYSE: VEEV)

Veeva’s management is seeing the company’s customers appreciate the patience they have displayed in adopting AI; customers started using Veeva’s Vault Direct Data API for AI use cases in 2024 Q2; Vault Direct Data API provides data access 100 times faster than traditional APIs; management thinks that the advantage of providing API access for AI use cases is the possibility of partners developing use cases that management could not even forsee; customers have to pay a fee to turn on Vault Direct Data API and the fee is for covering Veeva’s compute costs; there’s no heavy implementation tasks needed for Vault Direct Data API

When groundbreaking technology like GenAI is first released, it takes time for things to settle and become clearer. That’s starting to happen now. Customers have appreciated our taking the long view on AI and our orientation to tangible value rather than hype. In Q2, our first early customers started using the Vault Direct Data API to power AI and other use cases. The ability to retrieve data 100 times faster than traditional APIs is a major software platform innovation and will be a big enabler of AI that uses data from Vault applications…

… When you make an API like the Direct Data API, you don’t know the innovation you’re unleashing. And that’s the whole point because the data can be consumed so fast and transactionally accurately, use cases that weren’t practical before can become practical. I mean if I step back way back when to designing the first salesforce.com API, I knew it was going to unleash a lot of innovation, and you just don’t know. It’s not predictable, and that’s the good thing…

…[Question] Looking at Vault Direct Data API, how seamless is it for customers to turn it on and start using it? Is it something that needs an implementation? 

[Answer] That is something that’s purchased by the customer, so that is something that is not free for the customers to use. They purchase it. The fee is not that large. It covers our compute cost, that type of thing… 

…After that, no, there’s no implementation. You turn it on, and it’s on. And that’s that.

Veeva’s AI Partner Program is progressing well and has seen 30 AI use cases being developed by 10 partners across Veeva Development Cloud and Veeva Commercial Cloud; the AI use cases in Veeva Commercial Cloud are mostly related to data science while the use cases in Veeva Development Cloud are mostly related to generation of documents and reports; management does not want to compete with the partners that are in the AI Partner Program 

Our AI Partner Program is also progressing well. We now have more than 10 AI partners supporting roughly 30 use cases across R&D and Commercial. We also continue to explore additional AI application opportunities beyond our current AI solutions…

… [Question] You talked about some of the early traction you’re seeing with the AI Partner Program. Can you maybe talk about what are some of the use cases you’ve seen so far?

[Answer] The types of use cases in commercial often have to do with data science. So things like next best action, dynamic targeting, pre-call planning, things like that. And then in R&D, they can be more things like document generation, generate a clinical study report or doing specific medical coding, things like that. So those are the type of use cases…

…In terms of us monitoring that and informing our own road map, I guess there may be some of that. But mostly, that type of innovation really comes from internally our own thinking with our customers. We don’t want to really disrupt our partners, especially when the partners are having customer success. If there’s a major use case that we’re very clear that customers need and for some reason, the ecosystem is not delivering customer success, yes, maybe we might step in there. But I would guess that what we would do would be more holistic, I guess, in some sense and not specifically something a partner would tackle because we’re generally going to have more resources and more ability to sway our own road map than a partner would, and we want to be respectful to the ecosystem.

Zoom Video Communications (NASDAQ: ZM)

Zoom’s management is seeing customers seeking out the AI capabilities of Zoom’s Contact Center packages; Zoom’s management saw the ASP (average selling price) for its Contact Center product double sequentially because of the product’s AI-tier, which comes with higher pricing

We are seeing increased adoption of our advanced Contact Center packages, as customers seek to utilize our AI capabilities to enhance agent performance…

…If you remember, we started with one pricing tier. We eventually added two more and the AI agent is like that Eric was speaking about earlier, is in the highest tier. We actually saw our ASPs for Contact Center almost double quarter-over-quarter because it’s such a premium feature. And when I look at the Q2 deals, the majority of them were purchasing in one of the top 2 tiers, so all of that is contributing to what I would say is not only expansion in terms of seat count but expansion in terms of value being derived from the product.

Zoom’s AI companion uses generative AI to produce meeting summaries, live translations, image generation and more; Zoom AI Companion is now enabled on over 1.2 million accounts; management wIll be upgrading AI companion as Zoom transitions into the 2.0 phase of AI-enabled work; customers really like ZoomAI Companion; Zoom AI Companion is provided at no additional cost; in Zoom meetings, the No.1 use case of Zoom AI Companion is to create meeting summaries; management is constantly improving the quality of Zoom AI Companion’s meeting summaries; customers are giving positive feedback on Zoom AI companion

Today, AI Companion enhances an employee’s capabilities using generative AI to boost productivity through features like meeting summary, chat compose, image generation, live translation and enhanced features in Contact Center. As these features have grown in popularity, we are happy to share that Zoom AI Companion is now enabled on over 1.2 million accounts…

…Our progress broadening Zoom Workplace, building out enhanced AI tools for Contact Center and amassing a large base of AI users sets us up well to transition into the 2.0 phase of AI-enabled work. In this phase, Zoom AI Companion will move beyond enhancing skills to simplifying your workday, providing contextual insights, and performing tasks on your behalf. It will do this by operating across our collaboration platform to ensure your day is interconnected and productive…

…Our customers really like Zoom AI Companion. First of all, it works so well. Secondly, at no additional cost, not like some of other vendors who got to charge the customer a lot. And in our case, this is a part of our package…

… You take a Meeting, for example, right? For sure, the #1 use case like a meeting summary, right? And we keep improving that quality like in the [indiscernible] and or meeting summary are getting better and better. Like in July, we had another upgrade quarter-wise, even better than previous deliveries, right?..

… [Question] One question I had is when you’re looking at Zoom AI Companion, we’ve heard a lot of great things in the field if customers kind of comparing that to other products that are offered out there. Can you kind of remind us about how you guys think about tracking success with the product internally, given that you don’t kind of charge for it directly beyond having millions of people using it?

[Answer] The metrics that we’ve been talking about on here is account activation. So looking at how many — it’s not individual users, it’s actual customer accounts that have activated it… And also they share the stories like how Zoom AI Companion like is very accurate summary, action items are helping their employees’ productivity as well. And yes, a lot of very positive feedback about adopting Zoom AI Companion.

Zoom’s management has intention to monetise AI services for the Contact Center product, but not for Zoom Workplace

[Question] Now that you’re seeing more adoption, Kelly, of Zoom Companion, how do you think about the cost of providing these generative AI features and capabilities? And do you think Zoom could eventually charge on a usage basis for power users of the generally just trying to weigh cost versus revenue opportunities here?

[Answer] I mean when we launched AI Companion, right? So we do not want to charge the customer. However, that’s for the workplace for the business services like a Contact Center, all those new offerings. And I think for sure, we are going to monetize. As I mentioned in the previous earnings calls, new — new solutions or the billing services, AI, I think we are going to charge. They are AI Companion, right? But the workplace and our core you see offering and collaboration offering we do not want to charge. I want to see — I really appreciate our AI team’s great effort, right? And focus on the quality, focus on the cost reduction and so on and forth.

AI services are a drag on Zoom’s margins at the moment (as the company is providing a lot of AI services for free now) but management sees them as important investments for growth

[Question] Just on gross margins, like the impact of generative AI and maybe what you can do to alleviate some of that off there.

[Answer] I mean we’re guiding to 79% for this year, which we will, reflects the prioritization of AI, but also the very strong discipline that we continue to apply. And we are holding to our long-term target for gross margins of 80%. But of course, we think at this point in time, it’s very important to prioritize these investments as they really set us up for future growth.

Zoom’s dev ops team is saving costs for the company to make room for more AI investments

I also want to give a credit to our dev ops team. On the right hand, for sure, we are going to buy more and more GPUs, right? And also leverage that. Our team tried to save the money from other areas, fully automated, and so on and so forth, right? So that’s another way for us to save the cost, right, to make some room for AI.

The regulatory environment for AI in the US and Europe has so far had very little impact on Zoom’s business because Zoom’s management has been adamant and clear that it is not going to use customer’s data to train its AI models
[Question] Are you seeing anything in the broad sweep of AI regulation in the U.S. or Europe that you think can dampen innovation?

[Answer] That’s the reason why we launch AI Companion, we already mentioned, we are not going to use any of our customer data to train our AI models, right? And we take customers data very, very seriously, right? And as a customer, they know that they trust our brand and trust of what we’re doing. And so far, I do not see any impact in terms of like regulation. And again, this AI is moving rapidly, right? So almost the EMEA here and we all look at the potential regulation. But so far, impact actually to us, to our business, I think it’s extremely limited. So like meeting summary, and it’s a very important feature, customer like that. I think we do not use our customer data to train our AI model. And so why not keep using the feature? I think there’s no impact so far.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Adobe, MongoDB, Salesforce, Veeva Systems, and Zoom Video Communications. Holdings are subject to change at any time.

Does News Move The Stock Market?

If we’re constantly looking for news to explain short-term stock price movements, how often can we be right?

A great book I started reading recently is Making Sense of Chaos by economist J. Doyne Farmer. In the book, Farmer discusses his ideas for understanding economies through the lens of complexity science, which is the study of complex adaptive systems. The book referenced an interesting academic finance paper published in 1988 titled What Moves Stock Prices. The paper, authored by David Cutler, James Poterba, and Larry Summers, investigated the influence of news on stock prices.

Farmer described their work as such:

“Cutler, Poterba and Summers began by finding the 100 largest daily fluctuations in the S&P 500 index between 1946 and 1987. They then looked at the New York Times on the day after each move and recorded a summary of the paper’s explanation for the price change. The authors made a subjective judgement as to whether these explanations could plausibly be considered ‘real news’ – or at least real enough to have triggered a sizable change in stock price.”

The largest daily move in the paper’s dataset occurred on 19 October 1987 – now famously known as Black Monday – when the S&P 500 fell by 20.5%. Interestingly, there was no substantial news to explain the collapse. Farmer mentioned in his book:

“The explanations for the 20 per cent drop on October 19, 1987, were ‘worry over dollar decline and rate deficit’ and ‘fear of US not supporting dollar’. Cutler, Poterba and Summers didn’t classify this as news, and I agree. ‘Worry’ and ‘fear’ are subjective statements about the emotional state of the market that have no specific reference to external events.”

Farmer went on to mention:

“Of the dozen largest price fluctuations [shown below], only four were attributed to real news events, a ratio that they found also roughly applied to the largest 100 moves.”

In other words, as I have suspected to be the case for as long as I have been investing, stock prices are indeed more often than not driven by factors outside of the news. I find this to be an important trait of the stock market to know because if we’re constantly looking for news to explain short-term stock price movements, we’re likely to be wrong often, and this can impair our investment decision-making process.

The twelve largest daily price fluctuations in Cutler, Poterba and Summers’ dataset for What Moves Stock Prices:

  1. Date: 19 October 1987
    • Daily change: -20.5%
    • Explanation given: Worry over dollar decline and trade deficit; Fear of US not supporting dollar
  2. Date: 21 October 1987
    • Daily change: 9.1%
    • Explanation given: Interest rates continue to fall; deficit talks in Washington; bargain hunting
  3. Date: 26 October 1987
    • Daily change: -8.3%
    • Explanation given: Fear of budget deficits; margin calls; reaction to falling foreign stocks
  4. Date: 3 September 1946
    • Daily change: -6.7%
    • Explanation given: “… no basic reason for the assault on prices.”
  5. Date: 28 May 1962
    • Daily change:-6.7%
    • Explanation given: Kennedy forces rollback of steel price hike
  6. Date: 26 September 1955:
    • Daily change: – 6.6%
    • Explanation given: Eisenhower suffers heart attack
  7. Date: 26 June 1950:
    • Daily change: -5.4%
    • Explanation given: Outbreak of Korean War
  8. Date: 20 October 1987
    • Daily change: 5.3%
    • Explanation given: Investors looking for “quality stocks”
  9. Date: 9 September 1946
    • Daily change: -5.2%
    • Explanation given: Labor unrest in maritime and trucking industries
  10. Date: 16 October 1987
    • Daily change: -5.2%
    • Explanation given: Fear of trade deficit; fear of higher interest rates; tension with Iran
  11. Date: 27 May 1970
    • Daily change: 5.0%
    • Explanation given: Rumours of change in economic policy; “… the stock surge happened for no fundamental reason”
  12. Date: 11 September 1986
    • Daily change: -4.8%
    • Explanation given: Foreign governments refuse to lower interest rates; crackdown on triple witching announced

Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have no vested interest in any companies mentioned. Holdings are subject to change at any time.

Stocks and Interest Rate Cuts

How has the US stock market historically performed when the Federal Reserve had cut interest rates?

A topic I’ve noticed that is buzzing among financial market participants lately is what would happen to the US stock market if and when the Federal Reserve, the US’s central bank, cuts interest rates later this year. 

There is a high likelihood of a rate cut coming, although there is more uncertainty around the timing and the extent of any cut. In a speech last week, the central bank’s chair, Jerome Powell, said (emphases are mine):

“The time has come for policy to adjust. The direction of travel is clear, and the timing and pace of rate cuts will depend on incoming data, the evolving outlook, and the balance of risks.”

I have no crystal ball, but I do have historical context. Josh Brown, CEO of Ritholtz Wealth Management, a US-based investment firm, recently shared fantastic data on how US stocks have performed in the past when the Federal Reserve lowered rates. His data, in the form of a chart, goes back to 1957 and I reproduced them in tabular format in Table 1; it shows how US stocks did in the next 12 months following a rate cut, as well as whether a recession occurred in the same window:

Table 1; Source: Josh Brown

I also split the data in Table 1 according to whether a recession had occurred shortly after a rate cut, since eight of the 21 past rate-cut cycles from the Federal Reserve since 1957 took place without an impending recession. Table 2 shows the same data as Table 1 but for rate cuts with a recession; Table 3 is for rate cuts without a recession.

Table 2; Source: Josh Brown
Table 3; Source: Josh Brown

With all the data found in Tables 1, 2, and 3, here are my takeaways:

  • US stocks have historically done well, on average, in the 12 months following a rate-cut. The overall record, seen in Table 1, is an average 12-month forward return of 9%. When a recession happened shortly after a rate-cut, the average 12-month forward return is 8%; when a recession did not happen shortly after a rate-cut, the average 12-month forward return is 12%.
  • Drawdowns – the maximum peak-to-trough decline in stocks over a given time period – have occurred nearly all the time following a rate-cut. This is not surprising. It’s a feature of the stock market that you would often have to endure a sharp shorter-term fall in stock prices in order to earn a positive longer-term return.
  • A recession is not necessarily bad for stocks. As Table 2 shows, US stocks have historically delivered an average return of 8% over the next 12 months after rate cuts that came with impending recessions. 
  • It’s not a guarantee that stocks will produce good returns in the 12 months after a rate cut even if a recession does not occur, as can be seen from the August 1976 episode in Table 3.
  • My most important takeaway is that a rate-cut is not guaranteed to be a good or bad event for stocks. One-factor analysis in the financial markets  – “if A happens, then B will occur” – should be largely avoided because clear-cut relationships are rarely seen.

It’s worth bearing in mind that it’s not a certainty that the Federal Reserve will be cutting rates in the near future. Anything can happen in the financial markets. And even if a rate cut does happen, no one knows for sure how the US stock market would perform. History is not a perfect indicator of the future and the best it can do is to give us context for the upcoming possibilities. 


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have no vested interest in any companies mentioned. Holdings are subject to change at any time.