All articles

Company Notes Series (#2): BayCurrent Consulting

Editor’s note: We’re testing out a new series for the blog, the “Company Notes Series”, where we periodically share our notes on companies we’ve studied in the recent past but currently have no vested interest in (we may invest in or sell shares in the companies mentioned at any time). The notes are raw and not updated, and the “as of” date for the data is given at the start of the notes. The first edition in the series can be found here. Please give us your thoughts on the series through the “Contact Us” page; your feedback will determine if we continue with it. Thanks in advance!


Start of notes for BayCurrent Consulting

Data as of 31 May 2023

Background

  • Founded in March 1998 as PC Works Co. Ltd for the purpose of consulting, system integration and outsourcing related to management, operations and IT. In December 2006, PC Works Co. Ltd changed its name to BayCurrent Consulting Co. Ltd. In April 2014, Byron Holdings Co. Ltd was set up. In June 2014, Byron Holdings Co. Ltd acquired BayCurrent Consulting Co. Ltd and then the combined entity changed its name to BayCurrent Consulting Co. Ltd.
  • HQ: Tokyo, Japan
  • Listed on September 2016 on the Tokyo Stock Exchange Mother’s section; moved to First Section of Tokyo Stock Exchange on December 2018; moved to Prime Market of the Tokyo Stock Exchange on April 2022
  • Ticker: TSE: 6532
  • Total number of consultants as of FY2023 (financial year ended February 2023) is 2,961; total number of employees as of FY2023 is 3,310

Business

  • BayCurrent is a consulting firm that supports a wide range of themes such as strategy, digital, and operations for Japan’s leading companies in various industries. BayCurrent provides planning and execution support to clients, such as company-wide strategy planning and business strategy planning to support decision-making by management, and support for examining business operations using digital technology.
  • Examples of the projects that BayCurrent is currently working on under Digital Consulting:
    • Finance, Cashless payment, and Design: Building a UX improvement process to continue achieving high customer satisfaction over the long term
    • Pharmaceutical manufacturing, Digital technologies, and Market research: Formulating plans to enter the Japanese market of advanced digital medical equipment business for foreign companies 
    • Telecommunication, Metaverse, and Business planning: Developing plans to use metaverse and examine the use of AI toward the smart city concept
    • Automobiles, AI, and Business creation: Building a model for business using AI and supporting its implementation, aiming to reduce the risk of traffic accidents
  • Examples of the projects that BayCurrent is currently working on under Sustainability Consulting:
    •  Energy, ESG, and Support for practice: Forming a scheme and supporting negotiations for realizing offshore wind power business
    • Finance and Carbon neutrality: Considering policies in response to TCFD (Task Force on Climate-Related Financial Disclosures) in anticipation of sales of solutions in the future
    • High-tech, EV, and Business planning: Considering business domains and creating a road map for popularizing EVs (electric vehicles) to reduce CO2 emissions
    • Manufacturing, ESG, and Supply chain management: Considering the possibility of commercializing supplier ESG assessments and risk management
  • See also Figures 1, 2, 3, and 4 for examples of BayCurrent’s projects
Figure 1
Figure 2
Figure 3
Figure 4
  • BayCurrent groups its customers into three industry categories: Finance (banking, securities, insurance etc); Telecommunications/Media/High-Tech; and Others (energy, entertainment, government offices, food etc). In FY2023, 25% of revenue was from Finance, 35% was from Telecommunications/Media/High-Tech, and 40% from Others. In FY2019, the split was 40% from Finance, 30% from Telecommunications/Media/High-Tech, and 30% from Others. BayCurrent’s largest customer in FY2023 was Pfizer Japan, accounting for 12.0% of revenue; it seems like there’s no other company that accounted for more than 10% of revenue during the year.
  • In FY2023, revenue from customers based in Japan was at least 90% of BayCurrent’s total revenue.

Market opportunity

  • According to IDC Japan’s “Forecast for domestic business consulting market: 2021-2025” (announced on 1 July 2021), the Japanese consulting market is expected to have a CAGR of 7.8% from around ¥900 billion in 2020 to more than ¥1.2 trillion in 2025; within the consulting market is the digital consulting sub-segment which is expected to have a CAGR of 30.1% from more than ¥100 billion in 2020 to around ¥500 billion in 2025. BayCurrent ended FY2023 with revenue of just ¥76.1 billion.
  • According to BayCurrent: “In today’s business environment, the challenges faced by corporate managers are becoming more diverse and complex due to intensifying market competition and changes in market structure. There is a growing need for consultants with a high level of expertise. Furthermore, with the further development of digital technology in the future, the need for the utilization of new technologies in business is expected to increase year by year, and the consulting market is expected to continue to grow at a high rate.”
  • In Japan, there’s an initiative called DX (Digital Transformation) that began to be promoted heavily by the Japanese government starting in 2018 with the publication of the “DX [Digital Transformation]” report by the Ministry of Economy, Trade, and Industry (METI) during the year. METI warned that Japan would face an economic loss of ¥12 trillion per year by 2025 if traditional mainframes and backbone core systems were not updated and the shortage of ICT engineers were not addressed. Moreover, in 2016, the percentage of companies that have been operating their core systems for 21 years or more is 20%, and 40% for companies that have been in operation for 11 to 20 year; if this situation continues in 10 years, in 2025, the percentage of companies that have been operating core systems for 21 years or more will be 60%. Japanese companies appear to have heeded the government’s DX call. Surveys conducted by the METI and FUJITSU in 2020 indicated that almost half of the SMEs were actively promoting DX companywide, while large companies with more than 5.000 employees indicate an adoption rate close to 80%. These are a tailwind for BayCurrent Consulting.

Growth strategy

  • BayCurrent is focused on further increasing the added value of its consulting services; the recruitment and training of human resources; and providing an attractive work environment. 
  • BayCurrent’s support services for corporate managers in all industries are knowledge-intensive, and so management believes that improvements in the company’s consultants’ ability to make proposals and solve problems will affect its growth. For this reason, management strives to recruit excellent human resources with various backgrounds and focusing on creating an environment and treatment that makes it easy for each consultant to work with peace of mind. Management has established a wide variety of training programs and study sessions to improve its consultants’ skills for strategic planning and solving management issues. Management believes that BayCurrent is able to formulate viable strategies that meet the needs of clients precisely because the company’s consultants are professionals who have worked on numerous projects across industries and service areas; for this reason, management strives to not limit its consultants to specific fields. Figure 5 shows the establishment of the BayCurrent Institute, a business management research institute.
  • Management also distributes knowledge obtained through dialogue with university professors working on research subjects and members of the management teams of leading companies, in order to gain visibility from the public. Most recent examples of such work:
    • Participated in FIN/SUM, one of Japan’s largest fintech conferences, co-hosted by the Financial Services Agency and Nikkei Inc. BayCurrent did the following: Joji Noritake, Managing Executive Officer and CDO, conducted a standalone lecture on “Sustainable customer experience connects emotional memories”; took part in panel discussion on “Possibility of future individual investment through digital technology”
    • Participation in Green CPS Consortium, an organization aimed at building eco-friendly industry and society by controlling material loss, energy loss, and other aspects in all economic activities while driving economic growth
    • Made a donation to the VR/metaverse in the corporate sponsored practical research program of the University of Tokyo Virtual Reality Educational Research Center. The research program conducts basic research on the creation and operation of metaverse space and conducts demonstration experiments to develop practical applications of the metaverse in society. 
  • Growth of number of consultants vs growth of revenue (note the higher revenue growth vs consultant growth):
Table 1
Figure 5

Financials

  • Financials from FY2016 to FY2023 (financials in ¥; earliest data we could find was for FY2016): 
Table 2
  • Solid CAGRs in revenue:
    • FY2016-FY2023: 25.1%
    • FY2018-FY2023: 30.1%
    • FY2023: 32.4%
  • Profitable since at least FY2016. Net income CAGRs and average net income margins:
    • FY2016-FY2023: 52.3% CAGR, 15.3% average margin
    • FY2018-FY2023: 60.3% CAGR, 18.1% average margin
    • FY2023: 43.3% growth, 27.6% margin
  • Positive operating cash flow since at least FY2016. Operating cash flow CAGRs and average operating cash flow margins:
    • FY2016-FY2023: 34.0% CAGR, 19.4% average margin
    • FY2018-FY2023: 45.0% CAGR, 21.6% average margin
    • FY2023: 35.5% growth, 27.2% margin
  • Free cash flow positive since at least FY2016. Free cash flow CAGRs and average free cash flow margins:
    • FY2016-FY2023: 34.0% CAGR, 19.1% average margin
    • FY2018-FY2023: 45.8% CAGR, 21.3% average margin
    • FY2023: 33.6% growth, 26.7% margin
  • Balance sheet was initial in net-debt position and became net-cash in FY2020 onwards; high net-cash position of ¥33 billion in FY2023
  • Minimal dilution as weighted average diluted share count increased by only 0.8% per year for FY2016-FY2023, and -0.3% in FY2023
  • Management aims for a total shareholder return ratio (dividends and share buybacks) of around 40% of earnings; dividend payout ratio is typically 20%-30% under IFRS. In FY2023, interim dividend of ¥14 per share (adjusting for 1-for-10 stock split in November 2022) and final dividend of ¥23 per share, for a total dividend of ¥37 per share for FY2023, representing a payout ratio of 27%.

Management

  • Yoshiyuki Abe, 57, is President and CEO. Became President in December 2016. Joined the original BayCurrent Consulting Co. Ltd in September 2008 and became an executive director in November of same year. Yoshiyuki Abe became President in December 2016 after some major turmoil at BayCurrent that happened in H2 2016:
    • Failed to gain deals matching waiting consultants and then suffered a largely lowered operation rate
    • Additionally faced the defection of employees as a result of a talk about withdrawal that resulted in the loss of credibility of the clients receiving the support for many years
    • Revised earnings forecasts downwardly on 9 December 2016; on the same day, the former President left office
  • Kentaro Ikehira, 46, is Executive Vice President. Became Vice President in May 2021. Joined the original BayCurrent Consulting Co. Ltd in September 2007.
  • Kosuke Nakamura, 41, is CFO. Became CFO in May 2021. Joined the original BayCurrent Consulting Co. Ltd in January 2007.
  • Management has a long history of significantly beating their own mid-term growth projections. Examples:
    • In FY2018 earnings presentation, a projection for FY2019-FY2021 was given where revenue was expected to have a CAGR of 15%-20%, ending at ¥32-35 billion. Actual FY2021 revenue was ¥42.8 billion. 
    • In FY2022 earnings presentation, a projection for FY2022-FY2026 was given where revenue was expected to have a CAGR of 20% to end at ¥100 billion and EBITDA was expected to end at ¥30 billion. Projection given for FY2024 was for revenue of ¥94.6 billion and EBITDA of ¥36 billion – so FY2026 medium-term projection could be achieved/beat by as early as FY2024
  • Management has set a target of FY2029 revenue of ¥250 billion, which represents a 20% CAGR from FY2024’s projected revenue of ¥94.6 billion.

Compensation of Management

  • Yoshiyuki Abe’s total FY2023 compensation was ¥333 million, consisting of ¥40 million of fixed pay, ¥192 million of performance-linked remuneration, and ¥101 million of restricted stock compensation. Total compensation in FY2023 was just 1.6% of FY2023 net income as well as free cash flow
  • Yoshiyuki Abe’s tota FY2022 compensation was ¥297 million, FY2021 compensation was ¥206 million, and FY2020 compensation was ¥137 million.
  • Comparison of Yoshiyuki Abe’s compensation growth vs BayCurrent’s revenue/net income/FCF growth over past few years:
Table 3

Valuation (as of 31 May 2023)

  • 31 May 2023 share price of ¥5,110
  • Trailing revenue per share is ¥496.47, hence PS is 10.3
  • Trailing diluted EPS is ¥137.19, hence PE is 37.2
  • Trailing FCF per share is ¥132.71, hence PFCF is 38.5
  • Reminder that revenue growth projection for FY2029 is for CAGR of 20% from FY2024 – the valuation does not look too rich if BayCurrent is able to grow as projected 

Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have no vested interest in any company mentioned. Holdings are subject to change at any time.

What We’re Reading (Week Ending 29 September 2024)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 29 September 2024:

1. Digging Into The Coal Industry – Matt Franz

There are two main types of coal: thermal and metallurgical (“met”) or coking coal.

Met coal has more carbon, less ash, moisture, and sulfur than thermal coal. It is rarer and commands a higher price.

Met coal is a key ingredient in steel. To make steel, met coal is first turned into coke by heating it to 1,000ºC in the absence of oxygen. Without oxygen, the coal does not burn. The coal swells and then releases its gaseous volatile matter. Nearly pure crystalline carbon is all the remains. Coke is to coal what charcoal is to wood. Caking is a coal’s ability to be converted into coke. Thermal coal has no caking ability, which is why it is much cheaper.

Coke is mixed with iron ore, flux (e.g. limestone), and hot air in a blast furnace to create iron. Iron is put into a basic oxygen furnace where oxygen reduces the metal’s carbon content. It is further refined to remove impurities and alloys are added to make steel.

The quality of met coal influences the quality of the coke, iron, and steel produced. A blast furnace fed with higher quality coke will require less of it, lowering production costs. Steel makers have an economic incentive to pay more for higher quality met coal.

It takes 0.8-1.05 tons of met coal to produce one ton of steel. (1.3-1.5 tons of met coal make one ton of coke. 0.6-0.7 tons of coke make one ton of steel.) That’s a lot! 70% of the world’s steel is made this way…

… The major met coal exporting nations are the US, Canada, and Australia. The major importers are countries with large steel industries relative to their domestic met coal supplies – China, India, Japan, and South Korea.

The US usually exports ~70% of its met coal. Export contract prices are tied to international benchmark indices. Domestic contracts tend to specify a fixed price and a fixed volume for one year…

…Thermal coal has a lower calorific value (CV) and a lower cost than met coal. It is primarily used to generate electricity. It is also used to make cement. It takes 0.1-0.12 tons of coal to make one ton of cement.

Thermal coal has been in decline in Europe since the 1980s and in the US since the 2000s. It continues to grow in Asia. Worldwide, coal demand reached its highest level ever in 2022 and again in 2023…

…Today, coal remains the world’s largest energy source for electricity generation. Coal may be losing share as an energy source, but it continues to grow in absolute tons.

China is the world’s largest producer and consumer of coal. In the early 2000s coal produced 75-80% of its electricity. Today it’s more like 60-65%. Coal lost share but grew in absolute terms. Chinese electricity production rose eightfold and its coal consumption rose sevenfold…

…Coal’s peak share of U.S. energy occurred around 2007-2008 at 50% of electricity. Today it’s 20-22%. That’s still a meaningful amount. The decline in US coal was driven as much by fracking and its byproduct, cheap natural gas, as environmental considerations. Should US natural gas get expensive, we could see a shift back towards coal…

…Coal’s share of energy production is falling slower than the increase in total energy demand. Jevon’s Paradox describes this situation. Demand for energy is elastic. As energy costs decrease, demand for energy increases even more. On balance, energy demand increases, even as energy consumption becomes more efficient…

In Energy and Civilization: A History (2017), Vaclav Smil explains that energy transitions often take more than a century. The transition from biomass (wood) to coal in Western Europe took 200 years. The transition from coal to oil began in the late 19th century, but oil didn’t overtake coal as America’s dominant energy source until the 1940s, approximately 50-60 years later. The transition is still far from complete, and that’s despite crude being more energy dense and easier to transport (via pipeline).

One of the factors affecting the speed of the transition is infrastructure. The transition from coal to oil in America was slowed by the need to replace steam engines with diesel engines. The modern analog is the cost and complexity of switching a power plant from coal to natural gas.

Price is another factor. If there’s a new fuel that is much cheaper than the legacy fuel, there’s an economic incentive to rebuild the infrastructure faster. A wide disparity between coal and natural gas prices that is expected to continue will drive more US coal plants to switch to gas. That’s less likely to happen in Asia, where gas is less plentiful and LNG infrastructure is more expensive.

Once again, this suggests that the last ton of coal will be very expensive, not very cheap. Thermal coal may be a sunset industry, but it is going to be a beautiful sunset.

2. Fed up with Fed Talk? Factchecking Central Banking Fairy Tales! – Aswath Damodaran

As I drove to the grocery story on Fed Cut Wednesday, I had the radio on, and in the news at the top of the hour, I was told that the Fed had just cut interest rates, and that consumers would soon see lower rates on their mortgages and businesses on their loans. That delusion is not restricted to newscasters, since it seems to be widely held among politicians, economists and even market watchers. The truth, though, is that the Fed sets only one interest rate, the Fed Funds rate, and that none of the rates that we face in our lives, either as consumers (on mortgages, credit cards or fixed deposits) or businesses (business loans and bonds),  are set by or even indexed to the Fed Funds Rate…

…While the Federal Open Market Committee controls the Fed Funds rate, there are a whole host of rates set by buyer and sellers in bond markets. These rates are dynamic and volatile, and you can see them play out in the movements of US treasury rates (with the 3-month and 10-year rates highlighted) and in corporate bond rates (with the Baa corporate bond rate shown)

There is a final set of rates, set by institutions, and sometimes indexed to market-set rates, and these are the rates that consumers are most likely to confront in their day-to-day lives. They include mortgage rates, set by lenders, credit card rates, specified by the credit card issuers, and fixed deposit rates on safety deposits at banks.  They are not as dynamic as market-set rates, but they change more often than the Fed Funds rate…

…To test whether changes in the Fed Funds rate are a precursor for shifts in market interest rates, I ran a simple (perhaps even simplistic) test. I looked at the 249 quarters that compose the 1962- 2024 time period, breaking down each quarter into whether the effective Fed Funds rate increased, decreased or remained unchanged during the quarter. I followed up by looking at the change in the 3-month and 10-year US treasury rates in the following quarter:

Looking at the key distributional metrics (the first quartile, the median, the third quartile), it seems undeniable that the “Fed as leader” hypothesis falls apart. In fact, in the quarters after the  Fed Funds rate increases, US treasury rates (short and long term) are more likely to decrease than increase, and the median change in rates is negative. In contrast, in the periods after the Fed Fund decreases, treasury rates are more likely to increase than decrease, and post small median increases…

…In the quarter after the Fed Funds rate increase, mortgage rates and fixed deposit rates are more likely to fall than rise, with the median change in the 15-year mortgage rate being -0.13% and the median change in the fixed deposit rate at -0.05%. In the quarter after the Fed Funds rate decreases, the mortgage rate does drop, but by less than it did during the Fed rate raising quarters. In short, those of us expecting our mortgage rates to decline in the next few months, just because the Fed lowered rates on Wednesday, are being set up for disappointment…

…How else can you explain why interest rates remained low for the last decades, other than the Fed? The answer is recognizing that market-set rates ultimately are composed of two elements: an expected inflation rate and an expected real interest rate, reflecting real economic growth…

…Interest rates were low in the last decade primarily because inflation stayed low (the lowest inflation decade in a century) and real growth was anemic. Interest rates rose in 2022, because inflation made a come back, and the Fed scrambled to catch up to markets, and most interesting, interest are down this year, because inflation is down and real growth has dropped…

…The Fed’s major signaling device remains the changes in the Fed Funds rate, and it is worth pondering what the signal the Fed is sending when it raises or lowers the Fed Funds rate. On the inflation front, an increase or decrease in the Fed Funds rate can be viewed as a signal that the Fed sees inflationary pressures picking up, with an increase, or declining, with a decrease. On the economic growth front, an increase or decrease in the Fed Funds rate, can be viewed as a signal that the Fed sees the economy growing too fast, with an increase, or slowing down too much, with a decrease…

…Viewed through this mix, you can see that there are two contrary reads of the Fed Funds rate cut of 50 basis points on Wednesdays. If you are an optimist, you could take the action to mean that the Fed is finally convinced that inflation has been vanquished, and that lower inflation is here to stay. If you are a pessimist, the fact that it was a fifty basis point decrease, rather than the expected twenty five basis points, can be construed as a sign that the Fed is seeing more worrying signs of an economic slowdown than have shown up in the public data on employment and growth…

…If you remove the Fed’s role in crisis, and focus on the effects of just its actions on the Fed Funds rate, the effect of the Fed on equity market becomes murkier…

…The S&P 500 did slightly better in quarters after the Fed Funds rate decreased than when the rate increased, but reserved its best performance for quarters after those where there was no change in the Fed Funds rate. At the risk of disagreeing with much of conventional wisdom, is it possible that the less activity there is on the part of the Fed, the better stocks do? I think so, and stock markets will be better served with fewer interviews and speeches from members of the FOMC and less political grandstanding (from senators, congresspeople and presidential candidates) on what the Federal Reserve should or should not do…

… The truth is that the Fed is acting in response to changes in markets rather than driving those actions, and it is thus more follower than leader. That said, there is the very real possibility that the Fed may start to believe its own hype, and that hubristic central bankers may decide that they set rates and drive stock markets, rather than the other way around…

…I believe that it is time for us to put the Fed delusion to rest. It has distracted us from talking about things that truly matter, which include growing government debt, inflation, growth and how globalization may be feeding into risk, and allowed us to believe that central bankers have the power to rescue us from whatever mistakes we may be making.

3. ‘There Are Real Issues in China Now,’ Ray Dalio Says (Transcript here) – Bloomberg Televsion

I think that there are real issues in China now, and they changed, really in the last four years, and that is that they need a restructuring. Individuals, 70% of their money was in real estate. Real estate has gone down. Stocks have gone down. Salaries have gone down. And so and as a result, they’re not spending and they’re concerned and they’re holding money in cash…

…At the same time, you have the government sector is a problem because most of the government spending – 83% of government spending – is spent by local governments. Those local governments got their money by selling land for real estate. Okay, there are no land sales and they borrowed a lot of money…

…It’s a situation that’s more challenging than Japan in 1990. It needs a restructuring in order to be able to do that. And then there’s also the question: Is the property ownership, is it respected? And Deng Xiaoping during his period said, “It’s glorious to be rich.” Is it still glorious to be rich?…

…Yes, there’s fantastic innovation in terms of technology, there’s nothing like it other than in the United States. Europe certainly isn’t a competitor in that. However, it’s very much government-directed. Can there still be entrepreneurship and that inventiveness? These are the big cosmic questions…

…I see investing in China as largely a very attractively-priced place that now has a lot of questions regarding the issues that I just referred to…

…There’s a small percentage of our portfolio which is in China, and we’ll stay in China, you know, through this process.

4. OpenAI’s New Model, How o1 Works, Scaling Inference – Ben Thompson

There are two important things happening: first, o1 is explicitly trained on how to solve problems, and second, o1 is designed to generate multiple problem-solving streams at inference time, choose the best one, and iterate through each step in the process when it realizes it made a mistake…

…There has been a lot of talk about the importance of scale in terms of LLM performance; for auto-regressive LLMs that has meant training scale. The more parameters you have, the larger the infrastructure you need, but the payoff is greater accuracy because the model is incorporating that much more information. That certainly still applies to o1, as the chart on the left indicates.

It’s the chart on the right that is the bigger deal: o1 gets more accurate the more time it spends on compute at inference time. This makes sense intuitively given what I laid out above: the more time spent on compute the more time o1 can spend spinning up multiple chains-of-thought, checking its answers, and iterating through different approaches and solutions.

It’s also a big departure from how we have thought about LLMs to date: one of the “benefits” of auto-regressive LLMs is that you’re only generating one answer in a serial manner. Yes, you can get that answer faster with beefier hardware, but that is another way of saying that the pay-off from more inference compute is getting the answer faster; the accuracy of the answer is a function of the underlying model, not the amount of compute brought to bear. Another way to think about it is that the more important question for inference is how much memory is available; the more memory there is, the larger the model, and therefore, the greater amount of accuracy.

In this o1 represents a new inference paradigm: yes, you need memory to load the model, but given the same model, answer quality does improve with more compute.

5. Learning from Richard Lawrence of Overlook – 14.3% CAGR for 30 years – Eugene Ng

The birth of Overlook’s Cap on Subscriptions originated when Richard Lawrence had lunch in 1992 in New York with Crosby Smith, a representative of the Dillon Family. Richard was asked why he would not just raise capital to generate fees like other investment managers. Crosby proposed that if Richard limited initial subscriptions into the fund at $30 million, the Dillon Family would invest $1 million. They shook hands, and Overlook had its first investor.

The Overlook Cap on Subscriptions was born in that spontaneous moment for Overlook. Richard Lawrence thought that the Cap on Subscriptions has proven to be the single most significant business decision in their 30-year history. In the early 1990s, Overlook decided to cap new subscriptions at 8-9% growth per year. This policy enabled the company to grow its AUM steadily…

…Effectively, the Cap on Subscriptions can smooth fund inflows, effectively lowering the cyclical volatility of AUM. One can then limit inflows of funds at the top of the market, have a ready queue of investors waiting to jump in during market declines, effectively making an investment fund more anti-fragile especially during market selloffs.

In addition, the Cap incentivizes investors to make a long-term commitment, which is aligned with a long-term investment horizon. Investors usually have to wait 6–12 months to gain access, so there are no short-term gains for investors trying to time the markets. The Cap effectively self-selects patient long-term investors.

What is the downside of having such a Cap? As you can imagine, such a Cap on Subscriptions is not for AUM/asset gatherers. The fund size grows much slower and takes longer to scale, and investment managers collect much lower fees…

…Time-weighted return (TWR): The TWR calculates the compound growth of a portfolio’s Net Asset Value on a per-share basis over a specified period of time. Fund managers most often disclose this number.

Capital-weighted return (CWR): CWR calculates the Internal Rate of Return (IRR) for an individual investor’s return and the return collectively earned by all investors in the fund. The CWR accounts for all cash flows into and out of the investor’s specific account and the fund since inception. Most fund managers do not report CWR, and CWRs typically underperform TWRs for most funds.

The Discount (the “Discount”) and the Premium: The discount is the difference between the TWR and the CWR for a specific fund. Discounts occur when CWRs are lower than and underperform TWRs. Peter Lynch was producing world-leading returns when he ran Magellan (high TWR), but the underlying investors performed far worse (low CWR).

Discounts typically happen for two reasons:

First, a fund manager can generate exceptional results as measured by TWRs at the fund’s inception when assets under management (AUM) are small. Then, the manager gets “discovered” and/or “promoted,” and an explosion of money enters the fund, to the great delight of the fund manager. However, with the larger asset base, the now-famous fund manager performs poorly, dragging down his TWR while crushing his CWR.

Second, CWRs are hurt when investments are poorly timed. Investors chase funds promoting hot themes, then bail out when markets turn down. This behavior inevitably decreases their CWRs. But even buying smartly and selling poorly, or buying poorly and selling smartly, can result in a Discount.

On average, the Discount increases when some of the following conditions prevail:

  1. Funds experience fast growth of AUM: the Discount tends to increase as the absolute value of a fund increases.
  2. Funds are invested in trendy asset classes.
  3. Funds are exposed to excessive valuation risk.
  4. Funds have excessive exposure to fund-of-funds’ investors. 
  5. Funds are operated in higher volatility sectors…

…Richard realized that the elimination of Overlook’s Discount is overwhelmingly due to their legal Cap on Subscriptions. At first, he thought it was due to the success of their Investment Philosophy or the luck of their investors in timing their investments.

Overlook’s Investment Philosophy has helped them achieve outperformance of our TWR vs. the benchmark, but it did not impact CWR. The investors’ luck is not a factor either, as their investors have added funds consistently over time.

Instead, the answer lies exclusively in the legal Cap on Subscriptions because the Cap has allowed a limited amount of funds to enter Overlook steadily over the past 30 years. Control over the growth of AUM is the key to eliminating the Discount. The legal Cap on Subscriptions is the hero of Overlook’s story.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have no vested interest in any company mentioned. Holdings are subject to change at any time.

The Federal Reserve Has Much Less Power Over Financial Markets Than You Think 

It makes sense to mostly ignore the Federal Reserve’s actions when assessing opportunities in the stock market.

Last week, the Federal Reserve, the USA’s central bank, opted to lower the federal funds rate (the key interest rate controlled by it) by 50 basis points, or 0.5%. The move, both before and after it was announced, was heavily scrutinised by market participants. There’s a wide-held belief that the Federal Reserve wields tremendous influence over nearly all aspects of financial market activity in the USA.

But Aswath Damodaran, the famed finance professor from New York University, made an interesting observation in a recent blog post: The Federal Reserve actually does not have anywhere close to the level of influence over America’s financial markets as many market participants think.

In his post, Damodaran looked at the 249 calendar quarters from 1962 to 2024, classified them according to how the federal funds rate changed, and compared the changes to how various metrics in the US financial markets moved. There were 96 quarters in the period where the federal funds rate was raised, 132 quarters where it was cut, and 21 quarters where it was unchanged. Some examples of what he found:

  • A median change of -0.01% in the 10-year Treasury rate was seen in the following quarter after the 96 quarters where the federal funds rate increased, whereas a median change of 0.07% was seen in the following quarter after the 132 quarters where the federal funds rate was lowered. Put another way, the 10-year Treasury rate has historically tended to (1) decrease when the federal funds rate increased, and (2) increase when the federal funds rate decreased. This means that the Federal Reserve has very little control over longer-term interest rates. 
  • A median change of -0.13% in the 15-year mortgage rate was seen in the following quarter after the quarters where the federal funds rate increased, whereas a median change of -0.06% was seen in the following quarter after the quarters where the federal funds rate was lowered. It turns out that the Federal Reserve also exerts little control over the types of interest rates that consumers directly interact with on a frequent basis.
  • A median change of 2.85% in US stocks was seen in the following quarter after the quarters where the federal funds rate increased, a median change of 3.07% was seen in the following quarter after the quarters where the federal funds rate was lowered, and a median change of 5.52% was seen in the following quarter after the quarters where the federal funds rate was unchanged. When discussing the stock-market related data, Damodaran provided a provocative question and answer: 

“At the risk of disagreeing with much of conventional wisdom, is it possible that the less activity there is on the part of the Fed, the better stocks do? I think so, and stock markets will be better served with fewer interviews and speeches from members of the FOMC and less political grandstanding (from senators, congresspeople and presidential candidates) on what the Federal Reserve should or should not do.”

I have always paid scant attention to what the Federal Reserve is doing when making my investing decisions. My view, born from observations of financial market history* and a desire to build a lasting investment strategy, is that business fundamentals trump macro-economics. Damodaran’s data lends further support for my stance to mostly ignore the Federal Reserve’s actions when I assess opportunities in the stock market. 

*A great example can be found in Berkshire Hathaway, Warren Buffett’s investment conglomerate. Berkshire produced an 18.7% annual growth rate in its book value per share from 1965 to 2018, which drove a 20.5% annual increase in its stock price. Throughout those 53 years, Berkshire endured numerous macro worries, such as the Vietnam War, the Black Monday stock market crash, the “breaking” of the Bank of England, the Asian Financial Crisis, the bursting of the Dotcom Bubble, the Great Financial Crisis, Brexit, and the US-China trade war. Damodaran’s aforementioned blog post also showed that the federal funds rate moved from around 5% in the mid-1960s to more than 20% in the early-1980s and then to around 2.5% in 2018. And yet, an 18.7% input (Berkshire’s book value per share growth) still resulted in a 20.5% output (Berkshire’s stock price growth).


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have no vested interest in any company mentioned. Holdings are subject to change at any time.

What We’re Reading (Week Ending 22 September 2024)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 22 September 2024:

1. Mario Draghi outlines his plan to make Europe more competitive – Mario Draghi

Across different measures, a wide gap in GDP has opened up between the European Union and America. Europe’s households have paid the price in forgone living standards. On a per-person basis, real disposable income has grown almost twice as much in America as in the EU since 2000…

…Europe largely missed out on the digital revolution led by the internet and the productivity gains it brought: in fact, the productivity gap between the EU and America since 2000 is largely explained by the tech sector. The EU remains weak in the emerging technologies that will drive future growth. European companies specialise in mature technologies where the potential for breakthroughs is limited.

The problem is not that Europe lacks ideas or ambition. But innovation is blocked at the next stage: it is not translated into commercialisation, and innovative firms that want to scale up are hindered by inconsistent and restrictive regulations. Many European entrepreneurs prefer to seek financing from American venture capitalists and scale up in the American market…

…EU companies face electricity prices that are two to three times those in America. Natural-gas prices are four to five times higher. Over time, decarbonisation will help shift power generation towards secure, low-cost clean-energy sources. But fossil fuels will still set the energy price for most of the time for at least the remainder of this decade. Unless Europe better transfers the benefits of clean energy to end-users, energy prices will continue to dampen growth…

…As the era of geopolitical stability fades, the risk of rising insecurity becoming a threat to growth and freedom is increasing. Europe is particularly exposed. The EU relies on a handful of suppliers for critical raw materials and is heavily dependent on imports of digital technology.

2. When Chasing More Dividends Leaves You With Less – Jason Zweig

In July and August, as investors became more convinced interest rates will fall, exchange-traded funds specializing in dividend-paying stocks took in $4.5 billion in new money, estimates Ryan Issakainen, a strategist at First Trust, an ETF manager in Wheaton, Ill.

Although funds with big payouts sound safe, high income can lead to a poor outcome. You need to guard against needless tax bills, overexposure to narrow segments of the market and the chance of deep long-term losses…

…To see the potential downside of these funds, though, consider Global X SuperDividend, an ETF with $784 million in assets.

It yields nearly 11%.

That’s huge compared to the income returns of roughly 1.3% on the S&P 500, 2.1% on the Dow Jones Industrial Average and 5% on short-term U.S. Treasurys.

The SuperDividend fund’s supersized yield comes at a cost. Launched in June 2011 at $75, this week the shares traded around $22. That’s a 70% decline.

If you’d bought the ETF at its inception and held continuously through the end of August, you’d have lost 9%—after accounting for all those jumbo dividends along the way…

… A company that pays a steady stream of growing dividends is probably in robust financial health, but one that pays gigantic dividends is probably struggling and may be desperate to attract investors. Put a bunch of those into an ETF, and you get lots of income but even more risk…

…High-dividend funds often hold many more energy and financial stocks than broader portfolios do. That can raise risk.

In 2008, both First Trust’s Dow Jones Global Select Dividend and its Stoxx European Select Dividend had roughly 50% of their assets in financial stocks—right before the global financial crisis struck.

Over the 12 months ended March 31, 2009, as the MSCI World index lost 42.2% and European stocks overall sank 49.6%, First Trust’s Global Select fell 53.2% and European Select lost 63.9%—even after factoring in their dividends…

…Although a moderate dividend can be a sign of robust corporate health, a huge dividend can be a distress signal. A dividend four or five times greater than that of the overall market isn’t a green light; it’s a red flag.

3. Learning From Peter Keefe – John Garrett

The investment philosophy [at our new fund, Rockbridge Capital] is exactly the same: great businesses, great managers, bargain price. That remains unchanged.

The implementation has evolved over time. Great businesses, great managers, great price—it’s kind of like mom and apple pie. I mean, who’s opposed to it? It’s axiomatic that these things work, but I believe your approach to implementation should change over time…

…You don’t really know what makes a business great. You don’t really understand what contributes to compounding. You want a business with all the great characteristics—growth, rapid growth, sustainable growth—but you don’t know how to evaluate one business against another. You don’t know which businesses are mayflies and which are incredibly durable with multi-decade runways.

Learning how to discern and implement those three criteria does evolve over time. Another thing that evolves is the recognition that there are only a tiny number of businesses you will own over the course of a career that will compound and give you that 100-bagger effect or the 300-bagger effect—what Munger called the Lollapalooza effect. Those opportunities are incredibly rare.

But you spend your entire career looking for them. On day one, when you enter the business, you might think, ‘Well, maybe I’ll find it today,’ but you’re probably not going to find it today, tomorrow, or the day after. So what has evolved for me is the realization that when you find a compounder, don’t let it go…

…Every time I’ve trimmed a position and it involved a great business, it wound up being a huge mistake.

Now, we had this conundrum recently. We own a lot of Microsoft, which we bought back in the Balmer days. So it’s been in the portfolio over 10 years. We’ve made 10 times our money in the business, and it’s appreciated to have a very significant percentage of our portfolios.

Microsoft got a big bid recently because of the artificial intelligence stuff, and I don’t know enough about artificial intelligence to have a responsible opinion. But you can argue that there’s a trillion dollars’ worth of value in Microsoft attributable to AI. Do I trim the position? Well, based on the mistakes I’ve made in the past, no. But at the same time, is a 35 or 40 multiple sustainable for a company that’s already worth three trillion dollars? It’s hard to make that argument. And particularly when you’re managing both taxable and tax-exempt capital, you can make a pretty good argument that you should trim it. But again, that’s never worked out for me. So we are where we are.”…

…Every time we’ve had a business that’s compounded more than 10x—and we’ve had a couple that have compounded at 100x—there’s always been a leader and visionary who is a person of humility, thinking about their business in multi-decade timelines. Without exception, 10, 100, 200-baggers were always a person…

… They’re not thinking about an exit or the next thing; they’re thinking in 10, 20, 30-year time periods.

These people are artists. They’re focused on building something of great value—not just to accumulate wealth, but to create something valuable to society. To borrow from Tom Gayner, these are businesses that do something for people instead of to people. They are financially interested, but the finances are a means of keeping score rather than acquiring more things or a better jet. Those are the people I shy away from. The real artists see beauty in what they’re building and are focused on creating value for all stakeholders, especially the owners of the business.

When discussing people who want to serve all stakeholders, it’s not about rank-ordering which stakeholders to reward first. It’s about understanding that a business can do well for its employees, shareholders, and vendors. Munger talked about this all the time…

…People ask, ‘What makes you different?’ Well, it’s not my process. Everybody wants great businesses and great managers and to buy them at a bargain price. Nobody says they’re not a value investor or that they don’t like what Buffett does. So I think a major differentiator in this business is temperament. If I have an advantage, it’s that I don’t feel like I’m coming unglued when the world is coming unglued. I don’t know why that is; it’s just part of my makeup, but it’s an advantage because low prices are good for investors…

…The biggest compounder I’ve ever had in the investment business was American Tower. I was fortunate enough to figure out American Tower before it was even a public company. It was a footnote in the 10-K of a company called American Radio Systems. American Radio Systems was run by a brilliant, thoughtful capital allocator who fits into this liberal arts bucket I talked about earlier. Steve Dodge went to Yale and was an English major there.

Steve did cable transactions for one of the big New York banks. He got the idea that recurring revenue businesses or contractual revenue were great. So he moved into the cable business and then into the radio business. Around the time of the Telecom Act in the mid-1990s, digital networks for cell networks were beginning to roll out. Steve had people come to him and say, ‘We’d like to hang some of these digital antennas on your radio antennas.’ They also owned a portfolio of television broadcast antennas. They needed structures in suitable locations for these antennas.

That’s the genesis of American Tower, which was just a footnote. I remember calling Steve and asking about it. He basically hung up on me. I had a good relationship with him, so I knew I was onto something.

Long story short, American Tower was spun off and went to over $40 a share. Then came the dot-com bust. There had been a land rush in the tower business, and many companies had gotten levered up.

This was when I learned one of my early lessons about leverage, although it eventually helped me. American Tower dropped to under 80 cents a share from $44. Now that’s a drawdown.

I went up to Boston, where American Tower was headquartered. Chuck Akre was with me, and we met with Steve. He said, ‘I’ll tell you anything that I can legally tell you. I want you to know upfront that I don’t have much time. I have a business that needs my attention. It needs more attention than I can possibly give it because there’s only 24 hours in a day. I think that we can save this thing and I’m not sure that we can, but I also want to tell you, I am solely responsible. This is the worst thing that’s happened to me in my business career, but you’re looking at the guy who made the mistakes that got us in the pickle that we’re in.’

There was none of the usual excuses like ‘The dog ate my homework,’ or blaming the pandemic or the dot-com bust. Steve gave us none of that.

Steve figuratively raised his hand and said, ‘I messed it up, and I am sorry. I will do my best to get you and all the other shareholders out of this pickle.’

That kind of character in a moment of great crisis inspired me and others to make American Tower a more significant position, despite its distress.

We were convinced that the business wasn’t going to zero. It had one of the greatest business models in public companies’ history. A business where 100% of incremental revenue flows through to free cash flow and was growing by 20 to 30% a year. It was highly likely the business would be recapitalized. I can’t think of a financing environment where it wouldn’t be.

Steve’s character and willingness to accept responsibility were crucial in our decision to increase our position. It went up 300-fold from there.

4. Light-Based Chips Could Help Slake AI’s Ever-Growing Thirst for Energy – Amos Zeeberg

Recent results suggest that, for certain computational tasks fundamental to modern artificial intelligence, light-based “optical computers” may offer an advantage…

…In theory, light provides tantalizing potential benefits. For one, optical signals can carry more information than electrical ones—they have more bandwidth. Optical frequencies are also much higher than electrical ones, so optical systems can run more computing steps in less time and with less latency.

And then there’s the efficiency problem. In addition to the environmental and economic costs of relatively wasteful electronic chips, they also run so hot that only a tiny fraction of the transistors—the tiny switches at the heart of all computers—can be active at any moment. Optical computers could, in theory, run with more operations taking place simultaneously, churning through more data while using less energy…

…Seeing the potential advantages, researchers have long tried to use light for AI, a field with heavy computational needs. In the 1980s and 1990s, for instance, researchers used optical systems to build some of the earliest neural networks. Demetri Psaltis and two colleagues at the California Institute of Technology created a clever facial recognition system using one of these early optical neural networks (ONNs). They stored images of a subject—one of the researchers, in fact—as holograms in a photorefractive crystal. The researchers used the holograms to train an ONN, which could then recognize new images of the researcher and distinguish him from his colleagues.

But light also has shortcomings. Crucially, photons generally don’t interact with each other, so it’s hard for one input signal to control another signal, which is the essence of what ordinary transistors do. Transistors also work exceptionally well. They’re now laid down on coin-size chips by the billion, the products of decades of incremental improvements…

…The process of multiplying matrices, or arrays of numbers, undergirds a lot of heavy-duty computing. In neural networks, specifically, matrix multiplication is a fundamental step both in how networks are trained on old data and in how new data is processed in trained networks. And light just might be a better medium for matrix multiplication than electricity.

This approach to AI computation exploded in 2017, when a group led by Dirk Englund and Marin Soljačić of the Massachusetts Institute of Technology described how to make an optical neural network built on a silicon chip. The researchers encoded the various quantities they wanted to multiply into beams of light, then sent the beams through a series of components that altered the beam’s phase—the way its light waves oscillated—with each phase alteration representing a multiplication step. By repeatedly splitting the beams, changing their phase, and recombining them, they could make the light effectively carry out matrix multiplication. At the end of the chip, the researchers placed photo detectors that measured the light beams and revealed the result.

The researchers taught their experimental device to recognize spoken vowels, a common benchmark task for neural networks…

…Since that 2017 paper, the field has seen steady improvement, as various researchers have come up with new kinds of optical computers. Englund and several collaborators recently unveiled a new optical network they call HITOP, which combines multiple advances. Most importantly, it aims to scale up the computation throughput with time, space, and wavelength. Zaijun Chen, a former MIT postdoc now based at the University of Southern California, said this helps HITOP overcome one of the drawbacks of optical neural networks: It takes significant energy to transfer data from electronic components into optical ones, and vice versa. But by packing the information into three dimensions of light, Chen said, it shoves more data through the ONN faster and spreads the energy cost over many calculations. This drives down the cost per calculation. The researchers reported that HITOP could run machine-learning models 25,000 times larger than previous chip-based ONNs.

To be clear, the system is still far from matching its electronic predecessors; HITOP performs about 1 trillion operations per second, whereas sophisticated Nvidia chips can chug through 300 times as much data, said Chen, who hopes to scale up the technology to make it more competitive. But the optical chip’s efficiency is compelling. “The game here is that we lowered the energy cost 1,000 times,” Chen said…

…While optical computing has advanced quickly over the past several years, it’s still far from displacing the electronic chips that run neural networks outside of labs. Papers announce photonic systems that work better than electronic ones, but they generally run small models using old network designs and small workloads. And many of the reported figures about photonic supremacy don’t tell the whole story, said Bhavin Shastri of Queen’s University in Ontario. “It’s very hard to do an apples-to-apples comparison with electronics,” he said. “For instance, when they use lasers, they don’t really talk about the energy to power the lasers.”

Lab systems need to be scaled up before they can show competitive advantages. “How big do you have to make it to get a win?” McMahon asked. The answer: exceptionally big. That’s why no one can match a chip made by Nvidia, whose chips power many of the most advanced AI systems today. There is a huge list of engineering puzzles to figure out along the way—issues that the electronics side has solved over decades. “Electronics is starting with a big advantage,” said McMahon.

Some researchers think ONN-based AI systems will first find success in specialized applications where they provide unique advantages. Shastri said one promising use is in counteracting interference between different wireless transmissions, such as 5G cellular towers and the radar altimeters that help planes navigate. Early this year, Shastri and several colleagues created an ONN that can sort out different transmissions and pick out a signal of interest in real time and with a processing delay of under 15 picoseconds (15 trillionths of a second)—less than one-thousandth of the time an electronic system would take, while using less than 1/70 of the power.

5. Warren Buffett Case Study: Arbitrage – Dirtcheapstocks

By 1981, Arcata was the second largest printing services organization in the U.S. In addition, Arcata owned 77,500 acres of Northern California timberlands, which it used for timber harvesting, reforestation and milling.

Arcata was to be acquired by KKR. The stock was trading around $33/share at the time of the deal announcement. KKR’s $37 offer represented a reasonable premium over the current share price. But there was one other interesting bit of information.

“In 1978 the U.S. Government had taken title to 10,700 acres of Arcata timber, primarily old-growth redwood, to expand Redwood National Park. The government had paid $97.9 million, in several installments, for this acreage, a sum Arcata was contesting as grossly inadequate. The parties also disputed the interest rate that should apply to the period between the taking of the property and final payment for it. The enabling legislation stipulated 6% simple interest; Arcata argued for a much higher and compounded rate.” – Warren Buffett

“Buying a company with a highly speculated, large-sized claim in litigation creates a negotiating problem, whether the claim is on behalf of or against the company. To solve this problem, KKR offered $37.00 per Arcata share plus two-thirds of any additional amounts paid by the government for the redwood lands.” – Warren Buffett…

…“We started buying Arcata stock, then around $33.50, on September 30 and in eight weeks purchased about 400,000 shares, or 5% of the company. The initial announcement said that the $37.00 would be paid in January 1982. Therefore, if everything had gone perfectly, we would have achieved an annual rate of return of about 40% — not counting the redwood claim, which would have been frosting.” – Warren Buffett

“All did not go perfectly. In December it was announced that the closing would be delayed a bit. Nevertheless, a definitive agreement was signed on January 4. Encouraged, we raised our stake, buying at around $38.00 per share and increasing our holdings to 655,000 shares, or over 7% of the company. Our willingness to pay up – even though the closing had been postponed – reflected our leaning toward ‘a whole lot’ rather than ‘zero’ for the redwoods.” – Warren Buffett…

…“On March 12, KKR said its earlier deal wouldn’t work, first cutting its offer to $33.50, then two days later raising it to $35.00. On March 15, however, the directors turned this bid down and accepted another group’s offer of $37.50 plus one-half of any redwood recovery.” – Warren Buffett…

…“The trial judge appointed two commissions, one to look at the timber’s value, the other to consider the interest rate questions. In January 1987, the first commission said the redwoods were worth $275.7 million and the second commission recommended a compounded, blended rate of return working out to about 14%.” – Warren Buffett

“In August 1987 the judge upheld these conclusions, which meant a net amount of about $600 million would be due Arcata. The government then appealed. In 1988, though, before this appeal was heard, the claim was settled for $519 million. Consequently, we received an additional $29.48 per share, or about $19.3 million. We will get another $800,000 or so in 1989.” – Warren Buffett

The final result: 39% IRR…

…The greatest investor to ever live earns a 39% IRR in a low-risk arb deal. The most striking part of this case is not the return generated – but the lack of risk taken.

Arcata was a profitable, growing business. Take a look at its five-year history leading up to the deal.

Arcata had strong operating businesses that earned sufficient sums to cover its interest burden with plenty of comfort.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Microsoft. Holdings are subject to change at any time.

More Of The Latest Thoughts From American Technology Companies On AI (2024 Q2)

A collection of quotes on artificial intelligence, or AI, from the management teams of US-listed technology companies in the 2024 Q2 earnings season.

Last month, I published The Latest Thoughts From American Technology Companies On AI (2024 Q2). In it, I shared commentary in earnings conference calls for the second quarter of 2024, from the leaders of technology companies that I follow or have a vested interest in, on the topic of AI and how the technology could impact their industry and the business world writ large. 

A few more technology companies I’m watching hosted earnings conference calls for 2024’s first quarter after I prepared the article. The leaders of these companies also had insights on AI that I think would be useful to share. This is an ongoing series. For the older commentary:

Here they are, in no particular order:

Adobe (NASDAQ: ADBE)

Adobe’s management believes that Adobe’s approach to AI is highly differentiated; the greatest differentiation is at the interface layer, as Adobe is able to rapidly integrate AI across its product portfolio and allow users to realise value

Adobe’s customer-centric approach to AI is highly differentiated across data, models and interfaces…

…Our greatest differentiation comes at the interface layer with our ability to rapidly integrate AI across our industry-leading product portfolio, making it easy for customers of all sizes to adopt and realize value from AI. 

Adobe’s Firefly models are trained on data that allow outputs to be commercially safe and management thinks this feature of being commercially safe is really important to enterprises; Adobe now has Firefly models for imaging, vector, design, and video; Firefly is in a wide range of Adobe products; Firefly has powered more than 12 billion generations since its launch in March 2023 (was 9 billion in 2024 Q1); management’s strategy is to build Firefly models into more streamlined and precise workflows within Adobe’s products; Adobe has Firefly Service APIs for organisations to generate content at scale, and the API calls tripled quarter-on-quarter; Firefly Service APIs are gaining real traction

We train our Firefly models on data that allows us to offer customers a solution designed to be commercially safe. We have now released Firefly models for imaging, vector and design and just previewed a new Firefly video model…

… Firefly-powered features in Adobe Photoshop, Illustrator, Lightroom, and Premier Pro help creators expand upon their natural creativity and accelerate productivity. Adobe Express is a quick and easy create anything application, unlocking creative expression for millions of users. Acrobat AI Assistant helps extract greater value from PDF documents. Adobe Experience Platform AI Assistant empowers brands to automate workflows and generate new audiences and journeys. Adobe GenStudio brings together content and data, integrating high-velocity creative expression with the enterprise activation needed to deliver personalization at scale…

…We have now surpassed 12 billion Firefly-powered generations across Adobe tools…

… Our strategy is to build technology that will create more streamlined and precise workflows within our tools through features like text-to-template in Express, Generative Fill in Photoshop, Generative Recolor in Illustrator, Generative Remove in Lightroom and the upcoming Generative Extend for Video and Premier Pro. We’re exposing the power of our creative tools and the magic of generative AI through Firefly Service APIs so organizations can generate and assemble content at scale…

…The introduction of the new Firefly video model earlier this week at IBC is another important milestone in our journey. Our video model, like the other models in the Firefly family, is built to be commercially safe with fine-grain control and application integration at its core. This will empower editors to realize their creative vision more productively in our video products, including Premier Pro…

…Strong demand for Firefly Services, which provide APIs, tools and services for content generation, editing and assembly, empowering organizations to automate content production while maintaining quality and control. Total API calls tripled quarter over quarter…

…Firefly Services, which is you can think of that also as a consumption model where we have that, it’s off to a really good start. Our ability to give enterprises the ability to automate content, create custom models within enterprises, we’re seeing real traction because it’s a differentiated solution and that it’s designed to be commercially safe…

…One other thing I’d just emphasize there is that the commercial safety is so important to businesses of all sizes, frankly, and that is something that we feel very, very differentiated.

Adobe released significant advancements in AI Assistant across Adobe Acrobat and Reader in 2024 Q2 (FY2024 Q3) and saw 70% sequential growth in AI interactions in AI Assistant; the advancements in AI Assistant include content creation capabilities; Tata Consultancy Services used AI Assistant in Adobe Acrobat to create event summaries of hours of conference videos in minutes; management intends to actively promote subscription plans for Adobe Acrobat and Reader that include generative AI capabilities

For decades, PDF has been the de facto standard for storing unstructured data, resulting in the creation and sharing of trillions of PDFs. The introduction of AI Assistant across Adobe Acrobat and Reader has transformed the way people interact with and extract value from these documents. In Q3, we released significant advancements, including the ability to have conversations across multiple documents and support for different document formats, saving users valuable time and providing important insights. We are thrilled to see this value translate into AI Assistant usage with over 70% quarter-over-quarter growth in AI interactions. 

In addition to consumption, we’re focused on leveraging generative AI to expand content creation in Adobe Acrobat. We’ve integrated Adobe Firefly Image Generation into our edit PDF workflows. We’ve optimized AI Assistant in Acrobat to generate content fit for presentations, e-mails and other forms of communication, and we’re laying the groundwork for richer content creation, including the generation of Adobe Express projects.

The application of this technology across verticals and industries is virtually limitless. Tata Consultancy Services recently used Adobe Premiere Pro to transcribe hours of conference videos and then used AI Assistant in Acrobat to create digestible event summaries in minutes. This allowed them to distribute newsletters on session content to attendees in real-time.

We’re excited to leverage generative AI to add value to content creation and consumption in Acrobat and Reader in the months ahead. Given the early adoption of AI Assistant, we intend to actively promote subscription plans that include generative AI capabilities over legacy perpetual plans that do not.

Adobe GenStudio is integrated across Experience Cloud and Creative Cloud and helps marketers quickly plan, create, store, deliver, and measure marketing content; Vanguard used Adobe GenStudio to increase quality engagement with investors by 176% through one-to-one personalisation, and to enjoy millions in savings

Customers are embracing the opportunity to address their content supply chain challenges with Adobe GenStudio. With native integrations across Experience Cloud and Creative Cloud, GenStudio empowers marketers to quickly plan, create, store, deliver, and measure marketing content and drive greater efficiency in their organizations. Financial services leader Vanguard is creating an integrated content supply chain to serve the strategic goal of deepening their relationships with a broad range of investors. Leveraging the GenStudio solution, Vanguard was able to increase quality engagement by 176% by focusing on one-to-one personalization and to realize millions in savings by improving content velocity and resource allocation with an end-to-end content creation workflow.

Adobe’s management has been very consistent over the past 1-1.5 years in how they have approached AI, and that is, Adobe would be developing a broad set of models for the creative community, and the models would be highly differentiated based on quality, commercial safety, and integrability into Adobe’s product portfolio

I think we’ve been incredibly consistent with what we’ve said, dating back 1 year, 1.5 years ago, where we talked about the fact that we were going to develop the broadest set of models for the creative community. And we were going to differentiate the models based on quality, commercial safety, integratability into our tools and controllability. And as you’ve seen very methodically over the last 18 months, we continue to bring more and more of that innovation to life. And that fundamentally is working as we’ve now started to integrate it much more actively into our base. If you look at it with photography, we now have in our tool, Generative Remove, we have AI-assisted edits in design, we have Generative Pattern, Generative Fill Shape. We have, in Photoshop, we have Gen Remove. We also have Gen Fill, and I can continue on with all the generations, but we’ve also now started to integrate it in Firefly Services for what we’re enabling enterprises to be able to access and use in terms of batch work and through APIs.

Adobe’s management is seeing the accelerated use and consumption of generative AI credits in Adobe’s products play out the way they expected it to; total consumption credits are going up with the introduction of each new generative AI capability 

If you look at sort of how that’s played out, as we talked about, we’re seeing accelerated use and generative credits being consumed because of that deeper integration into all of our tools, and that is playing out as we expected…

…And we do see with every subsequent capability we integrate into the tool, total credits consumed going up. 

Adobe’s management is seeing Adobe enjoying indirect monetisation from the AI features of its products, such as (1) the products having more value and pricing, (2) users being retained better when they use generative AI features, and (3) higher conversion of users when they try out Adobe products

When you look at then how that converts to monetization, first and foremost, we’ve integrated it a lot of that value into our core products with more value and more pricing. We’re also seeing that when people use these generative features, they retain better. We’re also seeing that when people come to Adobe to try our Creative Cloud applications or Express application, they’re able to convert better. And so there are all these ancillary implied benefits that we’re getting. 

For direct monetisation of the AI features in Adobe’s products, management is thinking of (1) instituting caps on generative AI credit consumption, (2) having AI plans with different AI capabilities; but direct monetisation is currently still not the key focus that management has, because they want to focus on proliferation and usage of generative AI across the user base

In terms of direct monetization, what we’ve said in the past is that the current model is around generative credits, which is I think where you’re going with this. And we do see with every subsequent capability we integrate into the tool, total credits consumed going up. Now what we are trying to do as we go forward, we haven’t started instituting the caps yet. And part of this is, as we’ve said all along, we want to really focus our attention on proliferation and usage across our base. We see a lot of users excited about it. It’s some of the most actively used features that we’ve ever released. And we want to avoid the generation anxiety that people feel. But we’re watching very closely as the economy of generative credits evolves, and we’re going to look at instituting those caps at some point when we feel the time is right and/or we’re also looking at other alternative models. What we did with Acrobat AI Assistant has proven to be very effective. And so we’re also considering other opportunities like having standard CC plans that have a core set of generative capabilities but also having premium API — sorry, premium AI plans that will include things more like video and other things.

Adobe’s management thinks Adobe’s generative AI video models are already pretty capable, but they are going to get better over time; management thinks that the real value of generative AI video models is not in their ability to create a video through a description the user gives, but in their ability to extend the video

I don’t know if you had a chance to see some of the videos we put out there integrated directly into premier, also text to video, images to video, more controllability. We have also the ability now to generate not just themes with humans and dogs and organic animals, but all these like overlays and things that creative professionals actually want to work with. And so we’re very excited about the set of things that they can get out of the box that get going. And human faces and things will just continue to get better…

…I spend a couple of hours with our video team. They have just absolutely hit it out of the park. I mean, the work that they have done, which is leveraging the image models with video, and again, I think to David’s point, the integration with Premier, that’s where we’ve always said, it’s the integration of the model and the application that differentiates it. I think when other models first came out, people were like, “Wow, you can describe it.” That’s just such a small part of where the value is. And the real value is, you have a video, you want to extend it. It’s a game changer in terms of what we can do. So really excited about the stuff that we’re doing in video. 

MongoDB (NASDAQ: MDB)

MongoDB’s management sees AI as a longer-term opportunity for MongoDB; management is seeing companies largely still experimenting with AI applications currently; management thinks inference workloads will come, but monetisation of AI apps will take time

AI continues to be an additional long-term opportunity for our business. At the start of the fiscal year, we told you that we didn’t expect AI to be a meaningful tailwind for our business in fiscal year 2025, which has proven accurate. Based on recent peer commentary, it seems that the industry now mostly agrees with this view. Companies are currently focusing their spending on the infrastructure layer of AI and are still largely experimenting with AI applications. Inference workloads will come and should benefit MongoDB greatly in the long run, but we are still very early when the monetization of AI apps will take time. AI demand is a question of when, not if.

MongoDB’s management has been talking to customers and they think MongoDB is the ideal database for AI apps for five reasons: (1) AI workloads involve a wide variety of data types and MongoDB’s document-model database is meant to handle this variety well, thus providing a well-rounded one-stop solution, (2) MongoDB’s database is high-performance and scalable, and allows AI workloads to utilise real-time operational data, (3) MongoDB’s database is integrated with leading app development frameworks and AI platforms, (4) MongoDB’s database has enterprise-grade security and compliance features, and (5) MongoDB’s database can be run anywhere on the customer’s choice; management feels very good about MongoDB’s positioning for AI

Our discussions with customers and partners give us increasing conviction that we are the ideal data layer for AI apps for a number of key reasons.

First, more than any other type of modern workload, AI-driven workloads require the underlying database to be capable of processing queries against rich and complex data structures quickly and efficiently. Our flexible document model is uniquely positioned to help customers build sophisticated AI applications because it is designed to handle different data types, your source data, vector data, metadata and generated data right alongside your live operational data, outdating the need for multiple database systems and complex back-end architectures.

Second, MongoDB offers a high performance and scalable architecture. As the latency of LLMs improve, the value of using real-time operational data for AI apps will become even more important.

Third, we are seamlessly integrated with leading app development frameworks and AI platforms, enabling developers to incorporate MongoDB into their existing workflows while having the flexibility to choose the LLM and other specific tools that best suit their needs.

Fourth, we meet or exceed the security and compliance requirements expected from an enterprise database, including enterprise-grade encryption, authorization and auditability.

Lastly, customers can run MongoDB anywhere, on-premise or as a fully managed service in 1 of the 118 global cloud regions across 3 hyperscalers giving them the flexibility to run workloads to meet — to best meet their application use cases and business needs…

… As the performance of these LLMs and latency of these LLMs increase, accessing real-time data becomes really important like, say, you’re calling and talking to a customer support chatbot, that you want that chatbot to have up-to-date information about that customer so that they can provide the most relevant and accurate information possible…

…I think it’s a quickly evolving space, but we feel very good about our positioning for AI, even though it’s still very early days.

MongoDB’s management sees 3 ways AI can accelerate MongoDB’s business over time: (1) AI will drive the cost of building applications, as all past platform shifts have done, thus leading to more apps and higher demand for databases, (2) MongoDB can be the database of choice for developers building AI applications (see Point 9 on MongoDB’s new AI Applications Program), and (3) MongoDB can help customers modernise their application estate (see Point 10 for more on this opportunity)

We see 3 main opportunities where we believe AI will accelerate our business over time. The first is that the cost of building applications in the world of AI will come down as we’ve seen with every previous platform shift, creating more applications and more data requiring more databases. The second opportunity is for us to be the database of choice for customers building greenfield AI applications…

…The third opportunity is to help customers modernize their legacy application estate. 

MongoDB’s management made the MongoDB AI Applications Program (MAAP) generally available in July 2024; MAAP brings the cloud computing hyperscalers and prominent AI model-building startups into one ecosystem to reduce the complexity and difficulty for MongoDB’s customers when they build new AI applications 

While we see that there’s tremendous amount of interest in and planning for new AI-powered applications, the complexity and fast-moving nature of the AI ecosystem slows customers down. That’s why we launched the MongoDB AI Applications Program, or MAAP, which became generally available to customers last month. MAAP brings together a unique ecosystem, including the 3 major cloud providers, AWS, Azure and GCP as well as Accenture and AI pioneers like Anthropic and Cohere. MAAP offers customers reference architectures and end-to-end technology stack that includes prebuilt integrations, professional services and a unified support system to help customers quickly build and deploy AI applications.

Modernising legacy application estates is a big opportunity, as most of the $80 billion database market is still in legacy relational databases; MongoDB has the Relational Migrator product to help customers migrate from legacy relational databases to the company’s document-model database; management thinks AI can significantly improve the process of modernising legacy applications by helping with understanding legacy code and rewriting them as modern versions; MongoDB launched a few pilots with customers earlier in 2024 to modernise their legacy applications with the help of AI and the results are exciting; the CIO (Chief Information Officer) of an insurance company in the pilots said the modernisation process was the first tangible return he had seen in his AI investments; management thinks it will take time for the modernisation program to contribute meaningful revenue to MongoDB, but they are excited 

Most of the existing $80-billion-plus database industry is built on dated relational architecture. Modernizing legacy applications has always been part of our business, and we have taken steps over the years to simplify and demystify this complex process through partnerships, education and most recently, our Relational Migrator product. AI offers a potential step function improvement, lowering the cost and reducing their time and risk to modernize legacy applications…

…Earlier this year, we launched several pilots with our customers where we work with them to modernize mission-critical applications, leveraging both AI tooling and services. The early results from these pilots are very exciting as our customers are experiencing significant reductions in time and cost of modernization. In particular, we have seen dramatic improvements in time and cost to rewrite application code and generate test suites. We see increasing interest from customers that want to modernize their legacy application estate, including large enterprise customers. As a CIO of one of the world’s largest insurance companies said about our pilot, this is the first tangible return he’s seen on his AI investments. While it’s still early days and generating meaningful revenue from this program will take time, we are excited about the results of our pilots and the growing pipeline of customers eager to modernize their legacy estate…

…Since day one, since our IPO, we’ve been getting customers to migrate off relational to MongoDB. But one of the biggest friction points has been that while it’s easy to move the data, you can map the schema from a relational schema to a document schema and you can automate that, the biggest stumbling block is that the customer has to or some third party has to rewrite the application, which, by definition, creates more costs, more time and in some cases, more risk especially for older apps, where the development teams who built those apps no longer exist. So what’s been compelling about AI is that AI has finally created a shortcut to overcome that big hurdle. And so essentially, you can start basically diagnosing the code, understand the code, recreate a modern version of that code and generate test suites to make sure the new code performs like the old code. So that definitely gets people’s interest because now, all of a sudden, what may take years or multiyears, you can do in a lot less time. And the pilots that we have done, the time and cost savings have been very, very compelling.

That being said, we’re in the very early days. There’s a lot of interest. We have a growing pipeline of customers across, frankly, all parts of the world from North America to EMEA and even the Pac Rim. And so we’re quite excited about the opportunity. But again, I would say it’s very early days.

Delivery Hero, a leading local delivery platform, is using MongoDB Atlas Vector Search to provide AI-powered hypersonalised results to users; Delivery Hero found that MongoDB Atlas Vector Search helped it build solutions for less cost than alternative technologies

Delivery Hero, a long-time MongoDB Atlas customer is the world’s leading local delivery platform, operating in 70-plus countries across 4 continents. Their quick commerce service enables customers to select fresh produce for delivery from local grocery stores. Approximately 10% of the inventory is fast-moving perishable produce that can go quickly out of stock. The company risks losing revenue and increasing customer churn if the customer didn’t have viable alternatives to their first choice. To address these risks, they are now using state-of-the-art AI models and MongoDB Atlas Vector Search to give hyperpersonalized alternatives to customers in real time if items they want to order are out of stock. With the introduction of MongoDB Atlas Vector Search, the data science team recognized that they could build a highly performant, real-time solution more quickly and for less cost than alternative technologies. 

MongoDB’s management believes that general-purpose LLMs (large language models) will win and will use RAG (retrieval augmented generation) as the primary way to combine generally available data to proprietary data; management is seeing advanced RAG use-cases in answering complex questions

There are some questions about LLMs, whether a general-purpose LLM or a fine-tune LLM, what the trade-offs are. Our belief is that given the performance of LLMs, you’re going to see the general purpose LLMs probably win and will use RAG as the predominant approach to marry generally available data with proprietary data. And then you are starting to see things like advanced RAG use cases where you get much more sophisticated ways to ask complex questions, provide more accurate and detailed answers and better adapt to different types of information and queries.

MongoDB’s management is seeing most AI workloads happen in the cloud, but they also see a lot of customers using open-source LLMs and running those workloads locally

We predominantly see most of the AI workloads in the cloud, but there are definitely lots of customers who are looking at using open source LLMs, in particular, things like Llama, and running those workloads locally.

MongoDB’s management believes MongoDB wins against Postgres for winning AI workloads because MongoDB can handle complex data types whereas Postgres, which is a relational – or SQL – database, struggles

MongoDB is designed to handle these different data structures. And I talked about we can help unify metadata, operational data, vector data and generate it all in one platform. Relational databases, and Postgres is one of them, have limitations in terms of what they can — how they can handle different types of data. In fact, when the data gets too large, these relational databases have to do what’s called off-row storage. And it becomes — it creates a performance overhead on these relational platforms. Postgres has this thing called TOAST, which is — stands for The Oversized-Attribute Storage Technique. And it’s basically a way to handle these different data types, but it creates a massive performance overhead. So we believe that we are architecturally far better for these more complex AI workloads than relational databases.

MongoDB’s management is seeing growing adoption of Vector, and Vector is helping attract new customers to MongoDB; an existing Atlas customer, a financial news organisation, migrated from Elastic Search to Atlas Search in order to use MongoDB’s Vector Search capabilities; an European energy company is using Vector Search for a geospatial search application

On Vector, we’re continuing to see growth in adoption, and we see Vector is effective in attracting new customers to the MongoDB platform. A world-renowned financial news organization, which is already running in Atlas, migrated from Elasticsearch to Atlas Search using Search Nodes to take advantage of our Vector Search capabilities to build a site search that combines lexical search with semantic search to find the most relevant articles for user query. And a European energy company built a geospatial search application using Atlas Search and Vector search and the app was built on-prem but — and to clouds to vectorize geospatial data and facilitate research and discovery.

MongoDB’s management is seeing MongoDB’s customers improve their software development productivity with the help of AI, but the rate of improvement is all over the place

[Question] We’ve talked before in the past that AI is just driving a lot of new code, making developers significantly more productive. Have you seen that behavior in any of your existing customers on Atlas where maybe their utilization rate goes up or the number of applications built per customer goes up?

[Answer] A common question I ask our customers when I meet with them in terms of what code generation tools that they’re using and what benefits they’re gaining. The answers tend to be a little bit all over the map. Some people see 10%, 15% productivity improvement. Some people say 20%, 25% productivity improvement. Some people say it helps my senior developers be more productive. Some people say it helps my junior developers become more like senior developers. So the answers tend to be all over the map.

Nvidia (NASDAQ: NVDA)

Nvidia’s Data Center revenue had incredibly strong growth in 2024 Q2, driven by demand for the Hopper GPU computing platform; compute revenue was up by 2.5x while networking revenue was up by 2x

Data Center revenue of $26.3 billion was a record, up 16% sequentially and up 154% year-on-year, driven by strong demand for NVIDIA Hopper, GPU computing and our networking platforms. Compute revenue grew more than 2.5x. Networking revenue grew more than 2x from the last year.

Even as Nvidia is getting ready to launch its Blackwell-architecture GPUs, customers are still buying the Hopper-architecture GPUs; the H200 platform, based on the Hopper architecture, started ramping in 2024 Q2 and offers 40% more memory bandwidth than the H100; management thinks that the reasons why the Hopper-architecture chips still enjoy strong demand despite the imminent arrival of the Blackwell-architecture chips are (1) AI companies need chips today to process data right now, and (2) AI companies are in a race to build the best model and they’re all racing to be the first

Customers continue to accelerate their Hopper architecture purchases while gearing up to adopt Blackwell…

…NVIDIA H200 platform began ramping in Q2, shipping to large CSPs, consumer Internet and enterprise companies. The NVIDIA H200 builds upon the strength of our Hopper architecture and offering over 40% more memory bandwidth compared to the H100…

…The demand for Hopper is really strong. And it’s true, the demand for Blackwell is incredible. There’s a couple of reasons for that. The first reason is, if you just look at the world’s cloud service providers and the amount of GPU capacity they have available, it’s basically none…

…A generative AI company spends the vast majority of their invested capital into infrastructure so that they could use an AI to help them create products. And so these companies need it now. They just simply can’t afford — you just raise money, they want you to put it to use now. You have processing that you have to do. You can’t do it next year. You got to do it today. And so that’s one reason. The second reason for Hopper demand right now is because of the race to the next plateau. The first person to the next plateau gets to introduce some revolutionary level of AI. The second person who gets there is incrementally better or about the same. And so the ability to systematically and consistently race to the next plateau and be the first one there is how you establish leadership…

…We believe our Hopper will continue to grow into the second half. We have many new products for Hopper or existing products for Hopper that we believe will start continuing to ramp in the next quarters, including our Q3 and those new products moving to Q4. So let’s say, Hopper, therefore, versus H1 is a growth opportunity for that. 

Nvidia’s management thinks that the next generation of AI models will need 10-20 times more compute to train

Next-generation models will require 10 to 20x more compute to train with significantly more data. The trend is expected to continue.

Nvidia’s management sees inferencing accounting for 40% of Data Center revenue over the last 4 quarters (was 40% as of 2024 Q1)

Over the trailing 4 quarters, we estimate that inference drove more than 40% of our Data Center revenue.

Nvidia’s management is seeing demand coming from builders of frontier AI models, consumer Internet companies, and companies building generative AI applications for a wide range of use cases

Demand for NVIDIA is coming from frontier model makers, consumer Internet services, and tens of thousands of companies and start-ups building generative AI applications for consumers, advertising, education, enterprise and health care, and robotics. 

Nvidia’s Data Center revenue in China grew sequentially in 2024 Q2, but still remains below the level seen prior to export controls; management expects tough competition in China

Our Data Center revenue in China grew sequentially in Q2 and a significant contributor to our Data Center revenue. As a percentage of total Data Center revenue, it remains below levels seen prior to the imposition of export controls. We continue to expect the China market to be very competitive going forward.

Nvidia has leadership in inference

The latest round of MLPerf inference benchmarks highlighted NVIDIA’s inference leadership, with both NVIDIA Hopper and Blackwell platforms combining to win gold medals on all tests.

Nvidia’s Blackwell family of chips combines GPUs, CPUs, DPUs (data processing units), NVLink, and networking; the GB200 NVL72 system in the Blackwell family links up 72 GPUs to act as 1 GPU and is up to 30 times faster for LLM (large language model) inference workloads; Nvidia has made a change to the Blackwell architecture to improve production yields; Blackwell’s production is expected to ramp in the fourth quarter of 2024; management sees demand for Blackwell exceeding supply by a wide margin up to 2025; there are more than 100 different Blackwell architecture systems; Nvidia’s Blackwell systems come in both air-cooled and liquid-cooled flavours; management expects Nvidia’s Data Center business to grow significantly in 2025 and 2026, powered by the Blackwell system; management sees Blackwell as a step-function improvement over Hopper that delivers 3-5 times more AI throughput than Hopper; Blackwell required 7 one-of-a-kind chips to build; Nvidia designed and optimised the Blackwell system end-to-end

The NVIDIA GB200 NVL72 system with the fifth-generation NVLink enables all 72 GPUs to act as a single GPU and deliver up to 30x faster inference for LLM’s workloads and unlocking the ability to run trillion-parameter models in real time…

…We executed a change to the Blackwell GPU mass to improve production yields. Blackwell production ramp is scheduled to begin in the fourth quarter and continue into fiscal year ’26. In Q4, we expect to get several billion dollars in Blackwell revenue…

Demand for Blackwell platforms is well above supply, and we expect this to continue into next year…

…There are something like 100 different types of Blackwell-based systems that are built that were shown at Computex, and we’re enabling our ecosystem to start sampling those…

…We offer multiple configurations of Blackwell. Blackwell comes in either a Blackwell classic, if you will, that uses the HGX form factor that we pioneered with Volta. I think it was Volta. And so we’ve been shipping the HGX form factor for some time. It is air cooled. The Grace Blackwell is liquid cooled…

…We expect to grow our Data Center business quite significantly next year. Blackwell is going to be a complete game changer for the industry. And Blackwell is going to carry into the following year…

…Blackwall is a step-function leap over Hopper. Blackwell is an AI infrastructure platform, not just the GPU. It also happens to be the name of our GPU, but it’s an AI infrastructure platform. As we reveal more of Blackwell and sample systems to our partners and customers, the extent of Blackwell’s lead becomes clear. The Blackwell vision took nearly 5 years and 7 one-of-a-kind chips to realize: the Gray CPU, the Blackwell dual GPU and a colos package, ConnectX DPU for East-West traffic, BlueField DPU for North-South and storage traffic, NVLink switch for all-to-all GPU communications, and Quantum and Spectrum-X for both InfiniBand and Ethernet can support the massive burst traffic of AI. Blackwell AI factories are building size computers. NVIDIA designed and optimized the Blackwell platform full stack, end-to-end, from chips, systems, networking, even structured cables, power and cooling and mounts of software to make it fast for customers to build AI factories. These are very capital-intensive infrastructures. Customers want to deploy it as soon as they get their hands on the equipment and deliver the best performance and TCO. Blackwell provides 3 to 5x more AI throughput in a power-limited data center than Hopper…

…The Blackwell system lets us connect 144 GPUs in 72 GB200 packages into 1 NVLink domain, with an aggregate NVLink bandwidth of 259 terabytes per second in 1 rack. Just to put that in perspective, that’s about 10x higher than Hopper.  

Nvidia’s Ethernet for AI revenue doubled sequentially; management sees Nvidia’s ethernet product, Spectrum-X, enjoying wide support from the AI ecosystem; Spectrum-X performs 1.6 times better than traditional Ethernet; management plans to launch new Spectrum-X products every year and thinks that Spectrum-X will soon become a multi-billion dollar product

Our Ethernet for AI revenue, which includes our Spectrum-X end-to-end Ethernet platform, doubled sequentially with hundreds of customers adopting our Ethernet offerings. Spectrum-X has broad market support from OEM and ODM partners and is being adopted by CSPs, GPU cloud providers and enterprises, including xAI to connect the largest GPU compute cluster in the world. Spectrum-X supercharges Ethernet for AI processing and delivers 1.6x the performance of traditional Ethernet. We plan to launch new Spectrum-X products every year to support demand for scaling compute clusters from tens of thousands of GPUs today to millions of DPUs in the near future. Spectrum-X is well on track to begin a multibillion-dollar product line within a year.

Japan’s government is working with Nvidia to build an AI supercomputer; Nvidia’s management thinks sovereign AI revenue will be in the low-teens billion-range this year; management is seeing countries want to build their own generative AI that incorporates their own language, culture, and data

Japan’s National Institute of Advanced Industrial Science and Technology is building its AI Bridging Cloud Infrastructure 3.0 supercomputer with NVIDIA. We believe sovereign AI revenue will reach low double-digit billions this year…

…It certainly is a unique and growing opportunity, something that surfaced with generative AI and the desires of countries around the world to have their own generative AI that would be able to incorporate their own language, incorporate their own culture, incorporate their own data in that country.

Most of the Fortune 100 companies are working with Nvidia on AI projects

We are working with most of the Fortune 100 companies on AI initiatives across industries and geographies. 

Nvidia’s management is seeing a range of applications driving the company’s growth; these applications include (1) Amdocs’ smart agent which is reducing customer service costs by 30%, and (2) Wistron’s usage of Nvidia AI Ominiverse to reduce cycle times in its factories by 50%

A range of applications are fueling our growth, including AI-powered chatbots, generative AI copilots and agents to build new, monetizable business applications and enhance employee productivity. Amdocs is using NVIDIA generative AI for their smart agent, transforming the customer experience and reducing customer service costs by 30%. ServiceNow is using NVIDIA for its Now Assist offering, the fastest-growing new product in the company’s history. SAP is using NVIDIA to build Joule copilot. Cohesity is using NVIDIA to build their generative AI agent and lower generative AI development costs. Snowflake, who serves over 3 billion queries a day for over 10,000 enterprise customers, is working with NVIDIA to build copilots. And lastly, Wistron is using NVIDIA AI Omniverse to reduce end-to-end cycle times for their factories by 50%.

Every automobile company that is developing autonomous vehicle technology is working with Nvidia; management thinks that automotive will account for multi-billions in revenue for Nvidia; Nvidia won the Autonomous Brand Challenge at the recent Computer Vision and Pattern Recognition Conference

Every automaker developing autonomous vehicle technology is using NVIDIA in their data centers. Automotive will drive multibillion dollars in revenue across on-prem and cloud consumption and will grow as next-generation AV models require significantly more compute…

…At the Computer Vision and Pattern Recognition Conference, NVIDIA won the Autonomous Brand Challenge in the end-to-end driving at scale category, outperforming more than 400 entries worldwide. 

Nvidia’s management announced Nvidia AI Foundry – a platform for building custom AI models – in 2024 Q2; users of Nvidia AI Foundry are able to customise Meta’s Llama 3.1 foundation AI model; Nvidia AI Foundry is the first platform where users are able to customise an open-source, frontier-level foundation AI model; Accenture is already using Nvidia AI Foundry 

During the quarter, we announced a new NVIDIA AI foundry service to supercharge generative AI for the world’s enterprises with Meta’s Llama 3.1 collection of models… 

…Companies for the first time can leverage the capabilities of an open source, frontier-level model to develop customized AI applications to encode their institutional knowledge into an AI flywheel to automate and accelerate their business. Accenture is the first to adopt the new service to build custom Llama 3.1 models for both its own use and to assist clients seeking to deploy generative AI applications.

Companies from many industries are using NIMs (Nvidia inference microservices) for deployment of generative AI; AT&T saw 70% cost savings and 8 times latency reduction with NIM; there are 150 organisations using NIMs; Nvidia recently announced NIM Agent Blueprints, a catalog of reference AI applications; Nvidia is using NIMs to open the Nvidia Omniverse to new industries

NVIDIA NIMs accelerate and simplify model deployment. Companies across health care, energy, financial services, retail, transportation, and telecommunications are adopting NIMs, including Aramco, Lowes, and Uber. AT&T realized 70% cost savings and 8x latency reduction affter moving into NIMs for generative AI, call transcription and classification. Over 150 partners are embedding NIMs across every layer of the AI ecosystem. 

We announced NIM Agent Blueprints, a catalog of customizable reference applications that include a full suite of software for building and deploying enterprise generative AI applications. With NIM Agent Blueprints, enterprises can refine their AI applications over time, creating a data-driven AI flywheel. The first NIM Agent Blueprints include workloads for customer service, computer-aided drug discovery, and enterprise retrieval augmented generation. Our system integrators, technology solution providers, and system builders are bringing NVIDIA NIM Agent Blueprints to enterprises…

…We announced new NVIDIA USD NIMs and connectors to open Omniverse to new industries and enable developers to incorporate generative AI copilots and agents into USD workloads, accelerating our ability to build highly accurate virtual worlds.

Nvidia’s AI Enterprise software platform is powering Nvidia’s software-related business to approach a $2 billion annual revenue run-rate by the end of this year; management thinks Nvidia AI Enterprise represents great value for customers by providing GPUs at a price of $4,500 per GPU per year; management thinks the TAM (total addressable market) for Nvidia’s AI software business can be significant

NVIDIA NIM and NIM Agent Blueprints are available through the NVIDIA AI Enterprise software platform, which has great momentum. We expect our software, SaaS and support revenue to approach a $2 billion annual run rate exiting this year, with NVIDIA AI Enterprise notably contributing to growth…

…At $4,500 per GPU per year, NVIDIA AI Enterprise is an exceptional value for deploying AI anywhere. And for NVIDIA’s software TAM, it can be significant as the CUDA-compatible GPU installed base grows from millions to tens of millions. 

Computers that contain Nvidia’s RTX chip can deliver up to 1,300 AI TOPS (tera operations per second); there are more than 200 RTX AI computer models from computer manufacturers; there is an installed base of 100 million RTX AI computers; a game called Mecha BREAK is the first game to use Nvidia ACE, a generative AI service for creating digital humans

Every PC with RTX is an AI PC. RTX PCs can deliver up to 1,300 AI tops and there are now over 200 RTX AI laptops designed from leading PC manufacturers. With 600 AI-powered applications and games and an installed base of 100 million devices, RTX is set to revolutionize consumer experiences with generative AI. NVIDIA ACE, a suite of generative AI technologies is available for RTX AI PCs. Mecha BREAK is the first game to use NVIDIA ACE, including our small language model, Nemotron-4 4B, optimized on device inference. 

Foxconn, the largest electronics manufacturer in the world, and Mercedes-Benz, the well-known auto manufacturer, are using Nvidia Omniverse to produce digital twins of their manufacturing plants

The world’s largest electronics manufacturer, Foxconn, is using NVIDIA Omniverse to power digital twins of the physical plants that produce NVIDIA Blackwell systems. And several large global enterprises, including Mercedes-Benz, signed multiyear contracts for NVIDIA Omniverse Cloud to build industrial digital twins of factories.

Many robotics companies are using Nvidia’s AI robot software

Boston Dynamics, BYD Electronics, Figure, Intrinsyc, Siemens, Skilled AI and Teradyne Robotics are using the NVIDIA Isaac robotics platform for autonomous robot arms, humanoids and mobile robots.

Nvidia’s management is seeing some customers save up to 90% in computing costs by transitioning from genera-purpose computing (CPUs) to accelerated computing (GPUs)

We know that accelerated computing, of course, speeds up applications. It also enables you to do computing at a much larger scale, for example, scientific simulations or database processing. But what that translates directly to is lower cost and lower energy consumed. And in fact, this week, there’s a blog that came out that talked about a whole bunch of new libraries that we offer. And that’s really the core of the first platform transition, going from general-purpose computing to accelerated computing. And it’s not unusual to see someone save 90% of their computing cost. And the reason for that is, of course, you just sped up an application 50x. You would expect the computing cost to decline quite significantly.

Nvidia’s management believes that generative AI is a new way to write software and is changing how every layer of computing is done

Generative AI, taking a step back about why it is that we went so deeply into it, is because it’s not just a feature, it’s not just a capability, it’s a fundamental new way of doing software. Instead of human-engineered algorithms, we now have data. We tell the AI, we tell the model, we tell the computer what are the expected answers, what are our previous observations, and then for it to figure out what the algorithm is, what’s the function. It learns a universal — AI is a bit of a universal function approximator and it learns the function. And so you could learn the function of almost anything, and anything that you have that’s predictable, anything that has structure, anything that you have previous examples of. And so now here we are with generative AI. It’s a fundamental new form of computer science. It’s affecting how every layer of computing is done from CPU to GPU, from human-engineered algorithms to machine-learned algorithms, and the type of applications you could now develop and produce is fundamentally remarkable.

Nvidia’s management thinks AI models are still seeing the benefits of scaling

There are several things that are happening in generative AI. So the first thing that’s happening is the frontier models are growing in quite substantial scale. And we’re still all seeing the benefits of scaling.

The amount of compute needed to train an AI model goes up much faster than the size of the model; management thinks the next generation of AI models could require 10-40 times more compute 

Whenever you double the size of a model, you also have to more than double the size of the data set to go train it. And so the amount of flops necessary in order to create that model goes up quadratically. And so it’s not unexpected to see that the next-generation models could take 10x, 20x, 40x more compute than last generation.

Nvidia’s management is seeing more frontier model makers in 2024 than in 2023

Surprisingly, there are more frontier model makers than last year.

Nvidia’s management is seeing advertising-related computing needs shifting from being powered by CPUs to being powered by GPUs and generative AI

The largest systems, largest computing systems in the world today, and you’ve heard me talk about this in the past, which are recommender systems moving from CPUs. It’s now moving from CPUs to generative AI. So recommender systems, ad generation, custom ad generation targeting ads at very large scale and quite hyper-targeting, search and user-generated content, these are all very large-scale applications that have now evolved to generative AI.

Nvidia’s management is seeing generative AI startups generating tens of billions of revenue-opportunities for cloud computing providers

The number of generative AI start-ups is generating tens of billions of dollars of cloud renting opportunities for our cloud partners

Nvidia’s management is seeing that cloud computing providers have zero GPU capacity available because they are using it for internal workloads (such as accelerating data processing) and renting it out to model makers and other AI startups

If you just look at the world’s cloud service providers and the amount of GPU capacity they have available, it’s basically none. And the reason for that is because they’re either being deployed internally for accelerating their own workloads, data processing, for example…

…The second is, of course, the rentals. They’re renting capacity to model makers. They’re renting it to start-up companies. 

Nvidia’s management thinks Nvidia’s GPUs are the only AI GPUs that process and accelerate data; before the advent of generative AI, the number one use case of Nvidia’s GPUs was to accelerate data processing

NVIDIA’s GPUs are the only accelerators on the planet that process and accelerate data. SQL data, Panda’s data, data science toolkits like Panda’s, and the new one, Polar’s, these are the ones that are the most popular data processing platforms in the world, and aside from CPUs which, as I’ve mentioned before, are really running out of steam, NVIDIA’s accelerated computing is really the only way to get boosting performance out of that. And so the #1 use case long before generative AI came along is the migration of applications one after another to accelerated computing.

Nvidia’s management thinks that those who purchase Nvidia AI chips are getting immediate ROI (return on investment) for a few reasons: (1) GPUs are a better way to build data centers compared to CPUs because GPUs save money on data processing compared to CPUs, (2) cloud computing providers who rent out GPUs are able to rent out their GPUs the moment they are built up in the data center because there are many generative AI companies clamouring for the chips, and (3) generative AI improves a company’s own services, which delivers a fast ROI

The people who are investing in NVIDIA infrastructure are getting returns on it right away. It’s the best ROI infrastructure, computing infrastructure investment you can make today. And so one way to think through it, probably the easiest way to think through it is just to go back to first principles. You have $1 trillion worth of general-purpose computing infrastructure. And the question is, do you want to build more of that or not?

And for every $1 billion worth of Juniper CPU-based infrastructure that you stand up, you probably rent it for less than $1 billion. And so because it’s commoditized, there’s already $1 trillion on the ground. What’s the point of getting more? And so the people who are clamoring to get this infrastructure, one, when they build out Hopper-based infrastructure and soon, Blackwell-based infrastructure, they start saving money. That’s tremendous return on investment. And the reason why they start saving money is because data processing saves money, and data processing is probably just a giant part of it already. And so recommender systems save money, so on and so forth, okay? And so you start saving money.

The second thing is everything you stand up are going to get rented because so many companies are being founded to create generative AI. And so your capacity gets rented right away and the return on investment of that is really good.

And then the third reason is your own business. Do you want to either create the next frontier yourself or your own Internet services, benefit from a next-generation ad system or a next-generation recommender system or a next-generation search system? So for your own services, for your own stores, for your own user-generated content, social media platforms, for your own services, generative AI is also a fast ROI.

Nvidia’s management is seeing a significant number of data centers wanting liquid-cooled GPU systems because the use of liquid cooling enables 3-5 times more AI throughput compared to the past, resulting in cheaper TCO (total cost of ownership)

The number of data centers that want to go to liquid cooled is quite significant. And the reason for that is because we can, in a liquid-cooled data center, in any power-limited data center, whatever size of data center you choose, you could install and deploy anywhere from 3 to 5x the AI throughput compared to the past. And so liquid cooling is cheaper. Our TCO is better, and liquid cooling allows you to have the benefit of this capability we call NVLink, which allows us to expand it to 72 Grace Blackwell packages, which has essentially 144 GPUs.

Nvidia does not do the full integration of its GPU systems into a data center because it is not the company’s area of expertise

Our customers hate that we do integration. The supply chain hates us doing integration. They want to do the integration. That’s their value-add. There’s a final design-in, if you will. It’s not quite as simple as shimmying into a data center, but the design fit-in is really complicated. And so the design fit-in, the installation, the bring-up, the repair-and-replace, that entire cycle is done all over the world. And we have a sprawling network of ODM and OEM partners that does this incredibly well.

Nvidia has released many new libraries for CUDA, across a wide variety of use cases, for AI software developers to work with

Accelerated computing starts with CUDA-X libraries. New libraries open new markets for NVIDIA. We released many new libraries, including CUDA-X Accelerated Polars, Pandas and Spark, the leading data science and data processing libraries; CUVI-S for vector databases, this is incredibly hot right now; Ariel and Ciona for 5G wireless base station, a whole world of data centers that we can go into now; Parabricks for gene sequencing and AlphaFold2 for protein structure prediction is now CUDA accelerated.

Nvidia now has 3 networking platforms for GPUs

We now have 3 networking platforms, NVLink for GPU scale-up, Quantum InfiniBand for supercomputing and dedicated AI factories, and Spectrum-X for AI on Ethernet. NVIDIA’s networking footprint is much bigger than before. 

Salesforce (NYSE: CRM)

Agentforce is a new architecture and product that management believes will be fundamental to Salesforce’s AI leadership in the next decade; Salseforce will be introducing Agentforce in its upcoming Dreamforce customer-event; Agentforce is an autonomous AI and management will be getting every attendee at Dreamforce to turn on their own AI agents; Salesforce is already building agents for the company, Workday, and Workday will be Salseforce’s first Agentforce partner; Agentforce allows companies to build custom agents for sales, service, marketing, and commerce; management believes that within a year, most companies will be deploying autonomous AI agents at scale, and these agents will have a big positive impact on companies’ operations; Agentforce is currently management’s singular focus; many companies are already using Agentforce, including one of the world’s largest healthcare companies, which is resolving more than 90% of patient inquiries with Agentforce, and thinks Agentforce is much better than any other competing AI platform; a very large media company is using Agentforce to resolve 90% of employee and consumer issues; management thinks Salesforce is the first company to deploy high-quality enterprise AI agents at scale; Agentforce is not a co-pilot, it is an autonomous agent that is accurate and can be deployed right out of the box; users of Agentforce can do advanced planning and reasoning with minimal input; management sees Agentforce as being a trusted colleague that will complement human users; management sees thousands of companies using Agentforce by January 2025; early trials of Agentforce has showed remarkable success

We’re going to talk about a whole different kind of sales force today, a different kind of architecture and a product that we didn’t even talk about on the last earnings call that is going to be fundamental to our future and a manifestation of our decade of AI leadership, which is Agentforce. Now in just a few weeks, we’re going to kick off Dreamforce, and I hope all of you are planning to be there, the largest AI event in the world with more than 45,000 trailblazers in San Francisco. And this year, Dreamforce is really becoming Agentforce…

…We’re going to show our new Agentforce agents and how we’ve reimagined enterprise software for this new world of autonomous AI. And every customer, I’m going to try to get every customer who comes to Dreamforce to turn agents on while they’re there…

…This idea that you’re not just going to have sales agents and service agents who probably read, heard maybe you saw in CBC, we’re building the agents for Workday and we’re going to be building custom agents for so many of you as well with Agentforce, because it is a development platform as well as this incredible capability to radically extend your sales and service organizations.

So when you arrive at the Dreamforce campus, you’re going to see a big sign outside that says, humans with agents drive customer success together. And that’s because we now so strongly believe the future isn’t about having a sales force or a service force or a marketing force or a commerce force or an analytics force. The future is about also having an Agentforce. And while many customers today don’t yet have agent forces, but they do have sales forces or service forces, I assure you that within a year, we’re all going to have agent forces, and we’re going to have them at scale. And it’s going to radically extend our companies and it’s going to augment our employees, make us more productive. It’s going to turn us into these incredible margin and revenue machines. It’s going to be pretty awesome…

…with this Agentforce platform, we’re making it easy to build these powerful autonomous agents for sales, for service, for marketing, for commerce, automating the entire workflow on their own, embedding agents in the flow of work and getting our customers to the agent future first. And this is our primary goal of our company right now. This is my singular focus…

…We’re going to talk about the customers who have it, customers like OpenTable and Wiley and — and ADP and RBC and so many others who are deploying these agents and running them on top of our Data Cloud and our apps…

…At Dreamforce, you’re going to hear one of the very largest health care companies in the world. It’s got 20 million consumers here in the United States who is resolving more than 90% of all patient inquiries with Agentforce and they’re benchmarking us significantly higher than any other competing AI platform, and that’s based on some incredible AI breakthroughs that we have had at Salesforce…

…One of these very large media companies that we work with, a lot of probably know who have everything, every possible media asset, while they’re just resolving 90% of all of their employee and consumer issues with Agentforce, pretty awesome. So there’s nothing more transformational than agents on the technology horizon that I can see and Salesforce is going to be the first company at scale to deploy enterprise agents and not just any enterprise agents, the highest quality, most accurate agents in the world…

…We’re seeing that breakthrough occur because with our new Agentforce platform, we’re going to make a quantum leap for in AI, and that’s why it wants you all at Dreamforce, because I want you to have your hands on this technology to really understand this. This is not co-pilots…

…These agents are autonomous. They’re able to act with accuracy. They’re able to come right out of the box. They’re able to go right out of the platform…

…These agents don’t require a conversational prompt to take action. You can do advanced planning, reasoning with minimal human input. And the example of this incredible health care company, you’re going to be able to say to the agent, “Hey, I want to look at my labs, I want to understand this. It looks like I need repeat labs. Can you reschedule those for me? It looks like I need to see my doctor, can you schedule that for me? I also want to get an MRI, I want to get this.” And the level of automation that we’re going to be able to provide and unleash the productivity back into these organizations is awesome…

…This is going to be like having these trusted colleagues can handle these time-consuming tasks engaging with these — whether it’s inbound lead or resolving this customer, patient inquiry or whatever it is, this is humans with agents driving customer success together and Agentforce agents can be set up in minutes, easily scalable, work around with the block, any language. And by the beginning of next fiscal year, we will have thousands of customers using this platform. And we will have hand help them to make it successful for them, deploy it. The early trials have been remarkable to see these customers have the success, it has been just awesome…

…We’re just at the beginning of building an Agentforce ecosystem with companies able to build agents on our platform for their workforce and use cases, and we’re excited to have Workday as our first agent force partner.

Salesforce has been able to significantly reduce hallucinations with its AI products, and thus deliver highly accurate results, through the use of new retrieval augmented generation (RAG) techniques

The accuracy of our results, the reduction of hallucinations and the level of capability of AI is unlike anything I think that any of us have ever seen, and we’ve got some incredible new techniques, especially incredible new augmented RAG techniques that are delivering us the capability to deliver this accuracy with our — for our customers.

Salesforce’s management still sees the company as the No.1 AI CRM in the world

Of course, Salesforce is the #1 AI CRM.

 In 2024 Q2, Einstein delivered 25 trillion transactions and 1 trillion workflows; Wyndham is using Einstein to reduce average call times to free up time for service agents for higher-value work

We’re just operating at this incredible scale, delivering 25 trillion Einstein transactions across all of the clouds during the quarter, that’s 25 trillion and more than 1 trillion workflows…

…MuleSoft allows Wyndham to unlock business-critical data from various platforms and onboard future franchisees faster. And with Einstein generated recommended service replies, average call times have been reduced and service agents can focus on higher priority work

Salesforce’s management thinks many of the company’s customers have a misconception about AI in that they need to build and train their own AI models; management is able to use Salesforce’s AI models and resolve issues much better than its customers’ own models; management thinks Salesforce’s AI models have the highest efficacy

I think that there’s a lot of misconceptions about AI with my customers. I have been out there very disappointed with the huge amount of money that so many of these customers have wasted on AI. They are trying to DIY their AI…

…This idea that our customers are going to have to build their own models, train their own models, retrain their own models, retrain them again and I’m meeting with these customers, and they’re so excited when they and they say, “Oh, I built this model, and we’re resolving 10%, 20%, 30%, 40% and — of this or that and whatever. ” And I’m like, really, we’ll take a look at our models and our capability where you don’t have to train or retrain anything and you’re going to get more than 90%. And then they say, wait a minute, how do you do that? And this is a moment where every customer needs to realize you don’t need the DIY your AI. You can use a platform like Salesforce to get the highest efficacy of artificial intelligence, the best capability to fully automate your company, achieve all of your goals and you can do it with professional enterprise software…

…We’re in met with one of the largest CIOs in the world is telling me how excited he was for the B2C part of this business, he built this model and accuracy rates, than I was like, really, let me show you what we’re doing here. And then he said to me, why am I doing this? Why am I not just using your platform? And I said good question. So these customers are spending millions of dollars, but are they really getting the results that they want? It feels like this early days of cloud. It’s just early days of social mobile. Customers feel like they have to DIY it, they don’t need to, they can make it all happen themselves. And you can just see that to deliver this very high-quality capability they can use a deeply integrated platform like Salesforce.

Salesforce’s management is seeing the company’s customers get immediate ROI (return on investment) from deploying AI automation solutions because the solutions can be easily and quickly customised and configured

We’ve created out-of-the-box platform to deliver all of this for them. So this could be service reply recommendations, account summaries, report generation, you’ve seen in Slack, this kind of auto summarization, recaps, all of these amazing things, the level of automation, the amount of code that our team has written, the transformation of our platform in the last 18 months, it’s remarkable. And customers love it because they can take the platform and then all of this generative AI use case customize it for their own needs or configure it using our capability because they’re doing that without writing a line of code. It’s clicks, not code, deploy them in days, not weeks. They’re doing this in months, not years, and you’re getting immediate ROI. 

Salesforce’s management thinks many of the company’s customers are really disappointed with Microsoft Co-pilot because of a lack of accuracy

So many customers are so disappointed in what they bought from Microsoft CoPilots because they’re not getting the accuracy and the response that they want. Microsoft is disappointed so many customers with AI. 

Wiley is achieving double-digit percentage increase in customer satisfaction and deflection rate, and 50% case resolution with the first generation of Agentforce; Royal Bank of Canada and AP are seeing 90% case resolution rates with the second generation of Agentforce; OpenTable is using Agentforce to support its interactions with 60,000 restaurants and 160 million diners

Wiley is a long-standing Salesforce customer. It’s one of our first deployments in the first agent force trial. It’s pretty awesome. And you all know they make textbooks and it’s back-to-school. But maybe you don’t know that Wiley has to like surge their sales and service organization at back-to-school time when everyone’s buying these textbooks. Well, now they can use agents to do that surge. They don’t have to go buy a bunch of gig workers and bring them in. and that age and capacity is so exciting for them. What we saw with Wiley was, this is a quote from them, “we’re seeing double-digit percentage increase in customer satisfaction and deflection rate compared to older technologies and in these early weeks of our busiest season. ” So that was very reassuring to us. that we have the right thing that’s happening. And Wiley has already seen a 50% increase in case resolution. That’s with our first generation of Agentforce.

As I mentioned, the second generation of Agentforce, which we have with customers already, including some of these amazing organizations like Royal Bank of Canada, ADP and others is 90% case resolution. It is an awesome moment in this tech business.

OpenTable is another super great story. You all know they are managing 60,000 restaurants, 160 million diners to support. They’re on Agentforce now. They require that incredible scale to deliver top-notch customer service. That’s why they’re using the product. It’s been awesome to get their results and it can be all kinds of questions resolving basic issues, account activations, reservation management, loyalty point expiration. Agentforce for service can easily answer all of these questions like when do my points expire for a diner asset, a follow-up question like, what about in Mexico? What about — can I make this change? That’s where we’re delivering those incredible moments for OpenTable, giving them this kind of productivity enhancement. 

Agentforce is driving growth in cloud products’ sales for Salesforce

Agentforce for sales, you can imagine extending your sales force with SDRs, BDRs who are agents that are going out and building pipeline for you and generating all kind of demand and even really closing deals. So, this is going to drive sales cloud growth. It already is, service cloud growth. It already is because customers are going to extend their sales and service organizations and become a lot more productive with these agents.

Salesforce will be releasing industry-specific AI agents in the coming months

In the coming months, we’re going to release Agentforce agents for other roles, including industry-specific agents, health agents, as I mentioned. 

Data Cloud provides the foundation for Agentforce because it holds a huge amount of data and metadata; management continues to believe that data is the foundation of AI; Data Cloud federates and connects to all other data clouds of a user to deliver super accurate AI; Data Cloud is Salesforce’s fastest-growing organic product and will be the fastest to hit $1 billion, $5 billion, and $10 billion in revenue; Data Cloud customers were up 130% year-on-year in 2024 Q2; number of Data Cloud customers spending more than $1 million annually have doubled; Data Cloud processed 2.3 quadrillion records in 2024 Q2 (was 2 quadrillion in 2024 Q1); Data Cloud consumption was up 110% year-on-year in 2024 Q2; American Family Insurance is using Data Cloud to create a 360-degree view of customers; Adecco Group is using Data Cloud to create seamless access to information for 27,000 of its employees; Windhma is using Data Cloud to unify profiles of 165 million guest records, many of which are duplicates across multiple sources

This type of performance from our Agentforce platform wouldn’t be possible without Data Cloud. One of the reasons that our agents are so accurate is because of the huge amount of data and metadata that we had. And data is the foundation for every AI transformation. And with Data Cloud, we’re providing a high-performance data lake that brings together all our customer and business data, federating data from external repositories through this credible zero-copy alliance. So customers can use our Data Cloud and then federate and connect to all their other data clouds and then we can bring it all together to deliver the super accurate AI. 

That’s why Data Cloud is absolutely our fastest-growing organic product in history. It will be the fastest product to $1 billion — it’s going to probably be the fastest product of $5 billion, $10 billion. In Q2, the number of paid Data Cloud customers grew 130% year-over-year and the number of customers spending more than $1 million annually have already doubled. In the second quarter alone, and this is amazing, data Cloud processed 2.3 quadrillion records with 110% platform consumption growth year-over-year…

…American Family Insurance with millions of policyholders nationwide is using Data Cloud to consolidate data from multiple sources through our zero-copy partner network, creating a 360-view of the customers, enabling quick segmentation and activating lead data, including their real-time web interactions. The Adecco Group expanded their data cloud in the quarter, a great example of a company leveraging its gold mine of data to gain a unified view of its customers. Connecting all this data means that 27,000 Adecco employees using Salesforce will have seamless access to key information, including financial metrics and job fulfillment status, to help Adecco improve their job fill rate ratio and reduce their cost to serve…

…Wyndham utilizes Data Cloud to unify profiles of 165 million guest records, many of which were duplicates across many sources like Amazon Redshift and the Sabre Reservation System as well as Sales Cloud, Marketing Cloud and Service Cloud. 

Salesforce has rewritten all of its software to be under one unified platform; management thinks building AI agents without a unified platform is risky; the decision to unite all of Salesforce’s software was made 18 months ago with the shift to AI

We’ve automated every customer touch point and now we’re bringing these apps, data and agents together. It’s these 3 levels, and this is in 3 separate pieces of code or 3 different platforms or 3 different systems. This is 1 platform. We’ve rewritten all of our acquisitions, all of our core modules, our Data Cloud and our agents as 1 unified platform, which is how we are delivering not only this incredible functionality but this high level of accuracy and capability. And from this first-hand experience in meeting with these customers around the globe, I can unequivocably tell you that building these agents without a complete integrated platform is like trying to assemble a plane mid-flight, it’s risky chaotic and it’s not likely to succeed…

…With the shift to AI, it just became clear 18 months ago, we need to hit the accelerator pedal and rewrite all these things onto the core platform because customers are going to get this incredible value by having 1 integrated system, and it scales from small companies to extremely large companies. 

Bookings for Salesforce’s AI products was up more than 100% year-on-year in 2024 Q2; Salesforce signed 1,500 AI deals in 2024 Q2; aircraft maker Bombardier is using Salesforce’s AI products to arm sales reps with better information on, and recommendations for, prospects

We’re already accelerating this move from AI hype to AI reality for thousands of customers with amazing capabilities across our entire AI product portfolio. New bookings for these products more than doubled quarter-over-quarter. We signed 1,500 AI deals in Q2 alone. Some of the world’s largest brands are using AI solutions, including Alliant, Bombardier and CMA CGM. Bombardier, the maker of some of the world’s top performing aircraft, is enabling sales reps to sell smarter by consolidating need to know information on prospects in advance of meetings and providing recommendations on how to best engage with them through the Einstein copilot and prompt builder. 

Salesforce has a new team called Salesforce CTOs that will work alongside customers in deploying AI agents

To help our customers navigate this new world, we just launched a new team called Salesforce CTOs. These are deeply technical individuals who work alongside our customers to help them create and execute a plan for every stage of their AI journey to become agent first companies. 

Salesforce sees itself as customer zero for all its AI products, including Agentforce, and it is deploying its own AI products internally with success; 35,000 Salesforce employees are using Einstein as an AI assistant; Salesforce has already used Slack AI to create 500,000 channel summaries since February 2024, saving 3 million hours of work

We’re continuing our own AI journey internally as a Customer Zero of all of our products with great results. We now have 35,000 employees using Einstein as a trusted AI assistant, helping them work smarter and close deals faster. And since we launched Slack AI in February, our employees have created more than 500,000 channels — channel summaries, saving nearly 3 million hours of work. We’ll, of course, deploy Agentforce agents soon in a variety of different roles and tasks to augment, automate and deliver productivity and unmatched experiences for all employees and customers at scale.

Salesforce will be introducing Industry Toolkit at Dreamforce; Industry Toolkit contains more than 100 ready-to-use AI-powered actions; Industry Toolkit can be used with Agentforce 

At Dreamforce, we’re excited to share our new AI tool kit — industry toolkit, which features more than 100 ready-to-use customizable AI-powered actions. All of these actions can be applied to build industry-specific agents with Agentforce.

Salesforce’s management wants to see 1 billion AI agents by FY2026; there are already 200 million agents identified in trials

I’ll just give you my own personal goals. So I’m not giving any guidance here. My goal is that by the end of fiscal year ’26 that we will have 1 billion agents. Already in just looking at the number of consumers identified just in the trials that we have going on, we have like 100 million identified or more. Okay. call it, 200 million. But the funny thing is, of course, it’s only 1 agent. But let’s just think it’s like a manifestation of all these agents talking to all these consumers.

Salesforce already has a long history of selling non-human consumption-based products; with AI agents, management sees pricing on a consumption basis or on a per conversation basis (at $2 per conversation); management thinks AI agents is a very high-margin opportunity

On pricing. When you think about — when you think about apps and you think about humans, because humans use apps, not in all cases. So for example, the Data Cloud is a consumption product. The Commerce Cloud is a consumption product. Of course, the e-mail product, Marketing Cloud is a consumption product. Heroku is a consumption product. So of course, we’ve had non-human consumption-based products for quite a long time at Salesforce…

…When we look at pricing, it will be on a consumption basis. And when we think about that, we think about saying to our customers, and we have, it’s about $2 per conversation. So, that is kind of how we think about it, that we’re going to have a lot of agents out there, even though it’s only 1 agent. It’s a very high margin opportunity, as you can imagine, and we’re going to be reaching — look, you have to think about these agents are like, this is the new website. This is your new phone number. This is how your customers are going to be connecting with you in this new way, and we’re going to be helping our customers to manage these conversations. And it’s probably a per conversational charge as a good way to look at it or we’re selling additional consumption credits like we do with our data cloud. 

Veeva Systems (NYSE: VEEV)

Veeva’s management is seeing the company’s customers appreciate the patience they have displayed in adopting AI; customers started using Veeva’s Vault Direct Data API for AI use cases in 2024 Q2; Vault Direct Data API provides data access 100 times faster than traditional APIs; management thinks that the advantage of providing API access for AI use cases is the possibility of partners developing use cases that management could not even forsee; customers have to pay a fee to turn on Vault Direct Data API and the fee is for covering Veeva’s compute costs; there’s no heavy implementation tasks needed for Vault Direct Data API

When groundbreaking technology like GenAI is first released, it takes time for things to settle and become clearer. That’s starting to happen now. Customers have appreciated our taking the long view on AI and our orientation to tangible value rather than hype. In Q2, our first early customers started using the Vault Direct Data API to power AI and other use cases. The ability to retrieve data 100 times faster than traditional APIs is a major software platform innovation and will be a big enabler of AI that uses data from Vault applications…

… When you make an API like the Direct Data API, you don’t know the innovation you’re unleashing. And that’s the whole point because the data can be consumed so fast and transactionally accurately, use cases that weren’t practical before can become practical. I mean if I step back way back when to designing the first salesforce.com API, I knew it was going to unleash a lot of innovation, and you just don’t know. It’s not predictable, and that’s the good thing…

…[Question] Looking at Vault Direct Data API, how seamless is it for customers to turn it on and start using it? Is it something that needs an implementation? 

[Answer] That is something that’s purchased by the customer, so that is something that is not free for the customers to use. They purchase it. The fee is not that large. It covers our compute cost, that type of thing… 

…After that, no, there’s no implementation. You turn it on, and it’s on. And that’s that.

Veeva’s AI Partner Program is progressing well and has seen 30 AI use cases being developed by 10 partners across Veeva Development Cloud and Veeva Commercial Cloud; the AI use cases in Veeva Commercial Cloud are mostly related to data science while the use cases in Veeva Development Cloud are mostly related to generation of documents and reports; management does not want to compete with the partners that are in the AI Partner Program 

Our AI Partner Program is also progressing well. We now have more than 10 AI partners supporting roughly 30 use cases across R&D and Commercial. We also continue to explore additional AI application opportunities beyond our current AI solutions…

… [Question] You talked about some of the early traction you’re seeing with the AI Partner Program. Can you maybe talk about what are some of the use cases you’ve seen so far?

[Answer] The types of use cases in commercial often have to do with data science. So things like next best action, dynamic targeting, pre-call planning, things like that. And then in R&D, they can be more things like document generation, generate a clinical study report or doing specific medical coding, things like that. So those are the type of use cases…

…In terms of us monitoring that and informing our own road map, I guess there may be some of that. But mostly, that type of innovation really comes from internally our own thinking with our customers. We don’t want to really disrupt our partners, especially when the partners are having customer success. If there’s a major use case that we’re very clear that customers need and for some reason, the ecosystem is not delivering customer success, yes, maybe we might step in there. But I would guess that what we would do would be more holistic, I guess, in some sense and not specifically something a partner would tackle because we’re generally going to have more resources and more ability to sway our own road map than a partner would, and we want to be respectful to the ecosystem.

Zoom Video Communications (NASDAQ: ZM)

Zoom’s management is seeing customers seeking out the AI capabilities of Zoom’s Contact Center packages; Zoom’s management saw the ASP (average selling price) for its Contact Center product double sequentially because of the product’s AI-tier, which comes with higher pricing

We are seeing increased adoption of our advanced Contact Center packages, as customers seek to utilize our AI capabilities to enhance agent performance…

…If you remember, we started with one pricing tier. We eventually added two more and the AI agent is like that Eric was speaking about earlier, is in the highest tier. We actually saw our ASPs for Contact Center almost double quarter-over-quarter because it’s such a premium feature. And when I look at the Q2 deals, the majority of them were purchasing in one of the top 2 tiers, so all of that is contributing to what I would say is not only expansion in terms of seat count but expansion in terms of value being derived from the product.

Zoom’s AI companion uses generative AI to produce meeting summaries, live translations, image generation and more; Zoom AI Companion is now enabled on over 1.2 million accounts; management wIll be upgrading AI companion as Zoom transitions into the 2.0 phase of AI-enabled work; customers really like ZoomAI Companion; Zoom AI Companion is provided at no additional cost; in Zoom meetings, the No.1 use case of Zoom AI Companion is to create meeting summaries; management is constantly improving the quality of Zoom AI Companion’s meeting summaries; customers are giving positive feedback on Zoom AI companion

Today, AI Companion enhances an employee’s capabilities using generative AI to boost productivity through features like meeting summary, chat compose, image generation, live translation and enhanced features in Contact Center. As these features have grown in popularity, we are happy to share that Zoom AI Companion is now enabled on over 1.2 million accounts…

…Our progress broadening Zoom Workplace, building out enhanced AI tools for Contact Center and amassing a large base of AI users sets us up well to transition into the 2.0 phase of AI-enabled work. In this phase, Zoom AI Companion will move beyond enhancing skills to simplifying your workday, providing contextual insights, and performing tasks on your behalf. It will do this by operating across our collaboration platform to ensure your day is interconnected and productive…

…Our customers really like Zoom AI Companion. First of all, it works so well. Secondly, at no additional cost, not like some of other vendors who got to charge the customer a lot. And in our case, this is a part of our package…

… You take a Meeting, for example, right? For sure, the #1 use case like a meeting summary, right? And we keep improving that quality like in the [indiscernible] and or meeting summary are getting better and better. Like in July, we had another upgrade quarter-wise, even better than previous deliveries, right?..

… [Question] One question I had is when you’re looking at Zoom AI Companion, we’ve heard a lot of great things in the field if customers kind of comparing that to other products that are offered out there. Can you kind of remind us about how you guys think about tracking success with the product internally, given that you don’t kind of charge for it directly beyond having millions of people using it?

[Answer] The metrics that we’ve been talking about on here is account activation. So looking at how many — it’s not individual users, it’s actual customer accounts that have activated it… And also they share the stories like how Zoom AI Companion like is very accurate summary, action items are helping their employees’ productivity as well. And yes, a lot of very positive feedback about adopting Zoom AI Companion.

Zoom’s management has intention to monetise AI services for the Contact Center product, but not for Zoom Workplace

[Question] Now that you’re seeing more adoption, Kelly, of Zoom Companion, how do you think about the cost of providing these generative AI features and capabilities? And do you think Zoom could eventually charge on a usage basis for power users of the generally just trying to weigh cost versus revenue opportunities here?

[Answer] I mean when we launched AI Companion, right? So we do not want to charge the customer. However, that’s for the workplace for the business services like a Contact Center, all those new offerings. And I think for sure, we are going to monetize. As I mentioned in the previous earnings calls, new — new solutions or the billing services, AI, I think we are going to charge. They are AI Companion, right? But the workplace and our core you see offering and collaboration offering we do not want to charge. I want to see — I really appreciate our AI team’s great effort, right? And focus on the quality, focus on the cost reduction and so on and forth.

AI services are a drag on Zoom’s margins at the moment (as the company is providing a lot of AI services for free now) but management sees them as important investments for growth

[Question] Just on gross margins, like the impact of generative AI and maybe what you can do to alleviate some of that off there.

[Answer] I mean we’re guiding to 79% for this year, which we will, reflects the prioritization of AI, but also the very strong discipline that we continue to apply. And we are holding to our long-term target for gross margins of 80%. But of course, we think at this point in time, it’s very important to prioritize these investments as they really set us up for future growth.

Zoom’s dev ops team is saving costs for the company to make room for more AI investments

I also want to give a credit to our dev ops team. On the right hand, for sure, we are going to buy more and more GPUs, right? And also leverage that. Our team tried to save the money from other areas, fully automated, and so on and so forth, right? So that’s another way for us to save the cost, right, to make some room for AI.

The regulatory environment for AI in the US and Europe has so far had very little impact on Zoom’s business because Zoom’s management has been adamant and clear that it is not going to use customer’s data to train its AI models
[Question] Are you seeing anything in the broad sweep of AI regulation in the U.S. or Europe that you think can dampen innovation?

[Answer] That’s the reason why we launch AI Companion, we already mentioned, we are not going to use any of our customer data to train our AI models, right? And we take customers data very, very seriously, right? And as a customer, they know that they trust our brand and trust of what we’re doing. And so far, I do not see any impact in terms of like regulation. And again, this AI is moving rapidly, right? So almost the EMEA here and we all look at the potential regulation. But so far, impact actually to us, to our business, I think it’s extremely limited. So like meeting summary, and it’s a very important feature, customer like that. I think we do not use our customer data to train our AI model. And so why not keep using the feature? I think there’s no impact so far.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Adobe, MongoDB, Salesforce, Veeva Systems, and Zoom Video Communications. Holdings are subject to change at any time.

What We’re Reading (Week Ending 15 September 2024)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 15 September 2024:

1. The ROI on Generative AI – Tanay Jaipuria

The poster child for this has been Klarna which leveraged AI to elevate their customer support. Their AI assistant has taken over the work of 700 employees, reducing resolution times from 11 minutes to just 2 minutes while maintaining high customer satisfaction levels…

...Microsoft casually dropped that they too are expecting to save hundreds of millions of dollars a year on call centers after adopting Generative AI.

“Dynamics with Gen AI built in is sort of really … the category that gets completely transformed with Gen AI, contact centers being a great example. We, ourselves, are on course to save hundreds of millions of dollars in our own Customer Support and Contact Center Operations. I think we can drive that value to our customers”…

…We’re also hearing examples of measurable, tangible benefits from enterprises, as Amazon shared about their software development assistant Q which has saved them over 4,500 developer years in a recent code migration task…

…”With Q’s code transformation capabilities, Amazon has migrated over 30,000 Java JDK applications in a few months, saving the company $260 million and 4,500 developer years compared to what it would have otherwise cost. That’s the game changer. And think about how this Q transformation capability might evolve to address other elusive but highly desired migrations.”…

…eBay launched a new AI-assisted selling flow, and are already seeing improvements in customer satisfaction as well as faster time to list and get value for Sellers…

…YUM Brands is enhancing customer experiences at Taco Bell by rolling out voice AI driven drive-through systems. This technology is not only improving customer satisfaction but also boosting team member productivity, and the results are so promising they are accelerating their roll-out timelines…

Manulife is an example of a company already seeing large ROI in using AI to assist salespeople…

…”We’re using GenAI and machine learning models to make it really easy for agents to understand customer opportunities but also to generate these personalized communications at the click of a button to help them engage with more customers more often.

In our first 2 weeks live, about 68% of our agents had already used the new GenAI capabilities. And in July, we will be broadening that user base to about 2,000.

Based on our analysis in Singapore, we anticipate a 17% uplift and repurchase rates for our customer base, when this is fully rolled out to all of our agents.”…

…Rocket Mortgage is utilizing AI to automating the transcription of client calls and completing mortgage applications…

…”Now the Rocket Logic Assistant seamlessly generates over 300,000 detailed transcripts every week from outbound calls. It supports over 100 data points on mortgage applications saving our bankers from inputting tens of millions of data fields each week.”…

…Walmart has harnessed generative AI to enhance its product catalog, improving the quality of over 850 million pieces of data…

…”We’ve used multiple large language models to accurately create or improve over 850 million pieces of data in the catalog. Without the use of generative AI, this work would have required nearly 100x the current head count to complete in the same amount of time.”…

Mastercard is leveraging the new advances in Generative AI to enhance fraud detection, achieving a 20% increase in accuracy. 

2. The Agent Era – Patrick O’Shaughnessy and Bret Taylor

Patrick

It’s such an interesting story because I think it becomes ultra relevant in today’s world. And you hear a lot about this, maybe the mythical 10x engineer, the 100x engineer, 1,000x engineer, the leverage available to one person with a growing tool kit.

And maybe that’s a great excuse to bridge the conversation into agents. I think everyone listening will have heard that term and maybe have thought about it a little bit, have gotten excited about the prospect of some sort of autonomous agent doing work on their behalf or their company’s behalf. But it would be great for you to ground us in your definition of what one of these things is, if this becomes a really critical part of the world of technology in the next year or two. I think it would be great for everyone just to have a level-set, simple definition from your perspective on what an agent is and does.

Bret

I’ll start with maybe the academic flavor of this, but then I’ll move into what I think is maybe the more — what I believe is the more relevant definition, but agent is like the word app. There’s not one definition, and I think it will be a noun that is quite meaningful in the age of AI. The word agent in the context of AI comes from the word agency and essentially is a system that can reason and take action autonomously is the way I think about it. And a system that is agentic is one where software and AI can reason and make decisions and take action without human intervention, which is really exciting but something that is relatively new though the idea is certainly not new.

I think the effectiveness of reasoning with AI systems has become so meaningfully better over the past couple of years that I think the concept is — like many parts of AI, the ideas are not new, but the effectiveness is, and so we’re living in an era of agents now.

In practice, I think the word agent, just like the word app or site in the age of the web, will become important to all of us. So one agent that I think is important is what my company Sierra does, which is your company’s conversational AI. And so just imagine you’re a retailer. I think you’ll put as much care and attention into your AI agent as you do your website or your mobile app. Or if you’re a bank, and you’ll put as much care and attention to your AI agent, which can help a customer look up the balance of their checking account or perhaps be an interface to your investment banking arm or wealth management arm. Or if you’re a streaming service, your agent might help people sign up for a plan or upgrade or downgrade their subscription, as an example.

In that case, an agent is something like website or mobile app that’s branded and it’s yours. And there are parts of it that are about agency and sort of the AI definition of the word. But more importantly, it’s your thing. It’s your digital asset. It becomes the digital manifestation of your brand.

And that’s what my company Sierra does. And we think that’s one really important part of an agent. Just like in 1995, the way you existed online was to have a website, we think in 2025, the way you will engage with your customers will be your AI agent, and we think it’s a really important new category.

But then taking, okay, what are the other types of agents out there? One will be, I’d like to think of them as persona-based agents. They’re internally facing. They do a job. You’ve talked about software engineering. I think there’ll be software engineering agents that will work to produce software. I was looking at a start-up called Harvey, I think, that’s making a legal LLM, which is super interesting. And I think across many job functions, there will be AI agents that produce the output of a — whether it’s a paralegal or a software engineer or an operations analyst, things like that. So that’s one.

So there’s your company’s agent, there’s a persona-based agent that does a job, and then the third one — category is probably personal agents. So this is the agent that will work on your behalf, whether it’s helping you plan a vacation or organize your calendar or perhaps triage your inbox and things like that. I think technically, they’re all similar, but my guess is they’re different enough in what job they accomplish for you that there’s — probably different companies will build those different categories of agent.

If you’re building a software to be a personal assistant agent, the breadth of systems you have to integrate with is infinite because different people use different calendars and different this and different that, and there’s lots of interesting investment into that. If you’re building a coding agent, it’s a much more narrow use case but very deep, and you’re probably evaluating it based on benchmarks of the effectiveness of the software produced and the robustness of the software it produces…

…Patrick

What do you think are the next most important unlocks for the power of these agents? You mentioned their access tools, access to the Internet. I’ve heard people talk about the ability to have some sort of stored memory about you, the customer or the specific customer or just memory in general that doesn’t just live inside of a context window that’s always re-fed in or something.

Are those the three things that we need to unlock the next tier of productivity out of agents? Are there other things that you and Sierra are focused on? I’d love to get down to the nitty-gritty capabilities and roadblocks that you’re thinking about and working on that might make these things as ubiquitous as you think they will be.

Bret

Yes. I’ll start with the vantage point of Sierra. We help companies build customer-facing AI agents. Today, if you’re setting up a new Sonos speaker, you can chat with an AI agent they’ve built on our platform to help you set it up. If you’re a SiriusXM subscriber, you can chat with Harmony, which is their AI agent they’ve built on our platform. And if you’re a WeightWatchers member, if you click on the 24/7 live coaching tab in their app, that’s an AI agent they’ve built on our platform.

One of the things that I think is a nuanced problem that is not strictly technical in nature is just the act of actually designing conversational customer experiences is a relatively new discipline. I remember in the early days of the Internet, most websites looked like DVD intro screens, like they’re very graphical, there’s four big buttons. It’s really interesting to go down the Wayback Machine and look at them.

And I would say it took a number of years to evolve into sort of the design idioms that we recognize with websites today. And now if you go to a retailer, they’ll have a hamburger menu on the top left, and the way you filter through items and these — they’re sort of emergent from people’s lived experiences, both designing and using websites.

And now you can talk to almost any web developer. And they’ll not only choose similar technologies to make a website, but even the design process and Photoshop or Figma to design a website, they’re sort of established practices, some of which are obvious and some of which are actually subtle, like why did this become the way these things are done, and it’s the cumulative experience we have building with them.

The difference between a website in a mobile app and an AI agent is both the breadth and non-determinism of AI agents. So if you have a menu on a website, you can control what links are there, and it’s essentially multiple choice, here’s the options available to you. If you have an AI agent with a free-form text box, people can type whatever they want into that. And so your concept of what your customer experience is defined by you, but it’s also defined by your customers, by what they write in there.

It reminds me — going back to my web analogies here, it reminds me of going from Yahoo Directory to Google Search. Rather than having a taxonomy of everything available, it’s just free form, and there’s a much longer tail of queries in Google than there was in Yahoo! because of the expressiveness of a search box versus a directory.

And I think that that’s one of the really interesting and, I think, exciting opportunities with conversational AI for customer experiences is it’s a really authentic way to actually hear from your customers what they want from you. And I think we’ve — and so it sort of stands to reason, your website was the rails on which your customers communicate with you. And this is a free form that I think it’s much more expressive. And we’ve had multiple customers learn things about their customers that they didn’t expect by providing this really free-form experience.

And then similarly, I think the other really interesting thing when I mentioned non-determinism is the word agent comes from agency, and it’s really how much creativity do you want to give your AI in interacting with your customers. I think if you start from a position of control, you can say, I want to put guardrails around everything, but then your conversational customer experience is somewhat robotic. You’ve essentially defined the multiple-choice options of your customers’ experience. If you give your agent too much agency, in the extreme case, it will hallucinate, but in the more practical case, it just might not protect your brand in the way that you want it to.

And I would say that design question is both a technology question, which obviously we’re quite invested in solving, and I’m really excited about some of the work we’ve done there, but there’s a deeper question here, too, that’s actually a philosophical branding and design question as well. And what we’re trying to do at Sierra is not necessarily predefining answers to those questions. I think every company and every brand will have a different perspective on what’s correct for their brand experience but provide a platform that’s powerful and expressive enough. Whatever your answers are personally to that question, you can build your agent on Sierra.

Patrick

It’s so interesting to think about the customer experience going to a website where I buy shoes or something. I think one of your first customers was flip-flops, and there was a funny story around that, but I’m going to buy a pair of sandals, let’s say, on a website. And rather than click around, I just describe what I want and I can imagine like another pane on the right just starts showing me stuff. And then maybe I check out through this same thing as well, and that’s a simple version of tooling or ability to take action.

I’m curious what the hardest parts for you have been to build. It’s quite technically daunting to even think about how to build something like this, let alone one that’s adjustable and tunable to my specific brand. So talk a little bit about how hard of a technical challenge this is for Sierra, like the degree of difficulty you’ve encountered relative to, say, your expectation.

Bret

Yes. It’s a really wonderful question. I think that generative AI broadly is a technology with which it’s very easy to make a demo and very hard to make an industrial-grade system. And I think that’s the area of technical challenge that we’re really trying to dive into. And I think it’s one thing to say that this system does the correct thing 90% of the time. And it’s really an inkblot test whether 90% is a really good number or a horrible number.

And it also depends on the process. And so if it’s a consumer application that was helping you with your homework, maybe 90% is decent. If it’s something operating revenue impacting part of your business or there’s a compliance concern, it’s absolutely unacceptable to be wrong 10% of the time.

And so a lot of the challenges that we’re facing are, we like to say that software systems are moving from rule-based to goals- and guardrails-based. And it’s a very different mental model for building software systems. Rule-based systems, if you think about just the software development life cycle that’s evolved over the past 20 years, it’s really about how you make more and more robust rule-based systems, how do you ensure that the same input produces the same output, that it’s reliable, that it’s stable, and there’s a lot of true innovation in the way we make software to make them more secure and robust.

Now if you have parts of your system that are built on large language models, those parts are really different than most of the software that we’ve built on in the past. Number one is they’re relatively slow compared — to generate a page view on a website takes nanoseconds at this point, might be slightly exaggerating, down to milliseconds, even with the fastest models, it’s quite slow in the way tokens are emitted.

Number two is it can be relatively expensive. And again, it really varies based on the number of parameters in the model. But again, the marginal cost of that page view is almost zero at this point. You don’t think about it. Your cost as a software platform is almost exclusively in your head count. With AI, you can see the margin pressure that a lot of companies face, particularly of their training models or even doing inference with high-parameter-count models.

Number three is they’re nondeterministic fundamentally, and you can tune certain models to more reliably have the same output for the same input. But by and large, it’s hard to reproduce behaviors on these systems. What gives them creativity also leads to non-determinism.

And so this combination of it, we’ve gone from cheap, deterministic, reliable systems to relatively slow, relatively expensive but very creative systems. And I think it violates a lot of the conventions that software engineers think about — have grown to think about when producing software, and it becomes almost a statistical problem rather than just a methodological problem.

And so that’s really what we’ve tried to solve. We shared on our website, but we have a process we call the agent development life cycle, which is the name comes from, say, in the software development life cycle, here’s what you should do with these agentic platforms. It’s also — we’ve developed a lot of unique technology to make these systems more robust with having one AI model supervise another AI model to layer different models on top of each other to produce statistically more robust results.

And then as importantly, we’ve developed ways that folks who aren’t experts in AI can express the behavior that they want in their agent. You shouldn’t have to be an AI expert to make an agent just like you shouldn’t have to have a PhD in computer science to make a website. I don’t think we’re there yet, but that’s really what we’re trying to solve.

And broadly speaking, I would say, on the spectrum of fundamental research institutions like OpenAI, we’re not that we’re applied. We’re really thinking about how do we engineer on top of these foundation and frontier models to produce robust or reliable agents for our customers.

Patrick

I love the title of this one Kevin Kelly book, What Technology Wants, and I’m curious what agents want. If I’m a customer, I’m a prospective customer, and I want to go work with Sierra to make the best possible version of a conversational agent for my customers to use, what can the companies provide that make the agent do the best job?

Bret

Yes, it’s a great question. I would say that there’s two types of knowledge that I think really produce a really robust agent. One is the factual knowledge of your company. This just grounds the agent so that it won’t just make something up.

There’s a pretty widely-used technique called retrieval augmented generation in AI right now that effectively means rather than relying on the knowledge encoded in the model to answer questions, you present the model with knowledge, maybe stored in a knowledge base or a database and say, “Hey, summarize the content from here. Don’t rely on the information you’ve been trained on.”

That has been an effective technique for two reasons. One is that it means that you don’t necessarily need to train or fine-tune a model to use it with proprietary data, which is a much cheaper deployment methodology. And it also can be effective at preventing hallucinations as well because you’re effectively — rather than relying on the AI to determine what it knows or doesn’t know, you present the AI with the knowledge that it’s allowed to know, a simple way of putting it.

And that’s factual knowledge. And I would say that’s necessary, but woefully incomplete because that would enable your AI agent to answer questions, but it wouldn’t necessarily enable it to orchestrate a complex process or take action on your customers’ behalf.

The other type of knowledge is procedural knowledge. We have a Sonos speaker. It stops working. What would the best Sonos engineer ask you and do to figure out whether it’s a problem with your hardware, a problem with your Sonos app or a problem with your Wi-Fi? Like what is the process by which you do that.

If you’re a subscription streaming service, what is the process of processing an upgrade or downgrade to your membership? Are there different offers available based on your membership level? Do you have a promotion running? What’s been the most effective technique to keep people, a subscriber for a long period of time?

This is all the stuff that if you are a person and expert in it, and so coming in with that knowledge of not only here’s the factual knowledge for our company, but here’s the processes that represent our greatest customer experience. What does the best salesperson do? What does the best customer service person do? What is — the most effective marketeer at your company, how do they describe your products? And that’s often there we work with our customers to improve when they deploy AI.

And then the third thing is just access to the underlying systems themselves. I think the AI agents shouldn’t just be about answering questions or having a conversation, they should actually be able to take action on your behalf, whether that’s a retailer processing a return or a subscription service, changing your level of membership or connecting to the telemetry system of a consumer electronics company. So we can say, “Hey, we know your device phoned home. You’re connected. We now figured out this other problem.”

Or even with something like SiriusXM sending a signal down from a satellite to refresh your radio if your radio stopped working. So three ingredients, factual knowledge, procedural knowledge and systems integrations, I think, are the three key ingredients. And then with the right methodology, your agent can do anything that a person could do on a computer, which is just an incredible opportunity for customer experiences.

3. Here’s What Happens When Credit Markets Go Dark – Joe Weisenthal, Tracy Alloway, Jared Ellias, and Elisabeth de Fontenay

Joe (11:53):

You spell out this evolution of the debt markets and the historical things you’re taught in law school about the dangers of single lenders. We’ve talked to people in the industry and they have their explanations for why this particular market has boomed. But from your research, what would you say are the drivers of this? Or when you talk to people, what problems does the private credit market solve for them?

Elisabeth (12:19):

The interesting thing about this is that there’s multiple stories going on at the same time. So one is that, this is just actually substituting for a lot of the activity that banks did because the banks, ever since the financial crisis, have been really constrained for a lot of reasons. One, they’ve primarily been constrained because of regulation, and sort of regulation designed to discourage them from making risky loans and from, you know, to have diversification in their portfolio, and so on. And just their evolving model of doing business, that they prefer to be sort of the middleman and get some fees rather than lend directly. [There are] all kinds of reasons why banks have retreated from particularly the lower middle market, but also all the way to the largest companies. A second story is just that there’s been too much bank regulation. So, I’m not going to take a position on whether that’s true or not, but that bank regulation is stifling the banks and they can’t really lend and so on.

A third story is one that we find really interesting and appealing, which is that, it may just be that it never really made all that much sense to fund loans using bank deposits. That essentially, you have a very short-term liability, which is customer deposits, and very long-term assets. So some of these loans, of course, are multi-year loans. And that’s just a fundamental mismatch that banks have always struggled with and that bank regulation has always struggled with. And this is a really nice, neat solution to that. And the reason it’s showing up now is that, thanks to sort of loosening of some of the securities laws and other things, it’s finally the case that you can get these investment funds that are big enough to actually take over the role of banks. And for them, the sort of positive side of private credit is that you now have a better match between the funding source, which is you have these big institutional investors putting capital into private credit funds that is locked in for a number of years, and you’re matching that really well against the loans that are also multi-year. So in some sense, it’s actually a better fit than banks for financing this type of loan… 

...Joe (20:43):

It sounds pretty good to me. Okay, so there’s less legal fees, less creditor on creditor violence, liability asset matching, the better user experience. So what’s the catch? I don’t see any problems.

Elisabeth (20:57):

One potential problem is, of course, these are, in some cases, absolutely massive loans. And so you do lose diversification benefits. These are very risky investments. I would say, the private credit structure has a partial solution to that problem, which is that, the investors themselves in a private credit fund oftentimes are so massive themselves that they really don’t lose diversification, which is to say, their portfolios are so large that they can make this enormous investment in one private credit fund because that’s a tiny piece of their portfolio. So that’s one downside of private credit. The other of course, is the absence of trading. So before. you had pretty good signals of what your position was worth. There were lots of syndicated loans that had pretty active trading and there were indices tracking all of this. The [Loan Syndications and Trading Association] LSTA provides lots of data on the loan market, and, of course, the bond market is public in terms of the pricing there. So exit is always going to be a concern in this market, and I don’t think this market really has been truly tested yet. So we’ll have to find out. But that illiquidity can be an issue depending on what kind of investor you are and what your expectation is for getting out of these things…

…Tracy (30:14):

Just to play devil’s advocate for a second, I think this is something you actually deal with in the paper, but one of the things you hear from people in the private credit industry is tha, ‘Oh, well, if you’re getting funding from a private entity, maybe a single lender or maybe a club of lenders but it’s a smaller group than you would have in the public market, maybe there’s greater potential for working out your issues if you get into trouble. So you can renegotiate your debt with a smaller group of creditors and maybe they know your business better than like a big fund that is buying pieces of all these different types of bonds and things like that.’ What’s your response to that argument? This idea that, well, private credit actually allows you to have more room for workouts or maybe even stave off bankruptcy for longer?

Jared (31:09):

So, I guess my answer is that, that all sounds great, but it’ll depend. And it’s hard to really understand which way any of these forces cut. The one thing that’s clear cut, that’s important is, we’re losing the claims trading markets. Like, that’s just going to look a lot different. Like, the active market and the claims of Chapter 11 debtors, when that debtor is a private credit funded firm. But, as to the question of, ‘Well, you know, aren’t these private credit lenders smarter, more versatile, more nimble, able to commit capital? And won’t that be good for companies?’ You know, at the end, it depends. So something you worry about is, well, maybe private credit lenders will have incentives, not to adjust their marks on their books and instead, just to do ‘amend and extend’s, and just keep loans going when the company really needed to liquidate or should have filed for bankruptcy sooner.

Think about how different the GM bankruptcy would’ve been had they filed for bankruptcy in like 2005 versus 2009 when their business had already eroded so much. So we think of that erosion as something that limits reorganization options. And it’s not necessarily obvious how private credit interacts with that. Because private credit lenders have their own incentives and maybe their incentives are to say, ‘Look, we make loans to sponsor backed companies and if the sponsor wants to continue, we’re going to keep doing that because we really want to participate in their next deals.’ Or they could say like, ‘Let’s pull the plug on these things earlier.’

So something that I’ve heard from lawyers working in this space is that when private credit lenders replace like your mid-market banks, like your Citizens and that kind of bank, when you have like a private credit lender with a $30 million loan that might have been done by a syndicate of two regional banks, the private credit lenders are much more aggressive and much more willing to pull the plug on the company and to own the asset then that bank might have been, but the world could look very different for larger companies where private credit lenders might be easier for companies to do workouts with. So it’s really hard to tell. But I’m certainly a bit skeptical of the idea that all of this is unidirectional and the private credit is just better in every way for everything. It’s different and there’ll be different pros and cons and we’ll learn more about them, and the law will adapt and hopefully deal with some of the ways in which the incentives of private credit lenders distort bankruptcy outcomes.

Tracy (33:28):

Since you mentioned GM, could you maybe talk about another specific example of a liquidation playing out a bit late, as you describe it? I’m still salty over the collapse of Red Lobster, which you mentioned in your paper. So could you talk a little bit about that one and what it tells us about private credit?

Jared (33:47):

Sure. So, something that has been the case over the past few years is you’ve had private equity owned restaurants and retailers that just ended up doing quick liquidations after stalling for a very long time. Red Lobster is really interesting. Red Lobster had been struggling for a little while and then Fortress Investment Group, which was its private credit lender, came in and took over the company and basically just owned the asset very quickly. And something that is so interesting about that is that, traditionally, other lenders would’ve been a lot more cautious about doing that, because other lenders are very cognizant of what we call ‘lender liability’ and this line of law that suggests that you shouldn’t, if you’re a lender, play too much of a role in business decisions of companies that you lend to.

And like, there’s an example of like a private credit lender just behaving in this really aggressive way, which is interesting. Like, again, it’s hard to tell exactly what’s going to happen, but certainly that example doesn’t fit well with the story of, well, you know, the private credit lender is just like the banker and you know, it’s your corner bank in 1925, who’s going to work with you on your farm. The answer is, maybe some of the time that’s the story, but other of the time, you’re dealing with a very sophisticated party who may have different incentives and be worried about different things than traditional bank lenders or investors in the broadly syndicated market.

4. Flash Crashes Are Getting Faster – Ben Carlson

In the spring of 1962, the stock market was already in the midst of a double-digit correction. Then on May 28, there was a flash crash, sending stocks down nearly 7% in a single day. It was the biggest one day sell-off since the Great Depression…

…It’s becoming clearer by the day that last Monday’s stock market swoon was also a flash crash. As of August 5, the S&P 500 was down more than 6% for the month. It’s now positive in August…

…Flash crashes happened in the 1920s, they happened in the 1960s and they happen today.

The biggest difference between now and then is the interconnected nature of the global markets. You have computer and algorithmic trading. Information flows at the speed of light. Every piece of economic data is parsed in real-time with a fine-tooth comb.

Overreactions can happen much faster now.

Just look at the biggest gap downs over the past 40+ years:

This chart shows the biggest difference between the opening price of the stock market and the prior day’s close. All of them have occurred this decade outside of the 1987 crash…

…We are likely to see more of these flash crashes in the future due to a combination of increased leverage in the system, globalized markets and computer trading.

The hard part for investors is that it’s now easier to lose control during these types of market events. You don’t have to call your broker on the phone to place a trade. You can change your entire portfolio on your phone with the push of a button.

Just because markets are getting faster does not mean your decisions must be made faster.

5. Gaining Currency – Rachel Cheung

In its effort to cement its role as an innovation powerhouse, China’s most ambitious technological debut was also its most controversial: The digital yuan was rolled out as the legal tender of choice for the Olympic games. Instead of cash or Visa (the corporate sponsor that had dominated the sports event for three decades), visitors were encouraged to exchange foreign currencies for digital yuan at automated teller machines and to pay digitally through the e-CNY app on their phones or through a card that can be used offline…

…Yet, despite all the attention, the launch of the digital yuan largely fell flat. The COVID-19 pandemic meant Olympic visitors were confined to “bubbles” with little opportunity to travel, shop and dine out, and very few foreigners chose to use the digital yuan over their credit cards. Beijing saw just $315,000 in digital yuan processed every day over the course of the games — a small fraction of the usual revenues at the Olympics. At the 2008 Olympics in Beijing, for instance, the city generated roughly $264 million per day…

…But while China acknowledged its Olympic failure, it has also quietly doubled down on the digital yuan, including a big push to drive adoption. Last year, several cities began paying civil servants and collecting taxes in digital yuan. Jiangsu province saw the most recorded transactions in the country after it gave away 30 million yuan ($4.18 million) in digital “red envelopes.” And this past May, the digital yuan expanded for the first time outside of mainland China when it became available for use in Hong Kong. Though there is no timeline for a nationwide launch yet, China has rolled out pilot schemes in 26 cities and 17 provinces since 2019.

The efforts have paid off. In a press briefing last week, the PBOC announced that total transactions reached $7 trillion yuan ($982 billion) in June — a four-fold jump since last June.

Digital yuan usage is still only a fraction of China’s $40-trillion payment market, of course. The total number of e-CNY wallets opened — 120 million as of last July — also trails behind that of Alipay, which had over a billion users by 2020 and recorded $118 trillion worth of transactions in one year alone.

But as Beijing continues to crackdown on its fintech giants, it is creating room for the digital yuan to rise. In fact, officials see the transition to digital currency as both necessary and inevitable. According to Yi Gang, former governor of the PBOC, the current moment of transition is not unlike that of the Ming Dynasty, when the government started taking tax payments in silver instead of labor and grains. China’s currency has evolved with time, he said during a speech at Fudan University in April, and “the digital yuan is no exception.”…

…Officials are also trying to expand the scope of e-CNY beyond consumer retail transactions. The Bank of China, for instance, has tested the use of “smart contracts” for afterschool programs in Chengdu of Sichuan province: Parents can pay a deposit in e-CNY to educational institutions, and the latter only receives the money after the lessons are taken.

These business-to-business and government programming applications could be a “game changer,” according to Warwick Powell, a senior fellow at Taihe Institute, a Beijing-based think tank, because they “ensure that the provision of certain funds can only be used for certain activities.”

Yet that same function triggers concern for others. For instance, although some local governments and banks have offered loans in e-CNY, companies are reluctant to take them, says Yang You, a finance professor at University of Hong Kong. “The nature of e-CNY is that a policymaker can generate a loan and see where it flows to,” says You. But companies, he notes, would much prefer non-traceable loans, despite repeated assurances from the People’s Bank of China that it will not hold information against them…

…Instead, the PBOC says the digital yuan follows a principle of “anonymity for small value and traceable for high value” as a way of striking a balance between privacy protection and combating criminal activities, such as tax evasion and money laundering. The e-CNY wallet, for instance, requires users to undergo a more complex verification process in order to unlock higher transaction limits…

… If anything, the search for an alternative to the U.S.-backed Swift, the global messaging network for the banking system, has gained momentum since the U.S.-led sanctions on Russia.

“China has used the sanctions as a reason to advance the cause of de-dollarization,” says Elizabeth Economy, a senior fellow at the Hoover Institution at Stanford University and recent advisor to the Department of Commerce. “It has made the case that the United States is weaponizing the dollar, hence other countries should begin to trade in their own currencies. It’s actually a deft diplomatic move on the part of China.”

According to the Bank of International Settlements (BIS), a survey of 86 central banks last year showed a sharp uptick in experiments with “wholesale CBDC” — transactions between banks and other financial institutions, rather than consumers and businesses. In October, for instance, the e-CNY set a new milestone: At the Shanghai Petroleum and Natural Gas Exchange, the state-owned PetroChina used digital yuan to purchase a million barrels of oil from an undisclosed seller.

“There’s still a conversation about the e-yuan [for domestic retail transactions], but there’s more discussion about a regional payment system,” says Victor Shih, an associate professor of political economy at the University of California. “An alternative to Swift potentially has more legs.”

The oil purchase seems to be a one-off so far, but a new project called mBridge hopes to make such transactions routine. It is a collaborative effort between the “innovation hub” of BIS and the central banks of five jurisdictions: China, Hong Kong, Thailand, United Arab Emirates, and most recently, Saudi Arabia. 

Underpinned by distributed ledger technology (which records transactions in multiple places at the same time), mBridge aims to be a multi-CBDC platform that can support instant cross-border payments. The idea is to make international settlement faster and cheaper than Swift. But it also means things are not dependent on the U.S. dollar.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Amazon, Mastercard, and Microsoft. Holdings are subject to change at any time.

Does News Move The Stock Market?

If we’re constantly looking for news to explain short-term stock price movements, how often can we be right?

A great book I started reading recently is Making Sense of Chaos by economist J. Doyne Farmer. In the book, Farmer discusses his ideas for understanding economies through the lens of complexity science, which is the study of complex adaptive systems. The book referenced an interesting academic finance paper published in 1988 titled What Moves Stock Prices. The paper, authored by David Cutler, James Poterba, and Larry Summers, investigated the influence of news on stock prices.

Farmer described their work as such:

“Cutler, Poterba and Summers began by finding the 100 largest daily fluctuations in the S&P 500 index between 1946 and 1987. They then looked at the New York Times on the day after each move and recorded a summary of the paper’s explanation for the price change. The authors made a subjective judgement as to whether these explanations could plausibly be considered ‘real news’ – or at least real enough to have triggered a sizable change in stock price.”

The largest daily move in the paper’s dataset occurred on 19 October 1987 – now famously known as Black Monday – when the S&P 500 fell by 20.5%. Interestingly, there was no substantial news to explain the collapse. Farmer mentioned in his book:

“The explanations for the 20 per cent drop on October 19, 1987, were ‘worry over dollar decline and rate deficit’ and ‘fear of US not supporting dollar’. Cutler, Poterba and Summers didn’t classify this as news, and I agree. ‘Worry’ and ‘fear’ are subjective statements about the emotional state of the market that have no specific reference to external events.”

Farmer went on to mention:

“Of the dozen largest price fluctuations [shown below], only four were attributed to real news events, a ratio that they found also roughly applied to the largest 100 moves.”

In other words, as I have suspected to be the case for as long as I have been investing, stock prices are indeed more often than not driven by factors outside of the news. I find this to be an important trait of the stock market to know because if we’re constantly looking for news to explain short-term stock price movements, we’re likely to be wrong often, and this can impair our investment decision-making process.

The twelve largest daily price fluctuations in Cutler, Poterba and Summers’ dataset for What Moves Stock Prices:

  1. Date: 19 October 1987
    • Daily change: -20.5%
    • Explanation given: Worry over dollar decline and trade deficit; Fear of US not supporting dollar
  2. Date: 21 October 1987
    • Daily change: 9.1%
    • Explanation given: Interest rates continue to fall; deficit talks in Washington; bargain hunting
  3. Date: 26 October 1987
    • Daily change: -8.3%
    • Explanation given: Fear of budget deficits; margin calls; reaction to falling foreign stocks
  4. Date: 3 September 1946
    • Daily change: -6.7%
    • Explanation given: “… no basic reason for the assault on prices.”
  5. Date: 28 May 1962
    • Daily change:-6.7%
    • Explanation given: Kennedy forces rollback of steel price hike
  6. Date: 26 September 1955:
    • Daily change: – 6.6%
    • Explanation given: Eisenhower suffers heart attack
  7. Date: 26 June 1950:
    • Daily change: -5.4%
    • Explanation given: Outbreak of Korean War
  8. Date: 20 October 1987
    • Daily change: 5.3%
    • Explanation given: Investors looking for “quality stocks”
  9. Date: 9 September 1946
    • Daily change: -5.2%
    • Explanation given: Labor unrest in maritime and trucking industries
  10. Date: 16 October 1987
    • Daily change: -5.2%
    • Explanation given: Fear of trade deficit; fear of higher interest rates; tension with Iran
  11. Date: 27 May 1970
    • Daily change: 5.0%
    • Explanation given: Rumours of change in economic policy; “… the stock surge happened for no fundamental reason”
  12. Date: 11 September 1986
    • Daily change: -4.8%
    • Explanation given: Foreign governments refuse to lower interest rates; crackdown on triple witching announced

Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have no vested interest in any companies mentioned. Holdings are subject to change at any time.

What We’re Reading (Week Ending 08 September 2024)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 08 September 2024:

1. #360 Robert Kierlin: Founder of Fastenal – David Senra

What’s the most important part of Fastenal’s success that outsiders discovering the company for the first time don’t understand? The number one thing is the people aspect. The goal is to unleash entrepreneurial passion, a commitment that I will be self-driven to do better than what you can expect. It is a mindset.

This is what they’re telling their employees, “Run your business like you own it.” When you trust people to solve problems and make decisions and you let them go, that’s where the magic happens. That is the story of this company. Fastenal embraces a spirit of radical decentralization in autonomy.

“Each of its 2,700 stores operates as a stand-alone business with a clear leader and full P&L responsibility. We grow from the ground up based on the actions and decisions of thousands of people who run their businesses like they own it. I want those people to stay with us forever. They will never have to stop at a certain level.”…

…“We are now one giant organization. We are 2,700 small businesses wrapped up into one big company. Society tells us, you’re a big company, act like it. We say no, you don’t get to define us, we define ourselves. We’d go against the grain in almost everything that we do.” And so there’s just crazy stats about the company that, “More than 95% of our current batch of general managers have been promoted from within”.

“For nearly all of our senior leaders have worked their way up from entry-level positions.” So that leads into the second thing I want to tell you about, which is this interview with the CEO that had succeeded Kierlin or Bob — we call him Bob, when he stepped down. His name is Will Oberton.

Oberton started out at Fastenal as a store clerk. So he went all the way from — they mean business. He went all the way from store clerk to CEO…

…“By keeping operating costs very low, Fastenal is able to pay their employees incrementally higher wages, and thus, more effectively develop and retain talented salespeople. The quality of service and depth of knowledge that the employees have eventually brings in more revenue, which grows the business and allows it to further lower operating expenses as a percentage of revenue, thus allowing for more hiring of top-quality employees, which brings in more revenue. This is an overlooked virtuous circle of sorts.”…

…Why is it so important to have everybody working as a single cohesive team, to have everybody thinking that their role that they’re playing is just as important as the person next to them? “At Fastenal, we believe that you could be the best salesperson in the world. But if the order-picker doesn’t pick it right or the truck driver doesn’t get it there on time or the billing clerk doesn’t bill it correctly, you end up with an unhappy customer.”

“Everyone is key. You are better off working to make everyone equal so they stay focused” and he goes back, what do you think he’s about to say? I bet you can already finish his sentence for him. “You are better off working to make everybody equal so they stay focused on the common goal of pleasing the customer.”

[00:26:01] He’s going to give us more advice on how to do that. You need to install a reward system that keeps everyone focused on the common goal. He’s talking about incentives. If you have Fastenal’s common goal of growing our company through customer service, you will avoid any rewards that don’t fit that goal. And so when I got to this part of the book, I thought about Charlie Munger’s like three rules for incentives.

And so this is what he said, “Number one, everyone underestimates the power of incentives. Number two, never ever think about anything else before thinking about the power of incentives. And number three, which Bob is nailing, the most important rule in management, get the incentives right.”

And again, you have to be careful of these subgroups that are going to naturally develop in your company because his whole point is like “Listen, your incentives have to — they have to fit your overall common goal,” right, the common goal of pleasing the customer. And so he gives us an example, “If you do these incentives based on like separate groups, they can optimize for things that go against your common goal.”

So he gives an example that this is a really smart idea. “We do not reward production people for minimizing scrap. If some of that scrap you eliminate comes from the extra parts that guarantee you have a full order quantity ready when the customer wants it.” The incentive superpower that Munger talks about, you clearly see by picking up the book…

… I want to go back to that story of the CEO that was meeting with Buffett, the CEO that succeeded Kierlin. So this giant part of Fastenal’s business now after, this was invented after this book was written, okay, the first version in 1997, was the fact that they have these vending machines.

And the way I think about the vending machine is like think of anytime you’ve been in like a hardware store, right? You’ve got ACE Hardware or Home Depot or anywhere else. And think about how all the equipment and supplies are presented, kind of like searching through, it’s kind of like a chaotic mess.

So Will Oberton, which was the former CEO, but he’s no longer CEO now, but he’s the one that was CEO after Bob, okay? Oberton also developed an industrial vending machine system. There’s a video on YouTube that’s fascinating about this. It’s from Fastenal. Fastenal has their own YouTube channel. You can see the vending machine if you just type in Fastenal vending machine, if you’re interested in this, I thought it’s actually cool.

Oberton developed an industrial vending machine, and I searched for it after I read this because like I got to see what this looks like. Oberton had developed an industrial vending machine system, helping Bob realize a lifelong goal. In 1951, as a 12-year-old working in his father’s auto parts store, Bob was bothered by the fact that his dad had to send customers searching for nuts and bolts to someone else’s store.

He imagined that a vending machine installed at his father’s place might pop out fasteners like gumballs. Once on his own, he tried to convert a cigarette vending machine to this purpose. He couldn’t get it to work. So he started selling fasteners over the counter. Thus, Fastenal was born. 40 years later, working with a snack machine manufacturer and off-the-shelf software, Will Oberton got the job done.

Fastenal’s vending machines have been a big hit with customers. So their vending machines are actually installed in their customers’ locations. It cannot get simpler for this. You got to watch the video, I’m telling you. Oberton got the job done. Fastenal’s vending machines have been a hit with customers, generally helping them save 30% on supplies.

[00:46:05] The machines have cut down on theft and enabled automated reordering. That 4-year-old business, which I think now is like 15-years old, within 4-years old, this new idea already started contributing to 36% of the overall sales of Fastenal, I think it’s like over 40% now. 

2. A French Bank Like No Other in Europe Seeks to Export Its Model –  Phil Serafino and Albertina Torsoli

Bpifrance is a bank like no other in Europe.

The French lender has made more than €50 billion ($56 billion) in loans to small and mid-sized businesses and has €52 billion in stakes in almost 1,000 companies. It has backed everything from a startup wanting to take tourists to the edge of space in balloons and a chain of trendy Parisian nightspots to the automotive giant Stellantis NV. A force to reckon with on French deals for M&A advisers like Goldman Sachs Group Inc. and JPMorgan Chase & Co., it has lured away bankers from firms like UBS Group AG and Rothschild & Co…

…No other European country has an agency quite like Bpifrance: a for-profit, state-owned merchant bank with a mandate to foster national champions. Its wide-ranging lending activities are financed largely by borrowings guaranteed by its ultimate backer: the French taxpayer. And for all the political turmoil at the moment in France, its interventionist policies are likely to find favor no matter which coalition — from the left or the right — ends up forming a new government.

More than a decade after it was created under then-President François Hollande and his economic adviser — one Emmanuel Macron — Bpifrance exemplifies 21st-century French capitalism: Entrepreneurs build businesses with cash, nudges and nurturing from the state, which in turn wants them to create jobs at home and develop innovative technologies. Explicit in the deal: The government will fend off foreign interlopers if necessary…

…Bpifrance’s investment prowess and risk management haven’t really been tested because Dufourcq hasn’t faced a prolonged economic downturn, enjoying a favorable wind at his back almost from the start — even during the pandemic, when the French state opened the cash taps to prevent businesses from going under.

Its stock-picking bets also haven’t always paid off. A stake in train-car maker Alstom SA, for example, has lost about a third of its value since the investment early last year. Shares of Stellantis, in which the bank has a 6.4% holding, have slumped about 45% from their peak in March as the carmaker struggles to fix problems at its US and European operations.

Also, for much of the bank’s existence, it could finance itself at rock-bottom interest rates, something that’s no longer the case. A slowing economy and higher rates also may start to hurt companies that borrow from the bank: Bpifrance’s loans classified as doubtful stood at 4.7% at the end of 2022, up from less than 4% in recent years, according to the bank’s annual reports. It didn’t disclose the statistic in its 2023 report.

Dufourcq shrugs off such concerns, noting that the three decades-old agencies that combined to form Bpifrance survived some deep financial crises, and says his bank often says no to risky investment proposals.

While some European countries have national development banks, Bpifrance is unusual for the breadth of its offerings. It operates 50 offices around France, often sending representatives door to door to drum up business. In addition to debt and equity investments, it offers financing and credit insurance to exporters and training and consulting services to entrepreneurs — including on how to shrink their carbon footprint.

3. How Richmond Fed President Tom Barkin Sees The Economy Right Now – Cale Brooks, Tracy Alloway, Joe Weisenthal, and Tom Barkin

I talked to someone from Germany yesterday — this is going to make me interested but not your [audience]. Our savings rate went up at the beginning of Covid to about 15 or 16%. Same thing happened in Germany. Our savings rate has come down to about three and a half. Theirs is still at 17. So, why are German consumers not spending the way American consumers are? That’s an interesting topic. It’s something we spent some time on. It’s the kind of thing we spent some time on…

Tracy (08:01):

What’s the theory?

Tom (08:04):

Well, so the thing that really makes it crazy interesting is, there’s a whole social safety net in Europe that doesn’t exist here. And so, most of the time you think people are saving for retirement, they’re saving for a rainy day, or they’re saving because they’re worried about losing their job. Well, in Germany, they kept everyone’s job during the pandemic and you’ve got a pension. So why are they saving? And I think the best explanation I’ve gotten, it’s actually something on my list to study going forward, is there’s just a lot more precautionary feeling about the situation in Europe, the risk versus the Ukraine, and what’s happening over there. And it’s just a culture that maybe has just gotten a lot more cautious due to geopolitics if nothing else.

Joe (08:43):

That does sound really interesting. By and large, I mean obviously, the situation in the Argentina economy is radically different than it is here in the US. Germany is probably still, all things considered, similar cyclically to the us. Does it feel like, by and large, at least among developed countries’ central bankers, that there is a strong set of common mysteries perhaps? Or are they really like, everyone’s sort of seeing different things in their own country? I mean I’m sure it’s a mix of both, but how much of a global factor is there?

Tom (09:16):

Much more in common than different. The whole practice of central banking has been, I’d say, globalized over the years. And central bankers really do think about inflation targeting, for example, in the same ways. And there are banks, like New Zealand and Australia, that, back in 2000 or even before that, set inflation targets before the rest of us and we learned from them. And so there’s a lot of learning, there’s a lot of discussion. I think there’s very much a common framework. Now, the economies are very different.

I mean, the US economy has come through this unbelievably well, the European economies have not. And so we have a much stronger economy. So much of our economy is services, so much is supplied to ourselves. A lot of this deglobalization is felt much more on the European side. The challenges in China right now are felt much more on the European side. And then emerging market countries, they really just are worried we’re going to increase rates further and they’re going to end up offside. And so, they’re very dependent on our strength of our dollar and the weakness of our dollar…

…Tom (11:43):

I think the economy, since we were together three or four months ago, the economy’s moved in a very different way. First of all, on the inflation side, I might’ve even said four or five months ago I was looking for inflation to sustain and broaden. So, it’s sustained. We’ve got very low readings for four months in a row. And it’s now across the basket, whereas six months ago [or] eight months ago, it was really just in goods. And so the concern about inflation, reaccelerating has definitely come down significantly. At the same time, the labor market stats have also softened. And so, the phrase I’ve been using is, ‘people aren’t hiring but they’re not firing,’ and that’s just not a high likely sustainable outcome. Either demand will continue and people will start hiring again or you’ll start to see layoffs. And so I think there’s more concern on the labor market and less concern on inflation relatively…

…Tom (13:00):

So consumers, you hear a lot of talk about people saying that consumers [are] weak and people are running out of savings. That’s not what I’m hearing. What I’m hearing is consumers are still spending but they’re choosing. And, the way I think about it is, they now have the time when they go into a store and they see something that’s at a price they don’t like to say, ‘I think I’m going to do something else.’ And so if you look at Walmart’s results, they would talk about people trading down. If you look at Target’s results, they talked about the kind of reaction they’re getting to lower prices. McDonald’s results in the $5 value meal. I’ve talked to hotel chains that every room is booked, but they can’t raise price at all because the second they raise price, people just won’t buy it and won’t book it. I talked to a fast food leader who’s rolling out software actually to encourage their franchisees not to raise prices anymore…

…Tracy (15:15):

What’s the urgency, then, on supporting the labor market? And there’s obviously a debate going on right now about how fast deterioration in that market actually happens. We had Claudia Sahm on the podcast recently and she was talking about, ‘maybe it’s different this time,’ but how are you thinking about the pace or the rate of change in the labor market?

Tom (15:37):

So the other thing that’s happening in the labor market is a lot more supply of labor, and part of that is participation, prime age participation hitting 2025 year highs, and immigration, which is up significantly. And so the last jobs report where unemployment went up from 4.1 to 4.3, you actually added jobs, 114,000 jobs. We just added 420,000 people to the workforce. So the denominator got bigger. And so, you know, there’s some people who look at the unemployment rate and say, ‘Oh my gosh, the labor market’s about to fall off a cliff.’ That’s not how I see it. I see a loosening labor market being driven by a lot more supply. Now, what’s the urgency? We’re not in a situation, I don’t believe, where there is this big cliff there, but when we make policy, you’re trying to make it for a year from now, right? Because [of] the lags of monetary policy, you’re trying to meet a year from now.

And so you’ve got a labor market which is slowly cooled and you’ve got inflation which is now gradually cooled. And so, you sort of say, ‘Well, which do I worry most more about?’ And it’s been very clear for the last two and a half years that all you worry about is inflation. And now those are much more balanced…

…Tom (21:57):

Well, I see inflation upside risk in two places. First is, we’re at 2.5% for the last 12 months. Our target’s 2%. So while we’re doing great at bringing it down from when it was once 7.1%, core is still at 2.5%. And even the most optimistic forecast for the back half of this year don’t believe it’ll get to 2% because the numbers were so good on a 12 month basis…

Joe (22:20):

We’re talking we’re about on a year to year basis as opposed to like a three month though sequential, yeah okay.

Tom (22:22):

On a 12 month basis. Because the last half of last year was also very good. And so, we’re at least six months away, even with really good inflation data, from the inflation numbers hitting 2%. And if the numbers are just pretty good, not really good, there’s a risk that we plateau at some level over 2%. That’s one risk. The other risk is I do see medium term inflation pressures that are out there. We have a conflict in the Middle East that could spiral. Deglobalization is a very real risk and that means that the imports of goods could be more expensive going forward, or if we even reshore, more expensive. Housing’s a place where, if rates artists start coming down, one of the things I worry about is that will spool up demand for people who’ve been waiting to buy a house till mortgage rates come down, but there won’t be any new houses built. I mean that effect is two years, three years out.

And so what happens if you have more demand for houses with the same kind of supply? Or even if more houses come on the market, everyone who puts their house in the market is a buyer and a seller. So you’d still have this excess of demand over supply. So those things are potential inflationary risks. Now, good policy works against that, and if we do the right thing with rates we’ll work against, but that’s why I just want to make sure I understand it and see it before I declare victory.

Tracy (23:37):

What’s been the most surprising thing that you’ve heard at Jackson Hole this year? You talked about German savings rate, but beyond that, is there anything that caught your eye or your interest?

Tom (23:48):

Alan Blinder asked a question today that I thought was pretty interesting. He said, ‘When you think about monetary policy lags, why aren’t you talking about how to shorten them?’ And I’ve said, almost as it’s a given, that when we raise or lower rates, it takes 12 to 18 months for the full effect to go into the economy. Well, part of that is because the economy doesn’t behave in a way that would allow it to happen quicker. An example: I think the number is, in 2009, 60% of [the] mortgages in this country were adjustable rate. Today it’s 8%. And so when we raise or lower rates, it doesn’t flow through to mortgages quickly and certainly not even like it did 15 years ago. And I’m not saying we should change the mortgage market, but it does make you stop and think, how much of our policy, the effectiveness of our policy tools, is a given or how much could actually change over time as the economy changed?

4. No Priors Ep. 78 | With AWS CEO Matt Garman (Transcript here) – Sarah Guo, Elad Gil, and Matt Garman

…Now that we’re at a $100 billion run rate, I think 85% of workloads are still running on-prem today by most estimations, somewhere in that range. Pick your number, whether it’s 80 to 90, or whatever it is. That’s enormous. If there’s still 10x growth of just existing workloads – forget all the new genAI workloads that are being created every day – these are just existing workloads to move, there’s a 10x number in there, so that business is massive…

…Gil (12:40): You mentioned that 80% of workloads still haven’t migrated over. What do you think are the main blockers to that today? is it just momentum? Are there specific features? Are there big things still to build?

Garman (12: 48): There’s some technologies that I think… Look, if I had an easy button, and by the way we’re trying to build an easy button, but if I had an easy button that would just migrate mainframes to a modern cloud architecture today, almost everyone will push that button. But it doesn’t quite exist today and it’s not as simple as like, “Great, I’ll go run your mainframe in the cloud.” That’s not what customers want. They want to actually modernize those workloads and have them into microservices and containerized workloads and other things like that. So that’s one, is there’s just a bunch of workloads like that that are old and and their customer’s running a big SAP thing and they want to move it to the cloud but it just takes time because it’s tied to a bunch of other things like that. There’s also a bunch of workloads that as you get out of core IT workloads that are in line of business, that are the next set of things. Whether that’s say telco workloads that are running the 5G infrastructure around the world, we’ve slowly been moving those to the cloud and helping those customers get that flexibility and that agility of of running those in the cloud as well. But they’re slower to move.

If you think about all the compute that runs factories out there today on factory floors, most of those have not been modernized. And there’s a huge opportunity, by the way for AI, to totally revolutionize how you think about factory workflows and efficiency there. But a lot of that hasn’t moved. There’s on-prem infrastructure that people are still amortising, there’s still people whose jobs it is to run on-prem data centers, and so they’re resistant to moving things. There’s a bunch of factors in there and so some of it is just takes time, some of it is technology pieces, some of that is we still have stuff to go build and innovate and help make it easier for customers to do that.

Guo (14:37): I’d love to hear about just the initial investigation of generative AI as a technology change and how AWS began to react to it, invest in it, because to some degree it puts us all back in the on-prem co-lo era of the world, where to get one of these, if you’re doing any sort of real pre-training, to get your startup off the ground, you’re back to, “I’ll buy a bunch of DGX boxes somewhere and I need to think about the cost and management of that.”

Garman (15:07): Actually most people are still buying those but in the cloud. But it’s not a serverless type of thing. Most people are still not buying H100s and hosting them in a co-lo or anything like that. And increasingly, I think that’s going to get harder and harder as you move to liquid cooling and larger clusters. It is a super interesting space. I think we’ve been working on this space for how many years now – we’ve been investing in AI broadly for the last 10 years, and it’s why we started five or six years ago investing at the infrastructure layer and building our own processors, because we knew this was coming, we saw this path coming and we knew that that’s also not a short-term investment. It’s one of those things you got to invest way ahead. And then we were investing and building generative AI models, and then OpenAI made a generational leap forward with what they were able to do, what was possible, and many people have talked about this. But it really in some ways was a discovery as much as anything about just what was possible and unleashed the new set of capabilities.

So we actually as a business took half a step back and said, “These are going to be transformational abilities and assuming that this technology gets better and better and better over time, how do we make it so that every company out there can go build using those technologies?” Different than, “How can I go build a consumer application that people are going to be interested in?”, we took it from the point of view of AWS, “Just what are the building blocks that I can help all of our customers, whether they’re startups, whether they’re enterprises etc, go build interesting generative AI applications.” We started from first principles. Customers are going to care a ton about security. That’s not going to change. They’re not going to all of a sudden not care about securing their infrastructure.

We also had two more hypotheses. One, the idea that there wasn’t just going to be one model. We thought that there was going to be a lot of models for a lot of different purposes, and there’d be big models and small models, and people would want to combine them in new and interesting ways. I think the last two years have probably played that out. I think when OpenAI first launched, it wasn’t as obvious, but that was one of the bets that we made. The third one is that we view that every enterprise that was building on us, the interesting IP that they were going to bring to the table was mostly going to be their data, and they were going to care that their data didn’t leak back into a model or escape from their environment. So we built a bunch of what we did starting from those principles of how do we make sure that these things are secure, that their data is secure, that they can have access to every piece of technology that the customers need to go build interesting applications, and they can do it in a cost effective way. That’s how we approach the space.

I think we now have a platform in Bedrock, in Trainium chips and Inferentia chips, and then a bunch of the other capabilities around as well as the suite of models that we offer, both proprietary as well as open source ones – or open weights ones. I think we’re starting to see that momentum pick up and we’re seeing more and more customers really like that story. They like that platform to build from, and we’re seeing enterprises really lean in and want to build in that space because it gives them a lot of that control that they want as they go and build applications…

…Gil (26:25): The other place that a lot of people are spending time right now in terms of bottlenecks to utilization or usage or future-proofing, is actually more on the chip side or semiconductor or system side and in terms of DC capacity. Obviously you all have been building Trainium chips and other things which I think is really exciting to see that evolution. How do you think about future GPU shortages? Does that go away, when? I’m sort of curious about how you think about forward-looking capacity, and is the industry actually ready in terms of building out data centers, building out semiconductors, all the rest of it, packaging.

Garman (26:56): I think we’re probably going to be in a constrained world for the next little bit of time. Some of these things, they take time. Look how long it takes to build a semiconductor fab. It’s not a short lead time and that’s several years and TSMC is running fast to try to ramp up capacity, but it’s not just them. It’s the memory providers and frankly data centers that we’re building. There’s a lot of pieces in that value chain that I think as you look at the demand for AI which has been – exponential might be undershooting it – some of those components that support that I think are catching up and I think AWS is well positioned to try to do that better than others are.

We’ve spent a long time thinking about – in the last 18 years, learning how do we think about smart investing, how do we think about capital allocation. We’ve spent a bunch of time thinking about how do we acquire our own power, how do we ensure that it’s green and carbon neutral power, all super important things. We’re the largest purchaser of renewable energy over the last… new contracts, so actually going out and adding and supporting new renewable energy projects. We’re the largest provider I think, each of the last four or five years. So we’ve been leaning into that for a while to ramp up this and this is just a step up. So we’re thinking about how are we acquiring enough power. Our own chips is a way to support the growth of Nvidia chips, and so I think the more diversity there, the better off we are. We’re a huge partner of Nvidia’s. Nvidia actually runs their AI training clusters in AWS because we actually have the most stable infrastructure of anyone else, so they actually get the best performance from us. We love that partnership and we have a great and growing relationship with them. We think things like Trainum are a good diversification and I think there will be some workloads that run better on Trainium and are cheaper on Trainium over time, and as well as Inferentia.

I think inference is one of those workloads that – today it’s 50/50 maybe of training and inference. But in order for the math to work out, inference workloads have to dominate, otherwise all this investment in these big models isn’t really going to pay off, so hopefully for the industry that all happens. But I think we’re probably going to be tight for the next little bit of time, because the demand is almost infinite. I mean it seems infinite right now.

5. Timing the Stock Market Using Valuations – Ben Carlson

I’ve never found a legitimate way to utilize valuations to determine entry or exit points in the stock market. Maybe when things get to extremes but even then valuations can be unreliable.

In early 2017, I wrote a piece for Bloomberg about stock market valuations:…

...This was the lede:

Something happened in the stock market this week that has only occurred twice since 1871: Robert Shiller’s favorite valuation method for the S&P 500, the cyclically adjusted price-to-earnings ratio, reached 30. So, is it time to worry?

The only other times in history when the CAPE ratio reached 30 were in 1929 and 2000, right before massive market crashes. So it made sense that some investors were worried about the stock market being overvalued.

The S&P 500 is up nearly 170% since then, good enough for annual gains of roughly 14% per year.

Sometimes valuations matter, but other times, the market doesn’t care about your price-to-earnings ratios.

The same is true during bear markets. Sometimes stocks get downright cheap but not all the time…

…Three of the four bear markets this century didn’t see the CAPE ratio come close to previous bear market valuation levels. If your plan was to get more aggressive when the market got cheap enough, you would still be waiting.

The problem with using valuations as a timing indicator is that even if they do work on average, missing out on just one bull market can be devastating. You could be waiting a mighty long time to get back into the stock market and miss out on big gains in the meantime.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Amazon (parent of AWS). Holdings are subject to change at any time.

The Buyback Endemic

Buying back stock at unreasonably high valuations is not a good use of capital and can destroy shareholder value.

Buybacks can be a good way for companies to enhance shareholder value. Share buybacks reduce the number of shares outstanding, allowing companies to pay a higher dividend per share in the future.

But not all buybacks are good. Done at the wrong price, buybacks can actually be a bad use of capital. In fact, I have seen so many companies do buybacks recklessly and without consideration of the share price.

The problem probably arises from a few reasons. 

Wrong mindset

First, some executives do not have a good grasp of what buybacks are. Take this statement from Tractor Supply’s management in its 2024 second-quarter earnings report for example:

“The Company repurchased approximately 0.5 million shares of its common stock for $139.2 million and paid quarterly cash dividends totaling $118.5 million, returning a total of $257.7 million of capital to shareholders in the second quarter of 2024.”

The issue with this statement is that it lumps dividends and share repurchases in the same bracket. It also implies that share repurchases are a form of returning capital to shareholders. The truth is that share repurchases is not returning cash to long-term shareholders but only to exiting shareholders. If management mistakes repurchases as capital return, it may lead them to do buybacks regularly, instead of opportunistically.

Although I am singling out Tractor Supply’s management, they are just one out of many management teams that seem to have the wrong mindset when it comes to buybacks.

Incentives

Additionally, executive compensation schemes may encourage management to buy back stock even if it is not the best use of capital. 

For instance, Adobe’s executives have an annual cash remuneration plan that is determined in part by them achieving certain earnings per share goals. This may lead management to buy back stock simply to boost the company’s earnings per share. But doing so when prices are high is not a good use of capital. When Adobe’s stock price is high, it would be better for management to simply return dividends to shareholders – but management may not want to pay dividends as it does not increase the company’s earnings per share.

Again, while I am singling out Adobe’s management, there are numerous other companies that have the same incentive problem.

Tax avoidance

I have noticed that the buyback phenomena is more prevalent in countries where dividends are taxed. 

The US, for instance, seems to have a buyback endemic where companies buy back stock regardless of the price. This may be due to the fact that US investors have to pay a tax on dividends, which makes buybacks a more tax-efficient use of capital for shareholders. On the contrary, Singapore investors do not need to pay taxes on dividends. As such, Singapore companies do not do buybacks as often.

However, simply doing buybacks for tax efficiency reasons without considering the share price can still harm shareholders. Again, management teams need to weigh both the pros and cons of buybacks before conducting them.

Final thoughts

There is no quick fix to this problem but there are some starting points that I believe companies can do to address the issue. 

First, fix the incentives problem. A company’s board of directors need to recognise that incentives that are not structured thoughtfully can encourage reckless buybacks of shares regardless of the share price.

Second, management teams need to educate themselves on how to increase long-term value for shareholders and to understand the difference between buybacks and dividends.

Third, management teams need to understand the implications of taxes properly. Although it is true that taxes can affect shareholders’ total returns when a company pays a dividend, it is only one factor when it comes to shareholder returns. Executive teams need to be coached on these aspects of capital allocation.

Only through proper education and incentives, will the buyback endemic be solved.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Adobe and Tractor Supply. Holdings are subject to change at any time.

What We’re Reading (Week Ending 01 September 2024)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 01 September 2024:

1. Aidan Gomez, Co-founder & CEO @Cohere: What No One Understands About Foundation Models (Transcript here) – Harry Stebbings and Aidan Gomez

Aidan Gomez: It’s definitely true that if you throw more compute at the model, if you make the model bigger, it’ll get better. It’s kind of like it’s the most trustworthy way to improve models. It’s also the dumbest. Right? Like, if all else fails, just make it bigger. And so for folks who have a lot of money, that’s a really compelling strategy. It’s super low risk. You know it’s going to get better. Just scale the model up, pay more money, pay for more compute and go. I believe in it. I just think it’s extremely inefficient. There are much better ways. If you look at the past, let’s say year and a half, I guess by now it would be between ChatGPT coming out, or GPT-4 coming out, and now GPT-4, if it’s true what they say, and it’s 1.7 trillion parameters this big MoE, we have models that are better than that model, that are like 13 billion parameters. And so the scale of change, how quickly that became cheaper, is absurd, kind of surreal. And so, yes, you can achieve that quality of model just by scaling, but you probably shouldn’t.

Harry Stebbings: Do we continue to see that same scaling advantages, or does it actually plateau at some point, as you said there, we always hear about Moore’s Law. At some point, it just becomes a better calculator for the iPhone.

Aidan Gomez: It certainly requires exponential input. You need to continuously be doubling your compute in order to sustain linear gains in intelligence. But I think that probably goes on for a very, very, very long time. It’ll just keep getting smarter. But you run into economic constraints, right? Not a lot of people bought the original GPT-4, certainly not a lot of enterprises, because it was huge. It was massive. Super inefficient to serve. So costly, not smart enough to justify that cost. There’s a lot of pressure on making smaller, more efficient models smarter via data and algorithms methods, rather than just scaling up due to market forces. Just pressure on price.

Harry Stebbings: Will we live in this world of unbundled verticalised models, which are much more efficient and smaller, designed for specific use cases. Or will there be much larger three to five models which kind of rule it all?

Aidan Gomez: There will be both. There will be both. The one pattern I think we’ve seen emerge over the past couple years is that people love prototyping with a generally smart model. They don’t want to prototype with a specific model. They don’t want to spend the time fine tuning a model to make it specifically good at the thing that they care about. What they want to do is just grab an expensive big model prototype with that, prove that it can be done, and then distill that into an efficient focus model at the specific thing they care about. That pattern has really emerged. I think we’ll continuously exist in a world of multiple models, some focused and verticalized, others completely horizontal…

…Aidan Gomez: Yeah, sometimes we do. Sometimes we do. There’s this very obvious next step for models, which is you need to let them think and work through problems. You need to let them fail. They need to try something. Fail, understood why they failed, roll that back, and make another attempt. And so, at present, there’s no notion of problem solving in models.

Harry Stebbings: And when we say problem solving, that is the same as reasoning, correct?

Aidan Gomez: Yeah.

Harry Stebbings: Why is that so hard? And why do we not have any notion of that today?

Aidan Gomez: I think it’s not that reasoning is hard, it’s that there’s not a lot of training data that demonstrates reasoning out on the Internet. The Internet is a lot of the output of a reasoning process. Like, you don’t show your work when you’re writing something on the web. You sort of present your conclusion, present your idea, which is the output of loads of thinking and experience and discussion. So we just lack the training data. It’s just not freely available. You have to build it yourself. And so that’s what companies like Cohere and OpenAI and Anthropic, etc, that’s what we’re doing now, is collecting data that demonstrates human reasoning…

…Harry Stebbings: One thing I’m concerned about bluntly or I look at with hesitation is you see OpenAI price dumping. You see Meta releasing for free and Mark pronouncing the value of open source and open ecosystem. Are we seeing this real diminishing value of these models? And is it a race to the bottom and a race to zero.

Aidan Gomez: I think if you’re only selling models for the next little while, it’s going to be a really tricky game. It won’t be a small market. There will be a lot.

Harry Stebbings: This question may be really stupid. Who’s only selling models and who’s selling models and something else?

Aidan Gomez: I don’t want to name names, but let’s say Cohere right now only sells models. We have an API, and you can access our models through that API. I think that that will change soon. There are going to be changes in the product landscape and what we offer to sort of push not away from that, but to add on to that picture and that product suite. But if you’re only selling models, it’s going to be difficult because it’s going to be like a zero margin business because there’s so much price dumping, people are giving away the model for free. It’ll still be a big business, it’ll still be a pretty high number because people need this tech. It’s growing very, very quickly, but the margins at least now are going to be very, very tight.

And so that’s why there is a lot of excitement at the application layer. And I think that discourse in the market is probably right to point out that value is occurring beneath, like at the chip layer because everyone is spending insane amounts of money on chips to build these models in the first place. And then above at the application layer where you see stuff like ChatGPT, which is charged on a per user basis, $20 a month type thing, that seems to be where at this phase, value is accruing. I think that the model layer is an attractive business in the long term, but in the short term with the status quo, it is a very low margin, commoditized business if we just break it down…

…Aidan Gomez: I think it will be. Right now, chips are just exceptionally high margin and there’s very, very little choice in the market. That’s changing. I think it’s going to change faster than other people think. But I’m very confident.

Harry Stebbings: I think you’ve also seen the stockpiling of GPU’s change a lot. Before there was a sign of real supply chain shortage.

Aidan Gomez: Yes. Yeah.

Harry Stebbings: And now it’s not so much.

Aidan Gomez: No. Yeah. The shortage is going down. I think it’s becoming clear there are going to be more options available and not just on the inference side. Inference is already quite heterogeneous. You actually already have loads of options on the inference side, which is like not the training of the models, but the serving. On the training side, the picture has been, it’s essentially one company that creates the chips that you can use to train big models. That’s still true today. But – actually it’s not true today. There’s two companies. You can definitely train big models on TPUs. Those are actually now a usable platform for super large scale model training. And I think Google has proven that quite convincingly. And then there’s Nvidia. But I think soon, AMD, Tranium, these platforms are going to really be ready for primetime…

…Harry Stebbings: On enterprises, Canva is obviously making a hard push for enterprise. You sell into amazing enterprises. What’s the number one blocker today for why enterprises don’t adopt?

Aidan Gomez: It’s mostly trust in the technology. So security. Everyone is very sketched out by the current state of things. Who’s training.

Harry Stebbings: Sketched out means concerned?

Aidan Gomez: Yeah, yeah, right.

Harry Stebbings: Not like a flop.

Aidan Gomez: Well, they’re hoping that they don’t have a flop. So they’re really scared that someone’s going to take their data, train on it, and put them in some sort of security vulnerability, or that they’ll lose IP. I think that’s a very valid concern because people have been training on user data.

Harry Stebbings: Is there anything you can do to reassure them other than, “hey we’re using new synthetic data?”

Aidan Gomez: Yeah. So our deployment model is set up to do that. We focus on private deployments inside their VPC, on prem. What that means is just, it’s on their hardware, completely privately. We’re not asking them to send data over to us. We’ll process it and give you back the response from the model. We’re saying we’ll bring our models to where your data is. We can’t see any of it.

Harry Stebbings: Will we see the movement back to on-prem in this new world?

Aidan Gomez: When I speak to folks, it’s super conflicted in financial services. Yeah. People are pulling away from cloud. They’re pulling away from cloud. They’re building out their own data center capacity. Everywhere else still seems to be we need to migrate to cloud. It doesn’t make sense for us to have these data centers. I think that it probably depends on the vertical that you’re looking at…

…Harry Stebbings: Are we still in the experimental budgets for enterprise? Everyone’s like, oh, we’re just playing with budgets now. Is that fair? Or are we actually moving into mainstream?

Aidan Gomez: It’s really started to shift. So last year, 100%, it was like the year of the proof of concept. Everyone was sort of testing it out, playing around with it. But recently there’s been a big shift to urgency to get this tech into production. I think a lot of enterprises are scared of being caught flat footed. They’ve spent a year running POCs and testing stuff out. Now they’re sprinting towards, I want to put this into production, transform my product, augment my workforce.

Harry Stebbings: What’s the number one use case for them in terms of what they need or want?

Aidan Gomez: The number one use case…

Harry Stebbings: Because it feels like every board is saying, hey, what’s your AI strategy? And it’s like, what does that actually mean? Is it Klarna, who’s very much, we want to optimize our customer service and we’re going to do that. Is that the number one? Customer service? Is it employee augmentation and productivity?

Aidan Gomez: I think it’s employee augmentation. It’s these models becoming a partner or a colleague to your entire workforce. That’s the most popular use case.

Harry Stebbings: I think Copilot is the right way to do that.

Aidan Gomez: I think Copilot is great and it’s the right idea of augmenting a workforce with an assistant. But it’s siloed again within an ecosystem, so it plugs into Office and the Microsoft suite of products. Enterprises don’t just use Microsoft. They use Microsoft for their email and docs and spreadsheets and then they use Salesforce for their CRM. They have SAP for their ERP, they have some HRM, they have internal software that they built for themselves. And if you really want to augment the workforce, you need to have a platform for developing these assistants, these agents, that’s agnostic to a particular toolset and that prioritizes the tool sets rationally across what people actually use, what the market actually uses. So I don’t think that that’s going to be done by Copilot.

Harry Stebbings: You mentioned the word agent there. Agents is one of the hottest topics in ventureland. Do you think it’s justified, the hype around agent’s agentic behavior, what it does to workflows?

Aidan Gomez: I mean, the hype is justified 100%. That’s the promise of AI. The promise of these models is that they would be able to carry out work by themselves that just dramatically transforms productivity. Once you have a model that can go off and do things independently over a very long time horizon. So no longer like, I’m gonna do this one thing for you immediately in return and I’m done. But like, over the next six months, I’m going to be pumping deals into your top of funnel or something like that, right? Like doing outbound for you. It just completely transforms what an organization can do. The hype is justified. I think my critique would be, is that work going to be most effectively done outside the model builders or within? Who’s going to be best positioned to actually build that product?

Harry Stebbings: Why would it be best done within the models?

Aidan Gomez: Completely depends on the quality of the model. It entirely depends on the model. Like, the model is the reasoner behind the agent, and you have to be able to intervene at that level. If you’re not able to actually transform the model to be better at the thing that you care about. If you’re not the one building the model, if you’re just a consumer of the model, you’re structurally disadvantaged to build that product…

…Aidan Gomez: I think there’s sort of like a meme that’s going around of people saying we plateaued, nothing’s coming, it’s slowing down. I actually really think that’s wrong and not just from like a we need to 10x compute and that type of thing perspective and trust me, it’ll get better. But from a methods perspective. So when I was talking about reasoners and planners and models that can try things, fail and recover from that failure, and carry out tasks that take a long time to accomplish, these are, for the technologists, obvious things that just don’t exist in the technology today. We just haven’t had time to turn our focus there and add that capability into the model. For the past year plus, folks have been focusing on that and it will be ready for production, so we’ll see that come out, and I think that will be a big change in terms of capability…

…Harry Stebbings: What does AI not do today that you think it will do in three years? It will be completely transformative.

Aidan Gomez: I think robotics is like the place where there will be big breakthroughs. The cost needs to come down, but it’s been coming down. And then we need models that are much more robust just because a lot of the barriers have fallen away like before. Reasoners and planners inside of these robots, the software behind them, they were brittle and you had to program each task you wanted it to accomplish. And it was super hard coded to a specific environment. So you have to have a kitchen that is laid out exactly like this.

Harry Stebbings: Exactly the same dimensions, nothing different.

Aidan Gomez: Yeah, so it was very brittle. And on the research side, using foundation models, using language models, they’ve actually come up with much better planners that are more dynamic, that are able to reason more naturally around the world. I know this is already being worked on. There’s like 30 humanoid robotic startups and that type of thing. But soon someone’s going to crack the nut of general purpose humanoid robotics that are cheap and robust. And so that will be a big shift. I don’t know if that comes in the next five years or ten years, it’s going to be somewhere in that range…

…Harry Stebbings: So what have you changed your mind on most in the last 12 months?

Aidan Gomez: The importance of data. I underrated it dramatically. I thought it was just scale. And a lot of proof points have happened internally at Cohere that have just transformed my understanding of what matters in building this technology.

Harry Stebbings: So now it’s the quality of data.

Aidan Gomez: Yeah, quality. Like a single bad example, right, amongst like billions. It’s so sensitive. It is a bit surreal how sensitive the models are to their data. Everyone underrates it.

2. Chip War’s Chris Miller on Putin, China, and The Future – Mario Gabriele and Chris Miller

Which current or historical figure has most impacted your thinking?

Vladimir Putin. He is the most striking embodiment of my belief that you can’t understand people through traditional utility functions.

My background is in Russian studies, and I’m struck by the extent to which our analysis of Putin has changed over time. Twenty years ago, when he first came to power, he portrayed himself – and with some level of accuracy, I think – as a relatively modern leader of Russia. He was reforming the tax system and doing stuff that political leaders do. When we talked about his motivations at the time, the focus was often very financial. I remember very distinguished economists who I respect greatly saying, “Isn’t it the case that Putin is primarily driven by money?” And indeed, there are lots of examples of Putin being hugely corrupt and his friends stealing all sorts of stuff. He’s got his gaudy palaces on the shores of the Black Sea.

But we’ve learned that it’s not all about money. When he invaded Ukraine in 2022, Putin cited Peter the Great and Catherine the Great as justifications for territorial conquest. It’s an illustration that “modern people” are not always driven by modern impulses. The desire for power and glory and control, the desire to be on top and dominate others – for better or worse – are central to many people’s utility functions. These impulses might seem more base, but I think, to some degree, they’re present within all of us. You ignore them at your peril…

What is the most significant thing you’ve changed your mind about over the past decade?

I’ve changed my mind about the usefulness of thinking like an economist. Even though I may criticize them sometimes, I have great admiration for economists. But they think of everything in terms of utility functions and how to maximize them. They only know how to calculate that in dollars and cents. Though that’s valid, I’ve come to appreciate its limitations.

I’ve spent a lot of time over the past decade studying great entrepreneurs and geopolitical competition. Fundamentally, neither founders nor countries think like economists. Great founders may have shareholders who would like them to consider return on equity, but that’s not how they make decisions. Think of Jensen Huang ten years ago – even though Wall Street was warning him against it, he still poured Nvidia’s money into building out CUDA and the ecosystem around it. If your mode of thinking is purely economic – focused on return on equity or maximizing shareholder value – you miss a lot of what actually drives competitive, successful people.

The same thing is true at the international level. Governments don’t think like economists, either. They try to maximize glory or territory or reputation or power. Like great entrepreneurs, sometimes they simply want to win, just for the sake of besting an adversary. There are ultimately so many things that drive nations and the humans within them that are non-quantifiable. Often, they’re much more significant than strictly quantifiable economic variables…

What risk are we radically underestimating as a species? What are we overestimating?

We’re underestimating the risk of a great power conflict – World War III. World wars happen roughly every half-century. We shouldn’t forget that. Whether as part of a world war or not, the risk of a nuclear weapon being used in conflict within the next 50 years also seems highly plausible.

You can see points of tension across the border between China and the Western sphere. You see it in the South China Sea with the Philippines, in the East China Sea with Taiwan, and in the Himalayas with India – and those are just the border disputes. It’s easy to imagine how that could spiral in an escalatory manner.

If you put a dollar value on the cost of this kind of conflict, it would be measured in the many trillions. Yet the amount of time we spend thinking about it is not remotely commensurate with that outcome. Some people console themselves by saying, “It’s high magnitude but low risk, so the expected cost is low.” I’m not so sure about that. If you talk about the risk this year, maybe it’s low. But if you think about it over the next decade and factor in the risk compounding every year, suddenly, I don’t think those assumptions hold.

If you think the risk is high, we have two options. You can either offer concessions or build up your capabilities to deter more successfully. From the US perspective, we’ve been doing a little bit of the latter and a little bit of the former under Biden – but not much of either. I think it’s intellectually coherent to say, “Let’s do more of one or more of the other.” I think it’s not intellectually coherent to say, “Let’s just do a little bit of both,” when in reality, defense spending is at historic lows relative to the post-Cold War period.

3. Joel Greenblatt: Value and Special Situation Investment Lecture with Rob Goldstein (Transcript here) – Joel Greenblatt and Rob Goldstein

Rob Goldstein (02:32): We came across Moody’s in early 2000 when it was in the process of being spun off. It was obvious that Moody’s was one of the great businesses that we had ever seen and the problem was it was trading at 21 times forward earnings, and 24 times trailing earnings. So the question we had to ask ourselves was just how much of that greatness was already reflected in the stock price. Just to give a little perspective, typically at that time, we would buy stocks at 10 times earnings and sell it at 14 or maybe even 15 times earnings if we got lucky. So the thought of paying up for a business like this was really a new thing for us. So what I did was I compared Moody’s to Coke… 

…Goldstein (04:06): Okay. So several decades ago Buffett figured out that if he identified a really great business he could pay what seemed like a lot of money and still make a fortune. In 1988, Buffett bought $600 million of Coke stock. He paid around 13 times forward earnings, 15 times trailing earnings, and back then the value investment community didn’t understand why that was any great bargain. But 12 years later, the $600 million was worth over $7 billion. So Coke became the classic example of paying up for a great business and making a fortune doing it so that’s why I looked at Coke…

…Goldenstein (05:39): Okay, so there’s three really good things about Coke. [Writes on board: (1) Organic Growth, (2) High ROE, (3) Lasting Competitive ADVANTAGE]. To sum up, those are the three really important things to remember about Coke. In addition it was a relatively easy business to understand and it was a predictable business. Most businesses are neither of those things…

…Okay, my first slide. We have Moody’s historical financials and in the 19 previous years to 2000, revenues had grown at a compounded annual rate of 15% and operating profits have grown at a compounded annual rate of 17%. Not many companies have that kind of terrific performance. In the 19-year period, year-over-year revenue declined only one time and that decline was just a few percent and happened after a period of rapid growth. So you know they’ve done great in the past. But does past success equal future success, and as Moody’s a great business, how should we think about that?…

…Just to explain where this growth came from because it’s important for the rest of the analysis. 30 years ago, when you get a loan, the lending institution would retain that loan. Today, many of these loans are securitized and sold into the capital markets. The guy originating the loan is not necessarily the guy financing the loan. Today there’s trillions of dollars of these securities, including credit card loans, home equity loans, commercial mortgage loans, auto loans, etc. To do these securitizations, you need ratings.  Financing loans through the capital markets is more efficient than the old way, so one would expect that the growth would continue. In addition, Europe was way behind the US in terms of their growth curve of issuing these asset-backed securities and Asia was behind Europe. They were just sort of starting to go down that path. So basically there was lots of growth ahead.

We talked about good return on capital which we can get to later. In terms of the lasting competitive advantage, we talked about why there can be no new entrants and we touched on why there won’t be any pricing pressure, because their fees seem reasonable in the larger scope of things. You really have to go to S&P and Moody’s to get ratings, and they both know that, so they’re not going to be very negotiable on price. So the company was in the right place at the right time, and the same factors responsible for the past growth would be expected to continue into the future. So we concluded that Moody’s was a great business…

…This is a price chart of Coke. How much did Berkshire Hathaway make over those 12 years? We’ll assume that he paid $5 a share on $6.88. 12 years later, and his stock was $58 a share – be right around here – and he had collected $4.75 in dividends over that time. Just to keep it simple, let’s assume he was able to earn 6% on those dividends that he received, so let’s value the dividend at $6. So his $5 turned into $64,  and he’s got a 23.7% rate of return on his investment, annualised.

I basically pulled these numbers out of an annual report at the time. Question is, why did Buffett do so well on his Coke investment?…

…You’re correct, over the 10-year period, revenues grew at 8.8% and unit case volumes at 7%. Oh it is industry… oh no, the industry’s 4%-5% percent. So over the 10-year period they’ve got some price increases. Of course their cost also went up. They were able to grow their unit case volume to 7% a year, so they had organic growth, they didn’t need that much of it. That translated to 12% operating income growth. There was a little bit of leverage so they got 13% in net income growth and they bought the stock, so they got 15% EPS growth over that time.

The other reason why he did so well was because – we just talked about this – he only had to reinvest 20% of the earnings back into the business. So that meant that in addition to the buybacks he was able to pay our dividends.

Just one formula I’m going to put up on the board because we’re gonna come back to it later, is [writes on board: Growth rate divided by reinvestment rate equals return on equity]. So their growth rate was 12%, reinvestment rate was 0.2, so the return on equity was 60%. So that’s how the business performed and in addition, he did so well because there was big PE expansion. He paid about 15 times forward earnings when he bought the stock and in 2000 when we looked at it, it was trading at north of 30 times expected 12-month earnings.

So how can we expect Moody’s to perform for us over the next 12 years? What growth rate should we assume?…

…Well we settled on 12% and the reason why we settled on 12% is because (1) management guidance was low single digits, and (2) because 12% seemed very reasonable considering the historical operating performance had been so much better in the belief that the same factors responsible for the past growth were going to continue… 

…So we felt very comfortable that they could grow at very healthy rates in the future. An estimated 12% operating earnings growth rate for Moody’s happen to be very convenient, because that was Coke’s growth rate during those 10 years we looked at. So for the remaining analysis I could now just focus exclusively on the difference in return on capital and how that impacted the different valuations…

…What would you guess Moody’s return on capital was?

Attendees (33:06): [Indecipherable]

Goldstein (33:09): That’s exactly right. Their return on capital was infinite, because they had no – their $50 million in PP&E, they needed desks and computers for 2,000 employees and that was it. In addition their customers paid on time or in advance. They were in a very strong position. They could demand payment upfront and you typically see that kind of a thing with companies that earn good returns on capital. But the answer was their returns on capital were infinite. Very few businesses like that.

So Coke needed to spend 20% of its earnings on…  So they earn a dollar, they spend 20 cents, and you have 80 cents left over. Moody’s would spend nothing. They’d have a dollar left over. So how much more was Moody’s earning stream worth more than Coke? 25%? Okay, a dollar is 25% higher than eighty cents.

Now does this mean that the higher return on capital makes Moody’s worth 25% more than Coke? Well yes and no…

…Goldstein (36:02): The question is, everything else saying the same, does this fact that Moody’s has a higher return on capital mean that their business is worth 25% more than Coke?

Attendees (36:18): Yes, in terms of free cash flow.

Goldstein (36:22): In the short term that’s correct. But in the longer term, they’re not gonna grow at these 12% rate forever. So if you assume that in the very long run that growth rates drop to 5%, then if you go back to this formula [pointing to formula on growth rate, reinvestment rate, and ROE], you see that for Coke, will mean that they need to reinvest 8%-10% of their earnings in the business as their growth rate drops. This formula here is what we use to calculate this 60% ROE for Coke. Growth rate over return on capital equals the reinvestment rate. The growth rate at some point in the future drops to 5%, 20 years down the road or whenever, the return on equity is 60% for Coke we calculated, so that means the reinvestment rate would be 8.3%. The slower the growth, the less capital you need, the more capital you can pay out. So let’s just assume that at some point Coke will be paying out 90%-92% of earnings. So we split the difference instead. Let’s assume that this return on capital thing is going to mean that Moody’s is worth 15% more than Coke. It’s just somewhere in the middle between 25% and 10%, or 8%…

…Okay so you just raised my next point, which was is there something else you need to consider? What are they going to do with the money? So we saw that Coke returns all their excess capital, and we felt that Moody’s was very likely to return all their excess capital. In fact, they were gonna put more of that money into buybacks because that’s what management had said they were gonna do. So we basically took this important point and we could leave it out of our analysis at this point. because they were basically gonna be equal for both companies.

So can we justify 21 times earnings? 13 times 1.15 – the benefits from the higher return on capital – so you can pay 15 times earnings and get the same thing. How about 21 times earnings?

Attendees (46:20): [Indecipherable]

Goldstein (46:31): We concluded that because Moody’s had a much higher return on capital, the business was worth 15% more.

Attendees(46:48): 13 was the PE in Coke in 1988, but you’re saying Moody’s can justify a PE of 15?

Goldstein (46:57): Based on the higher return on capital. We saw that we were going to use similar revenue growth assumptions. Growth rate was the same, ROE was different, and let’s assume this [pointing to reinvestment rate] is the same.

Attendees (47:10): [Indecipherable]

Goldstein (47:18): Yes and I’m gonna get to that in a minute. The analysis I made was, I said what would have happened if back in 1988, Buffett paid 18 times earnings, or $7 a share for his stock. So what would have happened is he still would have done great. He would have made 8 times his money, and he would have had a compounded annual return of 20%. Still a great purchase. So he could have paid 18 times earnings at that point and still have done great. So $5 to $7, increased the price by 40%, gets you your 21 multiple – that’s what we had to get. Which is why we used 1.4. Does that make sense?

Well, let’s say you went back to the 1988 and you said that he couldn’t pay 13 times earnings, he had to pay 18 times earnings, how would he have done on his investment, and he still would have done great. So basically he did so well that he had so much room that he could have paid a lot more for his stock and still had a very good investment. Not as good, but still very good. He would have made 20% a year, each year, over those 12 years, and that 40% number got us to our 21 multiple.

[Equation on board: 13 x 1.15 = 15, 15 x 1.4 = 21]

So we kind of backed into it that way and that was the original analysis. And the reasoning was very sound despite the short cut we used. Actually the first time I spoke in Joel’s class, one of the students like you, said it didn’t – interest rate or something had to do with this – and immediately I knew that interest rates had a lot to do with it. Only I never really thought about it.

So what happened was after Buffett purchased Coke, interest rates over the remaining 12 years dropped from 9% to 6% [uses projector for a chart showing interest rates]. So 9% and over here down to 6%. So if you price the 30-year bonds and said that that 30-year bonds, how would that change in price if rates went from 9% to 6%, the answer would be, it would go up by 42%. If it was a perpetuity, it would go up by 50%, but it’s not, it’s a 30-year bond. So it’d go up by 42% percent and that’s the right way of really looking at things. So, it so happens that – this was somewhat random – but the 42% is basically the 40% that we came up with right here [pointing to the equation on the board of “13 x 1.15 = 15, 15 x 1.4 = 21”].

Attendees (51:50): [Indecipherable] For Moody’s, now interest rates are low…

Goldstein (52:08): Okay, let’s see how Buffett would have done. It’s a very good question actually. Let’s see how he would have done. So if interest rates drop from 9% to 6%, thing’s worth 40% more if rates go up. The way the math works, they’re worth 30% less. So going from 1 to 1.4, it’s 40% up, from 1.4 to 1, it’s 30% less. Had his stock traded for 30% less at the time we did this analysis, he would have had a $40.60 stock, he would have gotten $6 in dividends, he’d have $46.60 over his original $5 investment, he would have made over 8 times his money. I did the math, so I know that’s a compounded annual return of 20%.  It’s not as good as the 23.7%, but it’s still very good.

So taking your point. it wasn’t that we were expecting to do 23.7%, we were assuming that if interest rates stayed the same or went down, we could expect to make 20%, and that’s probably what he was looking at when he bought Coke. I don’t think he was betting on lower interest rates although who knows what he was doing. That makes sense right?…

…What happened to Moody’s was a good part and a bad part. The good part is that it did trade up. In October when this stock was actually spun off, it was up 20% from where it was in March. And by that following April – so I guess that had been just over a year – the stock was up 50% from where it had started. Now what happened with Moody’s is – and here’s the sad part because we sold our stock too early on this one – but what happened was in 2001, the business exploded to the upside. Profits didn’t grow 12%, they grew at 40%. In the next year, they didn’t grow at 12%, they grew it 35%. So profits have compounded over the following 6-plus years at 25%, at least 25%. I guess that’s what happens when you use conservative assumptions. But the stock was up 6 or 7 times since then and a lot of those gains came early before earnings really took off.

Attendees (59:10): So do you decide on what price to sell?

Goldstein (59:13): That’s a very good question. We obviously made a bad decision. It went up a bunch, earnings had started to shoot up, yet we thought – we got higher hurdle rates then a guy managing zillions of dollars, so the stock was up 50%, so had to think seriously about selling and putting your money into something else. When you make these analyses, hindsight is 20/20 and everything is so easy in retrospect. But in real time when you’re doing this, you’re obviously worried that stock’s 30 times earnings, what happens if I’m wrong, what happens if things do poorly next year and all of a sudden you’re not paying 30 times earnings, you’re paying 40 times earnings. Now the business looks shaky. So it’s never as easy at the time as it is after the fact. But we sold when it was up 50% or more. of all time.

4. Why China Is Starting a New Trade War –  Lingling Wei and Jason Douglas

Interviews with policy advisers in Beijing and people who have consulted with Chinese officials show that China’s leadership faced a pivotal crossroads last year, as the country’s real-estate bust brought the economy to one of its weakest points in decades.

Some advisers argued that China’s economy needed a fundamental rethink, graduating from its traditional heavy reliance on manufacturing and construction and instead prioritizing more domestic consumption—a shift that would make China more like the U.S., and potentially put it on a more stable growth path.

Instead, Chinese leader Xi Jinping ordered officials to double down on the country’s state-led manufacturing model, with billions of dollars in fresh subsidies and credit. He used a slogan to make sure officials got the message: “Establish the new before breaking the old,” or xian li hou po in Chinese.

The “new” in Xi’s model doesn’t mean a pivot to a new growth model. Instead, it is the top leader’s way of refining his idea of what kind of manufacturing for the state to back. In essence, the phrase calls for building industries China wants to dominate for the future—such as EVs, semiconductors and green energy—while also maintaining the country’s traditional areas of strength in “old” sectors such as steel. Any overcapacity problems can be punted to the future…

…Two principles have guided Xi’s thinking, Chinese policy advisers say. The first is that China must build an all-encompassing industrial supply chain that can keep the domestic economy running in the event of severe sanctions by the U.S. and other Western countries. In the top leader’s views, advisers say, industrial security sits at the core of China’s stability as tensions with the developed world rise.

The second is a deep-rooted philosophical objection to U.S.-style consumption, which Xi sees as wasteful.

That leaves China with few options other than investing in exports to stabilize its weakened economy and create jobs to make up for losses in domestic construction…

…Loans to industry, including manufacturing firms, have increased 63% since the end of 2021, while Chinese banks have pulled back sharply on lending to real-estate developers.

Government subsidies, though long central to China’s economic playbook, have also ramped up significantly. Companies listed on the Shenzhen and Shanghai stock exchanges declared $33 billion in government subsidies in 2023, according to figures from data provider Wind—23% more than in 2019…

…In all, 99% of publicly listed Chinese companies now disclose some form of subsidy, according to the Kiel Institute, a German think tank. China spends about 4.9% of its gross domestic product on nurturing industries—several times higher than the U.S., Germany and Japan, according to Scott Kennedy, a China expert at the Center for Strategic and International Studies in Washington.

Craig Allen, president of the U.S.-China Business Council, a lobbying group for American companies in China, said Xi’s manufacturing fixation was on display when he met recently with the governor of one of China’s poorest farm provinces.

When Allen asked the governor about his economic priorities, the governor listed semiconductors, software, biotechnology, robotics, aerospace, batteries, and EVs.

“I would have thought that addressing the immediate needs of his overwhelmingly rural constituents, such as improving agricultural harvests, might be at the top of his economic priorities list,” Allen said.

The fire hose of financial support looks set to keep spraying. The People’s Bank of China in April said it set up a new facility with roughly $70 billion to help bank lending to tech firms. In May, a national fund aimed at financing semiconductor production raised $48 billion from state-owned banks and other government-linked investment vehicles…

…“China’s production of advanced electric vehicles, lithium-ion batteries and photovoltaic products, first met our domestic demand, but also enrich global supply,” Chinese premier Li Qiang said in an address to the World Economic Forum’s June meeting in Dalian, China. The real source of China’s manufacturing edge isn’t government subsidies but its huge scale, which helps pin down costs, he added…

…China has added capacity to produce some 40 million vehicles a year, even though it sells only around 22 million at home. It’s on track to make around 750 gigawatts of solar cells this year, despite only needing 220 gigawatts domestically in 2023. And it is expected to account for 80% of the world’s new supply this year in basic chemicals such as ethylene and propylene, used to make garbage bags, toys and cosmetics—even though prices in China have been falling for 19 months, a sign of oversupply.

At the same time, output of steel, one of China’s “old” industries, increased last year despite waning domestic demand due to the continuing property crisis. Industry executives say Beijing has been prodding them to invest more in upgrading steel production through clean technologies and other means…

…China has suffered from persistent overcapacity in the past, at times raising ire from its trading partners for depressing global prices for steel and other goods.

In 2015, Xi entrusted his economic czar at the time, Liu He, to implement reforms that led to closures of many small and privately owned steel mills and other businesses. For a while, it seemed as if Xi and his economic team were ready to finally tackle overproduction.

But as tensions with the U.S. escalated in recent years, and China’s economy weakened, Xi’s views changed, Chinese policy advisers say. He grew more concerned about ensuring China could produce everything it needed in the event of a conflict with the U.S., and became less sympathetic to Western complaints.

5. What is behind China’s perplexing bond-market intervention? – The Economist

Many governments live in fear of bond-market “vigilantes”, investors who punish errant policies by aggressively selling the sovereign’s debt, driving down its price and thereby pushing up its yield. Financial regulators also worry about bond-market malfunctions, such as unsettled trades, when one party to a transaction fails to honour its promises. These mishaps can send ripples of anxiety through an entire financial system.

Such fears do not seem to apply to China’s financial authorities. On August 9th regulators in the southern province of Jiangxi ordered several rural banks not to settle their recent purchases of government bonds, according to Bloomberg, a news service. Similar lenders elsewhere have also been reported to the People’s Bank of China (PBoC), the country’s central bank, for using their own accounts to buy bonds on behalf of others. Rural banks have been instructed to stick to their main business of lending to local enterprises, rather than to the central government.

The measures are part of an attempt by the central bank to stem a relentless rally in the government’s bonds. Earlier this month yields dropped below 2.1% on ten-year securities, down from almost 2.6% at the start of the year. The causes are clear: China’s economy has slowed, borrowers have retreated and inflation has vanished. Nonetheless, officials have been warning since April that yields would not stay low for ever. In July the PBoC unveiled plans to sell government securities borrowed from other financial institutions if required. The central bank was, in other words, “preparing to short its own government’s bonds”, as Adam Wolfe of Absolute Strategy Research, a consultancy, put it. In the end, the bank left the vigilantism to other members of its posse. On August 5th state-owned banks sold bonds heavily, driving the price down and the yield back up a notch…

…In the long run, the best way to lift yields is to warm up the economy, which is likely to require more borrowing and spending from the central government. Its fiscal stimulus would be more powerful if the central bank supports spending with further interest-rate cuts. In other words, yields may have to fall before they can rise. If China’s government is to succeed in reflating the economy, the PBoC will need to act like an accomplice, not a vigilante.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Alphabet (parent of Google), Meta Platforms, Microsoft, and Salesforce. Holdings are subject to change at any time.