What We’re Reading (Week Ending 12 January 2025)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 12 January 2025:

1. The art of outlasting: What we can learn from timeproof Japanese businesses – Eric Markowitz

Japan is home to an extraordinary number of shinise, or long-established businesses. A 2008 study found that Japan had over 21,000 companies older than 100 years, including more than 3,000 that had crossed the 200-year mark. These firms are not just historical artifacts — they are vibrant examples of how to endure and thrive in a rapidly changing world. Their strategies — balancing tradition with adaptability, patience with practicality — are a masterclass in long-term thinking that today’s entrepreneurs and executives would be wise to study…

…What ties these stories together is an approach to business that’s almost rebellious in its patience. While the modern world glorifies disruption and speed, Japan’s ancient companies remind us that longevity is often about playing the long game. It’s about building something so solid, so aligned with its environment, that it can weather any storm. But let’s not romanticize this too much. Strip away the poetry of water metaphors and ancient traditions, and you’ll find ruthless pragmatism at the core of these businesses’ survival.

When Japan’s post-war construction boom faded, Kongo Gumi didn’t just stick to temples — they pivoted hard into office buildings and apartments while maintaining their temple maintenance business as a hedge. During the lean years of the 1990s recession, Hōshi Ryokan cut costs to the bone while refusing to lay off staff, with family members taking deep pay cuts to keep their centuries-old workforce intact. Okaya transformed from selling samurai swords to becoming a global steel trader, making calculated bets on new technologies and markets while keeping their supply chain relationships rock solid.

These companies didn’t just drift through history — they clawed their way through wars, depressions, and cultural upheavals, making brutal choices about what to preserve and what to sacrifice. Their longevity wasn’t achieved through Zen-like detachment, but through gritted teeth and white-knuckled adaptability.

2. Notes on China – Dwarkesh Patel

I got quite mixed messages about the state of public opinion in China. This is to be expected in a society where you can’t establish common knowledge. One person told me that the new generation is quite nationalist, unlike the older reform generation which personally experienced the catastrophes of Mao and the tangible benefits of liberalization. He made the rather insightful point that this tilt in Chinese public opinion increasingly gives lie to the American talking point, “We’re against the CCP, not the Chinese people.” In fact, he went on to say that the current regime is way more liberal than what would result from an election in China.

Another person told me that these Chinese nationalists were only a vocal minority, similar to the wokes in America circa 2020. While they make up only about 10% of the population, they aggressively shout down others on Weibo (China’s Twitter equivalent). Most people find them annoying but feel uncomfortable confronting them directly. This matches what a student who graduated from a top university there told me – the vast majority of his classmates are simply apolitical. And in our own interactions with locals, we saw little evidence of widespread nationalism. In fact, when my Chinese-speaking trip mate (who could actually speak Chinese) would mention he was from the UK to taxi drivers, they would often respond enthusiastically: “Oh wonderful, we love the UK!”…

…We chatted up quite a lot of young people on night life streets. I was struck by how many young people expressed feeling stressed or overwhelmed. We met a musician in Chengdu who was writing songs about youth anxiety. We chatted up some modeling school students – even they complained about the intense pressure they felt. We met a guy who had studied in Australia but returned to China during COVID. He explained that many of his friends with prestigious degrees are moving away from Shanghai and Beijing – Yes, the pay there can be twice as high as in second or third tier cities. But the competitiveness is insane. And in order to actually land the high skilled positions, they have to work truly insane hours (9-9-6 is not a myth). He said that many of his friends were opting for these less ambitious lower-paying careers in smaller cities, where the rent is lower and the pressure is manageable…

…I’m still puzzled by how China can have both a demographic collapse and massive youth unemployment. You’d think with fewer young people being born, the ones who are around would be in high demand. One explanation I heard while there is that there are plenty of menial jobs available, but today’s educated youth – who’ve gone through high school and college – just won’t take the low-skilled positions their parents and grandparents did. Meanwhile, there’s a real shortage of the high-skilled jobs that would actually match their education and aspirations. It’s a mismatch between the jobs available and the jobs young people feel qualified for and willing to do…

…The biggest surprise from talking to Chinese VCs people at AI labs was how capital constrained they felt. Moonshot AI, one of China’s leading AI labs, raised $1 billion at a $3 billion valuation. Meanwhile, just xAI’s new cluster alone will cost $3-4 billion.

The tech ecosystem feels quite shell shocked from the 2021 crackdown. One VC half-jokingly asked if I could help him get his money out of China. If you keep your money in China, you’re basically stuck choosing between terrible options. You can either accept a measly 2% yield from state banks, or throw it into China’s perpetually struggling stock market. This helps explain why valuations for Chinese companies are chronically low – the exit opportunities just suck. Even if you build (or invest in) something great, there’s no guarantee the company will be able to raise the next round. And even if you do raise again and succeed, the government might randomly cancel your IPO. And even if you somehow make it to the public markets, Chinese equities have been performing terribly anyways. It’s a good reminder of how easy it is to completely wreck an innovation ecosystem that depends on risk-taking investors.

3. Is AI progress slowing down? – Arvind Narayanan and Sayash Kapoor

To be clear, there is no reason to doubt the reports saying that many AI labs have conducted larger training runs and yet not released the resulting models. But it is less clear what to conclude from it. Some possible reasons why bigger models haven’t been released include:

  • Technical difficulties, such as convergence failures or complications in achieving fault tolerance in multi-datacenter training runs.
  • The model was not much better than GPT-4 class models, and so would be too underwhelming to release.
  • The model was not much better than GPT-4 class models, and so the developer has been spending a long time trying to eke out better performance through fine tuning.

To summarize, it’s possible that model scaling has indeed reached its limit, but it’s also possible that these hiccups are temporary and eventually one of the companies will find ways to overcome them, such as by fixing any technical difficulties and/or finding new data sources…

…Industry leaders don’t have a good track record of predicting AI developments. A good example is the overoptimism about self-driving cars for most of the last decade. (Autonomous driving is finally real, though Level 5 — full automation — doesn’t exist yet.) As an aside, in order to better understand the track record of insider predictions, it would be interesting to conduct a systematic analysis of all predictions about AI made in the last 10 years by prominent industry insiders.

There are some reasons why we might want to give more weight to insiders’ claims, but also important reasons to give less weight to them. Let’s analyze these one by one. It is true that industry insiders have proprietary information (such as the performance of as-yet-unreleased models) that might make their claims about the future more accurate. But given how many AI companies are close to the state of the art, including some that openly release model weights and share scientific insights, datasets, and other artifacts, we’re talking about an advantage of at most a few months, which is minor in the context of, say, 3-year forecasts.

Besides, we tend to overestimate how much additional information companies have on the inside — whether in terms of capability or (especially) in terms of safety. Insiders warned for a long time that “if only you know what we know…” but when whistleblowers finally came forward, it turns out that they were mostly relying on the same kind of speculation that everyone else does.

Another potential reason to give more weight to insiders is their technical expertise. We don’t think this is a strong reason: there is just as much AI expertise in academia as in industry. More importantly, deep technical expertise isn’t that important to support the kind of crude trend extrapolation that goes into AI forecasts. Nor is technical expertise enough — business and social factors play at least as big a role in determining the course of AI. In the case of self-driving cars, one such factor is the extent to which societies tolerate public roads being used for experimentation. In the case of large AI models, we’ve argued before that the most important factor is whether scaling will make business sense, not whether it is technically feasible…

…As an example, Sutskever had an incentive to talk up scaling when he was at OpenAI and the company needed to raise money. But now that he heads the startup Safe Superintelligence, he needs to convince investors that it can compete with OpenAI, Anthropic, Google, and others, despite having access to much less capital. Perhaps that is why he is now talking about running out of data for pre-training, as if it were some epiphany and not an endlessly repeated point.

To reiterate, we don’t know if model scaling has ended or not. But the industry’s sudden about-face has been so brazen that it should leave no doubt that insiders don’t have any kind of crystal ball and are making similar guesses as everyone else, and are further biased by being in a bubble and readily consuming the hype they sell to the world…

…Inference scaling is useful for problems that have clear correct answers, such as coding or mathematical problem solving. In such tasks, at least one of two related things tend to be true. First, symbolic reasoning can improve accuracy. This is something LLMs are bad at due to their statistical nature, but can overcome by using output tokens for reasoning, much like a person using pen and paper to work through a math problem. Second, it is easier to verify correct solutions than to generate them (sometimes aided by external verifiers, such as unit tests for coding or proof checkers for mathematical theorem proving).

In contrast, for tasks such as writing or language translation, it is hard to see how inference scaling can make a big difference, especially if the limitations are due to the training data. For example, if a model works poorly in translating to a low-resource language because it isn’t aware of idiomatic phrases in that language, the model can’t reason its way out of this.

The early evidence we have so far, while spotty, is consistent with this intuition. Focusing on OpenAI o1, it improves compared to state-of-the-art language models such as GPT-4o on coding, math, cybersecurity, planning in toy worlds, and various exams. Improvements in exam performance seem to strongly correlate with the importance of reasoning for answering questions, as opposed to knowledge or creativity: big improvements for math, physics and LSATs, smaller improvements for subjects like biology and econometrics, and negligible improvement for English.

Tasks where o1 doesn’t seem to lead to an improvement include writing, certain cybersecurity tasks (which we explain below), avoiding toxicity, and an interesting set of tasks at which thinking is known to make humans worse…

…We think there are two reasons why agents don’t seem to benefit from reasoning models. Such models require different prompting styles than regular models, and current agentic systems are optimized for prompting regular models. Second, as far as we know, reasoning models so far have not been trained using reinforcement learning in a setting where they receive feedback from the environment — be it code execution, shell interaction, or web search. In other words, their tool use ability is no better than the underlying model before learning to reason…

…The furious debate about whether there is a capability slowdown is ironic, because the link between capability increases and the real-world usefulness of AI is extremely weak. The development of AI-based applications lags far behind the increase of AI capabilities, so even existing AI capabilities remain greatly underutilized. One reason is the capability-reliability gap — even when a certain capability exists, it may not work reliably enough that you can take the human out of the loop and actually automate the task (imagine a food delivery app that only works 80% of the time). And the methods for improving reliability are often application-dependent and distinct from methods for improving capability. That said, reasoning models also seem to exhibit reliability improvements, which is exciting.

Here are a couple of analogies that help illustrate why it might take a decade or more to build products that fully take advantage of even current AI capabilities. The technology behind the internet and the web mostly solidified in the mid-90s. But it took 1-2 more decades to realize the potential of web apps. Or consider this thought-provoking essay that argues that we need to build GUIs for large language models, which will allow interacting with them with far higher bandwidth than through text. From this perspective, the current state of AI-based products is analogous to PCs before the GUI.

4. Waymo still doing better than humans at preventing injuries and property damage – Andrew J. Hawkins

The study is the product of the collaboration between Waymo and insurer Swiss Re, which analyzed liability claims related to collisions from 25.3 million fully autonomous miles driven by Waymo in four cities: Phoenix, San Francisco, Los Angeles, and Austin. They then compared those miles to human driver baselines, which are based on Swiss Re’s data from over 500,000 claims and over 200 billion miles traveled.

They found that the performance of Waymo’s vehicles was safer than that of humans, with an 88 percent reduction in property damage claims and a 92 percent reduction in bodily injury claims. Across 25.3 million miles, Waymo was involved in nine property damage claims and two bodily injury claims. The average human driving a similar distance would be expected to have 78 property damage and 26 bodily injury claims, the company says.

Waymo’s vehicles also performed better when compared to new vehicles equipped with all the latest safety tech, including automatic emergency braking, lane-keep assist, and blind spot detection. When compared to this group, Waymo’s autonomous driving system showed an 86 percent reduction in property damage claims and a 90 percent reduction in bodily injury claims.

5. SITALWeek #454 – Brad Slingerlend

I think we are approaching the point where we can start to estimate the value of AI for developers and the companies/consumers who are going to buy the next wave of innovative applications. I think the salient question for AI (and, frankly, humanity!) is: How much AI reasoning can you get for a human-equivalent salary? In other words, for a certain salary, how much compute power will it take to match or outperform a human (assuming the AI can collaborate with other humans/AIs using the same methods and tools a human would)…

… LLMs are shifting from a pure token-in/token-out model to a test-time scaling model, which may offer us better inroads for estimating costs. Essentially, they are thinking harder before spitting out a reply; thus, rather than just predicting the next words in a response using a probability model (see You Auto-Complete Me), they are doing some deep thinking to arrive at more accurate, useful answers. This is a major leap in capability that comes with a major leap in cost. OpenAI raised prices for their o1 model to $200/mo (Pro subscription) from $20 (Plus subscription). For developers, use of o1’s advanced reasoning API comes at 3-4x the cost of their “general purpose” GPT-4o. If o1 were priced at a typical Western office worker wage of $40/hr, the reasoning of the model would equate to around 5 hours of work per month. We also don’t know if the $200/mo price point is profitable for OpenAI or if they are just relying on Microsoft to further subsidize their business model (which brings us back to the principal-agent problem I started this section off with). So, all of my hand waving here seems to imply you can get a decent amount of human-equivalent reasoning for an amount of money in the realm of human labor cost. If true, after a few more years of advancements in semiconductors and AI models, we should have markedly affordable “human reasoning as a service”, an explosion in demand, and a wide range of outcomes for how much human supervision of AI will be required (it may be that human jobs stay relatively flat, but each human is 2x productive, then 4x, etc.).

Following this logic, at current AI reasoning costs, companies would need to lay off one human for every AI human equivalent they hire and would probably lose more skill/knowledge than they gain. In other words, based on my attempts to guess the cost of replacing human reasoning, today’s AI offerings aren’t likely compelling enough. In a couple years, however, maybe you will be able to lay off one human and hire a handful of AIs, which, by collaborating with each other and humans, may yield superior results. Even today, extremely high-value tasks, such as in-depth research or stock market predictions, may be able to take advantage of the high-cost test-time scaling AI models. And, if any of this math is in the realm of reason, you can easily see that AI may not require such high-value-add applications to be cost effective in the near to medium future. The proof will come within the next couple of years as today’s entrepreneurs develop the next generation of apps leveraging LLMs and overtaking human capabilities: If these apps are at price points that outcompete human employees, a significant wave of change could come much faster to society. 


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Alphabet (parent of Google and Waymo) and Microsoft. Holdings are subject to change at any time.

Company Notes Series (#4): engcon

Editor’s note: This is the latest edition in the “Company Notes Series”, where we periodically share our notes on companies we’ve studied in the recent past but currently have no vested interest in (we may invest in or sell shares in the companies mentioned at any time). The notes are raw and not updated, and the “as of” date for the data is given at the start of the notes. The first three editions in the series can be found here, here, and here. Please give us your thoughts on the series through the “Contact Us” page; your feedback will determine if we continue with it. Thanks in advance!

Start of notes for engcon

Data as of 31 December 2023

Background

  • Year founded: 1990
  • Listed in Stockholm Stock Exchange (Sweden) since 17 June 2022
  • Headquarters: Stromsund, Sweden

Business

  • engcon manufactures tiltrotator systems that turns excavators into tool carriers (see Figure 1). The hydraulic tools provided by the company include detachable grippers, stone and sorting grabs, combi grabs, and more. See engcon’s Youtube video for more.
Figure 1
  • engcon’s tiltrotator solutions are developed, manufactured and subsequently fitted on new or existing excavators. Dealers serve as a link between excavator manufacturers (OEMs, or original equipment manufacturers), tiltrotator manufacturers, and end-customers. End-customers are contractors, companies that own excavators, and excavator rental companies. engcon has partnerships with OEMs that increase the reach of the company’s products and prepare excavators for faster and easier installations of tiltrotators; the partnerships also provide valuable insight into which technologies OEMs are developing for the future, and engcon contributes with knowledge of end-customer requirements.  
  • engcon’s tiltrotator solutions are focused on excavators in the weight class of 2 tonnes to 33 tonnes.
  • The production of engcon’s tiltrotator solutions happens in the company’s production sites in Strömsund, Sweden and Niepruszewo, Poland. engcon’s tiltrotator solutions consist of various components designed by the company. Some of the components are also manufactured at engcon’s aforementioned production sites but most of the components are purchased from suppliers in Sweden and Northern Europe. 
  • engcon had sales in 16 markets across the globe in 2022 and its sales split by geographic region in 2022 and 9M 2023 is shown in Figure 2 below. The years in which engcon entered its various markets are:
    • Sweden: 1990
    • Finland and Norway: 1995
    • Denmark and Germany: 2003
    • UK: 2004
    • France: 2014
    • Netherlands: 2016
    • USA: 2017
    • Japan: 2018
    • South Korea and Australia: 2020
    • Canada, Belgium, Ireland, and Austria: 2021-2022
Figure 2
  • The majority of engcon’s sales take place through a global network of dealers. Sales also take place through collaboration with OEM dealer networks. A limited amount of products, mainly buckets and tools, are sold through engcon’s website in Sweden, Finland and Denmark. 
  • No single customer accounted for >10% of engcon’s sales in 2022, so there’s no customer concentration. But there may be supplier concentration for engcon: 10 of engcon’s largest suppliers in 2021 accounted for 58% of the company’s total purchases of raw materials and components.
  • A tiltrotator had an average price (including engcon and competitors) of SEK 176,000 (around US$19,000) in 2021. Dealers typically earn 30% of the price of a tiltrotator.
  • engcon released its 3rd-gen tiltrotator solution in May 2022. The 3rd-gen system is equipped with technology that has never been used on tiltrotators and that takes a clear step towards the electrified, connected and autonomous excavators of the future. The 3rd-gen’s load-sensing technology leads to reduced fuel consumption, improved precision, less wear and tear, and lower maintenance costs. The reduced energy need simplifies the use of alternative fuels for excavators, such as electricity and hybrid solutions. With help from a new sensor technology, the newly developed control system can precisely calculate the tilt and rotation of the tiltrotator, which means improved user-friendliness and greater potential for autonomous operations. Furthermore, the newly developed control system enables a more efficient remote connection, thereby improving remote support as well as the ability to remotely configure equipment.

Market opportunity

Newly manufactured excavator market for engcon

  • Globally, 665,000 excavators were sold in 2021. Of these 665,000 excavators, a total of 181,775 excavators belonging to the 2-33 tonne weight class (engcon’s focus is on excavators in that weight class) were sold in the Nordics, Europe, Americas, and Asia/Oceania; these regions are engcon’s key geographical markets as shown in Figure 2, and are named as the Focus Markets by the company. In the same year (2021), 12,934 tiltrotators for newly manufactured excavators, and 1,750 tiltrotators for existing excavators, were sold. The value of the tiltrotators sold was SEK 2.6 billion (around US$285 million). 
  • The number of excavators sold in the Focus Markets compounded at 6% per year for 2016-2019. COVID affected the market in 2020, but ultimately, the number of excavators sold in the Focus Markets still compounded at 2% per year for 2016-2021. The historical growth in the excavator market for each of engcon’s Focus Markets:
    • Nordic: 7,206 excavators sold in 2021, CAGR (compound annual growth rate) of 3% for 2016-2019, CAGR of 1% for 2016-2021,
    • Europe: 76,097 excavators sold in 2021, CAGR of 6% for 2016-2019, CAGR of 2% for 2016-2021
    • Americas: 62,972 excavators sold in 2021, CAGR of 10% for 2016-2019, CAGR of 4% for 2016-2021
    • Asia/Oceania: 35,481 excavators sold in 2021, CAGR of 2% for 2016-2019, CAGR of -1% for 2016-2021
  • The number of tiltrotators sold in the Focus Markets had a CAGR of 11% for 2016-2021, including a 15% decline in 2020 because of COVID. The value of tiltrotators sold in the Focus Markets had a CAGR of 15% for 2016-2021.
  • According to PwC, the value of the tiltrotators market is expected to compound at 19% from 2021 to 2026, driven by: (1) greater demand for productivity increases; (2) population growth and urbanisation; (3) lack of labour; (4) sustainability requirements; (5) excavators transitioning to becoming multi-purpose tool carriers and more autonomous; (6) and digitalisation and electrification of the construction market.
  • According to PwC: (1) Excavators equipped with tiltrotators are able to replace 2.2 other construction machines on average; (2) a tiltorator can increase the productivity of an excavator by 25%; (3) the use of a tiltrotator can save 6,000 litres of diesel annually, thus reducing 16,200 kg of CO2 emissions per year; (4) excavators with tiltrotators have a better safety profile as operators can exchange tools from within the cabin. 
  • The penetration rate of tiltrotators in newly manufactured excavators was 2% globally in 2021, 85% in the Nordics (92% in Sweden), and 7% in the Focus Markets. The penetration rate is closely connected to the maturity of the market, which can be divided into 3 phases: Development; acceleration; and mature. In the development phase, the penetration rate increases from 0% to 20%-25%. In the acceleration phase, the penetration rate has passed 20% and risen to 60%. The tipping point between the development phase and the acceleration phase is where the tiltrotator takes the step to becoming an established market standard. Authorities and clients, such as major construction and civil engineering companies, places requirements on excavators to be equipped with a tiltrotator for efficiency and safety reasons. Once the tipping point has been reached, the sales of tiltrotators to both new excavators and the aftermarket tends to gain momentum.
  • The market for tiltrotator manufacturers has 5 major operators (see Figure 3) that account for 95% of sales. engcon is the largest, with a market share of 45%. Tiltrotator manufacturers can be divided into 4 groups: global manufacturers, local manufacturers, other operators whose core operations are not tiltrotators, and excavator manufacturers (OEMs) with in-house manufactured tiltrotators. The 5 largest tiltrotator manufacturers are all global manufacturers, 4 of which are Swedish. All 5 collaborate with OEMs and the product portfolio includes quick couplers, tools, and other advanced attachments for excavators. engcon’s market share has increased from 42% in 2019 and 43% in 2020.
Figure 3

Existing excavator market for engcon

  • Number of newly-manufactured excavators in engcon’s Focus Markets that will not be equipped with tiltrotators for 2022-2026 is expected to be 960,000. This provides a large pool of retrofitting potential for engcon.

Management and major shareholders

  • engcon has Class A and Class B shares. Class A shares carry 10 votes per share while Class B shares have 1 vote per share. The Class B shares are public-listed. At end-2022, engcon had a total sharecount of 151.788 million (35.34 million Class A shares, 116.44 million Class B shares.
  • Stig Engstrom, 62, is the founder of engcon. He handed over the CEO role to Orjan Westerlund in 2003, and has been on the board of engcon since. Stig Engstrom controlled 29.04 million Class A shares and 24.74 million Class B shares at end-2022, which amounted to 35.4% of engcon’s total share count, but 67.1% of the total votes.
  • Stig Engstrom’s ex-wife, Monica Engstrom, has been on engcon’s board since 2004. Monica Engstrom controlled 6.31 million Class A shares and 42.21 million Class B shares at end-2022, which amounted to 32.0% of engcon’s total share count, but 22.4% of the total votes.
  • engcon’s CEO is Krister Blomgren, 58, who has been in the role since 2011. Blomgren controlled 1.259 million engcon Class B shares as of end-2022, which is 0.8% of the total share count. 
  • Other members of engcon’s management team are shown in Table 1 below (some of them have long tenures, which is good):
Table 1
  • Remuneration of Stig Engstrom and Krister Blomgren for 2019-2022 is shown in Table 2 below. Not much details are given on how they are compensated beyond the total amounts. The big jump in compensation for Blomgren in 2022 bears watching, but is only a tiny percentage of engcon’s profit and cash flow during the year.
Table 2

Financials

  • engcon’s revenue has CAGR of 16% from 2012 to 2022, and EBIT margin has doubled from 11% to 2022% in that period. See Figure 4
Figure 4
Table 3
  • From Table 3 above, engcon’s revenue CAGR for 2019 to 12 months ended 30 Sep 2023 is 16.7%. Net income CAGR is 25.6%, and FCF CAGR is 44.8%. Average net income margin is 15.9%, and average FCF margin is 14.0%.
  • engcon saw a large pull-forward of orders in 2021 Q4 and 2022 Q1, mainly in Nordic and Europe, due to price increases and uncertainty concerning delivery times, combined with an uncertain business environment and long lead times. So engcon expects 2023’s overall revenue growth to be just 8% (2023 Q1 growth was 55%, 2023 Q2 was -5%, and 2023 Q3 was -6%). Operating income also fell sharply in 2023 Q2 (12%) and 2023 Q3 (51%)

Valuation

  • Stock price on 31 December 2023: SEK 93.30
  • Trailing EPS = 2.33; Trailing PE = 40
  • Trailing FCF per share = 2.80; trailing PFCF = 33
  • For a company that is very likely going to post a further year-on-year decline in net income and FCF in 2023 Q4, those valuations look high.

Risks

  • engcon protects its business via patents, of which the most important relates to EC-Oil, which is a quick coupler system that allows for the replacement of hydraulic tools from the excavator’s cabin without the mounting of hoses and electrical cables. The patent, which has a maximum validity up to and including 2024, is not assessed to be business-critical, but it still helps to distinguish engcon’s tiltrotator systems and make it more difficult for competitors to copy. When the patent for EC-Oil expires, it may be difficult for engcon to have a distinctive product offering. 
  • The sale of excavators globally has been stronger than what I expected before researching engcon. But the overall construction market – including the sale of excavators – is probably sensitive to recessions. So future recessions are a risk.
  • There’s the possibility that someone comes up with a superior tiltrotator or similar solution to what engcon has.
  • In the shorter term, engcon has clearly been over-earning in 2021 and 2022, and is now suffering the hangover in 2023. Will the hangover last a long time? That’s a possibility, despite tiltrotators being a superior solution. 
  • In June 2022, Rototilt Group filed a lawsuit against engcon that alleged that the company had infringed upon a patent. The adjusted damages claimed amounted to approximately SEK 200 million. The alleged infringement relates to sensor technology in the Q-safe locking system. In May 2023, the Swedish Patent and Market Court announced its verdict regarding Rototilt’s lawsuit against engcon. The court determined that no infringement had taken place and therefore dismissed Rototilt’s action. The court determined that no infringement had taken place and therefore dismissed Rototilt’s action. At the same hearing, engcon claimed that Rototilt’s patent should be declared invalid. However, the court determined that the patent was valid. Following appeals, both parties were granted leave to appeal by the Swedish Patent and Market Court. A ruling in the higher court is expected in spring 2024 at the earliest.

Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have no vested interest in any company mentioned. Holdings are subject to change at any time.

What We’re Reading (Week Ending 05 January 2025)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 05 January 2025:

1. Mike Alkin – Talking Uranium (Transcript here) – Bill Brewster and Mike Alkin

Alkin: So coming to this market, I did that. I spent a good almost couple of years doing supply/demand on my own. There’s 430 reactors around the world. And understanding the country where they operate, the attitude towards nuclear, understanding the math involved. Often as investors, you look for heuristics. How many reactors are there? How many pounds per reactor would there be? You’re looking for rules of thumb. As you start peeling the onion back, I realize that rules of thumb don’t apply here because the amount of uranium needed for the reactor fleet around the world is not always the same. It depends upon enrichment capacity. We won’t go down that rabbit hole, but there’s a whole other segment you need to learn.

As I was doing that, I would go to these conferences and I would talk to nuclear fuel buyers, people who buy this stuff. It was hard for me at first to really understand what I was dealing with because as somebody at that time having well over 20 years of experience as a hedge fund investor, I talked to people in all industries that were on all sides of the equation. But the people buying it typically were curious as to what we were thinking when we were questioning them. If we were talking to a buyer at a company that was buying a product, they would say “What are you as an investor hearing? What are you hearing from the other side? What are my competitors saying? What are you hearing about inventories?” They were inquisitive. That was not this cohort. As I started speaking to nuclear fuel buyers, I was met with an enormous wall put in front of me telling me, “I’m an outsider, I’m not a nuclear engineer, I don’t know what I’m doing, I should basically stay away and they’ve got it.”

I thought it was that attitude that just said to me, “Something’s not right here because the numbers I’m coming up with, whether I’m looking at inventories or the amount of the cost of the supply, or the actual demand” – for context, at the time the price of uranium was $17, $18, $19 a pound. It would say what it was trading for in the market. As I did the analysis, I realized that the average cost was somewhere in the mid-$50s. I’m not that sharpest tool in the shed but I know that if something costs you mid-$50s to make, you can’t sell it for $17 for very long. So it was then that I had to peel back the onion saying, “Why are they producing it at that price?” Then you start to understand that the uranium market is one driven mostly by long term contracts. Well north of 80% on average will trade in a long-term window with contracts that cover 5, 7, 10, 12, 15 years depending on the contract. But that’s where most of the pounds trade. After the Fukushima event, a lot of these uranium producers, when the spot market had declined precipitously, were still selling into much higher prices. My understanding of that when I was talking to fuel buyers at these nuclear conferences, they were telling me that the price of uranium was $17 and $18, it was going to $10, it was going to $5. There was all this uranium out there.

That’s not what my math was showing me. What my math was showing me was that the model was that the long term contracts that had been signed before Fukushima melted down in 2011 were going to start to expire and rather rapidly. Uranium producers could not sell $17, $18, $20 uranium when it cost him 2.5 times that. At some point, production would have to start to shut down.

So you ask, “Do you think you’re crazy?” Yes, because as I’m talking to people who are obviously very sharp – they’re nuclear engineers – but it’s understanding, as you realize, as an investor, you have to understand incentives and you have to understand market structure. Charlie Munger would always say, “Show me the incentive, I’ll show you the outcome.” It was as I was starting to go and talk to these folks and realizing a couple of things. Number one is, they had no interest in what I was learning on my journey. Even though I’m not a nuclear engineer, I’m still somebody who’s a market participant. I’m still somebody that while I don’t speak their language, sitting at a dinner table or a lunch table or at a bar having a beer with them, I certainly could hold my own in supply/demand conversation. And as I would talk about what I was learning and uncovering, I was shot down at every step. I thought, “Wow, that’s interesting because I’m seeing a recency bias. What is now will always be.” So they were kind of latched onto that.

Then as I started peeling that, I’m thinking, “Why is this?” I’ve been doing this a very long time. Over the years, I’ve been wrong many times. I’ve been right more often than not. But you’re wrong and you try and understand where you’ve been wrong. I was thinking, “What is it? Why are they so uninterested in hearing what an outsider’s view is?” As I started to explore that more, you start to understand the makeup and the cost structure of a nuclear reactor, which I have known, but it really started to come into clear vision for me was the fuel. Uranium is just one part of the fuel cycle that goes in. You have uranium, they convert uranium from a powder into a gas. It then gets enriched, it then gets fabricated into pellets. That takes 18 to 24 months to do this stuff. There’s many different stages of the fuel cycle. As I was starting to think about what are the costs of that, all those stages are probably around 20% to 25%. What’s the cost of the uranium? That depends on the price. But it could be mid-single digits, high-single digits, somewhere around that. As you start talking to them about that, you realize it’s not a meaningful cost.

For comparative purposes, if I’m running a natural gas power plant or a coal power plant, my feedstock, the natural gas and the coal are 80% to 90% of the cost of operating it. Here, the uranium is single digits cost of operating it. The vision that started to come to me was uninterested market participants. They’re in the market very infrequently. Why are they uninterested? Because the cost is de minimis. Not to say it’s meaningless, but it’s de minimis. Then as I started to explore and ask questions, “Why are you not as concerned about this?” I was obviously met with a wall.

But what started to come to me was – and I asked flat out at a particular dinner at a World Nuclear Conference – I asked one, actually there were four fuel buyers at a dinner, I said, “If you all had a really enterprising fuel buyer that did the supply/demand work and said, “I think consensus is wrong. Here we are, $17, $18, $20 a pound. We should be buying uranium because the forecasts going out of the future are for deficits to be forming.” Let me ask you a question. Do you all, if the price were to go parabolic and you had all these great cost savings for your plant, do you participate that in any way, shape or form? Are you rewarded financially? Are you rewarded with a promotion?” The answer was I got laughed at. “What are you talking about? We’re paid to secure fuel.” These were buyers. As you come to a market as an investor, you think buyers are traders – they’re commercial creatures. These aren’t. These are really smart nuclear engineers that happen to buy a product that happens to not be a major cost component. There’s infrequent price discovery on their part and so it’s a lesson in understanding incentives and market structure…

Alkin: One of the things you see now is you have expert networks who provide hedge funds and mutual funds experts to speak to in any industry. If you’re a hedge fund wanting to get up to speed right now on the nuclear power industry, you’re going to say, “Get me three nuclear fuel buyers. I’d like to speak to them about uranium.” They’re going to get on the phone and they’re going to speak to them. For years – though I’m sure they’ve been doing this – they can get on the phone and speak to three fuel buyers and they say, “Yeah, there’s plenty of uranium out there.” Those are the same folks, when the price was $17 was telling me that, versus here you’re seeing floors and ceilings at $125 and $135. They are the gift that keep on giving. Yet the way the structure of the research process is, they’re going to expert networks. They find these people, and if you don’t understand how the sausage is made, you’re going to be misled. They’re not purposely misleading you. It’s just what their own beliefs are. For me, that’s a beautiful thing. I’ve been doing this a long time now, almost 30 years as a professional investor, and I’ve never seen a cohort of people who are so uninterested in hearing the other side of the story. So far I’ve seen them prices move up 4x in there against them and they still have the same attitude.

Brewster: To your point, it doesn’t sound like they’re very incentivized to care.

Alkin: There’s very little to no incentive to care, other than maybe you would think pride? I don’t know. But it doesn’t matter. It’s just not a thing. We actually chuckle because when we go to these conferences, you talk to them in a hallway or in a bar, it’s as though you’re an adversary. It’s very bizarre. They don’t have an incentive. It doesn’t matter what they pay. So that’s the bizarre thing.

2. Chip Cities Rise in Japan’s Fields of Dreams – Gearoid Reidy

In Chitose, a city of 100,000 in the northernmost main island of Hokkaido, billboards seek recruits for the Self-Defense Forces, which saw a 50% shortfall last year. When I arrived on a fully booked plane from Tokyo packed with salarymen in cheap suits and expensive watches, it was easy to see where the competition was coming from: a half-dozen towering cranes jutting into the sky, a jarring contrast against the surrounding countryside…

…Those cranes are building the first fab for Rapidus Corp., a public-private venture that aims to skip Japan to the head of the chip production queue. Founded just two years ago, it hopes to produce cutting-edge, 2-nanometer chips by 2027, in cooperation with IBM Corp. It’s fraught with risks, and the government’s record in promoting industry is spotty. But this is just the latest and most ambitious example of a series of bets on chips, with Prime Minister Shigeru Ishiba recently pledging an extra ¥10 trillion ($66 billion) on top of ¥3.9 trillion invested since 2021. Near the other end of the Japanese archipelago, 1,500 kilometers (930 miles) to the southwest, is another. In Kumamoto, on the island of Kyushu, mass production is soon set to begin at a $7 billion semiconductor plant.

Here, Taiwan Semiconductor Manufacturing Co., drawn by government subsidies and the region’s supply chain, opened its first Japanese plant in February. A second is in the works, with authorities lobbying for a third. It’s triggered an influx of Taiwanese workers into a city where until recently almost everyone was Japanese…

…As many as 6,000 laborers are employed to build Rapidus. But talk is of the arrival of permanent workers once test production begins. That’ll bring at least 1,000 high-earning jobs, along with their supply chains. On my visit, ASML Holding NV, the Dutch maker of chip-testing tools, had just opened offices, with 50 staff expected. Every second building seems to be being torn down and rebuilt…

…The scale of the ambition creates the risk of spectacular failure, one many in Japan’s media fully expect. Skepticism is warranted, considering previous government-led efforts, from DRAM maker Elpida Memory Inc., sold to Micron Technology Inc. after its 2012 bankruptcy, to troubled Japan Display Inc.

The economy was already doing well even before talk of Rapidus, Mayor Ryuichi Yokota told me, describing the fab as a “Big Bang” that has the city scrambling. Yet at night, when the construction crews leave, the silence is deafening. I couldn’t feel the billions I expected to find flowing, just a cold wind that would soon begin to turn to snow…

…The risk from disaster is unpredictable; but what if these experiments simply don’t work out? Japan has spent billions on subsidies to bring a foreign company in Kumamoto. And when it comes to Rapidus, the risks are immense. Even if the company can find the talent it needs (the country is expected to have a shortfall of 40,000 engineers), the technology succeeds and yields are acceptable, it still has to outcompete rivals — including TSMC — to attract customers with an unproven product.

Chitose mayor Yokota shrugged off these concerns. “I’m convinced it will succeed,” he said, resolute that researchers currently studying with IBM in the US will return, like Meiji-era scholars, with secrets Japan can use to rebuild.

3. Before Berkshire: Warren Buffett’s Tab Card Triumph – Kingswell and Alice Schroeder

He decided that he would come in and invest in this company — Mid-Continent Tab Card Co. — but, interestingly, he did not take Wayne and John’s word for it. The numbers they gave him were really enticing, but again he went through and he acted like a horse handicapper.

Here’s another point of departure from what almost anybody else would do. Everybody that I know — or knew as an analyst — would have created a model for this company and would have projected out its earnings and would have looked at its return on investment in the future. Warren didn’t do that. In fact, in going through hundreds of his files, I’ve never seen anything that resembled a model.

What he did is he did what you would do with a horse. He figured out the one or two factors that could make the horse succeed or fail — and, in this case, it was sales growth and making the cost advantage continue to work. Then, he took all of the historical data, quarter by quarter for every single plant, he got the similar information as best he could from every competitor they had, and he filled pages with little hen scratches of all this information and he studied that information.

And, then, he made a yes/no decision. He looked at it: They were getting 36% margins [and] they were growing over 70% a year on a million of sales. Those were the historic numbers. He looked at them in great detail — just like a horse handicapper studying the tip sheet — and then he said to himself, “I want a 15% return on $2 million of sales.” And then he said, “Yeah, I can get that.” And he came in as an investor.

So what he did is he incorporated his whole earnings model and compounding discounted cash flow into that one sentence. “I want 15% on $2 million of sales.”

Why 15%? Because Warren is not greedy. He always wants a mere 15% day one return on an investment and then it compounds from there. That’s all he has ever wanted. He’s happy with that. It’s a very simple thing. There’s nothing fancy about it…

…The $2 million of sales was pretty simple, too. It had $1 million [and] it was growing 70%. There was a big margin of safety built into these numbers. It had a 36% profit margin and he said, “I’ll take half that.”

He ended up putting $60,000 of his personal non-partnership money into this company, which was about 20% of his net worth at the time. He got 16% of the company’s stock, plus some subordinated notes.

4. China’s Bond Yields Scream the ‘D’ Word – Lingling Wei

Over the past week, just as Chinese leaders tried to get the public—and markets—excited with another round of stimulus talk, China’s 10-year sovereign yield kept falling to fresh lows. Now, the yield is around 1.7%, a full percentage-point plunge from a little over a year ago. The return on the 30-year government bond has also dropped below 2%.

The sovereign-debt yield still has a ways to go before falling to zero, but the speed of the drop is astonishing. The lower the yield falls, the deeper the market is signaling economic stress.

…In reality, Beijing is sticking to the formula of boosting demand through investment. The official thinking is, investment creates jobs, which would in turn create demand. That means more roads will be built, factories will be expanded and debts will continue to rise. Already, residents in some cities are complaining about the inconvenience from old roads being dredged up as authorities search for ways to invest.

One big irony is the source of bond buying—the force pushing down the yields.

State-owned banks, insurance firms and funds, the very institutions Beijing is counting on to support the economy, are the major purchasers of government bonds. These institutions would rather park their money in the safety of bonds than financing business projects or otherwise putting it to work.

“What’s good to invest in these days when demand is so low?” a Chinese banker told me, referring to weak business and consumer spending.

5. An Interview with Gregory Allen About the State of China Chip Export Controls – Ben Thompson and Gregory Allen

Here’s the question though. China doesn’t generally seem to be operating, and for good reason under the circumstances, under a real stringent return on invested capital calculation. I mean the 7nm chips that are being produced, we know with I think a pretty high degree of certainty, the yields are terrible.

GA: The yields are dreadful.

But they’re doing it anyway just because it needs to be done and this sort of ties into another thing. You referenced Dylan Patel and SemiAnalysis, who have been pretty strident critics of the enforcement of chip controls. But I think a good point he has made is that China, unlike the US, is not necessarily constrained in power or in the ability to build a ton of data centers, and so there’s a bit where they could just sort of — it’s not great, but they could just be way less efficient and accomplish similar things. Is there a bit where these expert controls are fashioned with Western/US constraints and concerns about how you go about building this stuff that might make them less impactful in the long run?

GA: Yeah, the export controls have not achieved their wildest dreams. There was a faction in the Biden administration that says, “Bwahaha, we found the secret weapon, and China’s AI dreams are gone” — that theory is just dead. Where we are now is at more of a cost imposition strategy. “We are going to make this as expensive and complicated as possible for you to do it, we’re going to try and slow you down, we’re going to try and increase your costs, and that is the race that we’re going to run”.

I mean, if you think about it, we’re switching from a mode in which the US AI ecosystem and the Chinese AI ecosystem were largely fused such that if we’re running a race, you can imagine there’s US people giving China Gatorade and those new Nike shoes that make you run faster. Now we’re moving to a moment where we’re trying to trip them in the race, that’s the change in mindset that we’ve experienced, and it’s not working to its most extreme form, but there is real cost imposition takes the form of the fact that SMIC has to operate at these dreadful yields. The economics are terrible, the fact that when they’re building all of these data centers, they’re having to use lousy chips, they’re having to buy more of them, and they’re having to deal with the higher energy costs of all of that.

It’s true that China does have just this extraordinary willingness to spend, but the point is we’re in this race, we’re in this competition, and it gives us an edge, not an infinite edge, but a meaningful edge.

This is a field, maybe you don’t have an answer to this, but there are some that argue that actually the better approach to some of these chips is a much more expensive, a much more high speed memory approach that has much lower latency using SRAM instead of High Bandwidth Memory. Is there a possibility that we actually pushed China down a different route towards developing these chips that maybe ends up being better because we thought HBM was the right way?

GA: I think that’s probably not what’s going to happen. It’s definitely worth saying that that could happen, a version of that kind of happened with YMTC and their NAND memory. There were multiple different approaches they could have taken technologically. All the Western and US allied Asian firms picked one way because it was obviously the best economics, and they held all the intellectual property, they held all the patents and so YMTC basically said, “Okay, we’re going to go down this other road and because we’re so heavily subsidized, it doesn’t really matter that it’s going to be more expensive”, and they did ultimately figure out how to get it work.

I think what you’re describing, the SRAM in massive quantities thing verges on the neuromorphic architecture, and it’s not that that’s impossible, and it’s not that that’s never going to happen, but it’s clearly not the right step for China right now. I think they have a path to domestic HBM production and that’s so much easier for them to chase than a SRAM revolution. I think traditionally they would just wait for somebody else to try and figure out and demonstrate that it’s possible and then they would throw infinite resources at it…

...For all of these chip controls, all this stuff that you’ve covered and written about, does any of it matter, if you add it all up, in comparison to that point that they don’t have EUV?

GA: EUV is the highest return on investment export control that we have had and are likely to have. It’s definitely the case that some of the other stuff hurts. If you talk about SMIC, for example, increasing their yields on their 7nm line and expanding the capacity of their 7nm line, they actually are bottlenecked by US equipment, a lot of US metrology equipment, etc. But if you want to talk about why they can’t—

But they do have the equipment, they just need to figure out how to duplicate it. The challenge with EUV is they don’t even have one, so duplicating it is that much harder.

GA: Yes exactly, it’s a lot harder to reverse engineer something that you don’t have a copy of, it really helps to have a copy of it. So I would say the EUV thing really matters, but there’s areas where China is facing headwinds that aren’t part of the EUV story.

So just to take one example, in DRAM, Micron still doesn’t use EUV in their production of DRAM, and they’re a globally competitive firm. So CXMT, the Chinese domestic champion of DRAM, the reason why they’re not currently globally competitive is not the absence of EUV, but I do think you could make a story that it is the absence of all this other stuff that we’ve been refusing to sell…

You’re not necessarily like a geopolitical analyst, but the thing that scares me about all this, I think I’ve asked you this every time, it still scares me, is we’re talking and saying the administration needs to do better at enforcing these laws that guarantee a power imbalance in the long run, that is usually very destabilizing. China might think, if we’re going to have a fundamental power imbalance, then how about we take Taiwan off the board because that will screw everyone? Now we’re equal again. Do you worry about this? You’re a strong advocate for doing this better.

GA: So. Number one is, I don’t know that I ever agree that the balance of power is the stable universe. In 1994, the Taiwanese defense budget was half of that of the Chinese defense budget, now the Chinese defense budget is infinity times that of the Taiwanese defense budget. And by contrast, in 1997, I think there was a single U.S aircraft carrier battle group that was more than capable of defeating the entire Chinese Navy and the entire Chinese Air Force, that was a massive power imbalance and it was a very stable relationship. And by the way, it was a relationship in which a lot of people got rich and had productive free trade and all these kinds of happy relationships. So the idea that power parity is the path to peace here, don’t know that I necessarily agree with that, I don’t think the historical record really bears that out.

Now, you could argue if we’re going to make bold moves and try and seize a decisive advantage, could those bold moves be destabilizing? Yeah, I think definitely think so.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in ASML and TSMC. Holdings are subject to change at any time.

What We’re Reading (Week Ending 29 December 2024)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 29 December 2024:

1. Quantum Computers Cross Critical Error Threshold – Ben Brubaker

In the 1990s, researchers worked out the theoretical foundations for a way to overcome these errors, called quantum error correction. The key idea was to coax a cluster of physical qubits to work together as a single high-quality “logical qubit.” The computer would then use many such logical qubits to perform calculations. They’d make that perfect machine by transmuting many faulty components into fewer reliable ones…

…This computational alchemy has its limits. If the physical qubits are too failure-prone, error correction is counterproductive — adding more physical qubits will make the logical qubits worse, not better. But if the error rate goes below a specific threshold, the balance tips: The more physical qubits you add, the more resilient each logical qubit becomes.

Now, in a paper(opens a new tab) published today in Nature, Newman and his colleagues at Google Quantum AI have finally crossed the threshold. They transformed a group of physical qubits into a single logical qubit, then showed that as they added more physical qubits to the group, the logical qubit’s error rate dropped sharply…

…At first, many researchers thought quantum error correction would be impossible. They were proved wrong in the mid-1990s, when researchers devised simple examples of quantum error-correcting codes. But that only changed the prognosis from hopeless to daunting.

When researchers worked out the details, they realized they’d have to get the error rate for every operation on physical qubits below 0.01% — only one in 10,000 could go wrong. And that would just get them to the threshold. They would actually need to go well beyond that — otherwise, the logical qubits’ error rates would decrease excruciatingly slowly as more physical qubits were added, and error correction would never work in practice…

…That variation, called the surface code, is based on two overlapping grids of physical qubits. The ones in the first grid are “data” qubits. These collectively encode a single logical qubit. Those in the second are “measurement” qubits. These allow researchers to snoop for errors indirectly, without disturbing the computation.

This is a lot of qubits. But the surface code has other advantages. Its error-checking scheme is much simpler than those of competing quantum codes. It also only involves interactions between neighboring qubits — the feature that Preskill found so appealing.

In the years that followed, Kitaev, Preskill and a handful of colleagues fleshed out the details(opens a new tab) of the surface code. In 2006, two researchers showed(opens a new tab) that an optimized version of the code had an error threshold around 1%, 100 times higher than the thresholds of earlier quantum codes. These error rates were still out of reach for the rudimentary qubits of the mid-2000s, but they no longer seemed so unattainable…

…Fowler, Martinis and two other researchers wrote a 50-page paper(opens a new tab) that outlined a practical implementation of the surface code. They estimated that with enough clever engineering, they’d eventually be able to reduce the error rates of their physical qubits to 0.1%, far below the surface-code threshold. Then in principle they could scale up the size of the grid to reduce the error rate of the logical qubits to an arbitrarily low level. It was a blueprint for a full-scale quantum computer…

…When you put the theory of quantum computing into practice, the first step is perhaps the most consequential: What hardware do you use? Many different physical systems can serve as qubits, and each has different strengths and weaknesses. Martinis and his colleagues specialized in so-called superconducting qubits, which are tiny electrical circuits made of superconducting metal on silicon chips. A single chip can host many qubits arranged in a grid — precisely the layout the surface code demands.

The Google Quantum AI team spent years improving their qubit design and fabrication procedures, scaling up from a handful of qubits to dozens, and honing their ability to manipulate many qubits at once. In 2021, they were finally ready to try error correction with the surface code for the first time. They knew they could build individual physical qubits with error rates below the surface-code threshold. But they had to see if those qubits could work together to make a logical qubit that was better than the sum of its parts. Specifically, they needed to show that as they scaled up the code — by using a larger patch of the physical-qubit grid to encode the logical qubit — the error rate would get lower.

They started with the smallest possible surface code, called a “distance-3” code, which uses a 3-by-3 grid of physical qubits to encode one logical qubit (plus another eight qubits for measurement, for a total of 17). Then they took one step up, to a distance-5 surface code, which has 49 total qubits. (Only odd code distances are useful.)

In a 2023 paper(opens a new tab), the team reported that the error rate of the distance-5 code was ever so slightly lower than that of the distance-3 code. It was an encouraging result, but inconclusive — they couldn’t declare victory just yet…

…At the beginning of 2024, they had a brand-new 72-qubit chip, code-named Willow, to test out. They spent a few weeks setting up all the equipment needed to measure and manipulate qubits…

…Then a graph popped up on the screen. The error rate for the distance-5 code wasn’t marginally lower than that of the distance-3 code. It was down by 40%. Over the following months, the team improved that number to 50%: One step up in code distance cut the logical qubit’s error rate in half…

…The team also wanted to see what would happen when they continued to scale up. But a distance-7 code would need 97 total qubits, more than the total number on their chip. In August, a new batch of 105-qubit Willow chips came out…

…When the group returned the following morning, they saw that going from a distance-5 to a distance-7 code had once again cut the logical qubit’s error rate in half. This kind of exponential scaling — where the error rate drops by the same factor with each step up in code distance — is precisely what the theory predicts. It was an unambiguous sign that they’d reduced the physical qubits’ error rates well below the surface-code threshold…

…At the same time, researchers recognize that they still have a long way to go. The Google Quantum AI team only demonstrated error correction using a single logical qubit. Adding interactions between multiple logical qubits will introduce new experimental challenges.

Then there’s the matter of scaling up. To get the error rates low enough to do useful quantum computations, researchers will need to further improve their physical qubits. They’ll also need to make logical qubits out of something much larger than a distance-7 code. Finally, they’ll need to combine thousands of these logical qubits — more than a million physical qubits.

2. History: Kodak & Fujifilm – Find Value

Ultimately, Kodak couldn’t adapt to the changing world and filed for bankruptcy in 2012.

In the game for over 100 years, Kodak survived two World Wars and the Great Depression and helped humans photograph the moon and Mars. Like Coca-Cola and McDonald’s, it used to be one of the most recognized brands in the world…

…Faced with a sharp decline in sales from its cash cow product, Fujifilm acted swiftly and changed its business through innovation and external growth. Under Shigetaka Komori (President in 2000), Fujifilm quickly carried out massive reforms. In 2004, Komori came up with a six-year plan called VISION75.

The management restructured its film business by downscaling the production lines and closing redundant facilities. In the meantime, the R&D departments moved to a newly built facility to unify the research efforts and promote better communication and innovation culture among engineers.

Realizing that the digital camera business would not replace the lucrative film due to the low margins, Fujifilm performed a massive diversification based on capabilities and innovation.

Even before launching the VISION75 plan, Komori had taken stock of their technologies and compared them with the demand of the international market. After which the R&D team came up with a chart listing the all existing in-house technologies that could match future markets.

For instance, Fujifilm was able to predict the boom of LCD screens and invested heavily in this market. Leveraging on photo film technology, they created FUJITAC, a variety of high-performance films essential for making LCD panels for TV, computers, and smartphones. Today, FUJITAC owns 70% of the market for protective LCD polarizer films.

Fujifilm also targeted unexpected markets like cosmetics. The rationale behind cosmetics comes from 70 years of experience in gelatin, the chief ingredient of photo film which is derived from collagen. Human skin is 70% collagen. Fujifilm also possessed deep knowledge in oxidation, a process connected both to the aging of human skin and to the fading of photos over time.

When promising technologies didn’t exist internally, Fujifilm proceeded by mergers and acquisitions. Based on technological synergies, it acquired Toyoma Chemical in 2008 to enter the drug business. Delving further into the healthcare segment, Fujifilm also brought a radio-pharmaceutical company now called Fujifilm RI Pharma. It also reinforced its position in existing joint ventures such as Fuji-Xerox which became a consolidated subsidiary in 2001 after Fujifilm purchased an additional 25% share in this partnership.

Fast forward 9 years after the peak of film sales, in 2010, Fujifilm was a new company. In 2000, 60% of sales and 70% of profits came from the film ecosystem, compare this to 2010 where the “Imaging segment” accounted for less than 16% of sales. Fujifilm managed to emerge victorious through a restructuring and diversification strategy…

…Unlike Fujifilm which recognized early on that photography was a doomed business and tackled new markets with a completely different portfolio, Kodak made multiple wrong moves and persisted in the decaying film industry.

It was not that Kodak didn’t want to change, it tried hard, but it did it wrong. Kodak’s management didn’t fully recognize that the rise of digital imaging would have dire consequences for the future of photo printing. It tried to replicate the film print business model in the digital world. In 2004, Facebook was launched, and people are just not going to print pictures anymore.

Interestingly, Kodak understood the impact of digitalization and predicted that pictures would be shared online. They acquired a photo-sharing website called Ofoto in 2001. Unfortunately, the company used Ofoto to make people print digital pictures. They failed in realizing that online photo sharing was the new business, not just a way to expand printing sales…

…While Fujifilm invested heavily in the pharmaceutical and healthcare sector to reduce its exposure to the challenging photo industry, Kodak sold its highly profitable Healthcare Imaging branch in 2007 to put more resources into its losing consumer camera division.

3. One Bed, Two Dreams: Building Silicon Valley Bank in China with Ken Wilcox (Transcript here) – Bernard Leong and Ken Wilcox

Wilcox: In the US, banks sometimes fail. When I started my career 40 years ago in banking, we had 18,000 banks. Today we have about 5,000. What happened to all of them? Where did 13,000 banks go? Some of them got acquired, but many of them failed. When a bank makes too many bad loans, the Federal Reserve causes it to fail and it disappears. In China, banks don’t fail. First of all, banks are fundamentally owned by the government and when they make too many bad loans, they don’t typically fail. Usually the government, the regulators, come and somebody gets arrested and the government re-capitalizes the bank. It’s often very quiet – it’s not even necessarily announced to the world – and the bank keeps on going. What does that mean? That means that Chinese banks can take more risk than US banks can. In the US, we had almost no competitors because everybody thought “Lending to technology companies is way too risky, so we’ll just let Silicon Valley Bank do it. None of the rest of us will try.” In China, many, many, many banks want to copy us and do the same thing, because they’re not worried about what happens if we lose too much money. So that’s another big difference there…

…Wilcox: After I’d been there for several months, it occurred to me one day that my main conversation partner, the guy who is the Chairman, who was from Shanghai Pudong Development Bank, it occurred to me that he actually wears three hats. The only hat I wear is banker / businessman. But he had a banker / businessman hat, and he had a party hat, and he had a government hat. Then I started to wonder, when I’m talking with him, which hat is he wearing? It took me a long time before I figured out he doesn’t even think he has three hats. He thinks they’re all the same hat, so he’s not even thinking about it the same way I was. So I think that’s quite confusing. 

It’s also confusing when people find out, when a US company comes to China and finds out that it’s going to get a Party Committee in their organization. They get very confused because they don’t know what a Party Committee is. If you ask people in government or in the party, “What’s a Party Committee?” You say, “We’re going to have one , but I don’t understand what it is?” It’s hard for them to explain. You get multiple definitions and then you don’t know what is actually going to happen. Some people will tell me, “When you get a Party Committee, it’ll be so good because all the employees in your organization who are members of the party will have a place to gather once a month and discuss things.” Then somebody else says, “When you get a Party Committee, it’ll be so much easier because the Party Committee will help you put on social events for the employees, all of the employees.” But then somebody else told me, “No, when you get a Party Committee, it’ll be like another board, but a secret board. You won’t know who’s on it and they will influence what the real board does – or what I would call the real board.” Then other people told me, “Don’t pay any attention. That’s all silliness. There is no such thing as a Party Committee.” So it’s very, very confusing…

…Wilcox: I’ll give you the best example and that is that I believe based on the years I spent in China, that ultimately the main reason they wanted us in China – and they actually were very determined to get us to come to China. I remember that early on, a couple of years before my wife and I moved to China, I had a series of meetings with a very high-level government official who’s also got a lot of status in the party. He was saying to me, “Ken, we really want you to bring your bank to China. Your bank is more important than any bank we’ve ever met. You’re more important than – he explicitly said this – he says, You’re more important than Morgan Stanley and more important than Goldman Sachs. And by the way Ken, you’re one of the smartest Americans we’ve met.” So you think to yourself, “Well this is an exaggeration, but it does feel nice.” He obviously is going to help me get established in China. But what I didn’t realize is that the main reason they wanted us in China was so that they could study our business model and figure out how to copy it over time. That was something I wasn’t expecting, but I should have if I were less naive. If I were better prepared, I would have realized that was the intention. So the original title, the working title I had for my book, which I had to change because the publisher didn’t like it, my original title was, “One Bed, Two Dreams”, because that’s a phrase that most Chinese are familiar with. It explains why it didn’t work well, because my dream was working with all these Chinese technology companies and helping them do business with the rest of the world, and their dream was learning our business model.

The result was that when they gave us our license, they also told us that we would not be able to use Chinese currency for three years. That made it almost impossible to do business for the first three years. The people that said these things were both members of the government and members of the party. So I don’t know which one was talking. But they said, “We understand that you won’t be able to do much business for the first three years because the companies that you want to work with all want renminbi, they don’t want US dollars. But you can still be a good citizen. You can do what we would do, and that is we here in China help each other. So you can be helpful and prove that you care about China by teaching other banks your business model during the three years when you can’t really do much business. We’ll give you subsidy to help support you during the three years when you can’t earn much money because you can’t really do any business.” Then at the end of the three years when they gave us permission to use renminbi, they said to us, “We are so happy that you came to China and we really admire your business model and we admire it so much that we’re starting a bank of our own using your business model. Would you mind staying a little longer and being an advisor to this new bank that’s going to use your business model?” It felt like they were stealing my intellectual property but I’m not sure they thought of it that way…

…Wilcox: General Motors when it went over to China in 1985, the Chinese really didn’t have an auto industry. They wanted General Motors there not because they wanted General Motors to make a lot of money. It was because they wanted to learn about automobile manufacturing and because it took so long to build up the knowledge base, General Motors was welcome for about 30 years. But now General Motors is slowly losing market share and it’s probably going to withdraw from China. Then what will happen is China has made so much progress partially because they’re hardworking and smart, partially because they had General Motors there to learn from them, and then once General Motors retracts and goes back to the US, the auto industry in China will begin exporting and competing globally. I think actually the Chinese have done such a good job of first of all, learning from foreign automakers, but then on top of that, taking it further that the foreign automakers are in huge trouble. I think China’s automobile industry will dominate in the future. 

4. Weekend thoughts: crypto, mania, and reflexivity follow up – Andrew Walker

When I first saw the “BTC yield” metric, I thought it was pretty crazy. MSTR is trading for approaching 3x the value of their bitcoin; if they issue stock and use all the stock to buy bitcoin, of course it’s going to cause their bitcoin holdings per share to go up…. and even more so if they issue debt and use that to buy bitcoin and then judge themselves on a per share basis! Taken to its extreme2, if you thought BTC yield was truly the be all, end all of value creation, and the higher the BTC yield the better, then any company following a pure BTC yield strategy should lever themselves up to the maximum amount possible, no matter the terms, and use all of the proceeds to buy BTC. Obviously no one does that because it would be insanity and eventually banks would stop lending, but I illustrate that only to show that purely maximize BTC yield is clearly not value maximizing….

But, if you look at the fine print, BTC yield is even crazier than simply suggesting increasing BTC per share is the only value creation metric that matters. If you really look at the MSTR BTC yield table above or read their disclosures, you’ll notice that the BTC yield assumes that all of their convertible debt converts…

…So, go back to MSTR’s BTC yield table; they have a set of 2029 converts that convert at $672.40/share. Those are far, far out of the money (MSTR’s stock trades for ~$400/share as I write this)…. yet MSTR’s BTC yield assumes those converts are in the money / will convert for their BTC yield.

That is an insane assumption that casually assumes MSTR’s shares almost double3. And, again, by taking this assumption to its extreme, we can see how wild it is. Like all things, convert debt involves different trade offs; for example, you could get a higher strike price by taking on a higher interest rate (i.e. if your strike price is ~$670 at a 0% interest rate, you could probably push it up to $770 by taking on a 3% interest rate or $870 by taking on a 6% interest rate4). MSTR has issued all of this convert debt deals at 0% interest rates, which is a great pitch (“we’re borrowing for free, we don’t have to pay a carry to buy BTC, etc”)…. but if BTC yield is all that matters, MSTR could start issuing convertible debt with really high interest rates, which would jack that strike price of the convert up, thus decreasing dilution and increasing the BTC yield…

…MSTR fans would say “but raising converts with interest doesn’t make sense; it’s no longer free money / now it has a carry cost.” And I understand that argument…. but convertible debt isn’t free money either, and I just do this to highlight how insane BTC yield is as a be all / end all metric!…

…The BTC yield that all of these companies present assumes that their convert debt converts, and that is a big / crazy assumption…. but it’s interesting to think about what will happen in five years. There is, of course, a world where BTC goes to $250k (or higher) and all of these stocks moon. In that world, the converts will be well in the money, and all of this worry will sound silly…. but there is also a world where BTC stalls out or drops over the next few years, and that world is really interesting. All of these companies are raising converts with 5-7 year maturities, so if BTC doesn’t moon and the converts aren’t in the money, you’re going to have all of the BTC standard companies facing a maturity wall at the same time. What happens then? I doubt they can roll the converts at anything close to the same terms (remember, cheap converts require high volatility, and if the stocks have stalled out for five years vol is going to be a lot lower), so they’ll either need to sell a ton of equity to paydown the debt (which will be tough; there probably won’t be much enthusiasm for the stock, and I’m not sure the market would be able to absorb the hypothetical amount of stock they’d need to issue without some enthusiasm)…. or you’ll have a wave of BTC standard companies all looking to sell down some of their bitcoin to payoff converts at the exact same time.

5. Satya Nadella | BG2 (Transcript here)- Bill Gurley, Brad Gerstner, and Satya Nadella

Gerstner: Shifting maybe to enterprise AI, Satya. The Microsoft AI business has already reported to be about $10 billion. You’ve said that it’s all inference and that you’re not actually renting raw GPUs to others to train on, because your inference demand is so high. As we think about this, there’s a lot of skepticism out there in the world as to whether or not major workloads are moving. If you think about the key revenue products that people are using today and how it’s driving that inference revenue for you today, and how that may be similar or different from Amazon or Google, I’d be interested in that.

Nadella: The way for us this thing has played out is, you got to remember most of our training stuff with OpenAI is sort of more investment logic. It’s not in our quarterly results – it’s more in the other income, based on our investment.

Gerstner: Other income or loss right?

Nadella: That is right. That’s how it shows up. So most of the revenue or all the revenue is pretty much our API business or in fact, to your point, ChatGPT’s inference costs are there, so that’s a different piece. The fact is the big-hit apps of this era are ChatGPT, Co-Pilot, GitHub Co-Pilot, and the APIs of OpenAI and Azure OpenAI. In some sense, if you had to list out the 10 most hit apps, these would probably be in the four or five of them and so therefore that’s the biggest driver.

The advantage we have had, and OpenAI has had, is we’ve had two years of runway pretty much uncontested. To your point, Bill made the point about everybody’s awake and it might be. I don’t think there will be ever again maybe a two-year lead like this, who knows? It’s all you say that and somebody else drops some sample and suddenly blows the world away. But that said, I think it’s unlikely that that type of lead could be established with some foundation model. But we had that advantage, that was the great advantage we’ve had with OpenAI. OpenAI was able to really build out this escape velocity with ChatGPT.

But on the API side, the biggest thing that we were able to gain was.. Take Shopify or Stripe or Spotify. These were not customers of Azure, they were all customers of GCP or they were customers of AWS. So suddenly we got access to many, many more logos, who are all “digital natives” who are using Azure in some shape or fashion and so on. So that’s sort of one. When it comes to the traditional enterprise, I think it’s scaling. Literally it is people are playing with Co-Pilot on one end and then are building agents on the other end using Foundry. But these things are design wins and project wins and they’re slow, but they’re starting to scale. Again, the fact that we’ve had two years of runway on it, I think…

I like that business a lot more, and that’s one of the reasons why the adverse selection problems here would have been lots of tech startups all looking for their H100 allocations in small batches. Having watched what happened to Sun Microsystems in the dotcom, I always worry about that. You just can’t chase everybody building models. In fact, even the investor side, I think the sentiment is changing, which is now people are wanting to be more capital-light and build on top of other people’s models and so on and so forth. If that’s the case, everybody who was looking for H100 will not want to look for it more. So that’s what we’ve been selective on.

Gerstner: You’re saying for the others that training of those models and those model clusters was a much bigger part of their AI revenue versus yours? 

Nadella: I don’t know. This is where I’m speaking for other people’s results. It’s just I go back and say, “What are the other big-hit apps?” I don’t know what they are. What models do they run? Where do they run them? When I look at the DAU numbers of any of these AI products, there is ChatGPT, and then there is – even Gemini, I’m very surprised at the Gemini numbers, obviously I think it’ll grow because of all the inherent distribution. But it’s kind of interesting to say that they’re not that many. In fact, we talk a lot more about AI scale, but there is not that many hit apps. There is ChatGPT, Github Co-Pilot, there’s Co-Pilot, and there’s Gemini. I think those are the four I would say, in a DAU, is there anything else that comes to your mind?…

…Gurley: Satya, on the enterprise side, obviously the coding space is off to the races and you guys are doing well and there’s a lot of venture-backed players there. On some of the productivity apps, I have a question about the the Co-Pilot approach and I guess Marc Benioff’s been obnoxiously critical on this front, calling it Clippy 2 or whatever. Do you worry that someone might think first-principles AI from ground-up, and that some of the infrastructure, say in an Excel spreadsheet, isn’t necessary to know if you did a AI-first product. The same thing by the way could be said about the CRM right? There’s a bunch of fields and tasks that that may be able to be obfuscated for the user.

Nadella: It’s a very, very, very important question. The SaaS applications or biz apps, let me just speak of our own Dynamics thing. The approach at least we’re taking is, I think the notion that business applications exist, that’s probably where they’ll all collapse in the agent era. Because if you think about it, they are essentially CRUD databases with a bunch of business logic. The business logic is all going to these agents, and these agents are going to be multi-repo CRUD. They’re not going to discriminate between what the back-end is, they’re going to update multiple databases, and all the logic will be in the AI tier so to speak. Once the AI tier becomes the place where all the logic is, then people will start replacing the backends right? In fact it’s interesting, as we speak, I think we are seeing pretty high rates of wins on Dynamics backends and the agent use, an we are going to go pretty aggressively and try and collapse it all, whether it’s in customer service, whether it is in… 

By the way, the other fascinating thing that’s increasing is just not CRM, but even what we call finance and operations, because people want more AI-native biz app. That means the biz app, the logic tier, can be orchestrated by AI and AI agents. So in other words, Co-Pilot to agent to my business application should be very seamless.

Now in the same way, you could even say, “Why do I need Excel?” Interestingly enough, one of the most exciting things for me is Excel with Python, is like GitHub with Co-Pilot. So what we’ve done is, when you have Excel – by the way this would be fun for you guys – which is you should just bring up Excel, bring up Co-Pilot, and start playing with it. Because it’s no longer like – it is like having a data analyst, so it’s no longer just making sense of the numbers that you have. It will do the plan for you. It will literally – like how GitHub Co-Pilot Workspace creates the plan and then it executes the plan – this is like a data analyst who is using Excel as a sort of row/column visualization to do analysis scratch pad. So it kind of tools you. So the Co-Pilot is using Excel as a tool with all of its action space because it can generate and it has python interpreter. That is in fact a great way to reconceptualize Excel. At some point you could say, “I’ll generate all of Excel” and that is also true. After all, there’s a code interpreter, so therefore you can generate anything.\

So yes, I think there will be disruption. The way we are approaching, at least our M365 stuff is, one is build Co-Pilot as that organizing layer UI for AI, get all agents, including our own agents – you can say Excel is an agent to my Co-Pilot, Word is an agent, it’s kind of a specialized canvases, which is I’m doing a legal document, let me take it into Pages and then to Word and then have the Co-Pilot go with it, go into Excel and have the Co-Pilot go with it. That’s sort of a new way to think about the work in workflow…

…Gurley: Satya, there’s been a lot of talk about model scaling and obviously there was talk, historically about 10x-ing the cluster size that you might do, over and over again, not once and then twice. X.AI is still making noise about going in that direction. There was a podcast recently where they flipped everything on their head and they said “If we’re not doing that anymore, it’s way better because we can just move on to inference which is getting cheaper and you won’t have to spend all this capex. I’m curious, those are two views of the same coin. But what’s your view on LLM model scaling and training cost, and where we’re headed in the future?

Nadella: I’m a big believer in scaling laws I’ll first say. In fact, if anything, the bet we placed in 2019 was on scaling laws and I stay on that. In other words, don’t bet against scaling laws. But at the same time, let’s also be grounded on a couple of different things.

One is these exponentials on scaling laws will become harder, just because as the clusters become harder, the distributed computing problem of doing large scale training becomes harder. That’s one side of it. But I would just still say – and I’ll let the OpenAI folks speak for what they’re doing – but they are continuing to – pre-training I think is not over, it continues. But the exciting thing, which again OpenAI has talked about and Sam has talked about, is what they’ve done with o1. This Chain of Thought with autograding is just a fantastic. In fact, basically, it is test-time compute or inference-time compute as an another scaling law. You have pre-training, and then you have effectively this test-time sampling that then creates the tokens that can go back into pre-training, creating even more powerful models that then are running on your inference. So therefore, that’s I think a fantastic way to increase model capability.

The good news of test-time or inference-time compute is sometimes, running of those o1 models means… There’s two separate things. Sampling is like training, when you’re using it to generate tokens for your pre-training. But also customers, when they are using o1, they’re using more of your meters, so you are getting paid for it. Therefore, there is more of an economic model, so I like it. In fact, that’s where I said I have a good structural position with 60-plus data centers all over the world.

Gurley: It’s a different hardware architecture for one of those scaling versus the other, for the pre-training versus…

Nadella: Exactly. I think the best way to think about it is, it’s a ratio. Going back to Brad’s thing about ROIC, this is where I think you have to really establish a stable state. In fact, whenever I’ve talked to Jensen, I think he’s got it right, which is you want to buy some every year. Think about it, when you depreciate something over 6 years, the best way is what we have always done, which is you buy a little every year and you age it, you age it, you age it. You use the leading node for training and then the next year it goes into inference, and that’s sort of the stable state I think we will get into across the fleet for both utilization and the ROIC and then the demand meets supply.

Basically, to your point about everybody saying, “Have the exponentials stopped?” One of the other things is the economic realities will also stop, right? At some point everybody will look and say, “What’s the economically rational thing to do?” Which is, “Even if I double every year’s capability but I’m not able to sell that inventory,” and the other problem is the Winner’s Curse, which is – you don’t even have to publish a paper, the other folks have to just look at your capability and do either a distillation… It’s like piracy. You can sign off all kinds of terms of use, but like it’s impossible to control distillation. That’s one. Second thing is, you don’t even have to do anything, you just have to reverse engineer that capability and you do it in a more computer efficient way. So given all this, I think there will be a governor on how much people will chase. Right now a little bit of everybody wants to be first. It’s great, but at some point all the economic reality will set in on everyone and the network effects are at the app layer, so why would I want to spend a lot on some model capability with the network effects are all on the app?…

…Gurley: Does your answer to Brad’s question about the balancing of GPU ROI, does that answer the question as to why you’ve outsourced some of the infrastructure to Coreweave in that partnership that you have?

Nadella: That we did because we all got caught with the hit called ChatGPT. It was impossible. There’s no supply chain planning I could have done. None of us knew what was going to happen. What happened in November of ‘22, that was just a bolt from the blue, therefore we had to catch up. So we said, “We’re not going to worry about too much inefficiency.” That’s why whether it’s Coreweave or many others – we bought all over the place. That is a one time thing and then now it’s all catching up. That was just more about trying to get caught up with demand.

Gerstner: Are you still supply-constrained Satya?

Nadella: Power, yes. I am not chip supply-constrained. We were definitely constrained in ‘24. What we have told the street is that’s why we are optimistic about the first half of ‘25, which is the rest of our fiscal year and then after that I think we’ll be in better shape going into ‘26 and so on. We have good line of sight.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Alphabet (parent of Google), Amazon, Meta Platforms (parent of Facebook), and Microsoft. Holdings are subject to change at any time.

What We’re Reading (Week Ending 22 December 2024)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 22 December 2024:

1. Meet Willow, our state-of-the-art quantum chip – Hartmut Neven

Errors are one of the greatest challenges in quantum computing, since qubits, the units of computation in quantum computers, have a tendency to rapidly exchange information with their environment, making it difficult to protect the information needed to complete a computation. Typically the more qubits you use, the more errors will occur, and the system becomes classical.

Today in Nature, we published results showing that the more qubits we use in Willow, the more we reduce errors, and the more quantum the system becomes…

…This historic accomplishment is known in the field as “below threshold” — being able to drive errors down while scaling up the number of qubits…

…There are other scientific “firsts” involved in this result as well. For example, it’s also one of the first compelling examples of real-time error correction on a superconducting quantum system — crucial for any useful computation, because if you can’t correct errors fast enough, they ruin your computation before it’s done. And it’s a “beyond breakeven” demonstration, where our arrays of qubits have longer lifetimes than the individual physical qubits do, an unfakable sign that error correction is improving the system overall.

As the first system below threshold, this is the most convincing prototype for a scalable logical qubit built to date. It’s a strong sign that useful, very large quantum computers can indeed be built…

…As a measure of Willow’s performance, we used the random circuit sampling (RCS) benchmark. Pioneered by our team and now widely used as a standard in the field, RCS is the classically hardest benchmark that can be done on a quantum computer today…

…Willow’s performance on this benchmark is astonishing: It performed a computation in under five minutes that would take one of today’s fastest supercomputers 1025 or 10 septillion years. If you want to write it out, it’s 10,000,000,000,000,000,000,000,000 years. This mind-boggling number exceeds known timescales in physics and vastly exceeds the age of the universe. It lends credence to the notion that quantum computation occurs in many parallel universes, in line with the idea that we live in a multiverse, a prediction first made by David Deutsch…

…Willow was fabricated in our new, state-of-the-art fabrication facility in Santa Barbara — one of only a few facilities in the world built from the ground up for this purpose. System engineering is key when designing and fabricating quantum chips: All components of a chip, such as single and two-qubit gates, qubit reset, and readout, have to be simultaneously well engineered and integrated. If any component lags or if two components don’t function well together, it drags down system performance…

…The next challenge for the field is to demonstrate a first “useful, beyond-classical” computation on today’s quantum chips that is relevant to a real-world application. We’re optimistic that the Willow generation of chips can help us achieve this goal. So far, there have been two separate types of experiments. On the one hand, we’ve run the RCS benchmark, which measures performance against classical computers but has no known real-world applications. On the other hand, we’ve done scientifically interesting simulations of quantum systems, which have led to new scientific discoveries but are still within the reach of classical computers. Our goal is to do both at the same time — to step into the realm of algorithms that are beyond the reach of classical computers and that are useful for real-world, commercially relevant problems.

2. X (previously Twitter) thread on quantum computing and Google’s Willow – Jeffrey Scholz

Like a regular computer, a quantum computer keeps bits in groups. So a 64 bit quantum computer would have a vector of 64 2d vectors serving as it’s “word.”

Here is where the speedup happens: in a regular computer, each of the 64 bits don’t know anything about the value of any of the other 64 bits.

If we want one bit to affect another bit, we have to explicilty combine them with a logic gate.

However, in a quantum computer, each of the 64 qbits can “talk to each other” via “quantum entanglement.”

Running a quantum circuit means you plug in a quantum vector, run it through a bunch of matrix multiplications, then collapse the output.

The final vector will be the correct answer. Technically, quantum computers can give wrong answers, but if you run the computation multiple times, then you will get the correct answer on average…

…The current problem with quantum computers is that as the circuit gets bigger, they become less correct on average. All of the “talking to each other” creates so much noise the system stops working.

Once your probability of being correct drops below a certain threshold your quantum computer becomes useless. This is a major blocker for current quantum compute.

Let’s look at a specific (oversimplified but helpful) example. Suppose you shine a laser beam into an ice cube.

Actually simulating what the laser will do when it exits the ice cube is very hard to predict because some quantum phenomena is involved.

To actually compute what the laser will do means you have to explicilty compute quantum entanglement, which is slow for classical computers but “built in” to a quantum computer.

However, you can *estimate* the distribution of how the laser will scatter without a quantum computer, so you can have at least a rough idea if your answer might be correct…

…By analogy, this is what Google was doing. The computation Google was doing was a “pseudo-random quantum circuit” (think pseudoranom ice cube) but we know a quantum circuit is just matrix multiplications (on crack). Therefore, it is a bunch of random matrix multiplications with an output that looks right.

Google’s actual breakthrough was that the output of the circuit “looks correct” — which sounds underwhealming — and compared to the headlines, it definitely is. The academic breakthrough is that Google was able to use a larger circuit and notice an apparent *increase* in accuracy when modeling how a laser shines through an ice cube. That is noteworthy.

You can definitely tell if a computation has failed, and it seemed to be failing less as the circuit got bigger…

…However, note that the problem is “rigged” in favor of quantum computers. The benchmark is explicitly modeling a quantum phenomenon, so *of course* we get a speedup.

In other words, Google created a random distribution on the output that “seems correct.” Why does it “seem correct?” well because by design, the computation cannot be run on a classical computer. But if we can’t run it on a classical computer, how do we know the quantum computer is actually giving the right answer? The answer is we don’t, and this is a serious gap…

…Quantum computing is kind of at the stage right now where some smart teenager wired a few logic gates together in a random fashion and said “hey look, my circuit made a random output and didn’t explode!” Compared to previous attempts, it is an improvement. But he is still a long way from training an LLM.

3. Volatility: A Double-Edged Sword for Long-Term Equity Investors – Daniel Crowley

The ability to measure risk in a portfolio has long been a puzzle for the financial world. When Harry Markowitz introduced Modern Portfolio Theory in 1952, he revolutionized how institutions approached risk and return. His use of standard deviation as a proxy for volatility offered a clean, mathematical way to quantify the unpredictability of markets. It gave investors a seemingly precise tool to compare assets and assess portfolio risk. Over time, this approach became gospel, with concepts like beta and the Sharpe ratio reinforcing volatility as the core measure of risk.

But here’s the problem: volatility tells only part of the story. Financial markets don’t follow the neat patterns of a normal distribution, which is what these models assume. Extreme events occur far more often than traditional models predict. We’ve seen this play out time and again—from the collapse of Long-Term Capital Management to the Great Financial Crisis. The models couldn’t account for the market’s tendency to behave irrationally and with far greater extremes than the math suggested. That’s why I’ve come to view volatility not as risk itself but as a signal, an invitation to investigate further…

…Volatility is often misunderstood because it treats upward and downward price movements as equal. A stock with erratic upward swings may have high volatility but poses little risk if the business fundamentals are sound. Conversely, a stock that steadily declines might appear “safe” on paper but can quietly destroy wealth.

The market’s reliance on volatility as a measure of risk often misses these nuances.

This misunderstanding creates a divide among investors. On one side are those who cling to volatility as the ultimate arbiter of risk, building models that rely on neat equations and assumptions about market behavior. On the other are those who dismiss it entirely, treating volatility as irrelevant noise.

My view lies somewhere in the middle. Volatility is neither good nor bad—it’s just a clue. It’s a signal to dig deeper and assess whether the market’s movements are justified by changes in a business’s intrinsic value.

What I’ve come to appreciate about volatility is its ability to surface opportunity. Markets are emotional, driven by fear, greed, and short-term thinking. Prices frequently diverge from reality, creating moments where high-quality businesses are available at steep discounts. When markets panic, as they did during the COVID-19 pandemic or the Great Financial Crisis, those who can stay calm and look beyond the noise can identify extraordinary opportunities.

Volatility, far from being a risk, is often the price of admission for outsized returns.

4. The AI nuclear renaissance – SMRs role – Rihard Jarc

The global nuclear power market is about 10% of global electricity (about $350-$400B annually) and around 32% of zero-carbon electricity generation.

As of 2023, nuclear energy accounted for about 18.6% of total electricity generation in the United States. The International Energy Agency (IEA) highlights that global nuclear power output must more than double by 2050 to meet net-zero emission targets. Most of the U.S.’s nuclear power plants are over 50 years old and nearing the end of their operational lives. While their lifespans have been extended to support the grid, they will need to be replaced in the coming decades…

…The introduction of ChatGPT and the AI boom that we have experienced in the last 2 years have only accelerated as AI workloads and AI chips consume much more energy than traditional data center workloads. This Nuclear Energy expert gives a good example:

» If you provide a simple search in Google, you consume 0.3 W per hour of electricity. If you do the same with ChatGPT or Alexa or Gemini, any AI that we can imagine, this 0.3 W transforms into 2.9 W, so it means 10X the consumption.«…

…Driven by artificial intelligence (AI), cloud computing, and digital transformation, U.S. data centers consumed an estimated 150 TWh of electricity in 2023, equivalent to around 3% of the nation’s power demand. According to Goldman Sachs estimates, data center demand hovered at 340 TWh in 2023 globally, which is about 1.3% of worldwide electricity use. U.S. data center power use is expected to triple between 2023 and 2030 roughly and will require about 47 gigawatts of new generation capacity…

…Nuclear energy has become very attractive because companies want to be carbon-neutral and have stable power. An additional benefit of nuclear power is that it can provide more stable long-term contracts that are less sensitive to inflation and supply chain problems…

…Interest in nuclear energy, particularly Small Modular Reactors (SMRs), is growing as they have been heralded as a solution to streamline nuclear power production, offering flexibility, lower upfront costs, and modular deployment. The simplest way to imagine SMR is that it is a smaller version of the traditional nuclear reactor. One of their most significant benefits is that they are modular. They are designed to be built in factories, not on-site. Because they are built in factories, they are easier to assemble and control. From quality checks to a more predictable supply chain and quality of workers. When assembled, they are then shipped to the site of the nuclear plant, where they are stacked together to form the whole plant. In terms of energy output, traditional nuclear plants have outputs between 1,000-1,600 megawatts of electric (MWe) per reactor, while SMRs are around 50-300 MWe per module. Some SMRs are also said to be safer due to passive safety features, which rely on natural processes like convection to prevent meltdowns in emergencies. But they also come with cons. The primary one is that they are much smaller than traditional nuclear plants, so they do not have the cost benefits of economy of scale. Because of that, producing the same amount of energy is more expensive than on a traditional nuclear plant…

…Over 25 countries, according to the International Atomic Energy Agency (IAEA), are investing in SMRs. In March, Wood Mackenzie estimated the pipeline of SMR projects was worth more than $176 billion and that SMRs could account for as much as 30% of the global nuclear fleet by 2050…

…We can look at the example of NuScale, which has its Pressurised Water Reactor design. Their levelized cost of electricity ranges from $89-135/MWh, while traditional nuclear plants are in the $110-160/MWh. However, looking at the most traditional alternative in data centers, which is combined solar and gas, gas costs $45-70/MWh, and solar plus storage costs $30-60/MWh…

…State-backed projects in countries like China and Russia have made more progress, leveraging integrated supply chains, controlled costs, and assured revenue streams. But even for them, the costs to build these reactors compared to first estimates are still much bigger…

…We must also face reality, which says that only 2 SMRs are operational right now, one of which is in Russia and the other one in China.

Another important topic when assessing nuclear energy is the problem of nuclear waste and its storage. Most SMR designs produce a similar amount of nuclear waste on a unit production basis than traditional nuclear plants, so the problem of storing nuclear waste stays.

5. How to invest without relying on target prices – Chin Hui Leong

The US stock market is soaring to new heights. But what does that mean for your stock returns in 2025? I would like to give you a definite answer but if I did so, I would be lying to you. In fact, you should view anyone who gives you target prices with suspicion.

Here’s the hard truth: No one can control where the market is headed in the short term. Yet, the allure of target prices persists…

…The answer lies in the inherent difficulty in predicting the future of rapidly evolving technologies.

The best example is Amazon.com. In mid-2010, when I first invested in the company, it had just reported US$24.5 billion in annual revenue, primarily from its online retail business. Here is the twist: it was impossible to know what the business would look like a decade later…

…Fast forward to 2023, and AWS had become a financial cash cow with nearly US$90 billion in annual revenue and an impressive US$24.6 billion in operating income. In other words, AWS, an insignificant division back in 2009, had generated more operating income in 2023 than the entire company’s revenue in 2009…

…I like to go back to the reason why valuation is used in the first place: to reduce your investment risk. The way I see it, valuation is one of the many ways you can employ to manage risk. But valuation is not the only risk in investing.

A weak, shrinking business can pose risks that no amount of stock valuation can solve. Hence, starting with high-quality businesses is my preferred approach.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Alphabet (parent of Google) and Amazon. Holdings are subject to change at any time.

Company Notes Series (#3): Golden Throat Holdings Group Company

Editor’s note: This is the latest edition in the “Company Notes Series”, where we periodically share our notes on companies we’ve studied in the recent past but currently have no vested interest in (we may invest in or sell shares in the companies mentioned at any time). The notes are raw and not updated, and the “as of” date for the data is given at the start of the notes. The first two editions in the series can be found here and here. Please give us your thoughts on the series through the “Contact Us” page; your feedback will determine if we continue with it. Thanks in advance!

Start of notes for Golden Throat Holdings

Data as of 16 January 2023

History of Golden Throat Holdings and current management/major shareholders

  • Current HQ: Guangxi Zhuang, China
  • IPO date: July 2015, on Hong Kong Stock Exchange
  • Golden Throat Holdings’ history dates back to 1956 when Liuzhou No.2 Sweet Factory (柳州市糖 果二廠), the predecessor of Golden Throat Company (also known as Guangxi Golden Throat), was established. Golden Throat Company today manufactures and sells lozenges and other pharmaceutical and food products.
  • Golden Throat Holdings’ flagship product is Golden Throat Lozenges (OTC), which was launched in 1994. Wang Yao Fa contributed to the creation of the formula for the Golden Throat Lozenges (OTC) product and his portrait was historically used by Golden Throat Holdings on the product packaging; the portrait was changed to Jiang Peizhen in 2015.
  • Golden Throat Company (the main operating entity in China of Golden Throat Holdings) was established in Liuzhou, Guangxi Zhuang, China, on 18 September 1998 by Jiang Peizhen as the original controlling shareholder. She has been involved with Golden Throat Holdings for over 60 years, since 1956.
  • Jiang and her son, Zeng Yong, control 69.79% of Golden Throat’s shares (the 69.79% equates to 516.0137 million shares) as of 30 June 2022. At the 11 January 2023 share price of HK$1.98, their stake equates to HK$1.02 billion.
  • Jiang, 76, is currently chairman and non-executive director of Golden Throat Holdings, while Zeng, 48, is an executive director and vice chairman of the board. Zeng has been involved with Golden Throat Holdings since 1995. Both Jiang and Zeng have been in their respective roles since February 2015.

Golden Throat Holdings’ business

  • Revenue in 2021 was RMB 820.5 million, of which 99.6% was from Mainland China.
  • The company reports its revenue by three product categories, which include Golden Throat Lozenges (OTC), Golden Throat Lozenge Series Products, and other products.
  • Golden Throat Lozenge (OTC): A type of lozenge mainly designed to relieve symptoms of sore and dry throat and hoarse voice caused by acute pharyngitis. Golden Throat Lozenges (OTC) was approved as over-the-counter medicine by the National Medical Products Administration (NMPA), China’s version of the FDA in the USA. As such, Golden Throat Lozenges (OTC) can be purchased by the public in pharmacies without requiring the prescription of a qualified medical professional. As of 31 December 2021, Golden Throat Lozenges (OTC) were exported to the United States, Canada, Russia, the European Union, Australia, Southeast Asia, Middle East, Mexico and Africa, and Mongolia, a newly explored export country in 2019. For the year ended 31 December 2021, Golden Throat Lozenges (OTC) accounted for 90.1% of Golden Throat Holdings’ total revenue.
  • Golden Throat Lozenge Series Products: Includes seven products comprising of Dule Lozenges (都樂含片), sugar-free Dule Lozenges, and five other sugar-free flavours of this series, namely orange (香橙), fructus momordicae (羅漢果), chrysanthemum (桑菊), American ginseng (西洋參) and hawthorn (山楂). A major difference between Golden Throat Lozenges (OTC) and Golden Throat Lozenge Series Products is that the former is approved as over-the-counter medicine, whereas the latter is approved as food products. The sugar-free series of Golden Throat Lozenge Series Products was launched in 2013, which supplements the company’s original sales channel and provides consumers with more diversified choices. As of 31 December 2021, Golden Throat Lozenge Series Products were exported to 17 countries and regions, and accounted for 8.7% of Golden Throat Holdings’ total revenue in 2021.
  • Other products: Accounted for approximately 1.2% of Golden Throat Holdings’ total revenue in 2021. Includes: (1) Yinxingye Tablet ( 銀杏葉片), which is designed to facilitate blood circulation, remove blood stasis and dredge energy channels and was approved as a prescription medicine by the NMPA; (2) a new product, Golden Throat Intestinal Series (金嗓子腸寶), which is an exclusive nutrition for probiotics, also known as prebiotics; and (3) Golden Throat Compound Probiotic Lozenges, which was launched in June 2022 and was developed by Golden Throat Holdings and the scientific research team of “Food Microbial Function Development” of Beijing Agricultural College. Golden Throat Compound Probiotic Lozenges addresses the lack of self-developed probiotics in China. Golden Throat Holdings has developed six kinds of proprietary probiotic bacteria in three new flavors and the company is committed to using “Chinese bacteria” to improve the physique of Chinese citizens. Golden Throat Compound Probiotics adopts the internationally leading three-layer embedding technology, 360-degree thermal radiation freeze drying technology, and automatic ingredient fermentation and cultivation system.
  • Golden Throat Holdings has established an extensive and structured sales and distribution network throughout China for its (i) over-the-counter medicines, (ii) food products, and (iii) prescription medicines. As of 31 December 2021 and 30 June 2022, substantially all of the company’s revenue was generated from sales to distributors. In 2021, there was only one customer that accounted for more than 10% of Golden Throat Holdings’ revenue (11.7%); there was no such customer in 2020.
  • Golden Throat Holdings has a well-established brand in China: 
    • In October 2021, in the 2021 ranking of China nonprescription medicines enterprises and product brands, Golden Throat Lozenges (OTC) was recognised as No. 1 amongst Chinese traditional medicines (Throat) by the China Nonprescription Medicines Association.
    • Golden Throat Holdings was ranked 43rd amongst the nonprescription manufacturing enterprises in the 2021 ranking of China non-prescription medicines enterprises and product brands.
    • Golden Throat Holdings was listed in the Top 500 Chinese Brands at the 14th China Brand Festival in August 2020.
    • In August 2020, Golden Throat Holdings claimed the title of “2019 China Traditional Medicines Pharmaceutical Industry Top 100 Enterprise” at the China Pharmaceutical Industry Top 100 Annual Assembly.
    • In 2019, Golden Throat was awarded the Best Brand Value Award at the China Financial Market Awards 2019, and won the Huapu Award at the 13th China Brand Festival in August.
    •  In 2017, the Golden Throat (金嗓子) brand was selected as a world famous brand by the China America Branding Strategy Forum and also ranked amongst the listed companies on the Forbes China Up-and-Comers List.

Golden Throat Holdings’ market and future expansion

  • According to a 2015 Euromonitor Report, retail sales value of lozenges in China increased 10.4% per year from RMB 2.09 billion in 2009 to RMB 3.42 billion in 2014, and was expected to increase to RMB 5.46 billion in 2019, at a CAGR of 9.7%. Lozenges accounted for 72% of the total throat remedies market in China in 2014; the throat remedies market primarily includes over-the-counter medicines and medicated confectionery (which are food).
  • In 2021, plants and office buildings of a new medicine production and research and development base for Golden Throat Holdings, located at Luowei Industrial Concentration Area, Liuzhou, Guangxi Zhuang Autonomous Region, as well as the commissioning of product lines and trial production were completed. Golden Throat Holdings completed the overall relocation in the second half of 2021. The new production base covers a usable area of about 60,000 square metres, including research and development centres, production plants, warehouses and administrative office buildings. “The fully automated production line in the production plant will improve the efficiency of the production process. A brand-new modern production enterprise will be formed with the new production and research and development base, new factories, new workflow and new production lines, which will completely upgrade the management platform and manufacturing platform of the factories, comprehensively improving the manufacturing quality and technology content of the products, enhancing the comprehensive competitiveness of the Company, and will lay a solid foundation for expanding and strengthening the Company.The new production base increased Golden Throat’s production capacity for its main products by 57% to 198.5 million boxes of Golden Throat Lozenges. See video of the new production base: https://news.gxtv.cn/article/detail_567c4b49e6924346917643b221fe9555.html
  • Also in 2021, Golden Throat Holdings selected a 48 mu (~32,000 square metres) piece of land in the south of the new drug production and R&D base as the site for the second phase of the new Golden Throat Base, which is expected to have a usable area of approximately 50,000 square metres after completion. The second phase will house a food production plant and a food research and development centre. After completion, a high-tech R&D team, smart manufacturing and smart sales will be introduced to develop more comprehensive health products. The second phase of the Golden Throat new base will form the core of Golden Throat Doctor Workstation, the Golden Throat Professor Workstation, the Golden Throat Research Institute, the Golden Throat Gastrointestinal Research Institute, and the Golden Throat Heart and Brain Research Institute. It will also facilitate the development of new products such as genetic medicines, traditional Chinese medicine prescriptions, specialty medical devices, and specialty health foods. As of 30 June 2022, the second phase of the Golden Throat new base is in the initial stage of construction.
  • The Golden Throat WeChat Mini Program Mall was launched in early 2020. “We will continue to expand online sales channel in 2022, and we believe there would be breakthroughs in our online business in the future.”

Golden Throat’s sales volumes and pricing of products

  • There was a change in packaging-configuration in August 2013, so numbers for 2012 and 2013 are not like-for-like comparisons with numbers in later years.
  • Golden Throat Holdings has managed to raise the prices for its Golden Throat Lozenges (OTC) products over time, while keeping  gross margin steady, keeping sales volume steady (although less steady then gross margin), and increasing revenue → signs of pricing power for Golden Throat Lozenges (OTC) product
  • Golden Throat Holdings has managed to raise the prices for its Golden Throat Lozenge Series Products over time, while increasing gross margin, increasing sales volume, and increasing revenue → signs of pricing power for Golden Throat Lozenge Series Products
  • Golden Throat Holdings’ sales volume was hurt in 2020 because of COVID, but the company still maintained or increased its product prices.
  • Golden Throat’s sales volume for Golden Throat Lozenge (OTC) products did not increase much over time because the volume was already near the company’s capacity – prior to the expansion mentioned in Point 3, Golden Throat’s annual production capacity was ~126 million boxes of the Golden Throat Lozenge (OTC) product.

Golden Throat financial performance

Annual numbers

  • Revenue has grown over time but had some ups and downs – same with net profit
  • Was always generating positive operating cash flow and free cash flow (with exception of 2017), although there’s no clear growth in cash flows.
  • Balance sheet was always in a strong net-cash position
  • No history of dilution (IPO happened in 2015 – immediately after IPO, there was around 726.36 million shares)
  • There was a dividend paid in every year since the company’s IPO, and it has increased over time; the dividend also looks fairly sustainable

Half-yearly numbers

  • Revenue growth in H1 2022 was affected by resurgence of COVID in China, and so was net-income
  • But cash flows have improved tremendously and balance sheet remains rock-solid
  • Worth noting that Golden Throat’s borrowings are all on fixed rates, so there’s no danger of rising interesting rates negatively affecting the company’s profit and/or cash flow 

Management’s integrity and kindness

  • There are related party transactions (RPTs), but they are minimal. In 2021, Golden Throat Holdings incurred RMB 9.576 million in expenses to procure raw ingredients (such as liquid isomalt, isomalt AG, syrup, and probiotics) from a related entity, Changbao; in 2020, the amount was RMB 4.388 million. These amounts make up only a single-digit percentage of total net profit (and even much smaller percentage of total revenue) in their respective years.
  • The remuneration of Jiang Peizhen and Zeng Yong has largely increased at a faster rate than Golden Throat Holdings’ revenue, net income, and FCF over the years, especially after the company’s IPO. But their remuneration levels only make up a single-digit percentage of Golden Throat Holdings’ net income (see table below).
  • Golden Throat Holdings ended 2021 with 937 full-time employees, of which 100 are disabled persons. In August 2020, Golden Throat Holdings provided electric vehicles for employees commuting to work. The EVs are produced by Liuzhou SGMW (柳州上汽通用五菱) and Golden Throat Holdings ordered over 700 of them from SGMW. Management thinks the EVs “would not only solve the transportation problem of employees with long commuting distance, but also effectively stimulate domestic demand and help economic growth and recovery.”

Valuation

  • Valuation numbers based on 11 January 2023 share price of HK$1.98
  • Trailing PE (price-to-earnings) of 7.8, trailing PFCF (price-to-free cash flow) of 7.7
  • Net-cash per share of HK$0.88
  • Trailing PE net of cash of 5.0, trailing PFCF ratio net of cash of 4.9
  • Trailing dividend yield of a massive 9.1%
  • Management wanted to acquire the company in August 2021 at HK$2.80 per share together with Affirma (emerging market private equity firm owned and operated by former senior leadership team of Standard Chartered Private Equity; managed over US$ 3.5 billion in assets at the time of the announcement).I think this price could be seen as a floor on the value of Golden Throat holdings. Golden Throat’s trailing earnings per share and free cash flow per share was RMB 0.30 (~HK$ 0.36 ) and RMB 0.18 (~HK$ 0.21), respectively, based on the company’s financials for the first half of 2021, meaning the acquisition price valued the company at a trailing PE and trailing PFCF ratio of just 7.8 and 13.1. Net of cash, the PE and PFCF ratios would be 5.3 and 8.8

Final thoughts (as of 16 January 2023)

  • Very cheap valuation right now
  • Possibility of much higher revenue in 2023 (compared to 2022 and 2021) as China has reopened and Chinese citizens depend on the Golden Throat Lozenge (OTC) product to soothe their ailments from COVID or otherwise; 2022’s overall numbers may be lower than in 2021 as China was in lockdown mode for most of 2022 and only opened up late in the year.
  • Selling prices for Golden Throat Lozenge (OTC) products on Tmall are currently easily more than RMB 10 per box, and more commonly around RMB 12-14 per box (see screenshots below, taken on 16 Jan 2023 from Tmall app – sidenote: Tmall has better reputation than Taobao). The unit sale price to distributors reported by the company in H1 2022 was just RMB 7.0 per box; I think it’s reasonable to expect the unit sale price to distributors for 2023 – as well as overall volume – to be materially higher than 2022 and 2021, thereby boosting profit and cash flow margins for Golden Throat Holdings.
  • Golden Throat Holdings had expanded production capacity in 2021, and is building a new plant right now.
  • Golden Throat Holdings has also received strong government support for the production of its products. See the following English translations of a Mandarin article from the Guangxi government website:
    • On January 4, Wei Guanghui, a member of the party group and deputy director of the Food and Drug Administration of the Autonomous Region, led a team to Guangxi Liangmianzhen Yikang Pharmaceutical Co., Ltd. and Guangxi Golden Throat Pharmaceutical Co., Ltd. The production of Golden Throat Lozenges provides door-to-door service guidance, and pays close attention to ensuring the supply of drugs for the prevention and control of the new crown epidemic.”
    • Golden Throat Lozenges were selected into the “Catalogue of Drugs for New Coronary Virus Infection (First Edition)” issued by the Beijing Municipal Health and Health Commission. In order to meet the clinical needs of the general public, the company has expanded its capacity and production at full capacity, and the Food and Drug Administration of the Autonomous Region has followed up the whole process.”
    • “The working time of Golden Throat Lozenges has been extended from the original 8 hours to 12 hours, and the daily production has increased from 7.37 million tablets to 9.21 million tablets, which strongly supports the anti-epidemic needs of the people across the country.
  • For now, I see Golden Throat Holdings as a deep-value stock, but it could also change into a growth stock if its plans for new products such as genetic medicines, traditional Chinese medicine prescriptions, specialty medical devices, and specialty health foods succeed.
  • One risk to the company’s future business prospects is if its Golden Throat Lozenge (OTC) product price gets controlled by the government. According to the IPO prospectus, “there had been no fixed or maximum prices promulgated by any authorities in China on Golden Throat Lozenges (OTC).” There’s been no update on the matter that I could find in subsequent annual reports.

Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have no vested interest in any company mentioned. Holdings are subject to change at any time.

What We’re Reading (Week Ending 15 December 2024)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 15 December 2024:

1. SpaceX: Rocket Ship – Matt Reustle and Luke Ward

Luke

So if we take the CapEx part of that first, NASA estimated that the cost to develop the Falcon 9 from scratch would be about $4 billion. But SpaceX ended up doing it for about a tenth of that price. So to begin with, that’s an order of magnitude improvement in the level of investment required.

SpaceX gives you the prices for launches on their website. So $70 million per launch of a Falcon 9 flight—that’s already 20 times cheaper than the Space Shuttle was per kilogram into orbit. But the real kicker, as you point out, is the operating leverage that comes from having partial reusability…

…Starship is designed to be fully and rapidly reusable. So unlike Falcon 9, which is only partially reusable but also able to fly multiple times every day, it’s going to have a payload capacity that’s about 100 tons to orbit at the beginning, but probably rising to closer to 200 tons to orbit over time.

And Musk has suggested that a variable cost of around $10 million per launch is the ballpark figure which they’d be aiming for at scale in a steady state, ambitiously maybe even falling to $2 million—a figure which has been touted. If you believe those kinds of performance levels are feasible, that gets the cost down to around $10 per kilogram. That’s over 100 times cheaper than the Falcon 9 we’re talking about at the moment. And that would have a dramatic effect on what’s economically feasible for humanity to do in space…

…Matt

Satellites in Low Earth Orbit—there is quite a bit of history in terms of that being the obvious space use case, that having an existing economy. I think Starlink is an extension of that. Different, absolutely, but an extension of what was going on.

Are there brand new industries being unlocked or obvious things with line of sight that open up from a space economy perspective that you see either today or, when I say near future, you could extend that out however far you think is reasonable.

Luke

A lot of these options which SpaceX has to develop, brand new markets that don’t exist already, are a function ultimately of the cost curve. Take semiconductor manufacturing on Earth; at the moment, we spend billions of dollars per fab to recreate the conditions which are readily accessible in space for free, if you can get there.

And so there’s some point on the cost curve intersecting between the cost of building a fab and the cost of launching a fab or the equipment of a fab into orbit and operating there instead. Same can be said of pharmaceutical research. The crystallization structures which are able to happen in space are different from the ones which are able to happen under the influence of gravity.

So if you think about pricing on pharmaceuticals, extending patent lives, etc., if you can move the manufacturing or the research lab for cutting-edge pharmaceuticals into space, you could make high-value, low-volume products. Something which would really make sense to do and doesn’t require a huge technological innovation to happen.

The list can go on and on—artificial organs, for example, being able to manufacture perfectly spherical lenses. There’s lots and lots of things which could be made.

Maybe the way to think about that is that space-based manufacturing could be the next large market for this if the costs can continue to come down. Starship having the volume of an A380 or a 747—think of the equivalent size of factory that represents. And if that can be launched every single day and recovered every single day for $10 per kilogram, that could be a really compelling way to do quite a lot of manufacturing.

Incidentally, that’s something that Jeff Bezos really focuses on in his vision for space as opposed to Mars per se, is where we can move a lot of the heavy-polluting industry off the planet. And why don’t we turn Earth into this perfect nature reserve, and all these polluting aspects of manufacturing can go into orbit, which again is very compelling.

Probably needs a lot more innovation to deliver communications from orbit, but I’d say it’s maybe an inevitability if the cost gets to a low enough point. You think how much solar energy is available without the atmospheric attenuation, for example—you know, 24/7. There’s lots of compelling reasons why if it’s cheap enough, at some point a lot of these things probably should happen, not just could happen.

Matt

The solar energy point, great example of something that is an entirely different dynamic in space than on Earth. What would the other things be? Just out of curiosity, when you mentioned semiconductors or pharmaceuticals, is it just purely gravity? Are there other things that are happening in space or not happening in space that happen on Earth that would drive that difference?

Luke

There’s the vacuum conditions—so there isn’t an atmosphere—so the level of impurities which you need to get rid of for a vapor deposition machine, for example. You don’t have the same kind of challenges there of having to have this deep vacuum.

Then, arguably, in space, because you don’t have gravity, you could construct much larger structures there rather than construct them on the ground and then launch them.

So again, that volume constraint which we were talking about earlier, in terms of how big your payload is—if you’re able to get enough stuff up there and assemble it in space, as we did with the International Space Station, things can be much, much larger given the payload bay of Starship than they could with the Space Shuttle.

Matt

When you think about low Earth orbit versus geosynchronous orbit versus something like Mars—which I think was the original vision with Elon and SpaceX—how much does that change the economics when you extend out?

Is it orders of magnitude where it’s an exponential cost curve to go further out? Even just if we focus on the launch and use a satellite for an example, before we get into all the manufacturing dynamics, is there any way to contextualize that from a cost perspective?

Luke

The really good news here is that gravitational force decreases with the square of distance. So the biggest challenge is getting off the surface and into orbit. Once you’re there, from an energy point of view, it’s a lot easier to go anywhere else in the solar system.

So if you were to take Falcon 9 again as the example, for the same price, it can place 20 tons into low Earth orbit, or it can place 4 tons into Martian orbit. That’s despite the latter being over a million times further away. Now, this feeds into what I think is probably the biggest misconception about SpaceX and its Mars ambitions.

I’d say for most people, the idea of a commercial entity pursuing exploration is naive at best. But I’d argue that long-term investors should be absolutely ecstatic about SpaceX having this mission as a forcing function. Firstly, it’s the key to getting the best people in the world to come and work for the organization and allow it to innovate in a manner and speed that others simply can’t match. That’s a huge competitive advantage.

Secondly, the way to get more cargo to Mars is actually about figuring out how to get more cargo into orbit around Earth, because that’s where the cost is all concentrated. It’s all in that first initial leap off the surface of our planet. So rather than framing Starship as a system that makes it possible to get to other planets, think about it instead being a system that could make it enormously more profitable to operate a business in Earth orbit and unlock brand new commercial use cases there as well…

…Luke

When we talk to SpaceX, they’re still very much focused on the here and now in the next couple of years. They have ambitions for things which they could do, but the focus is very much on the core business: serving the core customers, serving Starlink, getting Starship to launch status. We’ll deal with the next things next.

They’ve got so many things which they could be doing at the moment. When we come to this, a lot of this is us hypothesizing of how that could evolve beyond information which they’ve given us. The trend which you’ve seen of them to be vertical integrators could be quite informative. It might be that they end up being the ones who are commercializing a lot of these other services.

Rather than having a customer paying them for it at substantial scale, it would make more sense for them to do it. Could you start seeing some of these aspects? If they get into space-based manufacturing, for example, could that be priced on a value-added basis rather than a subscription basis or a volume basis? Certainly seems possible. If you start running data centers in space because it’s easier to power or cool them, etc., could you start offering data storage and machine learning alongside Starlink connectivity?

The further you look out, the more and more wacky it can get, but it’s also potentially financially plausible as well. You maybe have to take a bit of inspiration from science fiction here, but it’s quite a common trope in some of these movies of these large mega-corporations—the Weyland-Yutani Corporation from the Alien movies, or the Resources Development Administration from the Avatar films—where one mega-corporation was able to dominate access to space early on and then ends up controlling the entire extrasolar economy because of the advantages it had at that really early stage…

…Luke

The human spaceflight at the moment definitely has been the preserve of the rich and famous, but at scale that becomes cheaper and cheaper. And if we are talking about launching, Starship could be used as much for sending cargo and people to other points on the planet rather than other points in space. And so one option that the government’s looking into is this notion of rocket cargo delivery. Starship would be able to deliver 200,000 kg anywhere on the planet within 40 minutes.

What does that do for sort of a rapid reaction force, and what does that do for next-day delivery? At some stage, it’s going to be feasible to put a lot of astronauts or paying passengers on something like that, and it will be a quicker and potentially more efficient way to do long-distance travel. These things really could get quite wild, but it could be plausible at some stage. Again, that’s not the reason to invest in the company today; that’s not the basis of what they’re doing, and it’s a lot of people getting excited about things.

But come back in 10 years, I’d be disappointed if you or I weren’t able to go into space at some point in our lifetime for the cost of a premium economy ticket or something like that.

2. Japan vs Big Tech – Daye Deng

Put simply, US big tech has grown so dominant that it’s singlehandedly blowing a hole in the trade balance of a nation as large as Japan…

…In 2023, Japan recorded JPY 5.5 trillion in so-called digital trade deficit. The Ministry of International Trade and Industry (MITI) projects this to grow to JPY 8 trillion by 2030, at which point it could surpass Japan’s annual import of crude oil.

Japan’s total goods and services trade deficit in 2023 was JPY 6 trillion, with the digital deficit accounting for JPY 5.5 trillion…

…Japan has been in a structural deficit for goods trade over the past two decades. This may come as a surprise to those who have held onto the old idea that Japan is an export powerhouse.

There are several reasons for the shift:

  • Japanese firms have moved production overseas. This isn’t entirely negative since Japanese firms (and their profits) continue to grow, but it has contributed to a widening trade deficit.
  • Japan’s loss of global competitiveness in certain industries, like chips and appliances, to rivals such as South Korea.
  • Rising cost of imports driven by energy shocks, rising overseas inflation, and weak yen.

The third point deserves elaboration. Japan’s reliance on imported energy has long been a critical structural weakness. For example, following 2011 Fukushima nuclear disaster, Japan significantly reduced domestic nuclear energy production and increased its reliance on imported LNG, becoming a major contributor to trade deficit.

A similar pattern emerged post-Covid. Global oil and commodity prices surged. This was compounded by high rates of overseas inflation on general imports. On top, a historically weak yen made imports even more expensive…

…Since 2014, the Japanese government has been disclosing the digital deficit, which has grown 2.6-fold from 2014 to JPY 5.5 trillion in 2023. This is a net figure derived from JPY 9.2 trillion paid for digital services and JPY 3.7 trillion received from abroad…

…The picture is quite clear: on the services side, Japan is taking its hard-earned surplus from tourism and spending it all on paying for digital services.

How will this play out? While I’m personally bullish on the Japanese tourism industry, it still has natural growth constraints. However, there is no ceiling on how much Japan can continue to spend on digital services. In fact, digital services spend could accelerate given:

  • Japan is already playing catch-up in the digital realm, and is behind other major countries in many key digital metrics.
  • AI is poised to make Japan’s digital dependency crisis even worse, in a world where firms like Nvidia and those that are able to scale AI services (e.g. hyperscalers) dominate AI economics.

Without an AI champion of its own, Japan has few options if it wants to avoid being left behind in the new digital paradigm…

…Based on our discussion so far, does it surprise you that the Japanese yen has been weak?

“According to an analysis by Mizuho Research & Technologies, if the digital deficit doubles from the 2023 level by the end of March 2026, it will add another 5 to 6 yen of depreciation in the Japanese currency’s value against the dollar.”

– Nikkei Asian Review

Or let me put it another way — would you feel bullish about the currency of a country that relies on tourism as its primary growing surplus, while ultimately funneling all those earnings (and more) into paying for essential energy imports and ever-increasing digital spend on big tech?…

…In recent years we’ve seen how hard Japan has been trying to reclaim its position in the semiconductor industry. But do they only care about hardware and not its digital sovereignty? Will Japan continue to sit back and let US tech giants profit endlessly, or will it finally confront its position as a digital colony?

3. Guyana and the mystery of the largest ranch in the Americas – Swen Lorenz

Many mistakenly believe that Guyana is located in Africa – when it’s actually nestled right next to Venezuela…

…In 2015, ExxonMobil discovered oil off the coast of Guyana.

The discovery changed the course of the country. Long one of the poorest nations of the Western hemisphere, Guyana has since become the world’s fastest growing economy.

Since 2015, its GDP per capita has more than quintupled. In 2022 and 2023, its economy grew by 67% and 33%, respectively. Another stunner of a year is forecast for 2024, with 34% GDP growth.

The former British colony benefits from a large amount of oil wealth spread around a relatively small population of 800,000 people. Per head, there is twice as much oil as in Saudi Arabia. To put things in perspective, Guyana’s landmass is nearly as big as the UK, but it only has 1.2% of the UK’s population…

…Just a week ago, ExxonMobil reported that it had reached 500m barrels of oil produced in Guyana since output began in 2019. The goal is to lift production to 1.3m barrels per day by 2027, up from currently 650,000 barrels. In comparison, the UK’s North Sea produces just 1m barrels per day…

…Supporters of the country’s energy projects claim that they will bring untold riches to the population. Indeed, Guyana recently started to hand out cheques to its citizens, including the Guyanese diaspora of 400,000 people, who the government encourages to come back as it needs more labour to support the strong economic growth.

4. Capital, Compute & AI Scaling – Patrick O’Shaughnessy, Chetan Puttagunta, and Modest Proposal

Modest

Everyone knows the Mag 7 represent a larger percent of the S&P 500 today. But beyond that, I think thematically AI has permeated far broader into industrials, into utilities and really makes up, I would argue, somewhere between 40 and 45% of the market cap as a direct play on this. And if you even abstract to the rest of the world, you start bringing in ASML, you bring in TSMC, you bring in the entire Japanese chip sector. And so if you look at the cumulative market cap that is a direct play on artificial intelligence right now, it’s enormous…

… I think at the micro level this is a really powerful shift if we move from pre-training to inference time and there are a couple big ramifications.

One, it better aligns revenue generation and expenditures. I think that is a really, really beneficial outcome for the industry at large, which is in the pre-training world you were going to spend 20, 30, $40 billion on CapEx, train the model over 9 to 12 months, do post-training, then roll it out, then hope to generate revenue off of that in inference. In a test time compute scaling world you are now aligning your expenditures with the underlying usage of the model. So just from a pure efficiency and scalability on a financial side, this is much, much better for the hyperscalers.

I think a second big implication, again we have to say we don’t know that pre-training scaling is going to stop. But if you do see this shift towards inference time, I think that you need to start to think about how do you re-architect the network design? Do you need million chip super clusters in energy low-cost land locations or do you need smaller, lower-latency, more efficient inference-time data centers scattered throughout the country? And as you re-architect the network, the implications on power utilization, grid design?

A lot of the, I would say, narratives that have underpinned huge swaths of the investment world I think have to be rethought and I would say today because this is a relatively new phenomenon, I don’t believe that the public markets have started to grapple with what that potential new architecture looks like and how that may impact some of the underlying spend…

Chetan

But at the moment, at this plateauing time, we’re starting to see these small teams catch up to the frontier. And what I mean by frontier is where are the state-of-the-art models, especially around text performing? We’re seeing these small teams of quite literally two to five people jumping to the frontier with spend that is not one order, but multiple orders of magnitude less than what these large labs were spending to get there.

I think part of what’s happened is the incredible proliferation of open-source models. Specifically, what Meta’s been doing with LLaMA has been an extraordinary force here. LLaMA 3.1 comes in three flavors, 405 billion, 70 billion, 8 billion. And then LLaMA 3.2 comes in 1 billion, 3 billion, 11 billion, and 90 billion.

And you can take these models, download them, put them on a local machine, you can put them in a cloud, you can put them on a server, and you can use these models to distill, fine-tune, train on top of, modify, et cetera, et cetera, and catch up to the frontier with pretty interesting algorithmic techniques.

And because you don’t need massive amounts of compute, or you don’t need massive amounts of data, you could be particularly clever and innovative about a specific vertical space, or a specific technique, or a particular use case to jump to the frontier very, very quickly…

…Chetan

The force of Llama today has been two things, and I think this has been very beneficial to Meta is one. The transformer architecture that Llama is using is a sort of standard architecture, but it has its own nuances.

And if the entire developer ecosystem that’s building on top of Llama is starting to just assume that that Llama 3 transformer architecture is the foundational and sort of standard way of doing things, it’s sort of standardizing the entire stack towards this Llama way of thinking, all the way from how the hardware vendors will support your training runs to the hyperscalers and on and on and on. And so standardizing on Llama itself is starting to become more and more prevalent.

And so if you were to start a new model company, what ends up happening is starting with Llama today is not only great because Llama is open source, it’s also extraordinarily efficient because the entire ecosystem is standardizing on that architecture…

…Modest

So I think the interesting part for OpenAI was because they just raised the recent round and there was some fairly public commentary around what the investment case was. You’re right, a lot of it oriented around the idea that they had escape velocity on the consumer side and that ChatGPT was now the cognitive reference and that over time they would be able to aggregate an enormous consumer demand side and charge appropriately for that and that it was much less a play on the enterprise API and application building.

And that’s super interesting if you actually play out what we’ve talked about when you look at their financials, if you take out training runs, if you take out the need for this massive upfront expenditure, this actually becomes a wildly profitable company quite quickly in their projections. And so in a sense it could be better.

Now then the question becomes what’s the defensibility of a company that is no longer step function advancing on the frontier?…

…Chetan

These products are truly, as a software investor, absolutely amazing.

They require a total rethinking from first principles on how these things are architected. You need unified data layers, you need new infrastructure, you need new UI and all this kind of stuff. And it’s clear that the startups are significantly advantaged against incumbent software vendors. And it’s not that the incumbent software vendors are standing still, it’s just that innovator’s dilemma in enterprise software is playing out much more aggressively in front of our eyes today than it is in consumer.

I think in consumer, the consumer players recognize it, are moving it, and are doing stuff about it. Whereas I think in enterprise, even if you recognize it, even if you have the desire to do something, the solutions are just not built in a way that is responsive to dramatic re-architecture. Now could we see this happening? Could a giant SaaS company just pause selling for two years and completely re-architect their application stack?

Sure, but I just don’t see that happening. And so if you just look at any sort of analysis on what’s happening on AI software spend, something like it’s 8x year-over-year growth between 2023 and 2024 on just pure spend. It’s gone from a couple of hundred million dollars to well over a billion in just a year’s time…

…Modest

If you listen to AWS, one of the fascinating things they say is they call AWS a logistics business.

I don’t think anyone externally would sort of look at cloud computing and say, oh yeah, that’s a logistics business. But their point is essentially what they have to do is they have to forecast demand and they have to build supply on a multi-year basis to accommodate it.

And over 20 years they’ve gotten extraordinarily good at what has happened in the last two years, and I talked about this last time, is you have had an enormous surge in demand hitting inelastic supply because you can’t build data center capacity in three weeks. And so if you get back to a more predictable cadence of demand where they can look at it and say, okay, we know now where the revenue generation is coming from.

It’s coming from test time, it’s coming from Chetan and his companies rolling out. Now we know how to align supply with that. Now it’s back to a logistics business. Now it’s not grab every mothballed nuclear site in the country and try to bring it online.

And so instead of this land grab, I think you get a more reasonable, sensible, methodical rollout of it maybe. And I actually would guess that if this path is right, that inference overtakes training much faster than we thought and gets much bigger than we may have suspected.

But I think the path there in the network design is going to look very different and it’s going to have very big ramifications for the people who were building the network, who were powering the network, who were sending the optical signals through the network. And all of that, I think, has not really started to come up in the probability-weighted distributions of a huge chunk of the public market.

And look, I think most people overly fixate on NVIDIA because they are sort of the poster child of this, but there are a lot of people downstream from NVIDIA that will probably suffer more because they have inferior businesses. NVIDIA is a wonderful business doing wonderful things. They just happen to have seen the largest surge in surplus. I think that there are ramifications far, far beyond who is making the bleeding edge GPU, even though I do think there will be questions about, okay, does this new paradigm of test time compute allow for customization at the chip level much more than it would have if we were only scaling on pre-train…

…Modest

If you think about a training exercise, you’re trying to utilize them at the highest possible percent for a long period of time. So you’re trying to put 50, 100,000 chips in a single location and utilize them at the highest rate possible for nine months. What’s left behind is a hundred thousand chip cluster that if you were to repurpose for inferencing is arguably not the most efficient build because inference is peaky and bursty and not consistent.

And so this is what I’m talking about that I just think from first principles you are going to rethink how you want to build your infrastructure to service a much more inference focused world than a training focused world. And Jensen has talked about the beauty of NVIDIA is that you leave behind this in place infrastructure that can then be utilized.

And in a sunk cost world you say, sure, of course if I’m forced to build a million chip supercluster in order to train a $50 billion model, I might as well sweat the asset when I’m done. But from first principles it seems clear you would never build a 350,000 chip cluster with 2 1/2 gigawatts of power in order to service the type of request that Chetan’s talking about.

And so if you end up with much more edge computing with low latency and high efficiency, what does that mean for optical networking? What does that mean for the grid? What does that mean for the need for on site power versus the ability to draw from the local utility?…

…Chetan

Semiconductor company called Cerebras, and they recently announced that inference on Llama 3.1 405 billion for Cerebras is it can generate 900-plus tokens per second, which is a dramatic order-of-magnitude increase. I think it’s like 70 or 75 times faster than GPUs for inference as an example. And so as we move to the inference world, the semiconductor layer, the networking layer, et cetera, there’s tons of opportunities for startups to really differentiate themselves…

…Modest

On a less sort of dramatic view, the way I think about this, there’s AlphaGo, which famously did that move that no one had ever seen, and I think it’s like move 37, everybody was super confused about, ended up winning. And another example I love is Noam Brown, because I like poker, talked about his poker bot confused—it was playing high stakes, no limit, and it continually over-bet dramatically larger sizes than pros had ever seen before.

And he thought the bot was making a mistake. And ultimately it destabilized the pros so much. Think about that. A computer destabilized humans in their approach that they have to some extent taken on over-betting now into their game.

And so those are two examples where if we think about pre-training being bounded by the data set that we’ve given it, if we don’t have synthetic data generation capabilities, here you have two examples where algorithms did something outside of the bounds of human knowledge. And that’s what’s always been confusing to me about this idea that LLMs on their own could get to superintelligence, is functionally they’re bounded by the amount of data we give them up front.

5. Will China Take Over the Global Auto Industry? – Brad Setser

China has, according to the New York Times, the capacity to produce over 40 million internal combustion engine (ICE) cars a year.

Goldman Sachs thinks China will also have the capacity to produce around 20 million electric vehicles by the end of 2024…

…China’s internal market is around 25 million cars, and not really growing —so rising domestic EV sales progressively frees up internal combustion engine capacity for export.   Domestic demand for traditional cars is likely to be well under 10 million cars next year given the enormous shift toward EVs now underway inside China…

…Historically, the autos market has been largely regional (setting aside trade in luxury cars, where volumes are smaller). Most cars sold in China were made in China, most cars sold in Europe are produced in Europe, most cars sold in the North America are produced in North America, and so on. The U.S. did import a few million cars, on net, from Asia, and China imported a million or so luxury cars from Europe, but those were the exceptions rather than the rule.

That could change, absent hefty restrictions on Chinese auto imports (like the 100 percent tariff the U.S. now levies on EVs imported from China).

The global market—with massive overcapacity in China’s internal combustion engine (ICE) sector, massive capacity expansion in China’s EV sector, effectively unlimited credit for Chinese manufacturing firms from China’s state banks, and a Chinese yuan that is weaker against the dollar than it was back in 2008—is pushing for global auto manufacturing to become more like global electronics manufacturing, with a concentration of global production in a single region and, for that matter, a single country…

…Overcapacity in China’s automotive sector is not, in fact, all that new.

China’s traditional automotive sector was dominated by the joint ventures (“JVs”) formed by the large foreign firms and their (typically state-owned) Chinese partners. Chinese auto demand took off after the global financial crisis, and global firms responded by massively expanding their Chinese production capacity – as only the German luxury markets were interested in paying the 25 percent tariff and supplying the Chinese market from abroad.

But demand growth eventually slowed, and by 2018, the Wall Street Journal was reporting that the Chinese market was oversupplied…

…China’s EV industry—like EV industries in the U.S. and Europe—initially received substantial state backing. Chinese EV manufactures benefitted from downstream subsidies that built out China’s battery and battery chemical industry, as well as access to the world’s cheapest steel.
EV firms benefitted from cheap state financing—both equity injections from a myriad of state-backed funds and loans from state banks who (still) have to meeting lending quotas.

Moreover, China was quite explicitly protectionist in the application of its “consumer” EV subsidies.

Only EVs that were on state lists of qualifying vehicles were eligible for the subsidy, and the subsidy was only provided to cars that were made in China…

…And initially, only cars that were made in China with a battery made in China by a Chinese firm qualified for the lists…

…The only exception to the basic rule that qualifying for the list required using a battery made in China by a Chinese firm only confirmed the broad pattern of discrimination: Chinese-owned Volvo was allowed to use a Korean battery in one of its early EVs.

State support has not disappeared in any way as China’s EV industry took off.   Looking at direct cash subsidies from the central government to the manufacturers misses the myriad of ways China, Inc helps out firms producing in China…

…Nio received a significant ($1.9 billion) equity investment from the City of Hefei and the Province of Anhui, helping to offset ongoing losses. That equity injection was on top of state support for a factory in Hefei, which The New York Times reports was effectively a gift from the local government.

“‘The local government provided the land and the building’, said Ji Huaqiang, Nio’s vice president for manufacturing. ‘Nio does not own the factory or the land — it is renting, but the factory was custom built for Nio’”

That kind of support explains how Nio managed to build out its EV capacity even when its existing factories weren’t really being used that much:

“Nio’s two factories give it the capacity to assemble 600,000 cars a year, even though its annual rate of sales this autumn [2023] is only about 200,000 cars. Nio is nonetheless already building a third plant.”…

...What’s even more striking is that the investments that built out China’s EV capacity came in a market that was already saturated with modern auto production capacity.  That kind of investment wouldn’t have taken place without state guidance and support, support that was intended both to develop an indigenous Chinese industry (See China 2025) and to support a green transition that would reduce Chinese dependence on import fossil energy. It was the result of policy driven by the central government and backed financially by all levels of government. It also worked, China is now the world leader in EVs and batteries…

…If the world’s global firms can only compete with Chinese firms by using Chinese batteries and Chinese parts, that will hollow out much of the automotive industries of Europe and North America—a European brand on a Chinese-made car with a Chinese battery and drive train won’t sustain the current European auto supply chain or current European employment in the auto industry.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in ASML, Meta, and TSMC. Holdings are subject to change at any time.

What We’re Reading (Week Ending 08 December 2024)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 08 December 2024:

1. Why China’s Economy Opened Up in the 1970s – Joe Weisenthal, Tracy Alloway, and Odd Arne Westad

Joe (13:32):

What does it mean when you talk about history being “contingent?” You used that word a couple of times and I actually don’t know if I fully understand what that means, but when you’re telling these stories, or this story, and you’re keeping in mind the contingency in history, can you talk a little bit more about this idea?

Odd (13:48):

So you’ll see from the book that we go in and out from the sort of micro to the macro level of telling history. And if you look at the night when the coup against the radicals — the so-called Gang of Four within the party — took place, which we describe in some detail, you know, what happens from hour to hour…

Joe (14:10):

Right, this was the moment in which the left faction, after Mao dies, was arrested, and allowed for a sort of more moderate path to emerge.

Odd (14:21):

That’s right. And it was in effect a military coup. I mean, it was undertaken by the military and the security forces against the people who Mao himself had put in charge of the party, including his widow who was most prominent of all, Jiang Qing. Now that night, and the following few days, things could have ended up very differently. I mean, Shanghai, the biggest city in China by far, was still under control of the radicals. There were military units that supported the radical approach to politics. This could have ended up very differently from what it did.

And as we describe in the book, some of the plotters, some of the coup-makers themselves, in those days that followed the coup itself, were completely surprised by how little resistance there had been from the left. And how little chaos there had been on the streets. So that’s what I mean with it being contingent. I mean, this is something that obviously connects to the larger picture that we see today — going back to your sort of three level version of what happened in China. But it didn’t seem that obvious at the time. And it could have gone in very different directions from what we’re seeing today.

Tracy (15:30):

How important was the fraying of the relationship between China and the Soviet Union in the 1960s, early 1970s to spurring or catalyzing that opening up? Because it does feel like the sudden emergence of the Soviet Union as an external enemy, it feels like that led China in some respects to open up to the US and some other countries.

Odd (15:56):

This is a sort of trajectory that I think it’s really important to get right, because what Mao and his group of leaders did in the late 1960s was to turn to the United States as an ally — a pseudo ally, security ally — against the Soviet Union because they were so deadly afraid that there would be a war with the Soviets — a war that China certainly would have lost, given the state that Chinese communists themselves had pulled China into during the Cultural Revolution. So what Mao did was to turn to the enemy far away, the United States, to help back him against an enemy much closer to home, the Soviet Union, which they had this falling out with mainly for ideological reasons.

From Mao’s perspective, this was always intended to be a strictly security oriented pseudo alliance. It was directed against the Soviet Union. Mao to the end of his days was puzzled that United States would support the real communists, meaning him, against the fake communists, meaning the Soviet Union. But as long as they were willing to do that, he was certainly willing to reap the benefits. But he never intended that this would have any effect in terms of the increasingly radical communist direction that he was taking for China internally, domestically.

So that’s when what happens in 1976, after Mao’s death, becomes so significant, because the people who then took over, they thought, ‘Aha! We have this relationship between United States. They are supporting us for their own reasons in the Cold War against the Soviet Union. We can now also make use of this to supercharge Chinese reform.’ If it hadn’t been for that relationship, strictly security oriented, that already existed between China and the United States, I doubt that that would be possible. So it’s very important when about the longer term US-China relationship to think about that origin and how this actually got started. Very different from the way most people think about it, where the security element and the reform element are sort of conflated into one…

…Odd (36:05):

I think it was both. I mean in the Xi Jinping case, I think he was picked by the party as the, what Chinese would call, the core leader, back in the early twenty-teens, in response to what was seen as a bunch of real problems, from a Chinese Communist Party perspective, over liberalization, decentralization, corruption, strength of private companies that meddled in a lot of things that the communists didn’t want them to meddle in. They wanted to get a strong leader in who could deal with those issues, in a way that his predecessors, Jiang Zemin [and] Hu Jintao, had not been able to do it. So they wanted a strong leader. It’s just that, I think even for many communist leaders of that generation, they got more than they bargained for. So that’s where the personality aspect comes in. They got a leader who really wanted to return, at least on some issues, to the Maoist or even the sort of pre-Mao period, in terms of the CCP’s history and emphasizes the party’s position over what even many party leaders back 10 [or] 15 years ago thought would be good for China.

And it’s a classic example of responding to real world problems — not unknown in this country, right? — by going very far in one direction, hoping that that would resolve the problem that is there, and then getting stuck in a way with the kind of leader that you have in this case, in Xi Jinping. So I think that’s the story, the way we can tell it now. I hope at some point to be able to tell that story based on archives and primary documents, as an historian, we can’t do that yet. But I think at some point, we’ll be able to do that, and then it’ll be fascinating to test that hypothesis about how this happened.

Tracy (37:54):

So just on the revolution from below point, one of the things that you emphasize in the book is a lot of the stuff that happens in this time period is a result of people feeling that they are heading somewhere, that there’s a grander Chinese vision that can be achieved. And so that motivates people to actually do something. I’m curious, just going up to the present day, do you get a sense that people feel that? That there’s like a direction that China is heading in that it’s clear to people what they are trying to do?

Odd (38:33):

At the moment, absolutely not. I think it’s very, very clear that a lot of people in China do not understand where the country is heading and what the reasons are. And you know, you don’t spend much time in Beijing before you realize that these days. I think it was very different in the time period that we are talking about, which was generally a time of uplift, at least in economic and and social terms. And it’s right to say, I mean as many historians have said, that there was an element of a bargain in this. That, at least for some Chinese, not not everyone, but for some Chinese, maybe particularly in business, that would accept a dictatorship for what it was and then went on getting rich and and establishing some of these great or middling fortunes that you find so many of in China today. And that is good. I mean that was positive. It was much, much better than the dark past that we described at the beginning of the book.

It was just that, China wasn’t able to take what, in our view, is a necessary step to improve its political system, its overall attempt at trying to become a more open, more pluralistic country in the period when the going was good, when there was a general sense that China was making advances, domestically and internationally. Now, I think even if people from within the Chinese Communist Party after Xi Jinping would try to move in a direction of increased liberalization — which I think they will have to do at some point because people are just very unhappy with the kind of system that is there at the moment — it would be much more difficult, because the going is not that good. And probably it’s never going to be that good again. I mean, it was a remarkable period of economic transformation, 10% per year growth rates. It would’ve been possible to carry out necessary reform. But these people didn’t want to do it because they had become so preoccupied with holding onto power themselves. And I think, historically, that that might turn out to be the biggest mistake that the Chinese Communist Party has made.

2. Tim Cook Wants Apple to Literally Save Your Life – Steven Levy and Tim Cook

Some companies charge for AI-enhanced services. Did you consider that?

We never talked about charging for it. We view it sort of like multitouch, which enabled the smartphone revolution and the modern tablet.

You’ve personally been using Apple Intelligence for a while. What has been most useful for you?

We’re an email-based company, and I get enormous numbers from users, employees, partners, and so forth. Having it summarize author responses is a game changer, and having it prioritize things for you so you’re not doing your usual triage. Then, of course, there are fun things like the Image Playground.

I’ve heard you say that Apple Intelligence could make you funnier, which seems strange.

I think it can make you friendlier, which, in many ways, can be funnier as well.

Having AI speak for people makes me wonder whether the nature of communication will degrade. If Apple Intelligence writes something funny, who’s being funny, the sender or the AI?

It’s still coming from you. It’s your thoughts and your perspective. You and I both remember the productivity that came from the advent of the personal computer. It was no longer you punching your calculator, you were doing something on a spreadsheet. It was no longer you at the typewriter, you were using a word processor. Logic Pro helps musicians create music, but they’re still the author.

One of your demos involves a fictional recent graduate applying for a job. The cover letter is colloquial and somewhat sophomoric, but with Apple Intelligence a single click changes it to look like a savvy, smart person wrote it. If I’m a recruiter who hired that person, maybe I will feel tricked if they don’t live up to the professionalism of that letter.

I don’t think so. By using the tool, it comes across as more polished. It’s still your decision to use the tool. It’s like you and I collaborating on something—one plus one can equal more than two, right?…

When you’re thinking about things late at night, don’t you sometimes ask what it would mean if computers had superhuman intelligence?

Oh, of course. Not just for Apple, but for the world. There’s so much extraordinary benefit for humanity. Are there some things you have to have guardrails on? Of course. We’re very deeply considerate about things that we do and don’t do. I hope that others are as well. AGI itself is a ways away, at a minimum. We’ll sort out along the way what the guardrails need to be in such an environment…

Meta and Snap are leading us to mixed-reality glasses that we’d wear continually. Is the bigger, heavier Vision Pro ultimately headed that way?

Yes, it’s a progression over time in terms of what happens with form factors. AR is a huge deal. With Vision Pro, we’ve progressed to what is clearly the most advanced technology we’ve ever done, and I think the most advanced technology in the world in terms of electronics problems. We’ll see where it goes.

Apple has created a lot of consumer tools for medical technology. What’s the strategy for biological metrics and prosthetics?

It’s clear to me that if you zoom out way into the future, and you look back and ask what Apple’s biggest contribution was, it will be in the health area. That’s what I really believe. When we started pulling that string with the Apple Watch, it was a cascade of events. We started with something simple, like monitoring your heart rate, and then figured out we could pick up heart signals to get to an EKG and an AFib determination. Now we are monitoring sleep apnea. I’ve gotten so many notes over time from people who would have not survived had it not been for the alert on their wrist.

Apple plans to give AirPods the ability to correct for hearing loss. I bet the makers of expensive hearing aids are freaking out.

It’s not about competing against hearing aids on the market. It’s about trying to convince people who have hearing loss to use their AirPods. The vast majority of people with hearing issues have not been diagnosed. For some people, hearing aids have a stigma, and we can counter that with AirPods. And we can have people diagnose themselves. It’s the democratization of health…

We’re doing this interview at Apple Park, which is now seven years old. Have you been surprised by anything that couldn’t have been anticipated when it was just blueprints?

It’s promoted collaboration even more than I thought. That was a key component of the design, but there are so many places here where you just unexpectedly run into people. In the cafeteria, at the coffee bar, outside when you’re going across the pathway. Also, there’s a connection here to Steve that is incredible and very deep. We have the theater named after him and think about him all the time, but I can feel him in other spaces too.

3. 2024: The State of Generative AI in the Enterprise – Tim Tully, Joff Redfern, Derek Xiao, with Claude Sonnet 3.5

AI spending surged to $13.8 billion this year, more than 6x the $2.3 billion spent in 2023—a clear signal that enterprises are shifting from experimentation to execution, embedding AI at the core of their business strategies…

…Today, 60% of enterprise generative AI investments come from innovation budgets, reflecting the early stages of generative AI adoption. However, with 40% of generative AI spending sourced from more permanent budgets—58% of which is redirected from existing allocations—businesses are demonstrating a growing commitment to AI transformation…

…While foundation model investments still dominate enterprise generative AI spend, the application layer is now growing faster, benefiting from coalescing design patterns at the infrastructure level. Companies are creating substantial value by using these tools to optimize workflows across sectors, paving the way for broader innovation…

…In 2024, much of the action happened at the application layer. With many architectural design patterns established, app layer companies are leveraging LLMs’ capabilities across domains to unlock new efficiencies and capabilities. Enterprise buyers are seizing the moment, pouring $4.6 billion into generative AI applications in 2024, an almost 8x increase from the $600 million reported last year…

…Code copilots lead the charge with 51% adoption, making developers AI’s earliest power users…

…Support chatbots have captured significant usage, with 31% enterprise adoption…

…Enterprise search + retrieval and data extraction + transformation (28% and 27%, respectively) reflect a strong drive to unlock and harness the valuable knowledge hidden within data silos scattered across organizations…

…Meeting summarization ranks fifth in use cases (24% adoption), saving time and boosting productivity by automating note-taking and takeaways…

…When selecting generative AI applications, enterprises have clear priorities: Return on investment and industry-specific customization matter most when selecting new tools. Surprisingly, price isn’t a major issue; just 1% of the enterprise leaders we surveyed mentioned price as a selection concern. Buyers are playing the long game: They are far more focused on tools that can deliver measurable value (30%) and that understand the unique context of their work (26%) over those offering the lowest price tag (1%)…

…When AI pilots stutter or stall, it’s often due to challenges not adequately considered during the selection process. Although buyers aren’t checking price tags, implementation costs, cited in 26% of failed pilots, frequently catch them off guard. Data privacy hurdles (21%) and disappointing return on investment (ROI) (18%) also throw pilots off course. Technical issues, especially around hallucinations (15%), round out the top reasons for failure…

…Traditionally slow to adopt tech, healthcare is now leading generative AI adoption with $500 million in enterprise spend…

…Historically resistant to tech, the legal industry ($350 million in enterprise AI spend) is now embracing generative AI to manage massive amounts of unstructured data and automate complex, pattern-based workflows…

…With its complex data, strict regulations, and critical workflows, financial services ($100 million in enterprise AI spend) are primed for AI transformation…

…From Hollywood screens to creators’ smartphones, generative AI is reshaping media and entertainment ($100 million in enterprise AI spend)…

…Foundation models still dominate. The LLM layer commands $6.5 billion of enterprise investment…

…Rather than relying on a single provider, enterprises have adopted a pragmatic, multi-model approach. Our research shows organizations typically deploy three or more foundation models in their AI stacks, routing to different models depending on the use case or results…

…Among closed-source models, OpenAI’s early mover advantage has eroded somewhat, with enterprise market share dropping from 50% to 34%. The primary beneficiary has been Anthropic,* which doubled its enterprise presence from 12% to 24% as some enterprises switched from GPT-4 to Claude 3.5 Sonnet when the new model became state-of-the-art. When moving to a new LLM, organizations most commonly cite security and safety considerations (46%), price (44%), performance (42%), and expanded capabilities (41%) as motivations…

…To power RAG, enterprises must store and access relevant query knowledge efficiently. While traditional databases like Postgres (15%) and MongoDB (14%) remain common, AI-first solutions continue to gain ground. Pinecone,* an AI-native vector database, has already captured 18% of the market.

4. An Interview with Understanding AI Author Timothy B. Lee – Ben Thompson and Timothy B. Lee

As a side note, just as you sort of referenced it in passing, there is always the question of where are the productivity gains, when it came to, first the PC, and then the Internet? Is your sense that those just take a while to show up? Is there just a massive amount of consumer surplus that is not measured? What’s your big picture take on that question?

TL: There’s a couple of things. One is it takes a while to show up because to really get the big gains from a new general purpose technology, often you need to reorganize a lot of other business processes. There’s a famous analogy economists like to use for when they originally electrified the economy. The first thing they try to do is they tried to take the old steam-powered factories that just had one big crank shaft and put an electric motor in and that didn’t get you much improvement because the electricity was not cheap.

It was arguably worse.

TL: But then ten to twenty years later, people figured out, “Oh, we can have a bunch of small electric motors, one at each workstation, and now factories can be a lot more efficient”, but you had to build new factories and new businesses to do that…

Believe me, I think we’re around the same age, I know exactly what you mean and feel. That said, I feel like the big company — Wikipedia came out back when I was in college, or around that time and of course everyone, professors or teachers, banned the use of it. But what you quickly realized is that the key way to use Wikipedia is the sources. You go to Wikipedia, and then it has links to all the sources, then you have your original source documentation. I do feel like ChatGPT is just such a better version of that, particularly with the search version, and when it does sources, it’s just like, “What if we make a Wikipedia that just fills all sort of weight and space about knowledge”, and it’s pretty tough to beat in that regard.

TL: Yeah, absolutely. And as with Wikipedia, you have to be smart about it. You can’t assume that everything is accurate, you have to check your work. But I definitely find, anytime I have, if I’m trying to make a list of things and I want to know all the companies in a particular category, it’s a pain in the ass to find that on Google. Whereas if you ask ChatGPT, “Here’s like three companies in this category, give me more on the list”, it’ll know a bunch more of them. There’s so many things like that. So yeah, definitely, I don’t want to say never use it or it’s not useful. It’s definitely useful, but it’s 1% to 2% more productive over the course of a week rather than really transformational…

...Again, to go back to your perspective of looking at it over the last 18, 20 months since you started, do you think we’ve hit a wall with AI? You started wondering this publicly actually last December when Gemini came out and you felt a little underwhelmed, particularly given Google’s advantages. You weren’t sure at the time, was Google underperforming for Google specific reasons, maybe have we gotten as far as we can with GPT-4? What’s your evaluation 11 months on from that article?

TL: The thing I’ve noticed is that we keep hearing about there’s going to be a GPT-5—

It’s not here.

TL: There’s going to be a new big model and it hasn’t been released and I don’t have enough sources in the inside to those companies to know why that’s happening. But it could be they’re just still working on it and it’s going to come out next month and blow my mind, but every month that ticks by makes me a little more skeptical. Especially because the other thing trend we’ve seen is these companies are releasing these smaller models that are almost as good as the big models.

And then even to some extent, I was pretty impressed by o1, but what o1 did is kind of different. It wasn’t like scaling up the model, it’s like we’re going to do more inference time compute. In certain ways, it was much better, but it wasn’t better overall.

So my still pretty rough hypothesis, but my hypothesis is that there’s kind of a limit to what the current LLM architectures can do and we’re sort bumping up against that in various — I mean, another thing, we’ve had multimodal models that are much better, so we can do real-time voice and we can do images, so there’s new things it can do. But in terms of just the increase of overall reasoning capability, it doesn’t seem like we’ve had a big jump, really since March of 2023 when GPT-4 came out, and so I’m not going to make a strong prediction because again, it could come out next month and amaze me, but every month that ticks by I get a little bit more wondering what’s going on.

What do you think is the limitation? Is it data, compute or is it just a fundamental limitation of the transformer architecture?

TL: My guess is it’s a fundamental limitation of the transformer architecture, and I think the main issue is that the transformer architecture requires all of the model state to be in these vectors for individual words, and then it keeps a record of that forever — the whole context, there’s no process where you summarize and abstract a way. If you think about your life, you think about something that happened ten years ago, you don’t remember every single thing you said, everything that others said, you have a abstract memory that, “Oh, in 2014 I remember I lived in this place and I had this job”, and things you learn kind work their way into the brain, but it’s organized in a good way. LLMs just don’t have a way to do that.

So if I think about how people expect that at some point you’re going to have an LLM who’s like a personal assistant who maybe will work with you over your career and know all your habits and make all your appointments stuff and to do that, I just think this architecture where you remember every token exactly and do attention over that whole corpus, I don’t have any way of synthesizing and abstracting and forgetting unimportant things, just as a computer scientist, that doesn’t seem viable to me…

Do you think there’s a bubble now then?

TL: That’s always a hard question to say. Part of what’s hard about bubbles is that often people start calling a bubble pretty early and then the bubble keeps growing and people keep saying there’s a bubble.

Right. If people think there’s a bubble, there is not a bubble, that’s my heuristic.

TL: Well, there’s that, but also, at some point, the stock or the house price or whatever will peak and then go down, and the people who said it was a bubble right at the top will be right, but some people who called it way at the beginning were probably wrong.

I do expect a period where AI gets overly frothy and then crashes. Whether we’re currently there or just headed for that, is a little hard to say. I do not expect a dot-com bust level expansion, because as you were saying, I do think that this technology has clear benefits, it’s mostly big technology companies, it’s not as venture-funded. In fact, some of the early really crazy-funded companies have already been acquired.

So, yeah, I think the level of hype right now is a little too high and there’ll be some pullback, but I don’t think you’ll see a big crash and I don’t think you’ll see much of a pullback from deployment, because I think there really is enough value here that there’s going to be a big market for a lot of people working on it, and a lot of valuable stuff will come out of it in a pretty direct way.

I saw a new theory this week that actually really resonated with me. So this might be new to you, so I’m going to drop it to you on the spot. I think the big question on if you’re thinking about bubbles, you go back to a Carlota Perez model of the importance of bubbles and driving, you go back to the dot-com era, the really important part was the telecoms build out, which was, at the time, some people called it, and in retrospect, clearly insane. If you’re rolling out all this fiber and everyone’s doing it, the costs are going to go to zero, you’re all going to go bankrupt because it’s all financed by debt, as large infrastructure usually is. But the long-term payoff from that was massive, right? That, basically, booted off the whole Web 2.0 era where now everyone, suddenly, had broadband. Recessions suck, but there was a huge societal benefit that did come from that build out.

You go back to previous ones, whether it be electricity or steam, you had these similar cycles and the big question was, “What’s the societal beneficial output of an AI bubble if there is a bubble?” and chips never quite fit, because chips wear out and chips get better. So, if you buy a bunch of chips, but they’re five-year-old chips, what’s the benefit there? Doug O’Laughlin put this tweet out here, that has been really striking to me. He said, “Internet Bubble:Telecom::AI:Power/DCs”, and to me, that makes sense. If you’re going to actually build more nuclear power, or you’re going to do massive investments in solar and batteries, or whatever it might be to fuel these sorts of things, those are investments that, 1) can definitely make you go bankrupt because you’re taking out a bunch of debt to fund it, but 2) will retain value for many, many, many years to come. What do you think of that analogy? To me, it seems pretty compelling.

TL: Yeah, I one hundred percent agree with that. I mean, I was actually going to say the part of it that seems most bubbly is this stuff about Microsoft leasing out Three Mile Island for 20 years. Again, we were talking before is, “Do I think scaling law thing is going to run out of steam?”, my guess is it probably will. I don’t know if we’re on the verge of that, but, anyway, so I would not be surprised if people look back ten years from now, and say, “Oh, man, all that money companies spent on data centers and power is, that was kind of a waste of money”. But then, like you said, the country needs more power, and at some point, probably, we’ll want to be training really big models and so, if we have a bunch of huge data centers that we can use to train models, probably, we’ll get some value out of that. It’s tech companies spending the money so the social cost is not probably that high.

5. 7% of Book Value; 1x EBITDA; Cash is 2.5x Larger than Market Cap – Dirtcheapstocks

Highlands REIT, Inc. (Ticker HHDS) was created in 2016 when it was spun out of InvenTrust Properties Corp.

HHDS was formed to hold non-core assets of InvenTrust.

Today, HHDS owns 13 apartment houses, 3 retail properties, 1 office property and 1 correctional facility…

…HHDS has:

  • $205MM of book value.
  • $16.7MM of net operating income (NOI) in 2023.
  • $17MM of NOI in 2022.
  • $85MM of net debt.
  • 57% of NOI generated from multifamily assets

What do you think? Is Highlands worth book value? Is it worth half of book value?

If we want to value the business at an 8 cap, the equity must be worth $124MM.

Within the last two weeks, HHDS has been valued as low as $14.4MM.

That’s less than 1x NOI, and 7% of book value…

…Most companies valued at $14MM might have a few hundred shareholders of record. Apple is valued at $3.5 Trillion, and it has 23,000 record holders.

Highlands has 143,000 record holders…

…Here’s my theory: When Highlands was spun out of InvenTrust, every shareholder was given ownership individually. There are 143,000 separate people/entities that own this stock. And this stock was an afterthought. It was just a few noncore assets being spun out of a $2 billion REIT…

…HHDS, perhaps wanting to ward off future material purchases by Mackenzie, announced a tender offer in October 2023. While Mackenzie was tendering at $0.04/share earlier that summer, HHDS was willing to pay $0.12 – $0.17/share. What’s more, HHDS was committing $20MM to the share buyback.

HHDS would repurchase 13-19% of its shares if fully subscribed.

A few weeks later, HHDS increased the buyback to $25MM!

In the end, $23.7MM was spent to buy in 169MM shares – nearly 20% of the outstanding share count…

…HHDS showed up as an expert market security, even though it’s SEC registered.

But I found that the traditional expert market brokers couldn’t buy shares.

Then I went to alternative market brokers. They’d be happy to take my money, and told me I could get as much volume at $0.10 as my heart desired.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Alphabet (parent of Google), Apple, Meta, Microsoft, and MongoDB. Holdings are subject to change at any time.

What We’re Reading (Week Ending 01 December 2024)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 01 December 2024:

1. America, China, and the Death of the International Monetary Non-System – Russell Napier

Something changed in America in the 1990s. The U.S. federal funds rate began a decline from above 5 percent to reach the effective zero bound by 2009. U.S. ten-year Treasury yields declined from above 6 percent to levels not even recorded during the Great Depression. Credit to the U.S. nonfinancial corporate sector rose from 56 percent of GDP to a new all-time high of 87 percent, and U.S. Government debt rose from 60 percent of GDP to a recent high of 106 percent, very near the peak level recorded during World War II. The valuation of U.S. equities rose from a cyclically adjusted price-to-earnings ratio (CAPE) of 15x to the current level of 34x, having reached a new all-time high of 44x in 2000. U.S. tangible investment declined from 7 percent of GDP to as low as just 1 percent of GDP, a level only previously recorded in the Great Depression and briefly in the hiatus of investment after World War II…

…Today, we have an international monetary system that does not have a name…

…It is a non-system to the extent that its terms and conditions were never agreed upon by all the parties involved, but instead it was born from choices made by a few, most notably China, that the other parties accepted and adjusted to. The extremes of interest rates, debt levels, asset price valuation, and investment in tangible assets in the United States are just part of that global adjustment to the new international monetary system that grew from China’s unilateral decision to manage its exchange rate beginning in 1994. This system would never have been agreed to in any negotiation, as it was a system replete with distortions that would lead to dangerously large imbalances with dangerous political ramifications…

…The crucial distortion imposed by China’s decision in 1994 was a decoupling of developed world growth rates from interest rates, the discount rates used in asset valuations, which many assumed to be a new normal. When interest rates appear to be permanently depressed relative to growth rates, asset valuations rise, leverage increases, and investors are incentivized to pursue gain through rising asset prices rather than through investment in new productive capacity. The decoupling of growth and interest rates was driven by the People’s Bank of China’s (PBOC) appearance as a non-price-sensitive buyer of U.S. Treasury securities, and indirectly by the role China’s excessive fixed-asset investment played in reducing global inflation and hence interest rates…

…For developed-world companies facing the cheap resources, cheap finance, and cheap exchange rate of China, there was little incentive to invest in tangible assets at home. In the United States, in particular, where companies are managed to maximize return on equity and returns to shareholders, the corporation was able to benefit from both cheap Chinese production and the low interest rates that allowed balance sheets to be levered to buy back equity. In other countries, with different social contracts and less focus on rewarding management via stock options, closing productive capacity and pursuing financial engi­neering were more difficult. Thus, it was U.S. corporations that most fully adapted to the new international monetary system.

When the Bretton Woods system was established, severe restrictions were placed on the free movement of capital. The architects of that system recognized that maintaining exchange rate stability would not be possible if capital were allowed to move freely. Our current system permits, at least into and within the developed world, the free movement of capital. In this system, the private sector capital that left the developed world for China was transformed, via PBOC exchange rate inter­vention, into an accumulation of developed-world debt securities financed by the creation of renminbi reserves…

…. China’s inability to run sufficient surpluses since 2014 to generate sufficient broad money growth and prevent the escalation of its already high debt-to-GDP ratio is not widely recognized as a similar problem. Yet China’s move to a flexible exchange rate to avoid a debt deflation and create sufficient growth in broad money to reduce its debt burden will end the non-system as surely as President Nixon’s announcement that the U.S. dollar was no longer linked to gold ended Bretton Woods. Few analysts understand the impact that this move will have on the international monetary system and the long-accumulating distortions to credit, money, asset prices and the global economy.

When China moves to a flexible exchange rate, it is difficult to foresee how just one new international monetary system could replace the non-system. Given current geopolitical tensions, the prospect of China and the United States hashing out a new Bretton Woods–style agreement is highly unlikely…

…Predicting how any new U.S.-centric monetary system will develop is not easy, but such a system must allow for excessively high debts, the legacy of the non-system, to be inflated away. While much of the focus is on the high U.S. total nonfinancial debt-to-GDP ratio of 255 percent, there are many countries in the world struggling under even higher debt ratios: Canada, 311 percent; France, 315 percent; Japan, 400 percent; Netherlands, 316 percent; Switzerland, 297 percent, etc.15 The rise and rise of debt-to-GDP levels, a product of the gap between interest rates and growth rates under the non-system, will now have to be addressed.

With austerity, default, hyperinflation, or very high real GDP growth unlikely to be the solution, a new global monetary system will have to be created that offers a path of moderation toward reducing debt‑to-GDP levels. That path of moderation is likely to take the form of financial repression—such as that imposed upon savers in the after­math of World War II, to force their savings to fund the investment needed for postwar reconstruction, but at interest rates that did not reward them for the current and expected levels of inflation. That is a world in which bankers will create more credit and more money and more inflation than they have in recent decades. Higher nominal GDP growth combined with imposed purchases of low-yielding debt securi­ties will, over time, reduce debt-to-GDP levels, just as it did in the decades following World War II. Whatever the new international monetary system looks like, it will have to accommodate the financial repression that will finally begin to reduce debt-to-GDP levels…

…In the long period in which developed-world debts will have to be inflated away, policymakers will have to take a view as to which section of society will bear the heaviest cost. One of the quickest and least painful ways to enforce a deleveraging is through encouraging a rapid re‑equitization of the private sector. The ability of all corporations to deduct interest expense in calculating their taxes has to be reconsidered. In an era when much greater fixed-asset investment is essential, the tax privilege of deducting interest expense should not be available to cor­porations using debt to lever up an existing income stream; rather, the tax code should reward corporations using debt to build new businesses and new income streams. There are of course losers from such a change in taxation, but they are those who have been the winners from the prolonged period of falling interest rates and rising asset prices that have been the key feature of our now failing non-system. A long financial repression is in nobody’s interest, and the longer it prevails, the more likely it will create wealth redistributions that threaten social stability. Proactive intervention to force re-equitization upon a small section of society through the withdrawal of a tax privilege is painful for some but is a more equitable path to reducing high debt-to-GDP levels while facilitating greater investment.

To reduce the high and dangerous debt-to-GDP ratios of the developed world, nominal GDP must grow faster than total credit. This can be achieved by increasing the growth rate in bank credit while limiting the growth in nonbank credit. While the non-system was a key driver of the rise and rise of debt-to-GDP, the disintermediation of credit also played a key role. It is commercial bankers who create money, and if nominal GDP growth is to remain at a high enough level to reduce debt-to-GDP levels, bank balance sheets must grow faster than they have over the past three decades. Commercial banks create money when they expand their balance sheets, and if they do not create enough money, nominal GDP growth will remain low while credit growth, spurred by the growth in nonbank credit, can remain high.18 A combination of faster growth in bank credit combined with the re­striction of the growth in nonbank credit will be at the core of reducing debt-to-GDP ratios. The targeted ending of interest deductibility in the computation of corporate income tax, mentioned earlier, can assist in promoting the growth in bank credit and hence money at the expense of growth in nonbank credit. If it is bankers who are at the vanguard of funding the necessary investment renaissance in the United States, and not credit markets, then the move to lower debt-to-GDP levels will be less painful than if we are forced to take the hard path of austerity, default, hyperinflation, or a very long financial repression. A new focus on the growth of bank credit and therefore money is at the core of any policy to reduce dangerously high debt-to-GDP ratios.

2. Are U.S. Stocks Overvalued? – Ben Carlson

The S&P 500 is up nearly 90% since election day 2020 yet valuations are essentially identical.

How can that be?…

…Stock prices are up a lot but fundamentals2 have kept pace. In fact, the stock market has actually gotten less expensive over the past couple of years because of earnings growth…

…It’s also important to point out that much of the valuation premium on the S&P 500 comes from the largest stocks…

…These stocks have high valuations for good reason — they’re some of the best-run corporations in the world…

…The good news for valuation-conscious investors is there is plenty of value outside of the mega-cap stocks. Valuations for small and mid cap stocks are still pretty cheap. They are far less expensive now than they were before the pandemic. Maybe there’s a reason for that but stocks don’t get cheap for no reason.

3. Amazon’s Moonshot Plan to Rival Nvidia in AI Chips – Matt Day, Ian King, and Dina Bass

Nvidia’s biggest customers — cloud providers like Amazon Web Services, Microsoft Corp.’s Azure and Alphabet Inc.’s Google Cloud Platform — are eager to reduce their reliance on, if not replace, Nvidia chips. All three are cooking up their own silicon, but Amazon, the largest seller of rented computing power, has deployed the most chips to date…

…Fifteen years ago, the company invented the cloud computing business and then, over time, started building the infrastructure that sustains it. Reducing its reliance on one incumbent after another, including Intel Corp., Amazon ripped out many of the servers and network switches in its data centers and replaced them with custom-built hardware. Then, a decade ago, James Hamilton, a senior vice president and distinguished engineer with an uncanny sense of timing, talked Jeff Bezos into making chips…

…After almost four decades in the business, Hamilton knows taking Amazon’s chip ambitions to the next level won’t be easy. Designing reliable AI hardware is hard. Maybe even harder is writing software capable of making the chips useful to a wide range of customers. Nvidia gear can smoothly handle just about any artificial intelligence task. The company is shipping its next-generation chips to customers, including Amazon, and has started to talk up the products that will succeed them a year from now. Industry observers say Amazon isn’t likely to dislodge Nvidia anytime soon…

… The unit’s first chip was designed to power something called inference — when computers trained to recognize patterns in data make a prediction, such as whether a piece of email is spam. That component, called Inferentia, rolled out to Amazon’s data centers by December 2019, and was later used to help the Alexa voice assistant answer commands. Amazon’s second AI chip, Trainium1, was aimed at companies looking to train machine learning models. Engineers also repackaged the chip with components that made it a better fit for inference, as Inferentia2.

Demand for Amazon’s AI chips was slow at first, meaning customers could get access to them immediately rather than waiting weeks for big batches of Nvidia hardware. Japanese firms looking to quickly join the generative AI revolution took advantage of the situation. Electronics maker Ricoh Co., for example, got help converting large language models trained on English-language data to Japanese.

Demand has since picked up, according to Gadi Hutt, an early Annapurna employee who works with companies using Amazon chips. “I don’t have any excess capacity of Trainium sitting around waiting for customers,” he says. “It’s all being used.”

Trainium2 is the company’s third generation of artificial intelligence chip. By industry reckoning, this is a make-or-break moment. Either the third attempt sells in sufficient volume to make the investment worthwhile, or it flops and the company finds a new path. “I have literally never seen a product deviate from the three-generation rule,” says Naveen Rao, a chip industry veteran who oversees AI work at Databricks Inc., a purveyor of data and analytics software.

Databricks in October agreed to use Trainium as part of a broad agreement with AWS. At the moment, the company’s AI tools primarily run on Nvidia. The plan is to displace some of that work with Trainium, which Amazon has said can offer 30% better performance for the price, according to Rao. “It comes down to sheer economics and availability,” Rao says. “That’s where the battleground is.”…

…Amazon’s Trainium2 will likely be deemed a success if it can take on more of the company’s internal AI work, along with the occasional project from big AWS customers. That would help free up Amazon’s precious supply of high-end Nvidia chips for specialized AI outfits. For Trainium2 to become an unqualified hit, engineers will have to get the software right — no small feat. Nvidia derives much of its strength from the comprehensiveness of its suite of tools, which let customers get machine-learning projects online with little customization. Amazon’s software, called Neuron SDK, is in its infancy by comparison.

Even if companies can port their projects to Amazon without much trouble, checking that the switch-over didn’t break anything can eat up hundreds of hours of engineers’ time, according to an Amazon and chip industry veteran, who requested anonymity to speak freely. An executive at an AWS partner that helps customers with AI projects, who also requested anonymity, says that while Amazon had succeeded in making its general-purpose Graviton chips easy to use, prospective users of the AI hardware still face added complexity.

“There’s a reason Nvidia dominates,” says Chirag Dekate, a vice president at Gartner Inc. who tracks artificial intelligence technologies. “You don’t have to worry about those details.”…

…  “We’re particularly impressed by the price-performance of Amazon Trainium chips,” says Tom Brown, Anthropic’s chief compute officer. “We’ve been steadily expanding their use across an increasingly wide range of workloads.”

Hamilton says Anthropic is helping Amazon improve quickly. But he’s clear-eyed about the challenges, saying it’s “mandatory” to create great software that makes it easy for customers to use AWS chips.

4. Key Square Capital 2024 January letter – Scott Bessent and the Key Square team

In essence, a second Trump administration would be expected to embrace a “Peace Through Strength” trade policy. Of course, in the case of recalcitrant trade partners, Trump can always offer them a negotiating session with former US Trade Representative Robert Lighthizer who will likely play a prominent role in his second term.

Our base case is that a re-elected Donald Trump will want to create an economic lollapalooza and engineer what he will likely call “the greatest four years in American history.” Economist Ed Yardeni believes that post-Covid America has the potential to have a boom similar to the “Roaring Twenties” of a century ago. We believe that a returning President Trump would like this to be his legacy. In this scenario, the greatest risk factor, in our opinion, would be a sudden rise in long-end rates.

The talk of revenge will likely be limited to a small group of political enemies, and the wider policies of the administration will be oriented toward de-regulation, energy independence, reviving U.S. manufacturing and extending the tax cuts. We find it unlikely that across-the-board tariffs, as currently reported by the media, would be enacted at the same time as he moves to fix the immigration crisis. The tariff gun will always be loaded and on the table but rarely discharged. Of course, strategic and national security issues around China will remain.

Another differentiated view that we have is that Trump will pursue a weak dollar policy rather than implementing tariffs. Tariffs are inflationary and would strengthen the dollar–hardly a good starting point for a US industrial renaissance. Weakening the dollar early in his second administration would make U.S manufacturing competitive. A weak dollar and plentiful, cheap energy could power a boom. The current Wall Street consensus is for a strong dollar based on the tariffs. We strongly disagree. A strong dollar should emerge by the end of his term if the US reshoring effort is successful.

5. Scott Bessent Sees a Coming ‘Global Economic Reordering.’ He Wants to Be Part of It – Peter Rudegeair and Gregory Zuckerman

In his first interview following his selection, Bessent said his policy priority will be to deliver on Trump’s various tax-cut pledges. Those include making his first-term cuts permanent, and eliminating taxes on tips, social-security benefits and overtime pay…

…Bessent became one of Trump’s closest advisers by adding depth to his economic proposals and defending his plans for more-activist trade policies. He has argued that the president-elect’s plans to extend tax cuts and deregulate parts of the U.S. economy would create an “economic lollapalooza.”…

…Bessent has long been worried about the U.S.’s heavy debt and thinks the main way it can be reduced is by boosting growth, which increases tax revenues.

He has advised Trump to pursue a policy he calls 3-3-3, inspired by former Japanese Prime Minister Shinzo Abe, who revitalized the Japanese economy in the 2010s with his “three-arrow” economic policy. Bessent’s “three arrows” include cutting the budget deficit to 3% of gross domestic product by 2028, spurring GDP growth of 3% through deregulation and producing an additional 3 million barrels of oil or its equivalent a day.

To get government spending under control, Bessent has advocated extending the 2017 Tax Cuts and Jobs Act but with what are called pay-fors to lower its cost. That would involve either reducing spending or increasing revenue elsewhere to offset the impact. He also proposed freezing nondefense discretionary spending and overhauling the subsidies for electric vehicles and other parts of the Inflation Reduction Act.

Earlier this year, Bessent thought about tariffs as a negotiating tool, telling investors in a letter that the “tariff gun will always be loaded and on the table but rarely discharged.” He has since argued for them more forcefully, especially as a source of tax revenue.

In a speech last month titled “Make the International Economic System Great Again,” Bessent argued for increasing tariffs on national-security grounds and for inducing other countries to lower trade barriers with the U.S.  


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Alphabet, Amazon, and Microsoft. Holdings are subject to change at any time.

What We’re Reading (Week Ending 24 November 2024)

The best articles we’ve read in recent times on a wide range of topics, including investing, business, and the world in general.

We’ve constantly been sharing a list of our recent reads in our weekly emails for The Good Investors.

Do subscribe for our weekly updates through the orange box in the blog (it’s on the side if you’re using a computer, and all the way at the bottom if you’re using mobile) – it’s free!

But since our readership-audience for The Good Investors is wider than our subscriber base, we think sharing the reading list regularly on the blog itself can benefit even more people. The articles we share touch on a wide range of topics, including investing, business, and the world in general. 

Here are the articles for the week ending 24 November 2024:

1. Cash! – The Brooklyn Investor

Over the past few years, people have kept talking about mean reversion to value and whatnot, but I have ignored that for the most part for the reasons I’ve been saying here. The growth / value spread just seems to me so much reflecting values being taken away from the old economy into the new one. Yes, sounds like 1999 bubble, but it just seems true. Retail just seems to be going down the drain, old school marketing / advertising just seems to be losing to online marketing etc…

…The massive transfer of wealth has been going on for decades, or more than a century. Industrialization just sucked the wealth and value out of skilled workers / craftsman and transferred it to large corporations via factories. Formerly skilled workers were transferred into factories that required no skill (therefore, lower income). All the value-added accrued to the owners of the factories (capitalists). Same with national chain restaurants and retail. WMT transferred wealth from the local shops / restaurants to Arkansas; former store-owners end up having to work at WMT for lower pay (as unskilled workers). This is nothing new.

Now, the same thing is happening at so many levels at the same time that it is quite frightening. Just as a simple example, I’ve mentioned this before, but companies like Squarespace and Wix (or free options like WordPress) have sort of wiped out a large part of the web development world. People who knew a little HTML / CSS / Javascript might have been able to make a living not too long ago, but not now. All that ‘wealth’ is transfered to the companies that provide the platform for people to build it themselves.

Photographers are complaining for similar reasons. You no longer need to hire a photographer for low-end projects. You can just buy photos from various photos sites for very low prices, or even have AI generate the exact photo you need. I have used AI to generate artwork, photos and text in various volunteer work, and it is scary. I thought to myself, jeez, I would have paid an art student $300 for this 3 years ago; now I do it for free online via AI…

…This is why when people say the stock market as a percentage of GDP is going up, the concentration of stocks in the market is getting too high etc., I think it is obvious that this is happening because the wealth and value is actually being more and more focused and concentrated, so the market is only reflecting reality…

…A similar group of very rich and smart people are saying that long term rates can’t stay low and they must move substantially higher due to these unsustainably large and growing federal deficits. Do I worry about that? Yes. But, I look to Japan as the model of an aging society and growing government deficits. Sure, there are plenty of differences (Japan is a high savings nation), but I still can’t get around the fact that slowing population growth and maturity of the U.S. economy would make growth harder to achieve going forward. Almost certainly, we can’t get back to the growth of the post-war baby boom generation. So given that, how do interest rates go up? Deficit-driven inflation? We haven’t really seen that in Japan, and even in the U.S. until Covid and Ukraine. So is the recent inflation really deficit-driven inflation? Or exogenous event-driven inflation? Maybe a combination of both.

This is not to say I don’t care about deficits. Of course it’s a problem, and we need to deal with it at some point. My opinion is just seeing things as an investor. I am just telling you why, as an investor, I am not yet concerned too much with the deficit and inflation.

2. Off The Beaten Path Investing – David Katunarić and Lawrence J. Goldstein

Goldstein: I started at Burnham when they had about 22 senior analysts following every industry in America, or so they thought. One day, after discovering the pink sheets, or actually, I found the pink sheets afterwards. I saw a list of trucking companies. It was in the Standard & Poor’s transportation manual, which came out weekly, supplements to put in the looseleaf book. I got a list of every trucking company in the United States and there must have been well over 50, maybe more, and every one of them had lower earnings or losses, except for four companies. Those four were Roadway Express, Denver Chicago Trucking, Merchant Fast Motor Lines and Overnite, spelled N-I-T-E. I called them first, and I ended up making a friend of J. Howard Cochran, the founder and president. At the beginning, he sent me a copy of his monthly financial statement. There were no rules against doing that. I remember they were printed in purple ink on a ditto machine. His first report he sent me was the five months ended May. He had earned in those five months, per share, I remember $1.86. He told me also that in the trucking business, the second half of the year is better than the first half. I said, “Let’s see, five months $1.86, times 2 is over $3.60, and I’m missing a month and the second half is better, so it’s got to be higher than that.” The stock was $1.75 or $1.25 off it. I couldn’t believe it. So I wrote a report, gave it to my boss, head of research.

He said to me, and I can hear it to this day, “Listen, kids, this is an institutional research department. We don’t write or recommend reports on dollar-stocks.” So I knew I was onto something. My boss was crazy. It ended up, by the way, they earned almost $4 a share that year. I got to laugh, it’s funny – I could buy the first share at $1.75, and I did. A number of years later, I think two decades later, or less than, Overnite sold out to, I think it was the Southern Pacific Railway, they sold out for $100 million. This thing was worth $500,000 when I met them. So the pink sheets made sense to look there. Basically, what I came to do was to look left when everybody’s looking right, look down when everybody’s looking up, and find companies that are off the beaten path, overlooked or ignored by otherwise intelligent investors…

…Katunaric: What would you say, Larry, in these 40-some years that you’re managing Santa Monica Partners, how has your investing approach changed since then? What are some lessons that sparked the change?

Goldstein: It’s not changed at all, except that you don’t write to the SEC and ask for the 10-Ks and Qs and the proxy and have it take two weeks if you get it. Now you hit a keyboard and you get it all. That’s changed. The second thing is now there are people like you. There are a lot of people – I don’t mean you personally – who are on top of what’s called microcaps. So everybody’s searching for the goal. Obviously you’ve developed a business and you want to develop a bigger business. But that’s what happened. Competition that didn’t exist. When I did it, there was one firm that got big, Tweedy Browne. You know them? What happened to them was terrible. They got so big they had to buy ordinary stocks…

…Goldstein: When I bought Mastercard, it was not a huge company. When they went public, if I remember right, it was $39, $38, $37. I can’t remember the exact price, and it’s since split 10-for-1. So my cost is, I guess, $3 and change. I forget the exact split. I have to look it up. Let’s say it’s $10, $15 – but I think my cost is less than $15.

Katunaric: I saw somewhere that it was a hundred-bagger since the IPO. Maybe I read it last year. I think it was one of the best performing ones, but I’m not sure also.

Goldstein: I’ll focus on that for a second. The reason I bought it, was in 1971, I went to my boss, Tubby Burnham, and I said, “There’s a business that’s going public, Madison Avenue.” Madison Avenue is where all the advertising agencies were in New York, every one of them. The company that was going public, it was the second company to go, ad company. The first one was a company called Puppet, Koning, and Lois. They had been public for some period of time and the stock did okay. The second one was Batten, Barton, Durstein, and Osborne, which subsequently changed their name to BBD&O, which subsequently changed their name, and it’s the same company to Omnicom, which is the world’s first and second largest advertising agency. Why did I want to buy it? I said to my boss, “Advertising companies are required if you have a consumer product to sell. It’s a royalty company. They get a royalty on every new consumer product that’s marketed to the world.” That’s what I think it was. If you’re going to sell a new widget, you want to advertise it. They get a cut of that. So, a great business. I said, “That’s exactly what Mastercard is.” Everything that anybody buys, they get a cut. By the way, there’s no risk to their business. They don’t make loans. Banks make loans. They get a cut. Banks have risk, but Mastercard, it’s like every time you turn on the water, you get a free glass…

…I tell you, the biggest recommendation to me, and the biggest thing I don’t believe or understand is, Warren Buffett, he has never bought it, except for himself when he was a kid. He bought Oxy. I don’t know that much about Occidental, but there’s nothing better than TPL if you want to be in the oil business. They just own the stuff and you can take it out at your cost and pay them not only for that, but the right to get to the well and leave the well and for the water for fracking. If you run a hose or a pipeline, pay them. What better business is there than that? None.

Katunaric: I agree. You pitched me TPL extensively yesterday and the asset light nature of the business was really attractive.

3. Here’s How Trump Could Lose the Coming Trade War – Paul Krugman

All indications are that China’s era of torrid economic growth is behind it. For decades, Chinese growth was fueled mainly by two things: a rising working-age population and rapid productivity growth driven by borrowed technology. But the working-age population peaked around a decade ago and is now falling. And despite some impressive achievements, the overall rate of technological progress in China, which economists measure by looking at “total factor productivity,” appears to have slowed to a crawl…

…China, however, has built an economic system designed for the high-growth era — a system that suppresses consumer spending and encourages very high rates of investment.

This system was workable as long as supercharged economic growth created the need for ever more factories, office buildings and so on, so that high investment could find productive uses. But while an economy growing at, say, 9 percent a year can productively invest 40 percent of G.D.P., an economy growing at 3 percent can’t.

The answer seems obvious: redistribute income to households and reorient the economy away from investment toward consumption. But for whatever reason, China’s government seems unwilling to move in that direction…

…So what do you do if you have lots of capacity but your consumers can’t or won’t buy what you make? You try to export the problem, keeping the economy humming by running huge trade surpluses…

…China appears to be exporting close to $1 trillion more than it imports, and the trend is upward.

Hence the coming trade war. The rest of the world won’t passively accept Chinese surpluses on that scale…

…That’s why the Biden administration has been quietly pursuing a quite hard line on China, retaining Trump’s tariffs and trying to limit its progress in advanced technologies. It’s why the European Union has imposed high tariffs on electric vehicles made in China, which is probably only the beginning of expanded trade conflict…

…Trump’s insistence that tariffs don’t hurt consumers — even as businesses across America are planning to raise prices when his planned tariffs hit — strongly suggests that neither he nor anyone he listens to understands how global trade works. Not a good thing at a time of trade conflict.

4. Is the United States Going Broke? – Ben Carlson

There seem to be two extreme views when it comes to government debt levels.

One is the view that government debt doesn’t really matter all that much since we have the global reserve currency and the ability to print as much of that currency as we’d like.

The other view is that government debt levels are reaching a tipping point that will lead to calamity…

…It is true that U.S. government debt is enormous…

…Total government debt in the United States was around $23 trillion heading into the pandemic so debt levels are up 50% or so this decade alone.

It’s also true that the interest we pay on government debt has risen considerably because we’ve taken on so much and interest rates are so much higher than they were in the 2010s…

…But you can’t look at debt levels on their own. You have to think of them through the lens of a $30 trillion U.S. economy.

Here is interest expense as a percentage of GDP:..

…It’s shot up considerably in recent years but it’s still below 1990s levels. The Fed cutting interest rates should help on the margins…

…Spending was 45% of GDP during the pandemic. That was obviously unsustainable but things are now back to normal…

…The thing you have to understand is the United States government does not operate like a household when it comes to debt. You pay your mortgage off over time and eventually retire that debt.

The government’s budget is not at all like a household budget. First of all, the government can print its own currency. That helps in a pinch and it’s the main reason our government can’t go broke. Inflation is the true constraint when it comes to politicians spending money.

As long as the economy is growing, debt should be growing too…

…I would be more worried if you told me government and consumer debt were down in the coming decades. That would mean something is seriously wrong with the economy.

Debt grows because assets grow (remember government debt is an asset in the form of bonds for investors). Debt grows because the economy grows. Income grows. Prices grow. So of course debt will rise. 

5. Wall Street’s Elites Are Piling Into a Massive AI Gamble – Neil Callanan, Gillian Tan, Tasos Vossos, Carmen Arroyo, and Immanual John Milton

While much of the speculative hype around AI has played out in the stock market so far, as seen in chipmaker Nvidia Corp.’s share price, the giddiness is spreading to the sober suits of debt finance and private equity.

Analysis by Bloomberg News estimates at least $1 trillion of spending is needed for the data centers, electricity supplies and communications networks that will power the attempt to deliver on AI’s promise to transform everything from medicine to customer service. Others reckon the total cost could be double that…

…Further proof of the “unsatiable demand” for computing horsepower, according to real-estate broker Jones Lang LaSalle Inc., is the more than sevenfold increase over two years in construction work on US co-location centers, which lease out rack space to tech firms. Asking rents in those facilities have jumped as much as 37% in 12 months, the firm estimated in an August report.

All of this unbridled spending is revving up the issuance of both investment-grade debt and riskier leveraged loans, especially in the US, handily for private lenders and fee-starved investment bankers alike. Hedge funds are looking as well to profit from AI hysteria with novel types of debt structures.

It’s also opened up a new corner of the asset-backed securities market, where sales of debt backed by data centers have already jumped to a near-record $7.1 billion this year, according to data compiled by Bloomberg News. Chuck in fiber networks and other bits of kit, and it’ll be much higher. Matt Bissonette, who heads Guggenheim Securities’ business in this area, says the number of buyers for his data-center ABS products has roughly doubled in four years…

…While Blackstone hasn’t risked that kind of capital on construction before, developers of data centers can make stellar returns if all goes well. Property researcher Green Street reckons profit margins on London sites are about 65%.

Financiers are eager to back these grand projects because future occupants have usually pre-signed long leases, making them safer bets. Some banks are offering to lend as much as 70% or 80% of the cost and occasionally more when a lease is already signed, according to a person with knowledge of the matter…

…Lenders are more twitchy, however, about data centers explicitly earmarked for AI rather than more general purposes, according to a banker who works in the sector. Such deals can carry costlier debt and less leverage, he says, because the technology still has to prove its worth.

Separately, a senior partner at a leading private equity firm says he’s troubled by the emergence of speculative development, meaning construction takes place before a tenant has been found, as it’s hard to be sure of final demand. Some lawyers talk of “zombie projects” that may never be finished.

And not everyone believes that the “if you build it, they will come” approach is a surefire winner for those gambling on an era-changing AI breakthrough. Massachusetts Institute of Technology professor Daron Acemoglu says a lot of capital will be wasted.

Despite the misgivings, the appetite for deals from bankers and private lenders — especially for sites with blue-chip, signed-up occupants — is giving most data-center owners and developers a strong hand when pricing debt. A site leased long term by a tech giant can snag bank funding at a margin below two percentage points, says Brookland’s Hussain. Co-locators typically pay 2.5 percentage points or less, he adds.

“Recently, we raised €850 million ($907 million) in nine-year bonds at below 4% and refinanced and upsized our revolving credit facilities to $4.5 billion,” says Jordan Sadler, senior vice president at Digital Realty Trust Inc., a tech property firm that has signed joint ventures with Blackstone and others for almost $9 billion of hyperscale data-center developments…

…Across the Atlantic, one utility told the Federal Reserve Bank of Atlanta that electricity usage by data centers rose 17% in recent months. In Virginia, host to the world’s highest concentration of these sites, records for peak power demand were set six times in July, according to Dominion Energy Inc.

Trying to satisfy energy-devouring data centers means the utility sector’s capital spending is set to exceed $200 billion by next year, about double what it was a decade earlier. That would have stressed utility balance sheets, but a recent easing of how Moody’s Ratings views some of the industry’s riskier hybrid bonds — letting them be treated as half equity — has opened the floodgates to companies raising capital without being downgraded.

Sales of these bonds have risen almost eightfold this year to $15 billion, data compiled by Bloomberg shows. Only issues by bulge-bracket banks match that.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. We currently have a vested interest in Wix. Holdings are subject to change at any time.