A Possible Scientific Explanation For Why Top-Down Control of Economies Is A Bad Idea

Economies are complex systems that exhibit unpredictable emergent behaviours.

Mitch Waldrop’s Complexity: The Emerging Science at the Edge of Order and Chaos, published in 1992, is one of the best books I’ve read in recent times. It describes the science behind complex adaptive systems and the work academics from numerous disciplines have done on the concept of emergence. I also think it contains a kernel of insight – and a possible scientific explanation – on why top-down control of economies is a bad idea.

Complexity and emergence

But first, what are complex adaptive systems? The following passages from Waldrop’s book is a neat summary of what they are:

“For example, every one of these questions refers to a system that is complex, in the sense that a great many independent agents are interacting with each other in a great many ways. Think of the quadrillions of chemically reacting proteins, lipids, and nucleic acids that make up a living cell, or the billions of interconnected neurons that make up the brain, or the millions of mutually interdependent individuals who make up a human society.

In every case, moreover, the very richness of these interactions allows the system as a whole to undergo spontaneous self-organization. Thus, people trying to satisfy their material needs unconsciously organize themselves into an economy through myriad individual acts of buying and selling; it happens without anyone being in charge or consciously planning it. The genes in a developing embryo organize themselves in one way to make a liver cell and in another way to make a muscle cell… In every case’ groups of agents seeking mutual accommodation and self-consistency somehow manage to transcend themselves, acquiring collective properties such as life, thought, and purpose that they might never have possessed individually.

Furthermore, these complex, self-organizing systems are adaptive, in that they don’t just passively respond to events the way a rock might roll around in an earthquake. They actively try to turn whatever happens to their advantage. Thus, the human brain constantly organizes and reroganizes its billions of neural connections so as to learn from experience (sometimes, anyway)… the marketplace responds to changing tastes and lifestyles, immigration, technological developments, shifts in the price of raw materials, and a host of other factors. 

Finally, every one of these complex, self-organizing, adaptive systems possesses a kind of dynamism that makes them qualitatively different from static objects such as computer chips or snowflakes, which are merely complicated. Complex systems are more spontaneous, more disorderly, more alive than that. At the same time, however, their peculiar dynamism is also a far cry from the weirdly unpredictable gyrations known as chaos. In the past two decades, chaos theory has shaken science to its foundations with the realization that very simple dynamical rules can give rise to extraordinarily intricate behavior; witness the endlessly detailed beauty of fractals, or the foaming turbulence of a river. And yet chaos by itself doesn’t explain the structure, the coherence, the self-organizing cohesiveness of complex systems.

Instead, all these complex systems have somehow acquired the ability to bring order and chaos into a special kind of balance. This balance point – often called the edge of chaos – is where the components of a system never quite lock into place, and yet never quite dissolve into turbulence, either. The edge of chaos is where life has enough stability to sustain itself and enough creativity to deserve the name of life. The edge of chaos is where new ideas and innovative genotypes are forever nibbling away at the edges of the status quo, and where even the most entrenched old guard will eventually be overthrown.”

Put simply, a complex adaptive system comprises many agents, each of which may be following only simple rules. But through the interactions between the agents, sophisticated outcomes spontaneously “emerge”, even when the agents were not instructed to produce these outcomes. This phenomenon is known as emergence. Waldrop’s book has passages that help shed more light on emergence, and also has an illuminating example of how an emergent behaviour takes shape:

“These agents might be molecules or neurons or species or consumers or even corporations. But whatever their nature, the agents were constantly organizing and reorganizing themselves into larger structures through the clash of mutual accommodation and mutual rivalry. Thus, molecules would form cells, neurons would form brains, species would form ecosystems, consumers and corporations would form economies, and so on. At each level, new emergent structures would form and engage in new emergent behaviors. Complexity, in other words, was really a science of emergence… 

…Cells make tissues, tissues make organs, organs make organisms, organisms make ecosystems – on and on. Indeed, thought Holland, that’s what this business of “emergence” was all about: building blocks at one level combining into new building blocks at a higher level. It seemed to be one of the fundamental organizing principles of the world. It certainly seemed to appear in every complex, adaptive system that you looked at…

…Arthur was fascinated by the thing. Reynolds had billed the program as an attempt to capture the essence of flocking behavior in birds, or herding behavior in sheep, or schooling behavior in fish. And as far as Arthur could tell, he had succeeded beautifully. Reynolds’ basic idea was to place a large collection of autonomous, birdlike agents—“boids”—into an onscreen environment full of walls and obstacles. Each boid followed three simple rules of behavior: 

1. It tried to maintain a minimum distance from other objects in the environment, including other boids.

2. It tried to match velocities with boids in its neighborhood.

3. It tried to move toward the perceived center of mass of boids in its neighborhood.

What was striking about these rules was that none of them said, “Form a flock.” Quite the opposite: the rules were entirely local, referring only to what an individual boid could see and do in its own vicinity. If a flock was going to form at all, it would have to do so from the bottom up, as an emergent phenomenon. And yet flocks did form, every time. Reynolds could start his simulation with boids scattered around the computer screen completely at random, and they would spontaneously collect themselves into a flock that could fly around obstacles in a very fluid and natural manner. Sometimes the flock would even break into subflocks that flowed around both sides of an obstacle, rejoining on the other side as if the boids had planned it all along. In one of the runs, in fact, a boid accidentally hit a pole, fluttered around for a moment as though stunned and lost—then darted forward to rejoin the flock as it moved on.”

Emergence in the economy

In the first series of excerpts I shared from Waldrop’s book, it was hinted that an economy is a complex adaptive system. But this is not always true. Emergence is unlikely to happen in an economy with a very simple make-up. On the other hand, emergence is likely to occur in an economy whose depth and variety of economic activity within has increased over time. Here’s a relevant passage from Waldrop’s book:

“In fact, he argued, once you get beyond a certain threshold of complexity you can expect a kind of phase transition analogous to the ones he had found in his autocatalytic sets. Below that level of complexity you would find countries dependent upon just a few major industries, and their economies would tend to be fragile and stagnant. In that case, it wouldn’t matter how much investment got poured into the country. “If all you do is produce bananas, nothing will happen except that you produce more bananas.” But if a country ever managed to diversify and increase its complexity above the critical point, then you would expect it to undergo an explosive increase in growth and innovation-what some economists have called an “economic takeoff.””

This brings me to the topic behind the title and introduction of this article: Why top-down control of economies is a bad idea. An important aspect of emergence is that specific emergent phenomena in any particular complex adaptive system are inherently unpredictable. This applies to economies too. Given everything above, I think it stands to reason that any government that aims to exert top-down control over an economy that has grown in complexity would likely do a poor job. How can you control something well if you’re unable to predict its behaviour? 


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I do not have a vested interest in any companies mentioned. Holdings are subject to change at any time.

More Thoughts From American Technology Companies On AI

A vast collection of notable quotes on artificial intelligence, or AI, from the management teams of US-listed technology companies.

Nearly a month ago, I published What American Technology Companies Are Thinking About AI. In it, I shared commentary in earnings conference calls, from the leaders of technology companies that I follow or have a vested interest in, on the topic of AI and how the technology could impact their industry and the business world writ large. 

A few more technology companies I’m watching hosted earnings conference calls after the article was published. The leaders of these companies also had insights on AI that I think would be useful to share. Here they are, in no particular order:

Adobe (NASDAQ: ADBE)

Adobe’s management will be building foundational models as well as using generative AI as co-pilots for users

Our generative AI strategy focuses on data, models and interfaces. Our rich datasets across creativity, documents and customer experiences enable us to train models on the highest quality assets. We will build foundation models in the categories where we have deep domain expertise, including imaging, vector, video, documents and marketing. We are bringing generative AI to life as a co-pilot across our incredible array of interfaces to deliver magic and productivity gains for a broader set of customers. 

Adobe’s management thinks its generative AI feature, Firefly, has multiple monetisation opportunities, but will only introduce specific pricing later this year with a focus on monetisation right now

Our generative AI offerings represent additional customer value as well as multiple new monetization opportunities. First, Firefly will be available both as a stand-alone freemium offering for consumers as well as an enterprise offering announced last week. Second, copilot generative AI functionality within our flagship applications will drive higher ARPUs and retention. Third, subscription credit packs will be made available for customers who need to generate greater amounts of content. Fourth, we will offer developer communities access to Firefly APIs and allow enterprises the ability to create exclusive custom models with their proprietary content. And finally, the industry partnerships as well as Firefly represent exciting new top-of-funnel acquisition opportunities for Express, Creative Cloud and Document Cloud. Our priority for now is to get Firefly broadly adopted, and we will introduce specific pricing later this year.

Adobe is seeing outstanding customer demand for generative AI features

We’re really excited, if you can’t tell on the call, about Firefly and what this represents. The early customer and community response has been absolutely exhilarating for all of us. You heard us talk about over 0.5 billion assets that have already been generated. Generations from Photoshop were 80x higher than we had originally projected going into the beta and obviously, feel really good about both the quality of the content being created and also the ability to scale the product to support that

Adobe has built Firefly to be both commercially as well as socially safe for use

Third is that, and perhaps most importantly, we’ve also been able to — because of the way we share and are transparent about where we get our content, we can tell customers that their content generated with Firefly is commercially safe for use. Copyrights are not being violated. Diversity and inclusion is front and center. Harmful imagery is not being generated.

Adobe’s management believes that (1) marketing will become increasingly personalised, (2) the personalisation has to be done at scale, and (3) Adobe can help customers achieve the personalisation with the data that it has

I think if you look at Express and Firefly and also the Sensei GenAI services that we announced for Digital Experience, comes at a time when marketing is going through a big shift from sort of mass marketing to personalized marketing at scale. And for the personalization at scale, everything has to be personalized, whether it’s content or audiences, customer journeys. And that’s the unique advantage we have. We have the data within the audience — the Adobe Experience Platform with the real-time customer profiles. We then have the models that we’re working with like Firefly. And then we have the interfaces through the apps like Adobe Campaign, Adobe Experience Manager and so on.So we can put all of that together in a manner that’s really consistent with the data governance that people — that customers expect so that their data is used only in their context and use that to do personalized marketing at scale. So it really fits very well together.

Adobe’s management believes that content production will increase significantly in the next few years because of AI and this will lead to higher demand for more software-seats

And we’re sitting at a moment where companies are telling us that there’s a 5x increase in content production coming out in the next few — next couple of years. And you see a host of new media types coming out. And we see the opportunity here for both seat expansion as a result of this and also because of the value we’re adding into our products themselves, increase in ARPU as well.

DocuSign (NASDAQ: DOCU)

DocuSign’s management believes that generative AI can transform all aspects of the agreement workflow

In brief, we believe AI unlocks the true potential of the intelligent agreement category. We already have a strong track record, leveraging sophisticated AI models, having built and shipped solutions based on earlier generations of AI. Generative AI can transform all aspects of agreement workflow, and we are uniquely positioned to capitalize on this opportunity. As an early example, we recently introduced a new limited availability feature agreement summarization. This new feature, which is enabled by our integration with Microsoft’s Azure Open AI service and tuned with our own proprietary agreement model uses AI to summarize and documents critical components giving signers a clear grasp of the most relevant information within their agreement, while respecting data security and privacy. 

Some possible future launches of generative AI features by DocuSign include search capabilities across agreement libraries and edits of documents based on industry best practices

Future launches will include search across customer agreement libraries, extractions from agreements and proposed language and edits based on customer, industry and universal best practices.

DocuSign has been working with AI for several years, but management sees the introduction of generative AI as a great opportunity to drive significant improvements to the company’s software products

I’d add to that, that I think the biggest change in our road map beyond that clear focus and articulation on agreement workflow is really the advent of generative AI. We’ve been working on AI for several years. As you know, we have products like Insights that leverage earlier generations of AI models. But given the enormous change there, that’s a fantastic opportunity to really unlock the category. And so, we’re investing very heavily there. We released some new products, and we’ll release more next week at Momentum, but I’m sure we’ll talk more about AI during the call. 

DocuSign’s management sees AI technology as the biggest long-term driver of the company’s growth

So, I think we — overall, I would say, product innovation is going to be the biggest driver and unlocker of our medium- to long-term growth. We do believe that we have very credible low-hanging fruit from better execution on our self-serve and product-backed growth motion. And so, that’s a top priority to drive greater efficiency in the near to medium term. I think the AI impact is perhaps the biggest in the long term. And we are starting to ship products, as I alluded to, and we’ll announce more next week. But in terms of its overall impact on the business, I think it’s still behind the other two in the — in the near to medium term. But in terms of the long-term potential of our category of agreement workflow, I think it’s a massive unlock and a fantastic opportunity for DocuSign.

DocuSign’s management is currently monetising AI by bundling AI features with existing features in some cases, and charging for AI features as add-ons in others; management needs to learn more about how customers are using AI features when it comes to monetization

In terms of monetization, I expect AI features to be both bundled as part of our baseline products, strengthening their functionality and value, as I suggested earlier. And in some cases, packaged as a separately charged add-on. We do both today. So, if you take our Insights product, which is really our AI-driven analytics product for CLM, we both have a stand-alone SKU. It’s sold separately as well as a premium bundle. I think, we’re going to need to learn a little bit more about how customers want to use this and what the key value drivers are before we finalize how we price the different features, but certainly mindful of wanting to capture the — deliver the most value and capture the most value for DocuSign, as we price it.

MongoDB (NASDAQ: MDB)

MongoDB’s management believes that AI will increase software development velocity and will enable more companies to launch more apps, leading to the speed of software development being even more important for companies

We believe AI will be the next frontier of development productivity — developer productivity and will likely lead to a step function increase in software development velocity. We know that most organizations have a huge backlog of projects they would like to take on but they just don’t have the development capacity to pursue. As developer productivity meaningfully improves, companies can dramatically increase their software ambitions and rapidly launch many more applications to transform their business. Consequently, the importance of development velocity to remain competitive will be even more pronounced. Said another way, if you are slow, then you are obsolete.

Companies are increasingly choosing MongoDB’s Atlas database service as the platform to build and run new AI apps

We are observing an emerging trend where customers are increasingly choosing Atlas as the platform to build and run new AI applications. For example, in Q1, more than 200 of the new Atlas customers were AI or ML companies. Well-financed start-ups like Hugging Face, [ Tekion ], One AI and [ Neura ] are examples of companies using MongoDB to help deliver the next wave of AI-powered applications to their customers.

MongoDB’s management believes that apps on legacy platforms will be replatformed to be AI-enabled, and those apps will need to migrate to MongoDB

We also believe that many existing applications will be replatformed to be AI enabled. This will be a compelling reason for customers to migrate from legacy technologies to MongoDB.

MongoDB’s management believes that in an increasingly AI-driven world, (1) AI will lead to more apps and more data storage demand for MongoDB; (2) developers will want to use modern databases like MongoDB to build; and (3) MongoDB can support wide use-cases, so it’s attractive to use MongoDB

First, we expect MongoDB to be a net beneficiary of AI, the reason being is that, as developer productivity increases, the volume of new applications will increase, which by definition will create new apps, which means more data stores, so driving more demand for MongoDB. Second, developers will be attracted to modern platforms like MongoDB because that’s the place where they can build these modern next-generation applications. And third, because of the breadth of our platform and the wide variety of use cases we support, that becomes even more of an impetus to use MongoDB. 

MongoDB’s management knows that AI requires vector databases, but thinks that AI still needs an operational datastore, which is where MongoDB excels in

The results that come from training and LLM against content are known as vector embeddings. And so content is assigned vectors and the vectors are stored in a database. These databases then facilitate searches when users query large language model with the appropriate vector embeddings, and it’s essentially how a user search is matched to content from an LLM. The key point, though, is that you still need an operational data store to store the actual data. And there are some adjunct solutions out there that have come out that are bespoke solutions but are not tied to actually where the data resides, so it’s not the best developer experience. And I believe that, over time, people will gravitate to a more seamless and integrated platform that offers a compelling user experience…

..Again, for generating content that’s accurate in a performant way, you do need to use vector embeddings which are stored in a database. And you — but you also need to store the data and you want to be able to offer a very compelling and seamless developer experience and be able to offer that as part of a broader platform. I think what you’ve seen, Brent, is that there’s been other trends, things like graph and time series, where a lot of people are very excited about these kind of bespoke single-function technologies, but over time, they got subsumed into a broader platform because it didn’t make sense for customers to have all these bespoke solutions which added so much complexity to their data architecture. 

Okta (NASDAQ: OKTA)

Okta has been working with AI for a number of years and some of its products contain AI features

So when we look at our own business, one of our huge — we have AI in our products, and we have for a few years, whether it’s ThreatInsight on the workforce side or Security Center on the customer identity side, which look at our billions of authentications and use AI to make sure we defend other customers from like similar types of threats that have been prosecuted against various customers on the platform. 

Okta’s management thinks AI could be really useful for helping users to auto-configure the set of Okta

One of the ideas that we’re working on that might be a typical use case of how someone like us could use AI is configuring Okta, setting the policy up for Okta across hundreds of applications on the workforce side or 10 or 20 applications on the customer identity side with various access policies and rules about who can access them and how they access them. It’d be pretty complicated to set up, but we’ve actually been prototyping using AI to auto-generate that configuration.

Okta’s management believes that AI will lead to higher demand for identity-use cases for the company

And then the other one we’re excited about is if you zoom out and you think this is a huge platform shift, it’s the next generation of technology. So that means that there’s going to be tons of new applications built with AI. It means that there’s going to be tons of new industries created and industries changed. And there’s going to be a login for all these things. You’re going to need to log on to these experiences. Sometimes it’s going to be machines. Sometimes it’s going to be users. That’s an identity problem, and we can help with that. So in a sense, we’re really going to be selling picks and shovels to the gold miners. 

Salesforce (NYSE: CRM)

Salesforce recently launched EinsteinGPT, a form of generative AI for customer relationship management

Last quarter, I told you of how our AI team is getting ready to launch EinsteinGPT, the world’s first generative AI for CRM. At Trailhead DX in March in front of thousands of trailblazers here in San Francisco, that’s exactly what we did. 

Salesforce announced SlackGPT, an AI assistant for users of the communication software Slack; management also believes that unleashing large language models within Slack can make the software incredibly valuable for users

We saw more of the incredible work of our AI team at our New York City World Tour this month when we demonstrated Slack GPT. Slack is a secure treasure trove of company data that generative AI can use to give every company and every employee their own powerful AI assistant, helping every employee be more productive in transforming the future of work. SlackGPT can leverage the power of generative AI, deliver instant conversation summaries, research tools and writing assistance directly in Slack. And you may never need to leave Slack to get a question answered. Slack is the perfect conversational interface for working with LLMs, which is why so many AI companies are Slack first and why OpenAI, ChatGPT and AnthropicSquad can now use Slack as a native interface…

…I think folks know, I have — my neighbor Sam Altman is the CEO of OpenAI, and I went over to his house for dinner, and it was a great conversation as it always is with him. And he had — he said, “Oh, just hold on one second, Marc, I want to get my laptop.” And he brought his laptop out and gave me some demonstrations of advanced technologies that are not appropriate for the call. But I did notice that there was only one application that he was using on his laptop and that was Slack. And the powerful part about that was I realized that everything from day 1 at OpenAI have been in Slack. And as we kind of brainstorm and talked about — of course, he was paying a Slack user fee and on and on, and he’s a great Slack customer. We’ve done a video about them, it’s on YouTube. But I realize that taking an LLM and embedding it inside Slack, well, maybe Slack will wake up. I mean there is so much data in Slack, I wonder if it could tell him what are the opportunities in OpenAI. What are the conflicts, what are the conversations, what should be his prioritization. What is the big product that got repressed that he never knew about.

And I realized in my own version of Slack at Salesforce, I have over 95 million Slack messages, and these are all open messages. I’m not talking about closed messaging or direct messaging or secure messaging between employees. I’m talking about the open framework that’s going on inside Salesforce and with so many of our customers. And then I realized, wow, I think Slack could wake up, and it could become a tremendous asset with an LLM consuming all that data and driving it. And then, of course, the idea is that is a new version of Slack. Not only do you have the free version of Slack, not only do you have the per user version of Slack, but then you have the additional LLM version of Slack. 

Salesforce is working with luxury brand Gucci to augment its client advisers by building AI chat technology

A great example already deploying this technology is Gucci. We’re working with them to augment their client advisers by building AI chat technology that creates a Gucci-fied tone of service, while incredible new voice, amplifying brand, storytelling and incremental sales as well. It’s an incredibly exciting vision for generative AI to transform which was customer service into now customer service, marketing and sales, all through augmenting Gucci employee capabilities using this amazing generative AI.

Salesforce’s management believes that Salesforce’s AI features can (1) help financial services companies improve the capabilities of their employees and (2) provide data-security for highly regulated companies when their data is used in AI models

But yesterday, there were many questions from my friend who I’m not going to give you his name because he’s one of the – the CEO of one of the largest and most important banks in the world. And I’ll just say that, of course, his primary focus is on productivity. He knows that he wants to make his bankers a lot more successful. He wants every banker to be able to rewrite a mortgage, but not every banker can, because writing the mortgage takes a lot of technical expertise. But as we showed him in the meeting through a combination of Tableau, which we demonstrated and Slack, which we demonstrated, and Salesforce’s Financial Services Cloud, which he has tens of thousands of users on, that banker understood that this would be incredible. But I also emphasize to him that LLMs, or large language models, they have a voracious appetite for data. They want every piece of data that they can consume. But through his regulatory standards, he cannot deliver all that data into the LLM because it becomes amalgamated. Today, he runs on Salesforce, and his data is secured down to the row and cell level.

Salesforce’s management believes that the technology sector experienced a “COVID super cycle” in 2020/2021 that made 2022 difficult for companies in the sector but that the tech could see an acceleration in growth in the future from an “AI supercycle”

I just really think you have to look at 2020, 2021 was just this massive super cycle called the pandemic. I don’t know if you remember, but we had a pandemic a couple of years ago. And during that, we saw tech buying like we never saw. It was incredible and everybody surged on tech buying. So you’re really looking at comparisons against that huge mega cycle… 

…That’s also what gives me tremendous confidence going forward and that what we’re really seeing is that customers are absorbing the huge amounts of technology that they bought. And that is about to come, I believe, to a close. I can’t give you the exact date, and it’s going to be accelerated by this AI super cycle.

Salesforce is doing a lot of work on data security when it comes to developing its AI features

For example, so we are doing a lot of things as the basic security level, like we are really doing tenant level isolation coupled with 0 retention architecture, the LLM level. So the LLM doesn’t remember any of the data. Along with that, they — for them to use these use cases, they want to have — they have a lot of these compliances like GDPR, ISO, SOC, Quadrant, they want to ensure that those compliances are still valid, and we’re going to solve it for that. In addition, the big worry everybody has is people have heard about hallucinations, toxicity, bias, this is what we call model trust. We have a lot of innovation around how to ground the data on 360 data, which is a huge advantage we have. And we are able to do a lot of things at that level. And then the thing, which I think Marc hinted at, which is LLMs are not like a database. These intra-enterprise trust, even once you have an LLM, you can’t open the data to everybody in the company. So you need ability to do this — who can access this data, how is it doing both before the query and after the query, we have to build that. 

Salesforce is importing 7 trillion reports into its Data Cloud to build AI features, and management believes this is a value trove of data

And by the way, the Data Cloud, just in a month, we are importing more than 7 trillion reports into the data layer, so which is a very powerful asset we have. So coupled with all of this is what they are looking for guidance and how we think we can deliver significant value to our customers.

Salesforce’s management sees generative AI as a useful tool to help non-technical users write software

But you can also imagine, for example, even with Salesforce, the ability as we’re going to see in June, that many of our trailblazers are amazing low-code, no-code trailblazers, but soon they’ll have the ability to tap into our LLMs like ProGen and Cogen that have the ability to code for them automatically. hey aren’t coders. They didn’t graduate computer science degrees.

The arc of progress that Salesforce’s management sees with AI: Predictive, then generative, then autonomous

So I think the way I see it is this AI technologies are a continuum that is predictive then they generate, and the real long-term goal is autonomous. The initial version of the generative AI will be more in terms of assistance…

… And then I think the fully autonomous cases, for example, in our own internal use cases with our models, we are able to detect 60% of instance and auto remediate. That requires a little bit more fine-tuning and we’ll have to work with specific customers to get to that level of model performance. So I see this is just the start of this cut. The assistant model is the initial thing to build trust and a human in the loop and validate it. And then as the models get better and better, we’ll keep taking use cases where we can fully automate it.

AI has already improved the productivity of Salesforce’s software developers by at least 20% and management thinks the same productivity-boost can happen for Salesforce’s customers

But the other use cases, which we are going to see, and in fact, I have rolled out our own code elements in our engineering org and we are already seeing minimum 20% productivity…

…In some cases, up to 30%. Now a lot of our customers are asking the same. We are going to roll Einstein GPT for our developers in the ecosystem, which will not only help not only the local developers bridge the gap, where there’s a talent gap but also reduce the cost of implementations for a lot of people. So there’s a lot of value.

Veeva Systems (NYSE: VEEV)

Veeva Systems recently announced an AI chatbot for field sales reps and management is not thinking about the chatbot’s monetisation at the moment

CRM Bot is an AI application for Vault CRM. You can think of it as ChatGPT for field teams…

…Yes, it’s really early right now. We’re focused on ensuring that we have the right product working with our customers. So that’s our focus right now. Let’s get the product right, and then we’ll get into more of the details on kind of the sizing and the opportunity there. But we’re excited overall about the opportunity we have in front of us …CRM bot will — that’s not an included product so that will have a license that will most likely be licensed by the user. So that will be net new. But as Brent mentioned, we’re focused on getting the product right and we don’t have pricing for that or sizing for that yet.

Veeva Systems’ management thinks that AI will not disrupt the company and will instead be a positive

Given the breadth and the nature of our industry cloud software, data, and services, AI will not be a major disruptor for our markets, rather it will be complementary and somewhat positive for Veeva in a few ways. We will develop focused AI applications where the technology is a good fit, such as CRM Bot for Vault CRM. The broadening use of AI will make our proprietary data assets, such as Link and Compass, more valuable over time because the data can be used in new ways. We will also make it easy for customers to connect their own AI applications with their Veeva applications, creating even more value from the Veeva ecosystem…

Veeva Systems’ management thinks that core systems of records will be needed even in the world of AI

I like our position as it relates to AI because we’re a core system of record. So that’s something you’re always going to need. I think that’s 1 thing that people should always understand. Core system of records will be needed even in the world of AI. If I ask Brent, hey, Brent, do you think 10 years from now, you’ll need a financial system to manage your financials. He’s going to tell me, yes, I really need one, you can’t take it away. ChatGPT won’t do it for me, right? I’m making a joke there, but our customers have the same critical operational systems around drug safety, around clinical trials, around regulatory, around their quality processes. So those are always going to be needed.

Veeva Systems is focused on leveraging its proprietary data assets with AI to make them more valuable 

Now we are also building our data assets, and these are proprietary data assets, Link, Compass and we’re building more data assets. Those will also be not affected by AI, but AI will be able to leverage those assets and make those assets more valuable. So I think we’ll develop more — we’ll do basically 3 things. We’ll develop more applications over time. CRM bot the first. We got to get that right. We also will — our proprietary data will get more valuable.

Veeva Systems’ management wants to make it easy for customers to connect their own AI applications with the company’s software products

And the third thing we’ll do is make our applications fit very well when customers have their own proprietary AI applications. So especially the Vault platform, we’ll do a lot of work in there to make it fit really well with the other AI applications they have from other vendors or that they develop themselves, because it’s an open ecosystem, and that’s how that’s part of being Veeva. 


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Adobe, DocuSign, MongoDB, Okta, Salesforce, and Veeva Systems. Holdings are subject to change at any time.

How Bad is Zoom’s Stock-Based Compensation?

On the surface, the rising stock based compensation for Zoom looks bad. But looking under the hood, the situation is not as bad as it looks.

There seems to be a lot of concern surrounding Zoom’s rising stock-based compensation (SBC).

In its financial years 2021, 2022 and 2023, Zoom recorded SBC of US$275 million, US$477 million and US$1,285 million, respectively. FY2023 was perhaps the most worrying for investors as Zoom’s revenue essentially flat-lined while its SBC increased by more than two-fold.

But as mentioned in an earlier article, GAAP accounting is not very informative when it comes to SBC. When companies report SBC using GAAP accounting, they record the amount on the financial statements based on the share price at the time of the grant. A more informative way to look at SBC would be from the perspective of the actual number of shares given out during the year.

In FY2021, 2022 and 2023, Zoom issued 0.6 million, 1.8 million and 4 million restricted stock units (RSUs), respectively. From that point of view, it seems the dilution is not too bad. Zoom had 293 million shares outstanding as of 31 January 2023, so the 4 million RSUs issued resulted in only 1.4% more shares.

What about down the road?

The number of RSUs granted in FY2023 was 22.1 million, up from just 3.1 million a year before. The big jump in FY2023 was because the company decided to give a one-time boost to existing employees. 

However, this does not mean that Zoom’s dilution is going to be 22 million shares every year from now. The number of RSUs granted in FY2023 was probably a one-off grant that will likely not recur and these grants will vest over a period of three to four years.

If we divide the extra RSUs given in FY2023 by their 4-year vesting schedule, we can assume that around 8 million RSUs will vest each year. This will result in an annual dilution rate of 2.7% based on Zoom’s 293 million shares outstanding as of 31 January 2023.

Bear in mind: Zoom guided for a weighted diluted share count of 308 million for FY2024. This diluted number includes 4.8 million in unexercised options that were granted a number of years ago. Excluding this, the number of RSUs that vest will be around 10 million and I believe this is because of an accelerated vesting schedule this year.

Cashflow impact

Although SBC does not result in a cash outflow for companies, it does result in a larger outstanding share base and consequently, lower free cash flow per share.

But Zoom can offset that by buying back its shares. At its current share price of US$69, Zoom can buy back 8 million of its shares using US$550 million. Zoom generated US$1.5B in free cash flow if you exclude working capital changes in FY2023. If it can sustain cash generation at this level, it can buy back all its stock that is issued each year and still have around US$1 billion in annual free cash flow left over for shareholders.

And we also should factor in the fact that in most companies, due to employee turnover, the RSU forfeiture rate is around 20% or more, which will mean my estimate of 8 million RSUs vesting per year for Zoom could be an overestimate. In addition, Zoom reduced its headcount by 15% in February this year, which should lead to more RSU forfeitures and hopefully fewer grants in the future.

Not as bad as it looks

GAAP accounting does not always give a complete picture of the financial health of a business. In my view, SBC is one of the most significant flaws of GAAP accounting and investors need to look into the financial notes to better grasp the true impact of SBC.

Zoom’s SBC numbers seem high. But when zooming in (pun intended), the SBC is not as bad as it looks. In addition, with share prices so low, it is easy for management to offset dilution with repurchases at very good prices. However, investors should continue to monitor share dilution over time to ensure that management is fair to shareholders.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Zoom. Holdings are subject to change at any time.

What American Technology Companies Are Thinking About AI

A vast collection of notable quotes on artificial intelligence, or AI, from the management teams of US-listed technology companies.

The way I see it, artificial intelligence (or AI), really leapt into the zeitgeist in late-2022 or early-2023 with the public introduction of DALL-E2 and ChatGPT. Both are provided by OpenAI and are software products that use AI to generate art and writing, respectively (and often at astounding quality). Since then, developments in AI have progressed at a breathtaking pace.

Meanwhile, the latest earnings season for the US stock market is coming to its tail-end. I thought it would be useful to collate some of the interesting commentary I’ve come across in earnings conference calls, from the leaders of technology companies that I follow or have a vested interest in, on the topic of AI and how the technology could impact their industry and the business world writ large. Here they are, in no particular order:

Airbnb (NASDAQ: ABNB)

Airbnb’s management thinks AI is a massive platform shift

Well, why don’t I start, Justin, with AI. This is certainly the biggest revolution and test since I came to Silicon Valley. It’s certainly as big of a platform shift as the Internet, and many people think it might be even bigger. 

Airbnb’s management thinks of foundational models as the highways and what they are interested in, is to build the cars on the highways, in other words, they are interested in tuning the model

And I’ll give you kind of a bit of an overview of how we think about AI. So all of this is going to be built on the base model. The base models, the large language models, think of those as GPT-4. Google has a couple of base models, Microsoft reaches Entropic. These are like major infrastructure investments. Some of these models might cost tens of billions of dollars towards the compute power. And so think of that as essentially like building a highway. It’s a major infrastructure project. And we’re not going to do that. We’re not an infrastructure company. But we’re going to build the cars on the highway. In other words, we’re going to design the interface and the tuning of the model on top of AI, on top of the base model. So on top of the base model is the tuning of the model. And the tuning of the model is going to be based on the customer data you have.

Airbnb’s management thinks AI can be used to help the company learn more about its users and build a much better way to match accommodation options with the profile of a user

If you were to ask a question to ChatGPT, and if I were to ask a question to ChatGPT, we’re both going to get pretty much the same answer. And the reason both of us are going to get pretty close the same answer is because ChatGPT doesn’t know that it’s between you and I, doesn’t know anything about us. Now this is totally fine for many questions, like how far is it from this destination to that destination. But it turns out that a lot of questions in travel aren’t really search questions. They’re matching questions. Another is, they’re questions that the answer depends on who you are and what your preferences are. So for example, I think that going forward, Airbnb is going to be pretty different. Instead of asking you questions like where are you going and when are you going, I want us to build a robust profile about you, learn more about you and ask you 2 bigger and more fundamental questions: who are you? And what do you want?

Airbnb’s management wants to use AI to build a global travel community and world-class personalised travel concierge

And ultimately, what I think Airbnb is building is not just a service or a product. But what we are in the largest sense is a global travel community. And the role of Airbnb and that travel community is to be the ultimate host. Think of us with AI as building the ultimate AI concierge that could understand you. And we could build these world-class interfaces, tune our model. Unlike most other travel companies, we know a lot more about our guests and hosts. This is partly why we’re investing in the Host Passport. We want to continue to learn more about people. And then our job is to match you to accommodations, other travel services and eventually things beyond travel. So that’s the big vision of where we’re going to go. I think it’s an incredibly expanding opportunity.

Airbnb’s management thinks that AI can help level the playing field in terms of the service Airbnb provides versus that of hotels

One of the strengths of Airbnb is that Airbnb’s offering is one of a kind. The problem with Airbnb is our service is also one of a kind. And so therefore, historically less consistent than a hotel. I think AI can level the playing field from a service perspective relative to hotels because hotels have front desk, Airbnb doesn’t. But we have literally millions of people staying on Airbnb every night. And imagine they call customer service. We have agents that have to adjudicate between 70 different user policies. Some of these are as many as 100 pages long. What AI is going to do is be able to give us better service, cheaper and faster by augmenting the agents. And I think this is going to be something that is a huge transformation. 

Airbnb’s management thinks that AI can help improve the productivity of its developers

The final thing I’ll say is developer productivity and productivity of our workforce generally. I think our employees could easily be, especially our developers, 30% more productive in the short to medium term, and this will allow significantly greater throughput through tools like GitHub’s Copilot. 

Alphabet (NASDAQ: GOOG)

Alphabet’s management thinks AI will unlock new experiences in Search as it evolves

As it evolves, we’ll unlock entirely new experiences in Search and beyond just as camera, voice and translation technologies have all opened entirely new categories of queries and exploration.

AI has been foundational for Alphabet’s digital advertising business for over a decade

AI has also been foundational to our ads business for over a decade. Products like Performance Max use the full power of Google’s AI to help advertisers find untapped and incremental conversion opportunities. 

Alphabet’s management is focused on making AI safe

And as we continue to bring AI to our products, our AI principles and the highest standards of information integrity remain at the core of all our work. As one example, our Perspective API helps to identify and reduce the amount of toxic text that language models train on, with significant benefits for information quality. This is designed to help ensure the safety of generative AI applications before they are released to the public.

Examples of Alphabet bringing generative AI to customers of its cloud computing service

We are bringing our generative AI advances to our cloud customers across our cloud portfolio. Our PaLM generative AI models and Vertex AI platform are helping Behavox to identify insider threats, Oxbotica to test its autonomous vehicles and Lightricks to quickly develop text-to-image features. In Workspace, our new generative AI features are making content creation and collaboration even easier for customers like Standard Industries and Lyft. This builds on our popular AI Bard Workspace tools, Smart Canvas and Translation Hub used by more than 9 million paying customers. Our product leadership also extends to data analytics, which provides customers the ability to consolidate their data and understand it better using AI. New advances in our data cloud enable Ulta Beauty to scale new digital and omnichannel experiences while focusing on customer loyalty; Shopify to bring better search results and personalization using AI; and Mercedes-Benz to bring new products to market more quickly. We have introduced generative AI to identify and prioritize cyber threats, automate security workflows and response and help scale cybersecurity teams. Our cloud cybersecurity products helped protect over 30,000 companies, including innovative brands like Broadcom and Europe’s Telepass.

The cost of computing when integrating LLMs (large language models) to Google Search is something Alphabet’s management has been thinking about 

On the cost side, we have always — cost of compute has always been a consideration for us. And if anything, I think it’s something we have developed extensive experience over many, many years. And so for us, it’s a nature of habit to constantly drive efficiencies in hardware, software and models across our fleet. And so this is not new. If anything, the sharper the technology curve is, we get excited by it, because I think we have built world-class capabilities in taking that and then driving down cost sequentially and then deploying it at scale across the world. So I think we’ll take all that into account in terms of how we drive innovation here, but I’m comfortable with how we’ll approach it.

Alphabet’s management does not seem concerned with any potential revenue-impact from integrating LLMs into Google’s core Search product

So first of all, throughout the years, as we have gone through many, many shifts in Search, and as we’ve evolved Search, I think we’ve always had a strong grounded approach in terms of how we evolve ads as well. And we do that in a way that makes sense and provide value to users. The fundamental drivers here are people are looking for relevant information. And in commercial categories, they find ads to be highly relevant and valuable. And so that’s what drives this virtuous cycle. And I don’t think the underpinnings over the fact that users want relevant commercial information, they want choice in what they look at, even in areas where we are summarizing and answering, et cetera, users want choice. We care about sending traffic. Advertisers want to reach users. And so all those dynamics, I think, which have long served us well, remain. And as I said, we’ll be iterating and testing as we go. And I feel comfortable we’ll be able to drive innovation here like we’ve always done.

Amazon (NASDAQ: AMZN)

Amazon’s management thinks that the AI boom will drive significant growth in data consumption and products in the cloud

And I also think that there are a lot of folks that don’t realize the amount of nonconsumption right now that’s going to happen and be spent in the cloud with the advent of large language models and generative AI. I think so many customer experiences are going to be reinvented and invented that haven’t existed before. And that’s all going to be spent, in my opinion, on the cloud.

Amazon has been investing in machine learning for more than two decades, and has been investing large sums of capital to build its own LLMs for several years

I think when you think about machine learning, it’s useful to remember that we have had a pretty substantial investment in machine learning for 25-plus years in Amazon. It’s deeply ingrained in virtually everything we do. It fuels our personalized e-commerce recommendations. It drives the pick pass in our fulfillment centers. We have it in our Go stores. We have it in our Prime Air, our drones. It’s obviously in Alexa. And then AWS, we have 25-plus machine learning services where we have the broadest machine learning functionality and customer base by a fair bit. And so it is deeply ingrained in our heritage…

…We’ve been investing in building in our own large language models for several years, and we have a very large investment across the company. 

Amazon’s management decided to build chips – Trainium for training and Inferentia for inference – that have great price and performance because LLMs are going to run on compute, which depend on chips (particularly GPUs, or graphic processing units) and GPUs are scarce; Amazon’s management also thinks that a lot of machine learning training will be taking place on AWS

If you think about maybe the bottom layer here, is that all of the large language models are going to run on compute. And the key to that compute is going to be the chips that’s in that compute. And to date, I think a lot of the chips there, particularly GPUs, which are optimized for this type of workload, they’re expensive and they’re scarce. It’s hard to find enough capacity. And so in AWS, we’ve been working for several years on building customized machine learning chips, and we built a chip that’s specialized for training, machine learning training, which we call Trainium, and a chip that’s specialized for inference or the predictions that come from the model called Inferentia. The reality, by the way, is that most people are spending most of their time and money on the training. But as these models graduate to production, where they’re in the apps, all the spend is going to be in inference. So they both matter a lot. And if you look at — we just released our second versions of both Trainium and Inferentia. And the combination of price and performance that you can get from those chips is pretty differentiated and very significant. So we think that a lot of that machine learning training and inference will run on AWS.

Amazon’s management thinks that most companies that want to use AI are not interested to build their own foundational models because it takes a lot of resources; Amazon has the resources to build foundational models, and is providing the foundational models to customers who can then customise the models

And if you look at the really significant leading large language models, they take many years to build and many billions of dollars to build. And there will be a small number of companies that want to invest that time and money, and we’ll be one of them in Amazon. But most companies don’t. And so what most companies really want and what they tell AWS is that they’d like to use one of those foundational models and then have the ability to customize it for their own proprietary data and their own needs and customer experience. And they want to do it in a way where they don’t leak their unique IP to the broader generalized model. And that’s what Bedrock is, which we just announced a week ago or so. It’s a managed foundational model service where people can run foundational models from Amazon, which we’re exposing ourselves, which we call Titan. Or they can run it from leading large language model providers like AI 21 and Anthropic and Stability AI. And they can run those models, take the baseline, customize them for their own purposes and then be able to run it with the same security and privacy and all the features they use for the rest of their applications in AWS. That’s very compelling for customers.

Every single one of Amazon’s businesses are built on top of LLMs

Every single one of our businesses inside Amazon are building on top of large language models to reinvent our customer experiences, and you’ll see it in every single one of our businesses, stores, advertising, devices, entertainment and devices, which was your specific question, is a good example of that.

ASML (NASDAQ: ASML)

ASML’s management sees that mature semiconductor technologies are actually needed even in AI systems

So I think this is something people underestimate how significant the demand in the mid-critical and the mature semiconductor space is. And it will just grow double digit, whether it’s automotive, whether it’s the energy transition, whether it’s just the entire industrial products area, where is the — well, those are the sensors that we actually need as an integral component of the AI systems. This is where the mid-critical and the mature semiconductor space is very important and needs to grow.

Block (NYSE: SQ)

Block’s management is focused on three technology trends, one of which is AI

The three trends we’re focused on: Number one is artificial intelligence; number two is open protocols; and number three is the global south. Consider how many times you’ve heard the term AI or GPT in the earnings calls just this quarter versus all quarters in history prior. This trend seems to be moving faster than anyone can comprehend or get a handle on. Everyone feels like they’re on their back foot and struggling to catch up. Utilizing machine learning is something we’ve always employed at Block, and the recent acceleration in availability of tools is something we’re eager to implement across all of our products and services. We see this first as a way to create efficiencies, both internally and for our customers. And we see many opportunities to apply these technologies to create entirely new features for our customers. More and more effort in the world will ship to creative endeavors as AI continues to automate mechanical tasks away.

Datadog (NASDAQ: DDOG)

Datadog’s management thinks AI can make software developers more productive in terms of generating more code; as a result, the complexity of a company’s technology will also increase, which will lead to more importance for observability and trouble-shooting software products

First, from a market perspective, over the long term, we believe AI will significantly expand our opportunity in observability and beyond. We seek massive improvements in developer productivity will allow individuals to write more applications and to do so faster than ever before. And as with past productivity increases, we think this will further shift value from writing code to observing, managing, fixing and securing live applications…

… Longer term, I think we can all glimpse at the future where productivity for everyone, including software engineers, increases dramatically. And the way we see that as a business is, our job is to help our customers absorb the complexity of the applications they’ve built so they can understand and modify them, run them, secure them. And we think that the more productivity there is, the more people can write in the amount of time. The less they understand the software they produce and the more they need us, the more value it sends our way. So that’s what makes us very confident in the long term here…

…And we — the way this has played out in the past typically is you just end up generating more stuff and more mess. So basically, if one person can produce 10x more, you end up with 10x more stuff and that person will still not understand everything they’ve produced. So the way we imagine the future is companies are going to deliver a lot more functionality to their users a lot faster. They’re going to solve a lot more problems in software. But the they won’t be as tight and understanding from their engineering team as to what it is they’ve built and how they built it and what might break and what might be the corner cases that don’t work and things like that. And that’s consistent with what we can see people building with a copilot today and things like that.

Etsy (NASDAQ: ETSY)

Etsy’s management thinks that AI can greatly improve the search-experience for customers who are looking for specific products

We’ve been at the cutting edge of search technology for the past several years, and while we use large language models today, we couldn’t be more excited about the potential of newer large language models and generative AI to further accelerate the transformation of Etsy’s user experience. Even with all our enhancements, Etsy search today is still key-word driven and text based and essentially the result is a grid with many thousands of listings. We’ve gotten better at reading the tea leaves, but it’s still a repetitive cycle of query result reformulation. In the future we expect search on Etsy to utilize more natural language and multimodal approaches. Rather than manipulating key words, our search engines will enable us to ask the right question at the right time to show the buyer a curated set of results that can be so much better than it is today. We’re investigating additional search engine technologies to identify attributes of an item, multi-label learning models for instant search, graph neural networks and so much more, which will be used in combination with our other search engine technologies. It’s our belief that Etsy will benefit from generative AI and other advances in search technology as much or perhaps even more so than others…

When you run a search at Etsy, we already use multiple machine learning techniques. So I don’t think generative AI replaces everything we’re doing, but it’s another tool that will be really powerful. And there are times when having a conversation instead of entering a query and then getting a bunch of search results and then going back and reformulating your query and then getting a bunch of search results, that’s not always very satisfying. And being able to say, no, I meant more like this. How about this? I’d like something that has this style and have that feel like more of a conversation, I think that can be a better experience a lot of the time. And I think in particular for Etsy where we don’t have a catalog, it might be particularly powerful.

Fiverr (NYSE: FVRR) 

Fiverr’s management thinks that the proliferation of AI services will not diminish the demand for freelancers, but it will lead to a bifurcation in the fates of freelancers between those who embrace AI, and those who don’t

We haven’t seen AI negatively impact our business. On the contrary, the categories we open to address AI-related services are booming. The number of AI-related gigs has increased over tenfold and buyer searches for AI have soared over 1,000% compared to 6 months ago, indicating a strong demand and validating our efforts to stay ahead of the curve in this rapidly evolving technological landscape. We are witnessing the increasing need for human skills to deploy and implement AI technologies, which we believe will enable greater productivity and improved quality of work when human talent is augmented by AI capabilities. In the long run, we don’t anticipate AI development to displace the need for human talent. We believe AI won’t replace our sellers; rather sellers using AI will outcompete those who don’t…

…In terms of your question about AI, you’re right, it’s very hard to understand what categories or how categories might be influenced. I think that there’s one principle that we’ve — that I’ve shared in my opening remarks, which I think is very important, and this is how we view this, which is that AI technology is not going to displace our sellers, but sellers who have better gross and better usage of AI are going to outcompete those who don’t. And this is not really different than any meaningful advancement within technology, and we’ve seen that in recent years. Every time when there’s a cool new technology or device or form factor that sellers need to become professional at, those who become professional first are those who are actually winning. And we’re seeing the same here. So I don’t think that this is a different case. It’s just different professions, which, by the way, is super exciting.

Fiverr’s management thinks that AI-produced work will still need a human touch

Furthermore, while AI-generated content can be well constructed, it is all based on existing human-created content. To generate novel and authentic content, human input remains vital. Additionally, verifying and editing the AI-generated content, which often contains inaccuracies, requires human expertise and effort. That’s why we have seen categories such as fact-checking or AI content editing flourish on our marketplace in recent months.

Mastercard (NYSE: MA)

Mastercard’s management thinks AI is a foundational technology for the company

For us we’ve been using AI for the better part of the last decade. So it’s embedded in a whole range of our products…

…So you’ll find it embedded in a range of our products, including generative AI. So we have used generative AI technology, particularly in creating data sets that allow us to compare and find threats in the cybersecurity space. You will find AI in our personalization products. So there’s a whole range of things that we set us apart. We use this as foundational technology. And internally, you can see increasingly so, that generative AI might be a good solution for us when it comes to customer service propositions and so forth.

MercadoLibre (NASDAQ: MELI)

MercadoLibre is utilising AI within its products and services, in areas such as customer-service and product-discovery

In terms of AI, I think as most companies, we do see some very relevant short- to midterm positive impact in terms of engineering productivity. And we are also increasing the amount of work being done on what elements of the consumer-facing experiences we can deploy AI on I think the focus right now is on some of the more obvious use cases, improving and streamlining customer service and interactions with reps, improving workflows for reps through AI-assisted workflow tools and then deploying AI to help a better search and discovery in terms of better finding products on our website and better understanding specific — specifications of products where existing LLM are quite efficient. And then beyond that, I think there’s a lot of work going on, and we hope to come up with other innovative forms of AI that we can place into the consumer-facing experience. but the ones I just mentioned are the ones that we’re currently working on the most.

Meta Platforms (NASDAQ: META)

Meta’s work in AI has driven significant improvements in (a) the quality of content seen by users of its services and (b) the monetisation of its services

Our investment in recommendations and ranking systems has driven a lot of the results that we’re seeing today across our discovery engine, reels and ads. Along with surfacing content from friends and family, now more than 20% of content in your Facebook and Instagram Feeds are recommended by AI from people groups or accounts that you don’t follow. Across all of Instagram, that’s about 40% of the content that you see. Since we launched Reels, AI recommendations have driven a more than 24% increase in time spent on Instagram. Our AI work is also improving monetization. Reels monetization efficiency is up over 30% on Instagram and over 40% on Facebook quarter-over-quarter. Daily revenue from Advantage+ shopping campaigns is up 7x in the last 6 months.

Meta’s management is focused on open-sourcing Meta’s AI models because they think going open-source will benefit the company in terms of it being able to make use of improvements to the models brought on by the open-source-community

Our approach to AI and our infrastructure has always been fairly open. We open source many of our state-of-the-art models, so people can experiment and build with them. This quarter, we released our LLaMA LLM to researchers. It has 65 billion parameters but outperforms larger models and has proven quite popular. We’ve also open sourced 3 other groundbreaking visual models along with their training data and model weights, Segment Anything, DINOv2 and our Animated Drawings tool, and we’ve gotten some positive feedback on all of those as well…

…And the reason why I think why we do this is that unlike some of the other companies in the space, we’re not selling a cloud computing service where we try to keep the different software infrastructure that we’re building proprietary. For us, it’s way better if the industry standardizes on the basic tools that we’re using, and therefore, we can benefit from the improvements that others make and others’ use of those tools can, in some cases, like Open Compute, drive down the costs of those things, which make our business more efficient, too. So I think to some degree, we’re just playing a different game on the infrastructure than companies like Google or Microsoft or Amazon, and that creates different incentives for us. So overall, I think that that’s going to lead us to do more work in terms of open sourcing some of the lower-level models and tools, but of course, a lot of the product work itself is going to be specific and integrated with the things that we do. So it’s not that everything we do is going to be open. Obviously, a bunch of this needs to be developed in a way that creates unique value for our products. But I think in terms of the basic models, I would expect us to be pushing and helping to build out an open ecosystem here, which I think is something that’s going to be important.

Meta’s management thinks the company now has enough computing infrastructure to do leading AI-related work after spending significant sums of money over the past few years to build that out

A couple of years ago, I asked our infra teams to put together ambitious plans to build out enough capacity to support not only our existing products but also enough buffer capacity for major new products as well. And this has been the main driver of our increased CapEx spending over the past couple of years. Now at this point, we are no longer behind in building out our AI infrastructure, and to the contrary, we now have the capacity to do leading work in this space at scale. 

Meta’s management is focused on using AI to improve its advertising services

We remain focused on continuing to improve ads ranking and measurement with our ongoing AI investments while also leveraging AI to power increased automation for advertisers through products like Advantage+ shopping, which continues to gain adoption and receive positive feedback from advertisers. These investments will help us develop and deploy privacy-enhancing technologies and build new innovative tools that make it easier for businesses to not only find the right audience for their ad but also optimize and eventually develop their ad creative.

Meta’s management thinks that generative AI can be a very useful tool for advertisers, but they’re still early in the stage of understanding what generative AI is really capable of

 Although there aren’t that many details that I’m going to share at this point, more of this will come in focus as we start shipping more of these things over the coming months. But I do think that there’s a big opportunity here. You asked specifically about advertisers, but I think it’s going to also help create more engaging experiences, which should create more engagement, and that, by itself, creates more opportunities for advertisers. But then I think that there’s a bunch of opportunities on the visual side to help advertisers create different creative. We don’t have the tools to do that over time, eventually making it. So we’ve always strived to just have an advertiser just be able to tell us what their objective is and then have us be able to do as much of the work as possible for them, and now being able to do more of the creative work there and ourselves for those who want that, I think, could be a very exciting opportunity…

…And then the third bucket is really around CapEx investments now to support gen AI. And this is an emerging opportunity for us. We’re still in the beginning stages of understanding the various applications and possible use cases. And I do think this may represent a significant investment opportunity for us that is earlier on the return curve relative to some of the other AI work that we’ve done. And it’s a little too early to say how this is going to impact our overall capital intensity in the near term.

Meta’s management also thinks that generative AI can be a very useful way for companies to have high-quality chatbots interacting with customers

I also think that there’s going to be a very interesting convergence between some of the AI agents in messaging and business messaging, where, right now, we see a lot of the places where business messaging is most successful are places where a lot of businesses can afford to basically have people answering a lot of questions for people and engaging with them in chat. And obviously, once you light up the ability for tens of millions of small businesses to have AI agents acting on their behalf, you’ll have way more businesses that can afford to have someone engaging in chat with customers.

Microsoft (NASDAQ: MSFT)

Microsoft’s management thinks there is a generational shift in online search happening now because of AI

As we look towards a future where chat becomes a new way for people to seek information, consumers have real choice in business model and modalities with Azure-powered chat entry points across Bing, Edge, Windows and OpenAI’s ChatGPT. We look forward to continuing this journey in what is a generational shift in the largest software category, search.

Because of Microsoft’s partnership with OpenAI, Microsoft Azure is now exposed to new AI-related workloads that it previously was not

Because of some of the work we’ve done in AI even in the last couple of quarters, we are now seeing conversations we never had, whether it’s coming through you and just OpenAI’s API, right, if you think about the consumer tech companies that are all spinning, essentially, i.e. the readers, because they have gone to OpenAI and are using their API. These were not customers of Azure at all. Second, even Azure OpenAI API customers are all new, and the workload conversations, whether it’s B2C conversations in financial services or drug discovery on another side, these are all new workloads that we really were not in the game in the past, whereas we now are. 

Microsoft’s management has plans to monetise all the different AI-copilots that it is introducing to its various products

Overall, we do plan to monetize a separate set of meters across all of the tech stack, whether they’re consumption meters or per user subscriptions. The Copilot that’s priced and it is there is GitHub Copilot. That’s a good example of incrementally how we monetize the prices that are there out there and others are to be priced because they’re in 3D mode. But you can expect us to do what we’ve done with GitHub Copilot pretty much across the board.

Microsoft’s management expects the company to lower the cost of compute for AI workloads over time

And so we have many knobs that will continuously — continue to drive optimization across it. And you see it even in the — even for a given generation of a large model, where we started them through the cost footprint to where we end in the cost footprint in a period of a quarter changes. So you can expect us to do what we have done over the decade plus with the public cloud to bring the benefits of, I would say, continuous optimization of our COGS to a diverse set of workloads.

Microsoft’s management has not been waiting – and is not waiting – for AI-related regulations to show up – instead, they are thinking hard about unintended consequences from Day 1 and have built those concerns into the engineering process

So overall, we’ve taken the approach that we are not waiting for regulation to show up. We are taking an approach where the unintended consequences of any new technology is something that from day 1, we think about as first class and build into our engineering process, all the safeguards. So for example, in 2016 is when we put out the AI principles, we translated the AI principles into a set of internal standards that then are further translated into an implementation process that then we hold ourselves to internal audit essentially. So that’s the framework we have. We have a Chief AI Officer who is sort of responsible for both thinking of what the standards are and then the people who even help us internally audit our following of the process. And so we feel very, very good in terms of us being able to create trust in the systems we put out there. And so we will obviously engage with any regulation that comes up in any jurisdiction. But quite honestly, we think that the more there is any form of trust as a differentiated position in AI, I think we stand to gain from that.

Nvidia (NASDAQ: NVDA)

Cloud service providers (CSPs) are racing to deploy Nvidia’s chips for AI-related work

First, CSPs around the world are racing to deploy our flagship Hopper and Ampere architecture GPUs to meet the surge in interest from both enterprise and consumer AI applications for training and inference. Multiple CSPs announced the availability of H100 on their platforms, including private previews at Microsoft Azure, Google Cloud and Oracle Cloud Infrastructure, upcoming offerings at AWS and general availability at emerging GPU-specialized cloud providers like CoreWeave and Lambda. In addition to enterprise AI adoption, these CSPs are serving strong demand for H100 from generative AI pioneers.

Nvidia’s management sees consumer internet companies as being at the forefront of adopting AI

Second, consumer Internet companies are also at the forefront of adopting generative AI and deep-learning-based recommendation systems, driving strong growth. For example, Meta has now deployed its H100-powered Grand Teton AI supercomputer for its AI production and research teams.

Nvidia’s management is seeing companies in industries such as automotive, financial services, healthcare, and telecom adopt AI rapidly

Third, enterprise demand for AI and accelerated computing is strong. We are seeing momentum in verticals such as automotive, financial services, health care and telecom where AI and accelerated computing are quickly becoming integral to customers’ innovation road maps and competitive positioning. For example, Bloomberg announced it has a $50 billion parameter model, BloombergGPT, to help with financial natural language processing tasks such as sentiment analysis, named entity recognition, news classification and question answering. Auto insurance company CCC Intelligent Solutions is using AI for estimating repairs. And AT&T is working with us on AI to improve fleet dispatches so their field technicians can better serve customers. Among other enterprise customers using NVIDIA AI are Deloitte for logistics and customer service, and Amgen for drug discovery and protein engineering.

Nvidia is making it easy for companies to deploy AI technology

And with the launch of DGX Cloud through our partnership with Microsoft Azure, Google Cloud and Oracle Cloud Infrastructure, we deliver the promise of NVIDIA DGX to customers from the cloud. Whether the customers deploy DGX on-prem or via DGX Cloud, they get access to NVIDIA AI software, including NVIDIA-based command, end-to-end AI frameworks and pretrained models. We provide them with the blueprint for building and operating AI, spanning our expertise across systems, algorithms, data processing and training methods. We also announced NVIDIA AI Foundations, which are model foundry services available on DGX Cloud that enable businesses to build, refine and operate custom large language models and generative AI models trained with their own proprietary data created for unique domain-specific tasks. They include NVIDIA NeMo for large language models, NVIDIA Picasso for images, video and 3D, and NVIDIA BioNeMo for life sciences. Each service has 6 elements: pretrained models, frameworks for data processing and curation, proprietary knowledge-based vector databases, systems for fine-tuning, aligning and guard railing, optimized inference engines, and support from NVIDIA experts to help enterprises fine-tune models for their custom use cases.

Nvidia’s management thinks that the advent of AI will drive a shift towards accelerated computing in data centers

Now let me talk about the bigger picture and why the entire world’s data centers are moving toward accelerated computing. It’s been known for some time, and you’ve heard me talk about it, that accelerated computing is a full stack problem but — it is full stack challenged. But if you could successfully do it in a large number of application domain that’s taken us 15 years, it’s sufficiently that almost the entire data center’s major applications could be accelerated. You could reduce the amount of energy consumed and the amount of cost for a data center substantially by an order of magnitude. It costs a lot of money to do it because you have to do all the software and everything and you have to build all the systems and so on and so forth, but we’ve been at it for 15 years.

And what happened is when generative AI came along, it triggered a killer app for this computing platform that’s been in preparation for some time. And so now we see ourselves in 2 simultaneous transitions. The world’s $1 trillion data center is nearly populated entirely by CPUs today. And I — $1 trillion, $250 billion a year, it’s growing of course. But over the last 4 years, call it $1 trillion worth of infrastructure installed, and it’s all completely based on CPUs and dumb NICs. It’s basically unaccelerated.

In the future, it’s fairly clear now with this — with generative AI becoming the primary workload of most of the world’s data centers generating information, it is very clear now that — and the fact that accelerated computing is so energy efficient, that the budget of a data center will shift very dramatically towards accelerated computing, and you’re seeing that now. We’re going through that moment right now as we speak, while the world’s data center CapEx budget is limited. But at the same time, we’re seeing incredible orders to retool the world’s data centers. And so I think you’re starting — you’re seeing the beginning of, call it, a 10-year transition to basically recycle or reclaim the world’s data centers and build it out as accelerated computing. You have a pretty dramatic shift in the spend of a data center from traditional computing and to accelerated computing with SmartNICs, smart switches, of course, GPUs and the workload is going to be predominantly generative AI…

…The second part is that generative AI is a large-scale problem, and it’s a data center scale problem. It’s another way of thinking that the computer is the data center or the data center is the computer. It’s not the chip. It’s the data center, and it’s never happened like us before. And in this particular environment, your networking operating system, your distributed computing engines, your understanding of the architecture of the networking gear, the switches and the computing systems, the computing fabric, that entire system is your computer, and that’s what you’re trying to operate. And so in order to get the best performance, you have to understand full stack and understand data center scale. And that’s what accelerated computing is.

Nvidia’s management thinks that the training of AI models will be an always-on process

 You’re never done with training. You’re always — every time you deploy, you’re collecting new data. When you collect new data, you train with the new data. And so you’re never done training. You’re never done producing and processing a vector database that augments the large language model. You’re never done with vectorizing all of the collected structured, unstructured data that you have. And so whether you’re building a recommender system, a large language model, a vector database, these are probably the 3 major applications of — the 3 core engines, if you will, of the future of computing as well as a bunch of other stuff. But obviously, these are very — 3 very important ones. They are always, always running.

When it comes to inference – or the generation of an output – there’s a lot more that goes into it than just the AI models

The other thing that’s important is these are models, but they’re connected ultimately to applications. And the applications could have image in, video out, video in, text out, image in, proteins out, text in, 3D out, video in, in the future, 3D graphics out. So the input and the output requires a lot of pre and postprocessing. The pre and postprocessing can’t be ignored. And this is one of the things that most of the specialized chip arguments fall apart. And it’s because the length — the model itself is only, call it, 25% of the data — of the overall processing of inference. The rest of it is about preprocessing, postprocessing, security, decoding, all kinds of things like that.

Paycom Software (NYSE: PAYC)

Paycom’s management thinks AI is definitely going to have a major impact in the payroll and HCM (human capital management) industry

I definitely think it’ll be relevant. You can use AI for multiple things. There are areas that you can use it for that are better than others. And they’re front-end things you can use it for direct to the client. There are back-end things that you can use it for that a client may never see. And so when you’re talking about AI, it has many uses, some of which is front end and some back end. And I don’t want to talk specifically about what exactly we’re using it for already internally and what our opportunities would be into the future. But in answer to your question, yes, I do think that over time, AI is going to be a thing in our industry.

PayPal (NASDAQ: PYPL)

PayPal has been working with AI (in fraud and risk management) for several years, and management thinks generative AI and other forms of AI will be useful in the online payments industry

For several years, we’ve been at the forefront of advanced forms of machine learning and AI to combat fraud and to implement our sophisticated risk management programs. With the new advances of generative AI, we will also be able to accelerate our productivity initiatives. We expect AI will enable us to meaningfully lower our costs for years to come. Furthermore, we believe that AI, combined with our unique scale and sets of data, will drive not only efficiencies, but will also drive a differentiated and unique set of value propositions for our merchants and consumers…

…And we are now beginning to experiment with the first generation of what we call AI-powered checkout, which looks at the full checkout experience, not just the PayPal checkout experience, but the full checkout experience for our merchants…

…There’s no question that AI is going to impact almost every function inside of PayPal, whether it be our front office, back office, marketing, legal, engineering, you name it. AI will have an impact and allow us to not just lower cost, but have higher performance and do things that is not about trade-offs. It’s about doing both in there.

Shopify (NASDAQ: SHOP)

Shopify’s management thinks the advent of AI makes a copilot for entrepreneurship possible

But now we are at the dawn of the AI era and the new capabilities that are unlocked by that are unprecedented. Shopify has the privilege of being amongst the companies with the best chances of using AI to help our customers. A copilot for entrepreneurship is now possible. Our main quest demands from us to build the best thing that is now possible, and that has just changed entirely.

Shopify recently launched an AI-powered shopping assistant that is powered by OpenAI’s ChatGPT

We also — you’re also seeing — we announced a couple of weeks ago, Shop at AI, which is what I think is the coolest shopping concierge on the planet, whereby you as a consumer can use Shop at AI and you can browse through hundreds of millions of products and you can say things like I want to have a barbecue and here’s the theme and it will suggest great products, and you can buy it right in line right through the shopping concierge.  

Shopify has been using AI to help its merchants write product descriptions so that merchants can better focus on taking care of their customers

 For example, the task of writing product descriptions is now made meaningfully easier by injecting AI into that process. And what does that — the end result of that is merchants spend less time running product descriptions and more time making beautiful products and communicating and engaging with their customers. 

Taiwan Semiconductor Manufacturing Company (NYSE: TSM)

TSMC’s management sees demand in most end-markets as being mostly soft, but AI-related demand is growing

We observed the PC and smartphone market continue to be soft at the present time, while automotive demand is holding steady for TSMC and it is showing signs of soften into second half of 2023. I’m talking about automotive. On the other hand, we have recently observed incremental upside in AI-related demand

TSMC’s management thinks it’s a little too early to tell how big the semiconductor market can grow into because of AI, but they do see a positive trend

We certainly, we have observed an incremental increase in AI-related demand. It will also help the ongoing inventory digestion. The trend is very positive for TSMC. But today, if you ask me to quantitatively to say that how much of the amount increase or what is the dollar content in the server, it’s too early to say. It still continue to be developed. And ChatGPT right now reinforce the already stronger conviction that we have in HPC and AI as a structurally megatrend for TSMC’s business growth in the future. Whether this one has been included in our previous announcement is said that we have a 15% to 20% CAGR, the answer is probably partly yes, because of — for several, we have accelerated into our consideration. But this ChatGPT is a large language model is a new application. And we haven’t really have a kind of a number that put into our CAGR. But is definitely, as I said, it really reinforced our already strong conviction that HPC and AI will give us a much higher opportunities in the future…

…We did see some positive signs of the people getting much more attention to AI application, especially the ChatGPT’s area. However, as I said, quantitatively, we haven’t have enough data to summing it up to see what is the contribution and what kind of percentage to TSMC’s business. But we remain confident that this trend is definitely positive for TSMC.

TSMC’s management sees most of the AI work performed today as being focused on training but that it will flip to inference in the future – but nonetheless, high-performance semiconductors will still need be needed for AI-related work

Right now, most of the AI concentrate or focus on training. And in the future, it will be inference. But let me say that, no matter what kind of application, they need to use a very high-performance semiconductor component, and that actually is a TSMC’s advantage. So we expect that semiconductor content starting from a data center for [indiscernible] to device and edge device or those kind of things, put all together, they need a very high-speed computing with a very power-efficient one. And so we expect it will add to TSMC’s business a lot.

Tencent (NASDAQ: TCEHY)

Tencent is using AI to deliver more relevant ads to users of its services

We upgraded our machine learning advertising platform to deliver higher conversions for advertisers. For example, we help our advertisers dynamically feature their most relevant products inside their advertisements by applying our deep learning model to the standard product unit attributes we have aggregated within our SPU database. 

Tencent’s management thinks there will be a proliferation of AI models – both foundational as well as vertical – from both established companies as well as startups

So in terms of going forward, we do believe that number one, there’s going to be many models in the market going forward for the large companies, I think each one of us would have a foundation model. And the model will be supporting our own use cases as well as offer it to the market both on a 2C basis as well as on a 2B basis. And at the same time, there will be many start-ups, which will be creating their own models, some of them may be general foundation model. Some of them may be more industry and vertical models and they will be coming with new applications. I think overall, it’s going to be a very vibrant industry from a model availability perspective.

Tencent’s management thinks AI can help improve the quality of UGC (user-generated content)

In terms of the user-to-user interaction type of services like social network and short video network and games, long lead content, there will be — a lot of usages that helps to increase the quality of content, the efficiency at which the content are created as well as lowering the cost of content creation. And that will be net beneficiary to these applications. 

Tencent’s management thinks China’s government is supportive of innovation in AI

Now in terms of — you asked about regulation. Without the government’s general stance is like it’s supportive of regulation, but the industry has to be regulated. And I think this is not something that’s specific to China, even around the world. And you look at the U.S., there’s a lot of public discussion about having regulation and even the founder of OpenAI has been testifying and asking for regulation in the industry. So I think that is something which is necessary, but we felt under the right regulation and regulatory framework, then the government stance is supportive of innovation and the industry will actually have room for healthy growth.

Tesla (NASDAQ: TSLA)

Tesla’s management thinks data will be incredibly valuable when building out AI services, especially in self-driving

Regarding Autopilot and Full Self-Driving. We’ve now crossed over 150 million miles driven by Full Self-Driving beta, and this number is growing exponentially. We’re — I mean, this is a data advantage that really no one else has. Those who understand AI will understand the importance of data — of training data and how fundamental that is to achieving an incredible outcome. So yes, so we’re also very focused on improving our neural net training capabilities as is one of the main limiting factors of achieving full autonomy. 

Tesla’s management thinks the company’s supercomputer project, Dojo, could significantly improve the cost of training AI models

So we’re continuing to simultaneously make significant purchases of NVIDIA GPUs and also putting a lot of effort into Dojo, which we believe has the potential for an order of magnitude improvement in the cost of training. 

The Trade Desk (NASDAQ: TSLA)

Trade Desk’s management thinks that generative AI is only as good as the data that it has been trained on

ChatGPT is an amazing technology, but its usefulness is conditioned on the quality of the dataset it is pointed at. Regurgitating bad data, bad opinions or fake news, where AI generated deep bases, for example, will be a problem that all generative AI will likely be dealing with for decades to come. We believe many of the novel AI use cases in market today will face challenges with monetization and copyright and data integrity or truth and scale.

Trade Desk has very high-quality advertising data at scale (it’s handling 10 million ad requests per second) so management thinks that the company can excel by applying generative AI to its data

By contrast, we are so excited about our position in the advertising ecosystem when it comes to AI. We look at over 10 million ad requests every second. Those requests, in sum, represent a very robust and very unique dataset with incredible integrity. We can point generative AI at that dataset with confidence for years to come. We know that our size, our dataset size and integrity, our profitability and our team will make Koa and generative AI a promising part of our future.

Trade Desk’s management sees AI bringing positive impacts to many areas of the company’s business, such as generating code faster, generating creatives faster, and helping clients learn programmatic advertising faster

In the future, you’ll also hear us talk about other applications of AI in our business. These include generating code faster; changing the way customers understand and interact with their own data; generating new and more targeted creatives, especially for video and CTV; and using virtual assistance to shorten the learning curve that comes with the complicated world of programmatic advertising by optimizing the documentation process and making it more engaging.

Visa (NYSE: V)

Visa, which is in the digital payments industry, has a long history of working with AI and management sees AI as an important component of what the company does

I’ll just mention that we have a long history developing and using predictive AI and deep learning. We were one of the pioneers of applied predictive AI. We have an enormous data set that we’ve architected to be utilized at scale by hundreds of AI and ML, different services that people use all across Visa. We use it — we use it to run our company more effectively. We use it to serve our clients more effectively. And this will continue to be a big part of what we do.

Visa’s management thinks generative AI can take the company’s current AI services to the next level

As you transition to generative AI, this is where — we see this as an opportunity to take our current AI services to the next level. We are kind of as a platform, experimenting with a lot of the new capabilities that are available. We’ve got people all over the company that are tinkering and dreaming and thinking and doing testing and figuring out ways that we could use generative AI to transform how we do what we do, which is deliver simple, safe and easy-to-use payment solutions. And we’re also spending a fair bit of time thinking how generative AI will change the way that sellers sell, and we all buy and all of the shop. So that is — it’s a big area of opportunity that we’re looking at in many different ways across the company.

Wix (NASDAQ: WIX)

Wix’s management thinks AI can reduce a lot of friction for users in creating websites

First, our goal at Wix is to reduce friction. The easier it is for our users to build websites, the better Wix is. We have proven this many times before, through the development of software and products, including AI. As we make it easier for our users to achieve their goals, their satisfaction goes up, conversion goes up, user retention goes up, monetization goes up and the value of Wix grows…

…  Today, new emerging AI technologies create an even bigger opportunity to reduce friction in more areas that were almost impossible to solve a few years ago and further increase the value of our platform. We believe this opportunity will result in an increased addressable market and even more satisfied users. 

Wix’s management thinks that much more is needed to run e-commerce websites than just AI and even if AI can automate every layer, it is still very far into the future

The second important point is that there is a huge amount of complexity in software, even with websites, and it’s growing. Even if AI could code a fully functional e-commerce website, for example — which I believe we are still very far from — there is still a need for the site to be deployed to a server, to run the code, to make sure the code continues to work, to manage and maintain a database for when someone wants to buy something, to manage security, to ship the products, to partner with payment gateways, and many more things. So even if you have something that can build pages and content and code…you still need so much more. This gets to my third and final point, which is that even in the far future, if AI is able to automate all of these layers, it will have to disrupt a lot of the software industry, including database management, server management and cloud computing. I believe we are very far from that and that before then, there will be many more opportunities for Wix to leverage AI and create value for our users.

Zoom Video Communications (NASDAQ: ZM)

Zoom management’s approach to AI is federated, empowering, and responsible

We outlined our approach to AI is to drive forward solutions that are federated empowering and responsible. Federated means flexible and customizable to businesses unique scenarios and nomenclature. Empowering refers to building solutions that improve individual and team productivity as well as enhance the customers’ experience. And responsible means customer control of their data with an emphasis on privacy, security, trust and safety.

Zoom recently made a strategic investment in Anthropic and management will be integrating Anthropic’s AI assistant feature across Zoom’s product portfolio

Last week, we announced our strategic investment in Anthropic, an AI safety and research company working to build reliable, interpretable and steerable AI systems. Our partnership with Anthropic further boosts our federated approach to AI by allowing Anthropic’s AI assistant cloud to be integrated across Zoom’s entire platform. We plan to begin by layering Claude into our Contact Center portfolio, which includes Zoom Contact Center, Zoom Virtual Agent, and now in-beta Zoom Workforce Engagement Management. With Claude guiding agents towards trustworthy resolutions and empowering several service for end users, companies will be able to take customer relationships to the next level.

Zoom’s management thinks that having AI models is important, but it’s even more important to fine-tune them based on proprietary data

Having said that, there are 2 things really important. One is the model, right? So OpenAI has a model, Anthropic and Facebook as well, Google and those companies. But the most important thing is how to leverage these models to fine tune based on your proprietary data, right? That is extremely important when it comes to collaboration, communication, right? Take a zoom employee, for example. We have so many meetings, right, and talk about — every day, like our sales team use the Zoom call with the customers. We accumulated a lot of, let’s say, internal meeting data. How to fine tune the model with those data, it’s very important, right?

Examples of good AI use cases in Zoom’s platform

We also look at our core meeting platform, right, in meeting summary. It is extremely important, right? And it’s also we have our team chat solution and also how to lever that to compose a chat. Remember, last year, we also have email candidate as well. How do we leverage the generative AI to understand the context, right, and kind of bring all the information relative to you and help you also generate the message, right? When you send an e-mail back to customers or prospects, right, either China message or e-mail, right? We can lever to generate is, right? I think a lot of areas, even like you like say, maybe you might be later to the meeting, right? 10 minutes later, you joined the meeting. You really want to stand in what had happened, right? Can you get a quick summary over the positive minutes. Yes, you just also have to generate AI as well. You also can get that as well. 

Zoom’s management thinks there are multiple ways to monetise AI

I think in terms of how to monetize generative I think first of all, take Zoom IQ for Sales for example, that’s a new service to target the sales deportment. That AI technology is based on generative AI, right, so we can monetize. And also seeing some features, even before the generative AI popularity, we have a live transmission feature, right? And also, that’s not a free feature. It is a paid feature, right, behind the pay wall, right? And also a lot of good features, right, take the Zoom meeting summary, for example, for enterprise — and the customers… For to customers, all those SMB customers, they did not deploy Zoom One, they may not get to those features, right? That’s the reason — another reason for us to monetize. I think there’s multiple ways to monetize, yes.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Alphabet, Amazon, ASML, Datadog, Etsy, Fiverr, Mastercard, MercadoLibre, Meta Platforms, Microsoft, Paycom Software, PayPal, Shopify, TSMC, Tencent, Tesla, The Trade Desk, Visa, Wix, Zoom. Holdings are subject to change at any time.

When Share Buybacks Lose Their Power

Apple’s share buybacks have greatly benefited shareholders in the past. But with share prices much higher, buybacks may be less powerful.

Share buybacks can be a powerful tool for companies to boost their future earnings per share. By buying back shares, a company’s future earnings can now be shared between fewer shares, boosting the amount each shareholder can get.

Take Apple for example. From 2016 to 2022, the iPhone maker’s net income increased by 118%, or around 14% annualised. That’s pretty impressive. But Apple’s earnings per share (EPS) outpaced net income growth by a big margin: EPS advanced by 193%, or 19.6% per year.

The gap exists because Apple used share buybacks to decrease its share count. Its outstanding share count dropped by around 30%, or an annualised rate of close to 5.7%, over the same period.

But the power of buybacks is very much dependent on the price at which they are conducted. If a company’s share price represents a high valuation, earnings per share growth from buybacks will be less, and vice versa.

Valuations matter

To compare how buybacks lose their effectiveness when valuations rise, let’s examine a simple illustration. There are two companies, A and B, that both earn a $100 net profit every year and have 100 shares outstanding. These give them both earnings per share of $1. Let’s say Company A’s share price is $10 while Company B’s is at $20. The two companies also use all their profits in year 1 to buy back their shares.

Companies A and B would end the year with 90 and 95 shares outstanding, respectively. From Year 2 onwards, Company A’s earnings per share will be $1.11, or an 11% increase. Company B on the other hand, only managed to increase its earnings per share to $1.052, or 5.2%.

Buybacks are clearly much more effective when the share prices, and thus the valuation, is lower.

The case of Apple

As I mentioned earlier, Apple managed to decrease its share count by 30% over the last six years or 5.7% per year. A 30% decrease in shares outstanding led to a 42% increase in EPS*. 

Apple was able to decrease its share count so significantly during the last six years because its share price was trading at relatively low valuations. Apple also used almost all of its free cash flow that it generated over the last six years to buy back shares. The chart below shows Apple’s price-to-earnings multiples from 2016.

Source: TIKR

Source: TIKR

From 2016 to 2019, Apple’s trailing price-to-earnings (PE) ratio ranged from 10 to 20. But since then, the PE ratio has increased and now sits around 30.

In the last 6 years – from 2016 to 2022 – Apple was able to reduce its share count by 30% or 5.7% a year. But with its PE ratio now at close to 30, the impact of Apple’s buybacks will not be as significant. If Apple continues to use 100% of its free cash flow to buy back shares, it will reduce its share count only by around 3.3% per year. Although that’s a respectable figure, it doesn’t come close to what Apple achieved in the 6 years prior. 

At an annual reduction rate of 3.3%, Apple’s share count will only fall by around 18% over six years, compared to the 30% seen from 2016 to 2022. This will increase Apple’s earnings per share by around 22% versus the actual 42% clocked in the past six years.

In closing

Apple is a great company that has rewarded shareholders multiple folds over the last few decades. In addition to growing its business, timely buybacks have also contributed to the fast pace of Apple’s earnings per share growth. 

Although I believe Apple will likely continue to post stellar growth in the coming years with the growth of its services business and its potential in emerging markets, growth from buybacks may not be as powerful as it used to be.

When analysing the power of buybacks, shareholders should monitor the valuation of the stock and assess whether the buybacks are worthwhile for shareholders.

*(1/1-0.3)


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Apple. Holdings are subject to change at any time.

What Causes Stock Prices To Rise?

A company can be valued based on its future cash flows. Dividends, as cashflows to shareholders, should therefore drive stock valuations.

I recently wrote about why dividends are the ultimate driver of stock valuations. Legendary investor Warren Buffett once said: “Intrinsic value can be defined simply as the discounted value of cash that can be taken out of business during its remaining life.”

And dividends are ultimately the cash that is taken out from a business over time. As such, I consider the prospect of dividends as the true driver of stock valuations.

But what if a company will not pay out a dividend in my lifetime? 

Dividends in the future

Even though we may never receive a dividend from a stock, we should still be able to make a gain through stock price appreciation.

Let’s say a company will only start paying out $100 a share in dividends 100 years from now and that its dividend per share will remain stable from then. An investor who wants to earn a 10% return will be willing to pay $1000 a share at that time.

But it is unlikely that anyone reading this will be alive 100 years from now. That doesn’t mean we can’t still make money from this stock.

In Year 99, an investor who wants to make a 10% return will be willing to pay $909 a share as they can sell it to another investor for $1000 in Year 100. That’s a 10% gain.

Similarly, an investor knowing this, will be willing to pay $826 in Year 98, knowing that another buyer will likely be willing to pay $909 to buy it from him in a year. And on and on it goes.

Coming back to the present, an investor who wants to make a 10% annual return should be willing to pay $0.07 a share. Even though this investor will likely never hold the shares for 100 years, in a well-oiled financial system, the investor should be able to sell the stock at a higher price over time.

But be warned

In the above example, I assumed that the financial markets are working smoothly and investors’ required rate of return remained constant at 10%. I also assumed that the dividend trajectory of the company is known. But reality is seldom like this.

The required rate of return may change depending on the risk-free rate, impacting what people will pay for the stock at different periods of time. In addition, uncertainty about the business may also lead to stock price fluctuations. Furthermore, there may even be mispricings because of misinformation or simply irrational behaviour of buyers and sellers of the stock. All of these things can lead to wildly fluctuating stock prices.

So even if you do end up being correct on the future dividend per share of the company, the valuation trajectory you thought that the company will follow may end up well off-course for long periods. The market may also demand different rates of return from you leading to the market’s “intrinsic value” of the stock differing from yours.

The picture below is a sketch by me (sorry I’m not an artist) that illustrates what may happen:

The smooth line is what your “intrinsic value” of the company looks like over time. But the zig-zag line is what may actually happen.

Bottom line

To recap, capital gains can be made even if a company doesn’t pay a dividend during our lifetime. But we have to be wary that capital gains may not happen smoothly.

Shareholders, even if they are right about a stock’s future dividend profile, must be able to hold the stock through volatile periods until the stock price eventually reaches above or at least on par with our intrinsic value to make our required rate of return.

You may also have noticed from the chart that occasionally stocks can go above your “intrinsic value” line (whatever rate of return you are using). If you bought in at these times, you are unlikely to make a return that meets your required rate.

To avoid this, we need to buy in at the right valuation and be patient enough to wait for market sentiment to converge to our intrinsic value over time to make a profit that meets our expectations. Patience and discipline are, hence, key to investment success. And of course, we also need to predict the dividend trajectory of the company somewhat accurately.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I do not have a vested interest in any stocks mentioned. Holdings are subject to change at any time.

How Do Changing Assumptions Impact Intrinsic Values?

Are stock price movements due to new information justified? Here’s one way to find out.

It is not uncommon to see stock prices gyrate wildly during earnings season. A small earnings beat and the stock goes up 10% or even 20%. An earnings miss and the stock is down double digits after hours.

Are these stock price movements justified? Has the intrinsic value of the stock really changed that much? In this article, I look at how a change in assumptions about a company’s cash flow can affect the intrinsic value of the stock.

I take a look at what effects changing assumptions to a company’s cash flow have on the intrinsic value of the stock.

When long-term assumptions are slashed

Let’s start by analysing a stock that has its long-term assumptions slashed. This should have the biggest impact on intrinsic value compared to just a near-term earnings miss.

Suppose Company A is expected to dish out $1 in dividends every year for 10 years before it closes down in year 10 and liquidates for $5 a share. The liquidation value is paid out to shareholders as a special dividend in year 10. The table below shows the dividend schedule and the calculation of the intrinsic value of the stock today using a 10% discount rate.

YearDividendNet present value
Now$0.00$0.00
Year 1$1.00$0.91
Year 2$1.00$0.83
Year 3$1.00$0.75
Year 4$1.00$0.68
Year 5$1.00$0.62
Year 6$1.00$0.56
Year 7$1.00$0.51
Year 8$1.00$0.47
Year 9$1.00$0.42
Year 10$6.00$2.31
Sum$15.00$8.07

The intrinsic value in this case is $8.07.

But what if expectations for Company A are slashed? The dividend schedule is now expected to drop 10% to 90 cents per share for the next 10 years. The liquidation value is also cut by 10% to $4.50. The table below illustrates the new dividend expectation and the new intrinsic value of the stock.

YearDividendNet present value
Now$0.00$0.00
Year 1$0.90$0.82
Year 2$0.90$0.74
Year 3$0.90$0.68
Year 4$0.90$0.61
Year 5$0.90$0.56
Year 6$0.90$0.51
Year 7$0.90$0.46
Year 8$0.90$0.42
Year 9$0.90$0.38
Year 10$5.40$2.08
Sum$13.50$7.27

Understandably, the intrinsic value drops 10% to $7.27 as all future cash flows are now 10% less. In this case, if the stock was trading close to the initial $8.07 per share intrinsic value, then a 10% decline in the stock price can be considered justified.

When only short-term cash flows are impacted

But most of the time, expectations for a company should not change so drastically. An earnings miss may lead to expectations of lower dividends for the next couple of years but does not impact dividend projections for later years.

For instance, let’s say the dividend projection for Company A above is cut by 10% for Year 1 but returns to $1 per share in Year 2 onwards and the liquidation value at the end of Year 10 is still $5. The table shows the new expected dividend schedule and the intrinsic value of the stock.

YearDividendNet present value
Now$0.00$0.00
Year 1$0.90$0.82
Year 2$1.00$0.83
Year 3$1.00$0.75
Year 4$1.00$0.68
Year 5$1.00$0.62
Year 6$1.00$0.56
Year 7$1.00$0.51
Year 8$1.00$0.47
Year 9$1.00$0.42
Year 10$6.00$2.31
Sum$14.90$7.98

In this case, the intrinsic value drops to $7.98 from $8.07. Only a small decline in the stock price is warranted if the stock was initially trading close to its $8.07 intrinsic value since the decline in intrinsic value is only minimal. 

Delaying cash flows to the shareholder

Expectations can also change about the timing of cash flows paid to shareholders. This will also impact the intrinsic value of a stock.

For the same company above, instead of dividends per share declining, the dividends are paid out one year later than expected. The table below shows the new expected dividend schedule and the present value of the cash flows.

YearDividendNet present value
Now$0.00$0.00
Year 1$0.00$0.00
Year 2$1.00$0.83
Year 3$1.00$0.75
Year 4$1.00$0.68
Year 5$1.00$0.62
Year 6$1.00$0.56
Year 7$1.00$0.51
Year 8$1.00$0.47
Year 9$1.00$0.42
Year 10$1.00$0.39
Year 11$6.00$2.10
Sum$15.00$5.24

As you can see this has a bigger impact on intrinsic value. The intrinsic value of the stock drops to $5.24 from $8.07. But this is a pretty extreme example. We have delayed all future cash flows by one year. In most cases, our expectations may not change so drastically. For instance, Year 1’s dividend may just be pushed to Year 2. The table below illustrates this new scenario.

YearDividendNet present value
Now$0.00$0.00
Year 1$0.00$0.00
Year 2$2.00$1.65
Year 3$1.00$0.75
Year 4$1.00$0.68
Year 5$1.00$0.62
Year 6$1.00$0.56
Year 7$1.00$0.51
Year 8$1.00$0.47
Year 9$1.00$0.42
Year 10$6.00$2.31
Sum$15.00$7.99

In this case, the intrinsic value only drops by a few cents to $7.99.

Conclusion

A change in expectations for a company has an impact on intrinsic value. But unless the expectations have changed dramatically, the change in intrinsic value is usually small.

Fluctuations in stock prices are more often than not overreactions to new information that the market is prone to make. Most of the time, the new information does not change the expectations of a company drastically and the stock price movements can be considered unjustified. This is the case if the stock price is trading close to its original intrinsic value to begin with.

But bear in mind, this works both ways. Stock price pops can also be considered unjustified depending on the situation. As investors, we can use any mispricing of stocks to earn a good long-term return.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have no vested interest in any companies mentioned. Holdings are subject to change at any time.

How To Find The Intrinsic Value of a Stock At Different Points in Time

intrinsic value is the sum of all future cash flows discounted to the present, but it can also change over the course of time.

A company’s intrinsic value is the value of the sum of future cash flows to the shareholder discounted to the present day. 

But the intrinsic value of a company is not static. It moves with time. The closer we get to the future cash flows, the more an investor should be willing to pay for the company.

In this article, I will run through (1) how to compute the intrinsic value of a company today, (2) how to plot the graph of the intrinsic value, and (3) what to do with intrinsic value charts.

How to calculate intrinsic value

Simply put, intrinsic value is the sum of all future cash flows discounted to the present. 

As shareholders of a company, the future cash flow is all future dividends and the proceeds we can collect when we eventually sell our shares in the company.

To keep things simple, we should assume that we are holding a company to perpetuity or till the business closes down. This will ensure we are not beholden to market conditions that influence our future cash flows through a sale. We, hence, only need to concern ourselves with future dividends.

To calculate intrinsic value, we need to predict the amount of dividends we will collect and the timing of that dividend.

Once we figure that out, we can discount the dividends to the present day.

Let’s take a simple company that will pay $1 a share for 10 years before closing down. Upon closing, the company pays a $5 dividend on liquidation. Let’s assume we want a 10% return. The table below shows the dividend schedule, the value of each dividend when discounted to the present day and the total intrinsic value of the company now.

YearDividendNet present value
Now$0.00$0.00
Year 1$1.00$0.91
Year 2$1.00$0.83
Year 3$1.00$0.75
Year 4$1.00$0.68
Year 5$1.00$0.62
Year 6$1.00$0.56
Year 7$1.00$0.51
Year 8$1.00$0.47
Year 9$1.00$0.42
Year 10$6.00$2.31
Sum$15.00$8.07

As you can see, we have calculated the net present value of each dividend based on how far in the future we will receive them. The equation for the net present value is: (Dividend/(1+10%)^(Years away).

The intrinsic value is the sum of the net present value of all the dividends. The company in this situation has an intrinsic value of $8.07.

Intrinsic value moves

In the above example, we have calculated the intrinsic value of the stock today. But the intrinsic value moves with time. In a year, we will have collected $1 in dividends which will lower our intrinsic value. But at the same time, we will be closer to receiving subsequent dividends. 

The table below shows the intrinsic value immediately after collecting our first dividend in year 1.

YearDividendNet present value
Now$0.00$0.00
Year 1$1.00$0.91
Year 2$1.00$0.83
Year 3$1.00$0.75
Year 4$1.00$0.68
Year 5$1.00$0.62
Year 6$1.00$0.56
Year 7$1.00$0.51
Year 8$1.00$0.47
Year 9$6.00$2.54
Sum$14.00$7.88

There are a few things to take note of.

First, the sum of the remaining dividends left to be paid has dropped to $14 (from $15) as we have already collected $1 worth of dividends.

Second, the intrinsic value has now dropped to $7.88. 

We see that there are two main effects of time.

It allowed us to collect our first dividend payment of $1, reducing future dividends. That has a net negative impact on the remaining intrinsic value of the stock. But we are also now closer to receiving future dividends. For instance, the big payout after year 10 previously is now just 9 years away.

The net effect is that the intrinsic value dropped to $7.88. We can do the same exercise over and over to see the intrinsic value of the stock over time. We can also plot the intrinsic values of the company over time.

Notice that while intrinsic value has dropped, investors still manage to get a rate of return of 10% due to the dividends collected.

When a stock doesn’t pay a dividend for years

Often times a company may not pay a dividend for years. Think of Berkshire Hathaway, which has not paid a dividend in decades. 

The intrinsic value of Berkshire is still moving with time as we get closer to the dividend payment. In this scenario, the intrinsic value simply rises as we get closer to our dividend collection and there is no net reduction in intrinsic values through any payment of dividends yet.

Take for example a company that will not pay a dividend for 10 years. After which, it begins to distribute a $1 per share dividend for the next 10 years before closing down and pays $5 a share in liquidation value. 

YearDividendNet present value
Now0$0.00
Year 10$0.00
Year 20$0.00
Year 30$0.00
Year 40$0.00
Year 50$0.00
Year 60$0.00
Year 70$0.00
Year 80$0.00
Year 90$0.00
Year 10$0.00$0.00
Year 11$1.00$0.35
Year 12$1.00$0.32
Year 13$1.00$0.29
Year 14$1.00$0.26
Year 15$1.00$0.24
Year 16$1.00$0.22
Year 17$1.00$0.20
Year 18$1.00$0.18
Year 19$1.00$0.16
Year 20$6.00$0.89
Sum$15.00$3.11

The intrinsic value of such a stock is around $3.11 at present. But in a year’s time, as we get closer to future dividend payouts, the intrinsic value will rise. 

A simple way of thinking about it is that in a year’s time, the intrinsic value will have risen 10% to meet our 10% discount rate or required rate of return. As such, the intrinsic value will be $3.42 in one year. The intrinsic value will continue to rise 10% each year until we receive our first dividend payment in year 10.

The intrinsic value curve will look like this for the first 10 years:

The intrinsic value is a smooth curve for stocks that do not yet pay a dividend.

Using intrinsic value charts

Intrinsic value charts can be useful in helping investors know whether a stock is under or overvalued based on your required rate of return.

Andrew Brenton, CEO of Turtle Creek Asset Management whose main fund has produced a 20% annualised return since 1998 (as of December 2022), uses his estimate of intrinsic values to make portfolio adjustments. 

If a stock goes above his intrinsic value, it means that it will not be able to earn his required rate of return. In that case, he lowers his portfolio weighting of the stock and vice versa.

While active management of the portfolio using this method can be rewarding as in the case of Turtle Creek, it is also fairly time-consuming.

Another way to use intrinsic value charts is to use it to ensure you are getting a good entry price for your stock. If a stock trades at a price above your intrinsic value calculations, it may not be able to achieve your desired rate of return.

Final thoughts

Calculating the intrinsic value of a company can help investors achieve their return goals and ensure that they maintain discipline when investing in a company.

However, there are limitations. 

For one, intrinsic value calculations require an accurate projection of future payments to the shareholder. In many cases, it is hard for investors to predict with accuracy and confidence. We have to simply rely on our best judgement. 

We are also often limited by the fact that we may not hold stock to perpetuity or its natural end of life and liquidation. In the case that we need to sell the stock prematurely, we may be beholden to market conditions at the time of our sale of the stock. 

It is also important to note that intrinsic value is not the same for everyone. I may be willing to attribute a higher intrinsic value to a company if my required rate of return is lower than yours. So each individual investor has to set his own target return to calculate intrinsic value.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I do not have a vested interest in any stocks mentioned. Holdings are subject to change at any time.

What’s Your Investing Edge?

Whats your investing edge? That’s the question many investors find themselves asking when building a personal portfolio. Here are some ways to gain an edge.

Warren Buffett probably has the most concise yet the best explanation of how to value a stock. He said: “Intrinsic value can be defined simply: it is the discounted value of the cash that can be taken out of a business during its remaining life.”

This is how all stocks should theoretically be valued.  In a perfect market where cash flows are certain and discount rates remain constant, all stocks should provide the same rate of return. 

But this is not the case in the real world. Stocks produce varying returns, allowing investors to earn above-average returns. 

Active stock pickers have developed multiple techniques to try to obtain these above-average returns to beat the indexes. In this article, I’ll go through some investing styles, why they can produce above-average returns, and the pros and cons of each style.

Long-term growth investing

One of the more common approaches today is long-term growth investing. But why does long-term investing outperform the market?

The market underestimates the growth potential

One reason is that market participants may underestimate the pace or durability of the growth of a company. 

Investors may not be comfortable projecting that far in the future and often are only willing to underwrite growth over the next few years and may assume high growth fades away beyond a few years. 

While true for most companies, there are high-quality companies that are exceptions. if investors can find these companies that beat the market’s expectations, they can achieve better-than-average returns when the growth materialises. The chart below illustrates how investors can potentially make market-beating returns.

Let’s say the average market’s required rate of return is 10%. The line at the bottom is what the market thinks the intrinsic value is based on a 10% required return. But the company exceeds the market’s expectations, resulting in the stock price following the middle line instead and a 15% annual return.

The market underwrites a larger discount rate

Even if the market has high expectations for a company’s growth, the market may want a higher rate of return as the market is uncertain of the growth playing out. The market is only willing to pay a lower price for the business, thus creating an opportunity to earn higher returns.

The line below is what investors can earn which is more than the 10% return if the market was more confident about the company.

Deep value stocks

Alternatively, another group of investors may prefer to invest in companies whose share prices are below their intrinsic values now. 

Rather than looking at future intrinsic values and waiting for the growth to play out, some investors simply opt to buy stocks trading below their intrinsic values and hoping that the company’s stock closes the gap. The chart below illustrates how this will work.

The black line is the intrinsic value of the company based on a 10% required return. The beginning of the red line is where the stock price is at. The red line is what investors hope will happen over time as the stock price closes the gap with its intrinsic value. Once the gap closes, investors then exit the position and hop on the next opportunity to repeat the process.

Pros and cons

All investing styles have their own pros and cons. 

  1. Underappreciated growth
    For long-term investing in companies with underappreciated growth prospects, investors need to be right about the future growth of the company. To do so, investors must have a keen understanding of the business background, growth potential, competition, potential that the growth plays out and why the market may be underestimating the growth of the company.

This requires in-depth knowledge of the company and requires conviction in the management team being able to execute better than the market expects of them.

  1. Underwriting larger discount rates
    For companies that the market has high hopes for but is only willing to underwrite a larger discount rate due to the uncertainty around the business, investors need to also have in-depth knowledge of the company and have more certainty than the market that the growth will eventually play out.
    Again, this may require a good grasp of the business fundamentals and the probability of the growth playing out.
  2. Undervalued companies
    Thirdly, investors who invest in companies based on valuations being too low now, also need a keen understanding of the business. Opportunities can arise due to short-term misconceptions of a company but investors must have a differentiated view of the company from the rest of the market.
    A near-term catalyst is often required for the market to realise the discrepancy. A catalyst can be in the form of dividend increases or management unlocking shareholder value through spin-offs etc. This style of investing often requires more hard work as investors need to identify where the catalyst will come from. Absent a catalyst, the stock may remain undervalued for long periods, resulting in less-than-optimal returns. In addition, new opportunities need to be found after each exit.

What’s your edge?

Active fundamental investors who want to beat the market can use many different styles to beat the market. While each style has its own limitations, if done correctly, all of these techniques can achieve market-beating returns over time.

Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I do not have a vested interest in any stocks mentioned. Holdings are subject to change at any time.

Forget Profits or Free Cash Flow – Dividends Are What Really Matters!

Profits and free cash flow are nice metric to have as a company. But they may be reinvested. What really matters is what cash can be eventually distributed.

Investors often talk about profits and free cash flow. I’m no exception. If you look at the archive of articles on this blog, you will find that I have written about both of these subjects numerous times.

So why am I saying that profits and free cash flow are not what really matter and that dividends are what ultimately matters most?

Well, that’s because an asset should be valued based on the cash flow that the asset can produce for the asset holder. In the case of stocks, dividends are the only cash flow you receive as a long-term shareholder.

Business profits may not end up in our pockets 

Although profits or free cash flow that a business earns can theoretically be returned to the shareholder, the truth is that, more often than not, they aren’t. Companies may want to retain a portion or all of that cash flow for reinvestment in the business, acquisitions, or buybacks. 

Let’s take a look at a simple example.

Company A is a profitable business. It generates $1 in free cash flow in year one. The company does not want to pay a dividend. Instead, it reinvests that $1 to generate 10% more cash flows the subsequent year. It keeps reinvesting its profits each year for 5 years. Only after Year 5 does Company A decide that it will start to return all its free cash flow to shareholders as dividends. Its free cash flow per year stagnates after Year 5. Here is what Company A’s annual free cash flow and dividend per share look like:

Company B, on the other hand, produces $0 in free cash flow in Years 1 to 5. But in Year 6, it starts to generate $1.61 in free cash flow per share and pays all of that out as dividends each year. Like Company A, its growth stagnates after Year 5.

Here is what Company B’s annual free cash flow and dividend per share look like:

Which company is worth more? Neither. They are worth the same. That is because the cash flow received by the shareholders is equal.

Free cash flow and profits do not reflect all costs

If the above example left you slightly confused, maybe you can think of it like this. A company may be generating free cash flow but uses all that cash to grow through acquisitions or conduct share buybacks. Another company may be using its cash from operations to build more capacity to drive growth. The cash spent here are capital expenses which lower free cash flow*.

The first company may appear to be generating a lot of free cash flow but that cash is being spent on buybacks and acquisitions. The second company has no free cash flow but that’s because its investments are deducted before calculating free cash flow. Both these companies end up with no cash that year that can be returned to shareholders even though one is generating free cash flow and the other one is not. The difference lies in where these expenses/investments are recorded.

Capital expenses are deducted in the calculation of free cash flow but cash acquisitions of another company or buybacks usually are not. Correspondingly, a company that is spending heavily on marketing for growth may show up with no operating cash flow at all and consequently no free cash flow. Ultimately, it does not matter how the company invests or whether free cash flow appears on the financial statements. What really matters is how much cash the company can eventually return to shareholders as dividends, now or in the future.

Although it is true that dividends will eventually come from the free cash flow that a company produces, it is not always true that the free cash flow produced in any given year will lead to dividends.

A brief comment on buybacks

This discussion would not be complete without a short discussion on where buybacks fit into the grand scheme of things. Companies often declare that they have “returned” cash to shareholders through buybacks. 

However, this cash is only returned to shareholders who actually sell their stock to the company. What do long-term shareholders who do not sell their shares to the company get? They certainly do not receive any cash. 

I count buybacks as a form of investment that the company makes. Buybacks increase a company’s free cash flow per share by reducing the outstanding share count. Long-term shareholders benefit as future dividends are now split among fewer shares.

Given this, I do not count buybacks as cash that is “returned” to the long-term shareholder. Instead, I count it as an investment that drives free cash flow per share growth, and eventually, dividend per share growth.

What ultimately matters to long-term shareholders is, hence, dividends. Dividends is the only cash flow that a long-term shareholder receives. And this is what should drive the value of the stock price.

Final word

Don’t get me wrong. I’m not saying that investors should only invest in companies that are paying dividends. Far from it. I personally have a vested interest in many companies that currently don’t pay a dividend.

However, as a long-term shareholder, I’m cognizant of the fact that the value of the stock is dependent on the dividends that the company will pay eventually. Companies that don’t pay a dividend now or even in the near future can still be valuable if they ultimately start paying dividends.

And while cash flows and profits may not always result in dividends, it is the backbone of where dividends come from. As such, it is still important to keep in mind the future cash-generative profile of a company that will ultimately lead to dividend payments.

*Free cash flow is usually calculated as operating cash flow minus any capital expenses such as the purchase of property, plant and equipment or capitalised software costs


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have no vested interest in any companies mentioned. Holdings are subject to change at any time.