More Of The Latest Thoughts From American Technology Companies On AI (2024 Q1)

A collection of quotes on artificial intelligence, or AI, from the management teams of US-listed technology companies in the 2024 Q1 earnings season.

Last month, I published The Latest Thoughts From American Technology Companies On AI (2024 Q1). In it, I shared commentary in earnings conference calls for the first quarter of 2024, from the leaders of technology companies that I follow or have a vested interest in, on the topic of AI and how the technology could impact their industry and the business world writ large. 

A few more technology companies I’m watching hosted earnings conference calls for 2024’s first quarter after I prepared the article. The leaders of these companies also had insights on AI that I think would be useful to share. This is an ongoing series. For the older commentary:

Here they are, in no particular order:

Adobe (NASDAQ: ADBE)

Adobe’s management thinks that creativity is a human trait and that AI assists and amplifies human ingenuity without replacing it

Adobe’s highly differentiated approach to AI is rooted in the belief that creativity is a uniquely human trait and that AI has the power to assist and amplify human ingenuity and enhance productivity.

Adobe’s Firefly generative AI models within its Creative Cloud suite were trained on proprietary data; Adobe’s management has infused AI functionality into its flagship products within the Creative Cloud suite; management has built Adobe Express as an AI-first application; Firefly has generated over 9 billion images since its launch in March 2023 (was 6.5 billion in 2023 Q4); customers are excited about the commercial safety of Firefly; Firefly Services can create significantly more asset variations in a much shorter time, and the speed enables Adobe to monetise Firefly through the volume of content created; Firefly generations in May 2024 was the most generations of any month-to-date; Firefly Services has started to see customer wins; Firefly Services allows users to build customer models and access APIs and these are early in adoption, but customer-interest is better than expected; early Firefly Services usage is on (1) creating multiple variations in the ideation process, (2) creating geography-based variations on ads, (3) assets for community engagement

In Creative Cloud, we’ve invested in training our Firefly family of creative generative AI models with a proprietary data set and delivering AI functionality within our flagship products, including Photoshop, Illustrator, Lightroom, and Premier. We’re reimagining creativity for a broader set of customers by delivering Adobe Express as an AI-first application across the web and mobile surfaces. Since its debut in March 2023, Firefly has been used to generate over 9 billion images across Adobe creative tools…

…This week’s Design Made Easy event, which focused on Express for Business, was another big step forward for us. Companies of all sizes are excited about the integrated power and commercial safety of Firefly, the seamless workflows with Photoshop, Illustrator and Adobe Experience Cloud, and enterprise-grade brand controls that are now part of Express for Business, making it the optimal product for marketing, sales and HR teams to quickly and easily create visual content to share…

… Firefly Services can power the creation of thousands of asset variations in minutes instead of months and at a fraction of the cost. This allows us to monetize the volume of content being created through automation services. The increasing availability of Firefly in Creative Cloud, Express, Firefly Services and the web app is giving us opportunities to access more new users, provide more value to existing users and monetize content automation. These integrations are driving the acceleration of Firefly generations with May seeing the most generations of any month to date…

…On the models, we released Firefly Services. We’ve started to see some customer wins in Firefly Services. So they’re using it for variations, and these are the custom models that we’re creating as well as access to APIs. I would say that’s early in terms of the adoption, but the interest as customers say how they can ingest their data into our models as well as custom models, that’s really ahead of us, and we expect that to continue to grow in Q3 and Q4…

…In terms of what I would say we’re seeing usage of, I think the initial usage of Firefly Services in most companies was all around ideation, how can they create multiple variations of them and in the ideation process really just accelerate that ideation process? Most companies are then starting with as they’re putting it into production, how can they, with the brand assets and the brand guidelines that they have, do this in terms of the variations, whether they be geographic variations or they be just variations. I mean, if you take a step back also, every single ad company right now will tell you that the more variance that you provide, the better your chances are of appropriately getting an uplift for your media spend. So I would say that most companies are starting with creating these variations for geographies. The other one that we see a fair amount of is engaging with their communities. So when they want their communities to have assets that they have blessed for usage within community campaigns, that’s the other place where Firefly Services are being used. And a company has a community portal where the community can come in, take something and then post whether it’s on whatever social media site that you want. 

Adobe’s management has introduced Acrobat AI Assistant, an AI-powered tool for users to have conversations with their documents, within Adobe’s Document Cloud suite; Acrobat AI Assistant features are available as a standalone offer or as an add-on subscription to existing Adobe products; Acrobat AI Assistant for English documents was made generally available in April; management is seeing early success in adoption of Acrobat AI Assistant; Acrobat AI Assistant can be applied to document types beyond PDFs

In Document Cloud, we’re revolutionizing document productivity with Acrobat AI Assistant, an AI-powered conversational engine that can easily be deployed in minutes. This enhances the value of the trillions of PDFs, which hold a significant portion of the world’s information. Acrobat AI Assistant features are now available through an add-on subscription to all Reader and Acrobat enterprise and individual customers across desktop, web and mobile…

The introduction of Acrobat AI Assistant made generally available in April for English documents marks the beginning of a new era of innovation and efficiency for the approximately 3 trillion PDFs in the world. Acrobat AI Assistant is empowering everyone to shift from reading documents to having conversations with them in order to summarize documents, extract insights, compose presentations and share learnings. AI Assistant is available as a stand-alone offer for use in reader and as an add-on to Acrobat Standard and Pro. We’re seeing early success driving adoption of AI Assistant as part of our commerce flows and remain optimistic about the long-term opportunities…

…Other business highlights include general availability of Acrobat AI Assistant support for document types beyond PDF, meeting transcripts and enterprise requirements. 

The Adobe Experience platform, which is part of the Digital Experience segment, is on track to become a billion-dollar annual revenue business; management has released AEP (Adobe Experience Platform) AI Assistant to improve the productivity of marketing professionals; Adobe is the #1 digital experience platform; customer interest and adoption of AEP AI Assistant is great

At the end of May, we celebrated the 5-year anniversary of Adobe Experience Platform, which we conceived and built from scratch and which is on track to be the next billion-dollar business in our Digital Experience portfolio. We released AEP AI Assistant to enhance the productivity of marketing practitioners through generative AI while expanding access to native AEP applications…

When we introduced Adobe Experience Platform 5 years ago, it was a revolutionary approach to address customer data and journeys. Today, we’re the #1 digital experience platform and AEP with native apps is well on its way to becoming a billion-dollar business…

…We are excited by the customer interest and adoption of our latest innovations, including AEP AI Assistant, a generative AI-powered conversational interface that empowers practitioners to automate tasks, simulate outcomes and generate new audiences and journeys. For example, customers like General Motors and Hanesbrands have been working with AEP AI Assistant to boost productivity and accelerate time to value while democratizing access to AEP and apps across their organizations…

…When you think about the AEP AI Assistant, it’s doing a couple of things. One, it’s really making it easier for customers to deploy use cases. When you think of use cases that they have around, for example, generating audiences and running campaigns around those audiences, these are things today that require some data engineering. They require the ability to put these audiences together. So they require marketing and IT teams to work together. The AEP AI Assistant is making it much easier for marketers to be able to do it themselves and be able to deploy a lot more use cases.

Adobe’s management’s vision for Adobe Express is to make design easy; the launch of the new Adobe Express app in 2024 Q1 (FY2024 Q2) has been well received, with monthly active users doubling sequentially; management has been deeply integrating AI features into Adobe Express; cumulative exports from Adobe Express has increased by 80% year-on-year in 2024 Q1; management is building Adobe Express to be AI-first; management thinks Adobe Express is leveraging people’s need for AI 

Our vision for Adobe Express is to provide a breakthrough application to make design easy for communicators worldwide, leveraging generative AI and decades of Adobe technology across web and mobile. Our launch of the all-new Express application on iOS and Android earlier this quarter is off to a strong start with monthly active users doubling quarter-over-quarter…

There’s a lot of buzz with Express here at Adobe coming off the event we just had earlier this week, but it’s really based on the fact that the innovation in Express is on a tear, right? A few months ago, we introduced an all-new Express for the web. This quarter, we introduced an all-new Express for mobile. We introduced Express for Business. We also now have, as we’ve just talked about, been more deeply integrating AI features, whether it’s for imaging generation or Generative Fill or text effects, character animation, design generations, more deeply into the flow for Express. And that combination has led to an incredible set of metrics over the last quarter, in particular, but building throughout the year. Express MAU is growing very quickly. We talked about in the script earlier that MAU on mobile has more than doubled quarter-over-quarter, which is fantastic to see. And cumulative exports, if you look at year-over-year, has grown by over 80%. So really feeling good about sort of the momentum we’re seeing…

Express that is now in market is built on a brand-new platform, right? And that brand-new platform lays the groundwork for the AI era. And this will be — Express will be the place that anyone can come and create through a combination of conversational and standard inputs. That’s the vision that we have. And I think it’s an opportunity for us to really leap forward in terms of what we can do on the web and mobile at Adobe…

Express is really being driven by sort of the need for AI and how people are able to describe what they want and get the final output. When David talked about exports, just to clarify, what that means is people who have successfully got what they want to get done, done. And that’s a key measure of how we are doing it, and AI is certainly facilitating and accelerating that.

Adobe GenStudio uses AI to help enterprises transform their content supply chain; enterprise customers view customer experience management and personalisation at scale as key investments to make

We’re now transforming the content supply chain for enterprises with Adobe GenStudio, enabling them to produce content at scale, leveraging generative AI through native integrations with Firefly Services and Adobe Express for Business. Enterprise customers, both B2C and B2B, view customer experience management and personalization at scale as key areas of differentiation, making it a priority investment for Chief Marketing Officers, Chief Information Officers and Chief Digital Officers.

Adobe’s management thinks the biggest opportunity in AI for Adobe is in interfaces, such as performing tasks faster, improving workflows etc; in AI interfaces, management is seeing significant usage in AI Assistant and Photoshop; management believes that (1) the real benefits from disruptive technologies such as AI come when people use interfaces to improve their work, and that (2) in the future, more people will be using these interfaces

I think the biggest opportunity for us and why we’re really excited about GenAI is in the interfaces because that’s the way people derive value, whether it’s in being able to complete their tasks faster, whether it’s be able to do new workflows. And I would say, in that particular space, Acrobat has really seen a significant amount of usage as it relates to AI Assistant and Photoshop…

… And so we’re always convinced that when you have this kind of disruptive technology, the real benefits come when people use interfaces to do whatever task they want to do quicker, faster and when it’s embedded into the workflows that they’re accustomed to because then there isn’t an inertia associated with using it…

And so net-net, I am absolutely betting on the fact that 5 years from now, there’ll be more people saying, “I’m using creative tools to accomplish what I want,” and there’ll be more marketers saying, “I can now, with the agility that I need, truly deliver a marketing campaign in an audience that’s incredibly more specific than I could in the past.” And that’s Adobe’s job to demonstrate how we are both leading in both those categories and to continue to innovate.

Adobe’s management’s primary focus for generative AI is still on user adoption and proliferation

From the very beginning, we’ve talked to you guys about our primary focus for generative AI is about user adoption and proliferation, right? And that has continued to be the primary thing on our mind.

Adobe’s management thinks there are different routes to monetise AI, such as winning new users, and getting higher ARPU (average revenue per user)

And to your point, there are many different ways that we can monetize this. First is as you think about the growth algorithms that we always have in our head, it always starts with, as Shantanu said, new users, right? And then it’s about getting more value to existing users at higher ARPU, right? So in the context of new users, first and foremost, we want to make sure that everything we’re doing generative AI is embedded in our tools, starting with Express, right?

Adobe has seen strong growth in emerging markets because users need access to the cloud for all of the AI functionality

I mean I think in the prepared remarks, Dan also talked about the strength in emerging markets. And I think the beautiful part about AI is that since they need access to the cloud to get all of the AI functionality, emerging market growth has been really strong for us.

Adobe’s management thinks that they have hit a sweet spot with pricing for generative AI credits in Adobe’s subscription plans for imaging and vector work, but they will need to explore different plans for generative AI credits when it comes to video work

When we think about what we’ve done with imaging and video, we’ve done the right thing by making sure the higher-value paid plans that people don’t have to think about the amount of generative capability. And so there, the balance between for free and trialist users, they’re going to run into the generative capability limits and therefore, have to subscribe. But for the people who actually have imaging and vector needs, that they’re not constantly thinking about generative, I think we actually got it right. To your point, as we move to video, expect to see different plans because those plans will, by necessity, take into account the amount of work that’s required to do video generation. So you’re absolutely right as a sort of framework for you to think about it.

Adobe’s management thinks that there’s a lot of excitement now on AI infrastructure and chips, but the value of AI will need to turn to inference in order for all the investment in AI infrastructure and chips to make sense

It’s fair to say that the interest that exists right now from investors, as it relates to AI, is all associated with the infrastructure and chips and perhaps rightly so because that’s where everybody is creating these models. They’re all trying to train them. And there’s a lot of, I think, deserved excitement associated with that part of where we are in the evolution of generative AI. If the value of AI doesn’t turn to inference and how people are going to use it, then I would say all of that investment would not really reap the benefit in terms of where people are spending the money.

Adobe’s management think it doesn’t matter what AI model is used to generate content – DALL-E, Firefly, Midjourney, or more – because the content ultimately needs to be edited on Adobe’s software; management is building products on Firefly, but they are also happy to leverage on third-party AI models

So Firefly might be better at something. Midjourney might be something at something else. DALL·E might do something else. And the key thing here is that, around this table, we get excited when models innovate. We get excited when Firefly does something amazing. We get excited when third-party models do something because our view, to Shantanu’s point, is that the more content that gets generated out of these models, the more content that needs to be edited, whether it’s color correction, tone matching, transitions, assembling clips or masking compositing images. And the reason for this is that this is not a game where there’s going to be one model. There’s — each model is going to have its own personality, what it generates, what it looks like, how fast it generates, how much it costs when it generates that, and to have some interface layer to help synthesize all of this is important. And so just sort of to note, we’ve said this before but I’ll say it again here, you will see us building our products and tools and services leveraging Firefly for sure, but you’ll also see us leveraging best-of-breed personalities from different models and integrate them all together.

Ultimately, generative AI is going to create more growth in Adobe’s category

[Analyst] Awesome, the message here is that GenAI is going to create more growth in the category. And Shantanu, you did that with the pivot to cloud. You grew the category, so here we go again.

DocuSign (NASDAQ: DOCU)

DocuSign Navigator is a new AI-powered product that allows users to store and manage their entire library of accumulated agreements, including non-DocuSign agreements

Second, DocuSign Navigator allows you to store, manage and analyze the customer’s entire library of accumulated agreements. This includes past agreements signed using DocuSign eSignature as well as non-DocuSign agreements. Navigator leverages AI to transform unstructured agreements into structured data, making it easy to find agreements, quickly access vital information, and gain valuable insights from agreements. 

DocuSign acquired Lexion, an AI-based agreements company, this May; management thinks Lexion can improve Docusign’s Agreement AI and legal workflow; the Lexion acquisition is not for revenue growth, but to integrate the AI technology into DocuSign’s products

AI is central to our platform vision, and we’re thrilled to welcome Lexion to the DocuSign family. Lexion is a proven leader in AI-based agreement technology, which significantly accelerates our IAM platform goals. We maintain a high bar for acquisitions, and Lexion stood out due to its sophisticated AI capabilities, compatible technology architecture, and promising commercial traction with excellent customer feedback, particularly in the legal community…

… With regard to capital allocation, we also closed the Lexion acquisition on May 31…

In terms of how it adds to DocuSign, I think overall, agreement AI, their extraction quantity and quality where we augment our platform. Another area where I think they’re really market-leading is in legal workflow. So workflow automation for lawyers, for example, if you’re ingesting a third-party agreement, how can you immediately use AI to assess the agreement, understand how terms may deviate from your standard templates and highlight language that you might want to propose as a counter that really accelerates productivity for legal teams. And they’ve done an excellent job with that. So overall, that’s how it fits in…

We’re not breaking it out just because of its size and materiality. It’s not material to revenue or op margin for us. The overarching message that I would like to send on Lexion is that the purchase of Lexion is about integrating the technology into the DocuSign IAM platform. That opportunity for us, we think, in the long term, can apply to the well over 1 million customers that we have.

MongoDB (NASDAQ: MDB)

MongoDB’s management wants to prioritise investments in using generative AI to modernise legacy relational applications; management has found that generative AI can help with analyzing existing code, converting existing code and building unit and functional tests, resulting in a 50% reduction in effort for app modernisation; management sees a growing list of customers across industries and geographies who want to participate; interest in modernising legacy relational applications is high, but it’s still early days for MongoDB

Second, we are more optimistic about the [ opti-tech ] to accelerate legacy app modernization using AI. This is a large segment of the market that has historically been hard to penetrate. We recently completed the first 2 GenAI powered modernization pilots, demonstrating we can use AI to meaningfully reduce the time, cost and risk of modernizing legacy relational applications. In particular, we see that AI can significantly help with analyzing existing code, converting existing code and building unit and functional tests. Based on our results from our early pilots, we believe that we may be able to reduce the effort needed for app modernization by approximately 50%. We have a growing list of customers across different industries and geos, who want to participate in this program. Consequently, we will be increasing our level of investment in this area…

…We have an existing relational migrated product that allows people to essentially migrate data from legacy relational databases and does the schema mapping for them. The one thing it does not do, which is the most cumbersome and tedious part of the migration is to auto generate or build application code. So when you go from a relational app to an app built on MongoDB, you still have to essentially rewrite the application code. And for many customers, that was the inhibitor for them to migrate more apps because that takes a lot of time and a lot of labor resources. So our app modernization effort is all about or using AI is all about now solving the third leg of that stool, which is being able to reduce the time and cost and effort of rewriting the app code, all the way from analyzing existing code, converting that code to new code and then also building the test suites, both unit tests and functional tests to be able to make sure the new app is obviously operating and functioning the way it should be…

…That’s why customers are getting more excited because the lower you reduce the cost for that migration or the switching costs, the more apps you can then, by definition, migrate. And so that is something that we are very excited about. I will caution you that it’s early days. You should not expect some inflection in the business because of this. 

MongoDB’s management wants to prioritise investments in building an ecosystem for customers to build AI-powered applications because management recognises that there are other critical elements in the AI tech stack beyond MongoDB’s document-based database; management has launched the MongoDB AI Application Program, or MAP, that combines cloud computing providers, model providers, and more; Accenture is the first global systems integrator to join MAP

Third, although still early in terms of customers building production-ready AI apps, we want to capitalize on our inherent technical advantages to become a key component of the emerging AI tech stack…

Recognizing there are other critical elements of the AI tech stack, we are leveraging partners to build an ecosystem that will make it easier for customers to build AI-powered applications. Earlier this month, we launched the MongoDB AI application Program, or MAP, a first-of-its-kind collaboration that brings together all 3 hyperscalers, foundation model providers, generative AI frameworks, orchestration tools and industry-leading consultancies. With MAP, MongoDB offers customers reference architectures for different AI use cases, prebuilt integrations and expert professional services to help customers get started quickly. Today, we are announcing that Accenture is the first global systems integrator to join MAP and that it will establish a center of excellence focused on MongoDB projects. We will continue to expand the program through additional partnerships and deeper technical integrations.

MongoDB’s document-based database architecture is a meaningful differentiator in AI because AI use cases involve various types of data, which are incompatible with legacy databases; there was a customer who told management that if he were to design a database specifically for AI purposes, it would be exactly like MongoDB

Customers tell us that our document-based architecture is a powerful differentiator in an AI world, the most powerful use cases rely on data of different types and structures such as text, image, audio and video. The flexibility required to handle a variety of different data structures is fundamentally at odds with legacy databases that rely on rigid schemes, which is what makes MongoDB’s document model such a good fit for these AI workloads…

…One customer told us if he had to build a database, it would be designed exactly like MongoDB and so for this new AI era. And so we feel really good about our position. 

A unit with Toyota that is focused on AI and data science migrated to MongoDB Atlas after experiencing reliability issues with its original legacy database system; the Toyota unit now uses MongoDB Atlas for over 150 micro-services and will use MongoDB Atlas as its database of choice for future AI needs

Toyota Connected, an independent Toyota company focused on innovation, AI, data science, and connected intelligence services, migrated to MongoDB Atlas after experiencing reliability issues with the original legacy database system. The team selected MongoDB Atlas for its ease of deployment, reliability and multi-cloud and multi-region capabilities. Toyota Connected now uses Atlas for over 150 micro-services. Their solution benefits from 99.99% uptime with Atlas as a platform for all data, including mission-critical vehicle telematics and location data needed for emergency response services. MongoDB’s Toyota Connected’s database of choice for all future services as they explore vector and AI capabilities, knowing they’ll get the reliability and scalability they need to meet customer needs.

Novo Nordisk is using MongoDB Atlas Vector Search to power its generative AI efforts in producing drug development reports; Novo Nordisk switched from its original relational database when it wasn’t capable of handling complex data and lacked flexibility to keep up with rapid feature development; reports that Novo Nordisk used to take 12 weeks to prepare can now be completed with MongoDB Atlas Vector Search in 10 minutes

By harnessing GenAI with MongoDB Atlas Vector search, Novo Nordisk, one of the world’s leading health care companies is dramatically accelerating how quickly can get new medicines approved and delivered to patients. The team responsible for producing clinical study report turn to Atlas when the original relational database wasn’t capable of handling complex data and lack the flexibility needed to keep up with the rapid feature development. Now with GenAI and the MongoDB Atlas platform, Novo Nordisk gets the mission-critical assurances that needs to run highly regulated applications, enabling them to generate complete reports in 10 minutes rather than 12 weeks. 

MongoDB’s management still sees MongoDB as well-positioned to be a key beneficiary when organisations embed AI into next-gen software applications

Our customers recognize that modernizing legacy applications is no longer optional in the age of AI. And are preparing for a multiyear journey to accomplish that goal. They see MongoDB as a key partner in that journey. We are well positioned to be a key beneficiary as organizations embed AI into the next generation of software applications that transform their business.

MongoDB’s management  believes that MongoDB’s performance in 2024 Q1 was less upbeat than the cloud computing hyperscalers because the hyperscalers’ growth came primarily from reselling GPU (graphic processing unit) capacity for AI training and there’s a lot of demand for AI training at the moment, whereas MongoDB is not seeing AI apps in production at scale, which is where MongoDB is exposed to

In contrast to the hyperscalers, like we believe the bulk of their growth across all 3 hyperscalers was really spent on reselling GPU capacity because there’s a lot of demand for training models. We don’t see a lot of, at least today, a lot of AI apps in production. We see a lot of experimentation, but we’re not seeing AI apps in production at scale. And so I think that’s the delta between the results that the hyperscalers produce versus what we are seeing in our business.

MongoDB’s management  thinks that AI is going to drive a step-fold increase in the number of apps and software created, but it’s going to take time, although the process is happening

I think with AI, you’re going to see a stepfold increase in the number of apps and the number of amount of software that’s being built to run businesses, et cetera. But that’s going to take some time. as with any new adoption cycle, the adoption happens in what people commonly refer to as S curves. And I think we’re going through one of those S curves.

MongoDB’s management sees the possibility of customers’ desire to spend on AI crowding out other software spending, but does not think it is an excuse for MongoDB not meeting new business targets

Is AI essentially crowding out new business? We definitely think that that’s plausible. We definitely see development teams experimenting on AI projects. The technology is changing very, very quickly. But that being said, we don’t see that as a reason for us to not hit our new business targets. And as I said, even though we started slow, we almost caught up at the end of this quarter, and we feel really good about our new business opportunity for the rest of this year. So — so I don’t want to use that as an excuse for us not meeting our new business targets.

Okta (NASDAQ: OKTA)

A new product, Identity Threat Protection with Okta AI, is expected to become generally available soon

We’re also excited about the launch of Identity Threat Protection with Okta AI, which includes powerful features like Universal Logout, which makes it possible to automatically log users out of all of their critical apps when there is a security issue. Think of this as identity threat detection and response for Okta. We expect Identity Threat Protection to become generally available this summer.

Okta’s management does not expect the company’s new products – which includes governance, PAM, Identity Threat Protection with Okta AI, and Identity Security Posture Management – to have material impacts on the company’s financials in FY2025; of the new products, management thinks Identity Threat Protection with Okta AI and Identity Security Posture Management will make impacts first before PAM does

I wouldn’t expect for these newer things that are coming out like posture management or threat protection, I wouldn’t expect it in FY ’25 at all. I probably wouldn’t even think it would impact it in FY ’26 because we’re talking about a $2.5 billion business at this point. It takes a lot of money in any of these products to make a material difference to the overall numbers. So we’re setting these up for the long term…

…How we’re thinking about this internally is that the — I think it will mirror the order of broad enablement. So we’re broadly enabling people in the following order: governance is first, followed by a combination of posture management and identity threat protection, followed by privileged access. So we think that Identity Threat Protection with Okta AI and Identity Security Posture Management, that bundle could pretty quickly have as much of an impact as governance. And then we think the next sequential enablement in the next order of impact will probably be Privileged Access.

Okta’s management is currently not seeing companies changing their software spending plans because they want to invest in AI, although that might change in the future

[Question] There is a shift in the marketplace among the C-suite from fear about the economy to, gee, I need to focus on how I’m going to implement AI. And in that context, there’s uncertainty around the mechanics of what they need to do to secure AI within their organizations. And I guess my question to you is we’re hearing the pipelines of the VAR channels, particularly in security, are extremely robust into the back half of the year. But the uncertainty around AI decision is keeping people from implementing it. So how robust is the pipeline that you’re looking at? And are you, in fact, hearing that from your C-suite customers when you talk to them?

[Answer] What I’ve heard is everyone is figuring out how they can deploy this new wave of technology to their products and services and business and how they can use it for security and how they can use it for innovation. But they’re not at the stage where it’s broadly impacting other plans. It’s more of like a — their planning exercise at this point. I think that might change in the future.

Okta’s management thinks that more companies will invest in AI in the future, and this will be a tailwind for the company because more identity features will be needed; the current AI wave is not impacting spending on Okta at the moment, but might be a boon in the future

My bet is that they’re going to be building new apps. They’re going to be deploying more technology from vendors that are building apps with AI built in, which is going to — all of that’s going to lead to more identity. They’re going to have to log people into their new apps they build. They’re going to have to secure the privileged accounts that are running the infrastructure behind the new apps. They’re going to have to make sure that people in their workforce can get to the apps that are the latest, greatest AI-driven experiences for support or for other parts of the business. So I think that identity is one of these foundational things that’s going to be required whether it’s the AI wave, which is going to be really real and impactful and — or whether it’s whatever comes after that.

[Question] So not impacting spending today but might impact to help it in the future.

[Answer] Yes, yes. That’s how I see it.

Okta’s management sees 2 ways of monetising Okta AI: Through new products, and through making existing products better

Okta AI will be monetized through 2 ways. one will be new products like Identity Threat Protection with Okta AI; and the other way, it will be — it will just make products better. For example, the Identity Security Posture Management, it has a new capability that’s going to be added to that product that’s just going to make it smarter about how it detects service accounts. That Identity Security Posture Management scans a customer’s entire SaaS estate, and says, here are all the things you should look at. You should take — this account needs MFA. This other account is — probably has overly permissive permissions. The challenge there is how does the customer know which of those accounts are service accounts, so they can’t have human biometrics. And we added — we used some AI capability to add that to the scan. So that’s an example of just the product gets better versus Identity Threat Protection is like it’s a whole new product enabled by that.

Salesforce (NYSE: CRM)

Salesforce is managing 250 petabytes of customer data and management thinks this is going to be critical and positions Salesforce for success when Salesforce’s customers move into AI; management thinks that customer data is the critical success factor in AI, not AI models and UIs (user interfaces); management thinks most of the AI models that are being built today, both large and small, are just commodities and will not survive

We’re now managing more than 250 petabytes of data for our customers. This is going to be absolutely critical as they move into artificial intelligence…

…When you look at the power of AI, you realize the models and the UI are not the critical success factors. It’s not critical where the enterprise will transform. There are thousands of these models, some open source and some close source models, some built with billions, some with just a few dollars, most of these will not survive. They’re just commodities now, and it’s not where the intelligence lies. And they don’t know anything about a company’s customer relationships. Each day, hundreds of petabytes of data are created that AI models can use for training and generating output. But the one thing that every enterprise needs to make AI work is their customer data as well as the metadata that describes the data, which provides the attributes and contacts the AI models need to generate accurate, relevant output. And customer data and metadata are the new gold for these enterprises…

…Not every company is as well positioned, as you know, for this artificial intelligence capability of Salesforce is because they just don’t have the data. They may say they have this capability or that capability, this user interface, that model, that whatever, all of these things are quite fungible and are expiring quickly as the technology rapidly moves forward. But the piece that will not expire is the data. The data is the permanent key aspect that, as we’ve said, even in our core marketing, it’s the gold for our customers and their ability to deliver our next capability in their own enterprises.

Salesforce’s management is seeing incredible momentum in Data Cloud, which is Salesforce’s fastest-growing organic and next billion-dollar cloud; Data Cloud’s momentum is powered by the need for customers to free their data from being trapped in thousands of apps and silos; the need to free their data is important if Salesforce’s customers want to embrace AI; Data Cloud was in 25% of Salesforce’s >$1 million deals in 2024 Q1; 2024 Q1 was the second quarter in a row when >1,000 Data Cloud customers were added; in 2024 Q1, 8 trillion records were ingested in Data Cloud, up 42% year-on-year, 2 quadrillion records were processed, up 217% year-on-year, and there were 1 trillion activations, up 33% year-on-year

Many of these customers have a central business and customer data that exists outside of Salesforce that’s trapped in thousands of apps and silos. It’s disconnected. That’s why we’re seeing this incredible momentum with our Data Cloud, our fastest-growing organic, and our next billion-dollar cloud. It’s the first step to becoming an AI enterprise. Data Cloud gives every company a single source of truth and you can securely power AI insights and actions across the entire Customer 360.

Now let me tell you why I’m excited about Data Cloud and why it’s transforming our customers and how it’s preparing them for this next generation of artificial intelligence. Data Cloud was included in 25% of our $1 million-plus deals in the quarter. We added more than 1,000 Data Cloud customers for the second quarter in a row. 8 trillion records were ingested in the Data Cloud in the quarter, up 42% year-over-year, and we processed 2 quadrillion records. That’s a 217% increase compared to last year. Over 1 trillion activations drove customer engagement, which is a 33% increase year-over-year. This incredible growth of data in our system and the level of transactions that we’re able to deliver not just in the core system but especially in data cloud is preparing our customers for this next generation of AI.

Salesforce’s predictive AI, Einstein, is generating hundreds of billions of predictions daily; Salesforce is working with thousands of customers in generative AI use cases through the launch in 2024 Q1 of Einstein Copilot, Prompt Builder,and Einstein Studio; Salesforce has closed hundreds of Einstein Copilot deals since the product’s general availability (GA)

Einstein is generating hundreds of billions of predictions per day, trillions per week. Now we’re working with thousands of customers to power generative AI use cases with our Einstein Copilot, our Prompt Builder, our Einstein Studio, all of which went live in the first quarter, and we’ve closed hundreds of Copilot deals since this incredible technology has gone GA. And in just the last few months, we’re seeing Einstein Copilot develop higher levels of capability. We are absolutely delighted and could not be more excited about the success that we’re seeing with our customers with this great new capability.

Luxury fashion company Saks is using Salesforce’s Einstein 1 Platform in Data Cloud to create AI-powered personal experiences for customers

Saks, a leader in the luxury fashion market, part of Hudson’s Bay, went all-in on Salesforce in the quarter. CEO Marc Metrick is using AI to create more personal experiences for every customer touch point across their company. And with our Einstein 1 Platform in Data Cloud, Saks can unify and activate all its customer data to power trusted AI.

Salesforce is helping FedEx generate savings and accelerate top-line partly with the help of its AI solutions

The Salesforce data and app and AI capabilities generate expense savings. This is the core efficiency while growing and accelerating top line revenue. This is the effectiveness that we’re delivering for FedEx. This efficiency includes next best action for sellers, automated lead nurturing, Slack for workflow management, opportunity scoring, a virtual assistant, AI on unstructured data for delivering content to sales and customer service. And when we think about effectiveness, we see our Journey Builder delivering hyper personalization, integrating customer experiences across service, sales, marketing, the ability to tailor and deliver customer experiences based on a Customer 360 view. When we look at these incredible next generation of capability we’ve delivered at FedEx, gone now are these days of static business rules that leave customers dissatisfied, asking, “Do they not know that I’m a valued customer of FedEx?” Now FedEx has not only the power of the Customer 360 but the power of AI to unlock so much more commercial potential by conducting an orchestra of commercial functions that never played well together before.

Air India is using Data Cloud and Einstein across 550,000 service cases each month to improve its customer experience and deliver more personalised customer service

And with Data Cloud, Air India is unifying Data Cloud across loyalty, reservations, flight systems and data warehouses. They have a single source of truth to handle more than 550,000 service cases each month. And now with Einstein, we’re automatically classifying and summarizing cases and sending that to the right agent who’d recommend the next steps and upgrading in high-value passenger experiences. Even when things happen like a flight delay, our system is able to immediately intervene and provide the right capability to the right customer at the right time. All of that frees up agents to deliver more personal service and create more personal relationships, a more profitable, a more productive, a more efficient Air India, a company that’s using AI to completely transform their capability.

Salesforce’s management is seeing good demand, driven by customers recognising the value of transforming their front-office operations with AI, but buying behaviour among customers is measured (similar to the past 2 years) with the exception of 2023 Q4

We’re seeing good demand as AI technology rapidly evolves and customers recognize the value of transforming into AI enterprises. CEOs and CIOs are excited about the opportunity with data and AI and how it can impact their front-office operations…

…We continue to see the measured buying behavior similar to what we experienced over the past 2 years and with the exception of Q4 where we saw stronger bookings. The momentum we saw in Q4 moderated in Q1 and we saw elongated deal cycles, deal compression and high levels of budget scrutiny.

Siemens used Einstein 1 Commerce to build and launch its AI-powered digital marketplace, named Accelerator Marketplace, in just 6 months

Siemens lacked a centralized destination for customers to easily choose the right products and buy on demand. To simplify the buying experience for customers, Siemens worked with Salesforce to develop and launch its Accelerator Marketplace, an AI-powered digital marketplace built on Einstein 1 Commerce, providing AI-generated product pages, smart recommendations and self-service ordering. And they did it all in just 6 months.

Salesforce is using AI internally with great results; Salesforce has integrated Einstein into Slack and Einstein has already answered 370,000 employee queries in a single quarter; Salesforce’s developers have saved 20,000 hours of coding through the use of AI tools

AI is not just for our customers. As part of our own transformation, we continue to adopt AI inside Salesforce. Under the leadership of our Chief People Officer Nathalie Scardino and our Chief Information Officer Juan Perez, we’ve integrated Einstein right into Slack, helping our employees schedule, plan and summarize meetings and answer employee questions. Einstein has already answered nearly 370,000 employee queries in a single quarter. In our engineering organization, our developers now save more than 20,000 hours of coding each month through the use of our AI tools.

Slack AI was launched in February and it provides recap, summaries and personalized search within Slack; >28 million Slack messages have been summarised by Salesforce’s customers since the launch of Slack AI

We also launched Slack AI in February, an amazing innovation that provides recap, summaries and personalized search right within Slack. I personally have been using it every day to get caught up on the conversations happening in every channel. And we’ve seen great traction with our customers with this product, and our customers have summarized over 28 million Slack messages since its launch in February.

Los Angeles city will use Salesforce’s Government Cloud and other solutions to integrate AI into its software system

And in the public sector, the city of Los Angeles chose Salesforce to modernize how the city’s 4 million residents request city services using its MyLA311 system. The city will use government cloud and other Salesforce solutions to integrate AI assistance into MyLA311 and modernize its own constituent-facing services, giving residents more self-service options and improving service reliability and responsiveness.

Salesforce’s products for SMBs (small and medium businesses), Start and Pro Suite, which both have AI built-in, are building momentum; Salesforce added 2,300 new logos to the products in 2024 Q1

Our new offerings for small and medium businesses, Starter and Pro Suite, which are ready-to-use, simplified solutions, with AI built in, are building momentum. In Q1, we added another 2,300 new logos to these products. Since Starter’s launch last year, we’ve seen customers upgrade to our recently launched Pro Suite and even to our Enterprise and Unlimited editions.

Studies have shown that 75% of the value of generative AI use cases is in the front office of companies; Salesforce is the leader in front-office software, so management thinks this is why – with Data Cloud at the heart – the company is in a good position for growth going forward

We all saw the report from McKinsey, 75% of the value of Gen AI use cases is in the front office. And everybody knows Salesforce is the leader in front-office software. That’s our fundamental premise for our growth going forward. We’re already managing 250 petabytes of data and metadata that’s going to be used to generate this incredible level of intelligence and artificial intelligence capability to deliver for our customers a level of productivity and profitability they’ve just never been able to see before. And at the heart of that is going to be our Data Cloud. 

Salesforce’s management is focused on 2 things at the company: The ongoing financial transformation at Salesforce, and the use of AI

Look, we really are focused on two things in our company. One is this incredible financial transformation that we’ve all gone through with you in the last year. The second one is this incredible transformation to artificial intelligence, which is going to be based on data. 

Salesforce’s management thinks that the relative weakness seen in the software world currently is because of pull-forward in demand from COVID, and not because of crowding out by AI; management thinks AI is a growth driver for software companies

[Question] When we think about this measured buying environment, is there any sort of crowding effect around AI that’s impacting software in your view, meaning when you think about all these companies starting to gear up for this next platform shift, was it just the uncertainty of what they’re going to spend on over the next 6 to 12 months, holding them back perhaps on what their normal sort of pace of spending might be with you all or other enterprise software companies?

[Answer] As we entered the post-pandemic reality, we saw companies who had acquired so much software in that time looked to actually rationalize it, ingest it, integrate it, install it, update it. I mean it’s just a massive amount of software that was put in. And so every enterprise software company kind of has adjusted during end of this post-pandemic environment. So when you look at all of these companies, especially as you saw them report in the last 30 days, they’re all basically saying that same thing in different ways. When you take AI, that has to be our growth driver for future capabilities for these companies. 

Salesforce’s management sees the consumer AI world and the enterprise AI world as having very different needs for the kind of data they use for AI implementations, even though the model architectures are very similar; enterprise AI requires internal datasets from companies

It’s been pretty magical to use OpenAI over the last year, especially in the last release, when I’m really talking to it. And when I think about the incredible engineering effort that OpenAI has done, it’s pretty awesome. They’ve built a great UI. I love talking to the software. They have really strong algorithms or what we call Models, especially their new one, which is their 4o Model. And then they stole data from lots of companies like Time, Dow Jones, New York Times, Reddit. Now they’re all making good, doing agreements with all of us, saying, “We’re sorry,” and paying for it. And they took that data, they normalized it, they delivered a comprehensive data set that they train their model on…

…And then we’ve seen a lot of fast followers with the models. It could be open source models like Llama 3. It could be some proprietary models like Gemini from Google and others. Now there’s thousands and thousands of these models. And if you look on Hugging Face, everybody is a fast follower. And 6 months later, everybody is where everybody else was 6 months ago. And the data, well, a lot of these companies are all thinking they can rip off all this data, too, and they’re all having to pay that price. Okay, that’s the consumer world.

The enterprise world is a little different, right? We have great user interfaces, great apps, all kinds of great technology that our users are using, the millions and millions of users. Then we have the same models, in many cases, or maybe we’ve written some of our own models with our engineers. But then the third piece is the data. And that data is a little bit different. Because in the enterprise, how do you put together these large, fully normalized data sets to deliver this incredible capability, and that is where the magic is going to be. Because for all companies, including ours and others, who want to deploy generative AI internally, it’s not going to be Times Magazine that’s going to give you the intelligence, it’s going to be our customer data and your transaction history and how you’re how your company operates in your workflow and your metadata. And that idea that we can deliver another level of productivity for companies using that architecture is absolutely in front of us. But that idea that we have to do it with the right architecture, that also is in front of us. And I think that while we can say it’s a different kind of architecture, it’s still the same idea that we need a great UI, we need models, but we’re going to need very highly normalized and federated data. And that data needs to be stored somewhere, and it needs to come from somewhere. And that is going to be something that’s going to continue in perpetuity over time as these models and UIs are quite fungible. And we’ll be using different models and different UIs over the years, but we’ll be using the same deep data sources. And I think that is why, when I look at what Salesforce is doing, this is going to be critical for our customers.

Salesforce’s management has seen many instances where software vendors promise customers they can deliver AI magic, only for the customers to come up empty-handed because (1) the vendors did not put in the work – and are unable – to make the customers’ data AI-ready, and (2) there’s no proper UI that’s commonly accessed within the customer

Don’t think that there aren’t a lot of people walking into these companies saying, “Hey, you can do this. You can do that. You can do these other things”. We’ve seen a lot of that in the last 6 to 12 months, and then it turns out that you can’t. “Hey, I can make this happen. I can make that happen. I can pull a rabbit out of the hat in the enterprise for you by doing this, that and the other thing,” and then it doesn’t actually happen. And then what it turns out is you got to do a lot of the hard work to make this AI happen, and that starts with building highly normalized, large-scale, federated, highly available data sources. And then building on top of that the kind of capabilities to deliver it to our customers. I think a common story is, “Hey, oh, yes, I am a provider of a data lake or a data capability. And just by going to that, I’m going to be able to provide all your AI.” But then it turns out that no one in the enterprise actually uses that product. There is no UI that’s commonly accessed. That’s why I’m so excited that Salesforce has Sales Cloud and Service Cloud and Tableau and Slack and all of our amazing products that have these huge numbers of users that use these products every single day in a trusted, scalable way and then connecting that into this new capability.

Veeva Systems (NYSE: VEEV)

Veeva’s management’s strategy with generative AI is to enable customers and partners to develop generative AI solutions that work well with Veeva’s applications; generative AI applications require access to data and Veeva provides the access through solutions such as Direct Data API; Direct Data APi provides data access 100 times faster than traditional APIs; management is seeing customers being appreciate of Veeva’s efforts to allow generative AI applications to work well with its own applications; management thinks that the generative AI applications its customers and partners will develop will be very specific; Veeva’s work on Direct Data API started more than 2 years ago

In these early days as GenAI matures, our strategy is to enable our customers and partners to develop GenAI solutions that work well with Veeva applications through our AI Partner Program and powerful Vault Platform capabilities like the Vault Direct Data API. GenAI applications need access to accurate, secure, and timely data from Vault and our data applications. Released in April, our Direct Data API provides data access up to 100 times faster than traditional APIs…

…In general, customers are appreciative of our strategy to enable a broad range of GenAI use cases and experimentation through their own resources and our partner network…

…In terms of the AI strategy, our strategy is to really enable customers and their partners to develop AI applications because they’re going to be very specific AI applications, GenAI applications for very specific use cases whether it’s field information, pre-call planning, next best action, what have you. They’re going to be very specific applications. That innovation has to come from everywhere. And one of the things it needs is clean data. All of these AI applications need clean, concurrent, fast data. So one of the things we did — started about 2 years ago actually is put in a new API on the Vault platform called the Direct Data API, and that was just released this April. 

Veeva’s management has no plans to develop or acquire generative AI solutions currently, but are open to the idea as they observe how the technology evolves; Veeva’s applications do use AI technology, but not specifically generative AI; customers really trust Veeva, so management wants to move carefully when it comes to Veeva developing generative AI applications

We don’t have plans to develop or acquire GenAI solutions today, but that may change in the coming years as we see how GenAI technology evolves, and we determine which use cases can provide consistent value for the industry. In the meantime, we will continue to add advanced automation to our applications. Some, like TMF Bot and RIM Bot, use AI technology, but generally not GenAI…

… We have that trust. We have to continue to earn that trust. So we don’t really get into things that are too speculative. We definitely don’t overpromise. The trust is the most valuable thing we have. So we’ll be really targeted when we get into an AI application if we do. It will be an area where, hey, that’s a use case that we’re pretty sure that can be solved by GenAI, and there’s not a great partner to do it. Okay. Then we might step in because we do have that trusted position.

Veeva’s management lowered the company’s FY2025 revenue guidance slightly (was previously $2.725 billion – $2.74 billion) because of macro challenges and crowding-out from companies wanting to reallocate resources to AI; management is seeing some deferment of spending on core systems because customers are busy investing in AI, but the deferment creates pent-up demand and it’s not spending that has stopped

For fiscal year 2025, we now expect total revenue between $2.700 and $2.710 billion. This is a roughly $30 million reduction compared to our prior guidance, mostly in the services area. As we have said, the macro environment remains challenging as the industry continues to navigate inflation, higher interest rates, global conflicts, political instability, and the Inflation Reduction Act. There is also some disruption in large enterprises as they work through their plans for AI…

…A little more than a year ago, AI really burst upon the scene with GenAI…

…That caused a lot of pressure in our larger enterprises, on the IT department, “Hey, what are we going to do about GenAI? What’s our strategy as a large pharmaceutical company, biotech about AI?” And that we would land in the IT department of these companies. Now for the smaller — our smaller SMB customers, doesn’t land so much. They have other things to think about, other more pertinent, very stressful things. But in the large companies, with tens of thousands of people, they’re looking for these operational efficiencies that they could potentially get through AI and they have a budget to kind of get ahead of that game. So that — by the word disruption, I meant that through a competing priority into our customers, hey, we had some existing plans. Now this AI, we have to plan for what we’re going to do on that. Where are we going to spend on innovation, on experimentation? Who’s going to do that? What budget would we use, that type of thing. So some of that would take an impact onto us, which is core systems. Now those core systems, when we get that type of impact, it will delay a project, but it won’t stop it because these core systems are things you need. You can delay them, but all that does is create somewhat of a pent-up demand.


Disclaimer: The Good Investors is the personal investing blog of two simple guys who are passionate about educating Singaporeans about stock market investing. By using this Site, you specifically agree that none of the information provided constitutes financial, investment, or other professional advice. It is only intended to provide education. Speak with a professional before making important decisions about your money, your professional life, or even your personal life. I have a vested interest in Adobe, DocuSign, MongoDB, Okta, Salesforce, and Veeva Systems. Holdings are subject to change at any time.