2023-05-15 05:43:14 ET
Summary
- Consumer facing AI is the craze, but people possibly do not realize the costs associated with them and the consequent compression in margins.
- The industrial AI is already ingrained in the large companies and they have possibly not been as vocal about it as the consumer AI app companies.
- In addition to leveraging margin benefits at a computation level, AI could have meaningful margin benefits due to the limited additional cost.
- We think hyperscalers will soon need to evaluate the margin impact of launching consumer-based AI solutions just to compete for the consumer mind space.
The launch of ChatGPT has led to fears of the world getting taken over by AI .
The fury with which ChatGPT’s competitors have been launched would make Open AI feel as the sorcerer’s apprentice: they seem to have woken up a power they cannot control!
The hyperbole aside, at HWI we have been working with machine intelligence since the start and feel compelled to demystify the entire hype around AI.
Understanding what is AI
The one question worth understanding is: Why has everyone woken up to AI, after the advent of ChatGPT? Were all the big tech companies sleeping at the wheel?
AI or artificial intelligence refers to the programmatic ability to suggest/predict/make decisions based on trends in data.
The launch of ChatGPT has brought AI to the masses in form of a prompt window that can auto-generate human-like responses based on questions. As a chatbot, ChatGPT’s predecessor was ELIZA, a chatbot developed in the 1960s by MIT professor Joseph Weizenbaum .
ELIZA brought to the fore the shallow nature of communication between human and machines.
Named for the heroine of "My Fair Lady," ELIZA was perhaps the first instance of what today is known as a chatterbot program. Specifically, the ELIZA program simulated a conversation between a patient and a psychotherapist by using a person's responses to shape the computer's replies. Weizenbaum was shocked to discover that many users were taking his program seriously and were opening their hearts to it. The experience prompted him to think philosophically about the implications of artificial intelligence, and, later, to become a critic of it.
"The dependence on computers is merely the most recent -- and the most extreme --Â example of how man relies on technology in order to escape the burden of acting as an independent agent," Weizenbaum told the journal. "It helps him avoid the task of giving meaning to his life, of deciding and pursuing what is truly valuable."
Source: MIT
ELIZA was quite basic in its approach, where users were encouraged to talk to it and based on user inputs sentences were decomposed to look for keywords. Based on certain rules, ELIZA would respond. Today’s chatbots have a lot more programming power and memory at their disposal. ChatGPT for one is leveraging artificial intelligence and in turn responding that make its responses of near human quality, albeit a bit error prone.
Coupled with the increase in loneliness , ChatGPT has been an instant hit.
ChatGPT, however, is a conversational AI application or a consumer application of AI.
Keep in mind that AI for business is different than AI for consumers, given their need for more accurate results, trusted data and governance tools.
Source: Q1 2023 IBM Earnings Call on Seeking Alpha
Going back to 2018, when Arvind Krishna was the SVP of Hybrid Cloud and Director of IBM Research, he made some interesting observations:
…AI has done great in speech to text and recognizing words that are spoken and recognizing here's is a cat or in recognizing there might be a lesion on that particular skin tumor, so that means it maybe melanoma or it's not…
But when you look under all this, this is all what I would use the term narrow AI and that’s not we use I think the term widely use out there. And narrow AI means, AI that works for one task in one domain. Wow that means, if you think through the millions of tasks humans do, you'll have to go train AI for every one of those million. So, you have to go train for all of those. Getting AI to be sort of intelligent, like humans and what called general AI and we believe we are decades away from that point…
..last year somebody did a survey at a pretty deeply technical conference on AI and audience kind of somebody said, 275 per general AI from recent 2050.
... What’s happening with my data? Are you going to take that model and give it to everybody? Or are you going to tell me, who is going to? Or what’s all is going to where my data itself might go, not just that?
Source: IBM Presentation at the Credit Suisse Conference
In some sense ChatGPT and most of the consumer AI tools out there are Narrow AI tools. Thus, application AI (including narrow AI) and general AI are two different businesses, which then entail separate economics.
In a previous article we have attempted to estimate the economics of a popular narrow AI play, ChatGPT.
In fact, NICE ( NICE ) came out with another set of earnings, with commentary that throws a bit of light on the ground realities of LLM / generative AI adoption in business.
The first camp is those enterprises that are extremely protective of their brand and unwilling to even consider deploying generative AI to service their customers because of that concern. And the other camp is those that see the potential but experience significant disappointment following an initial use of out of the box generative AI.
Source: Q1 2023 NICE Earnings Call on Seeking Alpha
Furthermore, the company management acknowledged that AI is predominantly a support mechanism and human touch is not going away.
I've been in this industry for almost 25 years now and I remember 25 years ago when I predicted that number of agents or agents would disappear and here, we are today with much higher number than it was a decade or 15 years ago. The reason for that is that eventually the number of transactions or interactions is actually going to continue to grow exponentially. It grew 100 folds in the last 15 years. The overall number of interactions over a variety of channels. And we see more of simple transactions going to an unattended channels and unattended form, but the more complex interactions still require to be manned
Source: Q1 2023 NICE Earnings Call on Seeking Alpha
The change in tonality from AI and LLMs to be the next world conquering technology to it being of an assistive nature in sifting through large volumes of data is not necessarily new.
Turning towards the hyperscalers, we think most of them are trying to ride the AI wave through a variety of narratives ranging from defending the slowdown in their core business to justifying restructuring.
Amazon ( AMZN ) Web Services or AWS
Amazon seems to be playing catch up big time. Instead of worrying about a dwindling FCF, Andy Jassy seems to have been captivated by the lack of a B2C generative AI LLM.
...every single one of our businesses inside Amazon are building on top of large language models to reinvent our customer experiences, and you'll see it in every single one of our businesses, stores, advertising, devices, entertain.
...And to do that, it's difficult. It's across a lot of domains and it's a very broad surface area...
...And we've had a large language model underneath it, but we're building one that's much larger and much more generalized and capable. And I think that's going to really rapidly accelerate our vision of becoming the world's best personal assistant.
Our recent announcement on large language models and generative AI and the chips and managed services associated with them is another recent example. And in my opinion, few folks appreciate how much new cloud business will happen over the next several years from the pending deluge of machine learning that's coming.
Source: 1Q 2023 Amazon Earnings call on Seeking Alpha
And we are not sure as to why does Amazon need to now rebuild its platforms on LLMs. Amazon has been using AI in its product recommendations and advertising business, to great effect.
On the advertising side, we're continuing to buck wider advertising trends and deliver robust growth.
Source: 1Q 2023 Amazon Earnings call on Seeking Alpha
In context of the broader macro environment, the AWS warning on customers wanting to optimize cloud spending may not be very negative .
Furthermore, the interdependence between the consumer business and AWS could also be muddying the waters for Amazon’s cloud business’ growth. (Amazon has been accused of using customer ideas to incubate new businesses for itself.) We continue to believe that the AWS business should be separated out from the consumer business to allow it to grow.
While AWS continues to be the largest player, but lack of focus can potentially make its position vulnerable.
Microsoft ( MSFT ) Azure
Microsoft was an early backer of OpenAI and has benefitted a lot from the ChatGPT wave and the subsequent integration with Bing.
In March 2023, Scott Guthrie - Executive Vice President, Cloud + AI at Microsoft said:
Yes. I mean, AI learns with data. And so, it's -- one of the things you need to think about with AI is, as you provide data to that model and it learns, who owns that model, who owns that data? And that's why the trust is so important, I think in this AI age. And again, our promise is your data is your data. It's not our data. You get to monetize it. You get to control it. And the foundation models won't learn from your data. No. Your instance learns, but not the models that are shared by others. I think that promise and frankly the trust that we've earned over the years at Microsoft and you earn trust in drips and you can lose it in buckets.
Source: Morgan Stanley Technology, Media & Telecom Conference transcript on Seeking Alpha
In April 2023, Elon Musk, the new owner of Twitter expressed his desire to sue Microsoft for having illegally used Twitter data to train Microsoft’s AI model.
The irony of the above situation aside, Microsoft is decidedly using its association with OpenAI to signal the company’s AI readiness and riding off the AI wave, which then creates a flywheel for Azure.
I mean in general the typical pattern we've seen going back many years is you either migrate a workload to the cloud or you build a new workload in the cloud. You then optimize it and then you reinvest the savings and you rinse and repeat with the next workload or the next use case.
Source: Morgan Stanley Technology, Media & Telecom Conference transcript on Seeking Alpha
Since AI is computationally intensive, it is likely to impact gross margins, which will only be offset by higher pricing. Thus, in some sense, people understand that AI products will be higher priced – it is one way of understanding AI's signaling of value proposition, which then justifies a higher price.
Microsoft Cloud gross margin percentage increased roughly 2 points year-over-year to 72%, a point ahead of expectations driven by cloud engineering efficiencies. Excluding the impact of the change in accounting estimate for useful lives, Microsoft Cloud gross margin percentage decreased slightly, driven by lower Azure margin.
Source: Q3 2023 Microsoft Earnings call on Seeking Alpha
We think Microsoft is using AI to justify the increase in COGS (which in an inflationary environment is not unexpected), but the focus is going to now be growing Azure volumes further, even at the cost of margins.
Google (GOOG) ( GOOGL ) Cloud or GCP
In Q1 2023, GCP turned profitable. While Google has also been employing AI across its search products for a while, the company has also been impacted by the ChatGPT wave.
In March, we introduced our experimental conversational AI service called Bard.
Source: Q1 2023 Google Earnings call on Seeking Alpha
Additionally, GCP also has not been untouched by the customers optimizing cloud spending theme.
So, in terms of the cloud question, the point we are trying to underscore is there is uncertainty in the economic environment. And so we saw some headwind from slower growth of consumption with customers really looking to optimize their costs given that macro climate.
Source: Q1 2023 Google Earnings call on Seeking Alpha
We see this as a stark difference versus Microsoft – Azure saw margin weakness despite 25% growth and here Google Cloud has turned profitable. Obviously, Azure’s fixed cost burden due to AI is getting added on top of its existing cost base while for Google the AI cost is getting spread with the growth in the cloud business.
We think structurally Google Cloud is better positioned for a profitable growth and contribution to cash flows for the company. In the same breath, we also remain worried about Google spending too much effort on its consumer AI offering.
IBM ( IBM )
The focus for IBM remains on hybrid cloud:
- Red Hat consulting ARR reached $2 billion, growing at double digits
- Red Hat OpenShift ARR crossed $1 billion
- Overall Red Hat revenue quadrupled (from the time of the acquisition)
Possibly a leader in AI back in the day, the craze for consumer AI has also gotten to IBM .
While the company does bifurcate between consumer and industrial AI, it has not been spared from having the need to advertise its achievements.
A lot of the excitement out there is on AI for consumers, and I think it's going to be a home run for those companies that are going down that path. But there's a lot of industries where people still worry about their data. They are careful about what's used to train the data.
So yes, the first training is a lot more expensive, maybe 4, 5 times. But then if you can do 100 tasks for almost no additional cost, you can see that the overall is a 4 to 5x benefit in terms of time to value and productivity.
We did a preview of AI for ITOps around a product called Ansible. We're calling it Project Wisdom is our internal code name, but it shows that you can get 60% to 80% of the code that our IT operations person might write, gets written by the AI. Well, that's a 3x to 4x productivity for that individual.
J.B. Hunt Transport use our Turbonomic AI-powered resourcing and automation engine to optimize their cloud environment. This helped reduce infrastructure refresh costs by 75%. This is a great example of the work we're doing in IT operations. In other areas such as customer care, we are automating hundreds of thousands of call center responses with AI with more than 90% accuracy and greater levels of customer satisfaction.
Source: Q1 2023 IBM Earnings call on Seeking Alpha
Interestingly, the AI use cases were pertaining to efficiency gains in coding and call center operations, which in some sense does resonate with the commentary above – higher value add decision making will need human intervention with AI coming in handy as a reasonably effective assistant.
IBM has used this entire AI phenomenon reasonably well to also justify another aspect of its business:
AI has gone through sort of 4 generations very quickly from expert systems in the '80s and '90s, to machine learning, through about 2010, deep learning for the last decade and now this era of large language models and generative AI.
Our employees are expecting higher wages. As we hire new people, they get hired in at higher wages than their similar colleagues were a year ago. We have to get a return on that back from our clients.
And that's how we should think about it, that we are willing to sort of feather it in so that our clients don't get a sticker shock right away. But over time, that is, I think, the nature of inflation, if I remember my college economics lessons.
Source: Q1 2023 IBM Earnings call on Seeking Alpha
The company had laid the basis right there – costs are creeping up, but we can’t charge the customer right away. But then as a technology company the benefits of AI must be reaped. Ironically, just after labor day, the following news came out: IBM pauses some hiring amid plans to replace 7,800 jobs with AI. The devil is in the detail and this 7,800 number is an expectation over the next five years.
Five years ago, Arvind Krishna said we are decades away from AI becoming pertinent enough to replace humans. Today AI is automating low end / repeat jobs. So how does this 7,800 figure? Simple. AI is the perfect excuse to make the organization lean and AI is adding efficiencies. Hence, in our opinion, the perfect time to announce restructuring and blame it on what the world loves today – AI.
Oracle ( ORCL )
We have been bullish on Oracle since 2019 . It is interesting how far behind Wall Street has been.
The level of surprise in the tone of this question was matched only by the excitement in the answer:
[Question] With the slowdown we're seeing across so many IaaS PaaS vendors over the last couple of quarters and especially this quarter, why has OCI Gen 2 held up so well?
[Answer] The other guys can't do that. They can build clusters, but they actually literally are physically building a new cluster. They're building new hardware. Our existing hardware, standard network, allows us to group these things together dynamically, these GPUs together dynamically, to attack AI problems. No one else can do that. So we have a lot of business, a lot of new AI companies coming to Oracle because we're the only ones who can run their workloads.
Source: Q3 2023 Oracle Earnings call on Seeking Alpha
We have continued to like the differentiation the technological differentiation that Oracle’s cloud infrastructure brings to the table and believe that it lends itself to AI use cases well. The areas where we have concern are:
- Focus on healthcare with Cerner: Yes, Larry Ellison wants to redefine healthcare and the mix of eccentricity and genius may achieve it. But at what cost? Does this impact the growth of the IaaS business? We hope not.
- Actual cloud growth: The technology is fine, but when do Oracle’s numbers begin to surprise us, benefitting the installed base of the company? Oracle has a large share of the database market, but when does the autonomous database start taking share more rapidly?
There are green shoots and we continue to like the story, but if execution does not match up to the technology narrative, we may want to rethink our bullish stance here.
Conclusion
AI has opened a world of possibilities by capturing the imagination of the masses. The launch of AI comes at a time when interest rates are at record high and are expected to fall soon. Hence the cost of running AI could soon become irrelevant. However, until such time, we think it is worth differentiating between the economics of consumer and industrial AI.
- Companies wanting to ride the AI wave (including the ones bolting on consumer AI products on top of their platforms) purely on the consumer side are ignoring the costs of workload inference and are likely to see margin impact.
- Companies with an industrial AI focus have incorporated AI into their business models long back, which will give leverage to margins as the scale grows.
- The business of cloud may see a bit of a pause due to the general macro sentiment, where hybrid is becoming the preferred mode (remember finance 101: renting an asset in the long term is always more expensive than owning it)
As always, we would love to understand how readers see this market beyond the hype around AI.
Previous articles in this series:
- Masters Of Cloud, Part 1: The Players
- Masters Of Cloud, Part 2: The Evolving Landscape
- Masters Of Cloud, Part 3: The Dark Horses
- Masters Of Cloud, Part 4: The Post-COVID World
- Masters Of Cloud: Winter Is Coming
For further details see:
Masters Of Cloud: The Hype And Reality Of AI