All right. Good morning, everyone. Welcome to the next keynote fireside chat here at the 2023 Morgan Stanley TMT Conference. We're thrilled to have Ruth Porat with us, the CFO of Alphabet. Ruth, thank you so much for joining us again.
It's great to be here. Great to see everyone.
Let me do the safe harbor and all the important disclosures. Please note that all important disclosures, including personal holdings disclosures and Morgan Stanley disclosures appear on the Morgan Stanley public website at www.morganstanley.com/researchdisclosures. They're also available at the registration desk.
Some of the statements that Ms. Porat may make today could be considered forward-looking. These statements involve a number of risks and uncertainties that could cause actual results to differ materially. Any forward-looking statements that Ms. Porat makes are based on assumptions as of today, and Alphabet undertakes no obligation to update them. Please refer to Alphabet's Form 10-K for a discussion of the risk factors that may impact actual results.
I will forego your impressive bio and background. If anybody wants to know who Ruth is, you can Google her. It's a very extensive background, including a Morgan Stanley alum. So it's great to see you.
Great. Thanks.
There is a lot to talk about from a macro and micro perspective. Maybe let's talk about sort of top priorities for 2023 for Alphabet. And just sort of to really tackle it right away, top priorities? And how does AI fit into the company's top priorities?
So the top priorities are, first, to invest for long-term growth. And then also to make sure we durably reengineer our cost structure. In terms of investing for long-term growth, the priorities are the same that we've talked about at this conference for quite some time. It's really behind Search, ads, YouTube, Cloud and Pixel.
And to your question, AI really underlies all of that and the investments we're making in AI research and delivery. In terms of durably reengineering our cost structure, we do have a number of work streams that are in flight to go right at that. And the main point is, we are looking to slow the rate of expense growth such that revenues -- revenue growth exceeds expense growth.
And then more specifically to your question on AI, it is obviously an extremely exciting time for AI. And as Sundar indicated, we're on the verge of launching a number of LLM-based products and services, you can think of it as a portfolio. And I'm not going to get ahead of those sitting here today, but we do very much look forward to continuing to unveil what we're doing there.
And I think the main point, as I think about it, is all that we're doing with LLMs in these launches builds on many years of research and product work that we've done. We have amazing engineers, in particular, in AI with both Google Research and DeepMind. And you can see it in the product rollouts that we've had.
So reminiscing back to when I first joined Google back in 2015, that's when Google first launched RankBrain. And it was the first opportunity where we're really using AI to improve context around queries. And then you think of BERT and MUM and what we've done with LLMs to continue to advance what is Search.
And going through what we're doing, the same is true with ads. We've used AI that really help advertisers achieve and maximize their objectives around ROI. We've used it in Cloud. And so we're building on a decade of extraordinary work, and as I said, look forward to sharing more in the next weeks to months.
Okay. I want to go into the AI discussion a little more, but maybe just to sort of right set on from a macro perspective. Can you just sort of give us the -- any updates or the status there what you're seeing from macro ad demand? Any verticals or categories that are potentially stronger or weaker than others and sort of the state of the advertiser discussions in '23.
So after what was clearly an extraordinary acceleration in digital activity kind of going into COVID, coming out of COVID, what we saw and what we've talked about is that the macro environment got more challenging. And in the fourth quarter, we -- as well as prior in the year, we talked about some of the pullbacks that we had seen as a result of the more challenging macro environment. In Search, there was some pullback. We described it as kind of broader pullback affecting YouTube and network. And basically, as we've talked about for many years, what you see with Google Search is what you see in the broader economy.
To your question about what is the conversation with advertisers, it's as it's always been. What about ROI, what can we do with ROI? And that, again, is where you start with Search as this extraordinary cost-efficient way to achieve their objectives, measurable, and that's such an important part, therefore, of the dialogue.
But then to your prior question layering on AI and things like Performance Max where we can help them really optimize across the portfolio and solutions to maximize ROI. So continuing to have a lot of those conversations.
Okay. Let's talk about Search a little bit in sort of the AI use cases and the way Search could change. Investors right now have sort of taken the stance of bucketing certain companies in the winners or losers' group within the AI discussion. Other companies are pushing out AI tools within Search throughout the overall ecosystem. To your point, we're still sort of waiting to learn about Bard and more tools to come with the APIs. Maybe just remind us, give us some more examples of how you're already using AI in Search? And then what are some new use cases in Search that you can sort of think about through next applications of AI to come?
Yes, I love it. So we have, as I said, been using AI in Search for many, many years. And our view is that we don't launch products just because there's a new capability or technology available. We have billions of people around the globe who trust us, rely on us to deliver accurate, timely information. And so user testing is a large part of what we've always talked about and continue to do and remain very excited about when we do launch the application utility take-up. To your question, I think probably one of the most exciting new applications that really leverages large language models is everything we're doing with multimodal search.
And to give you a sense of what that is, say you want it tonight, anyone here wants to plan a great dinner somewhere in San Francisco, and you've got your favorite Italian dish that you want, lasagna, you can take a screenshot, photo, combine it with text with Google Search around near me, and you'll get a full listing of where can you go for that great lasagna dish that you've seen that you want here in San Francisco.
And so it's this combination of visual search and all elements of it leveraging MUM is so exciting, multimodal search. We're the only search engine -- Google Search is the only search engine now offering multimodal search, and it builds off of Google Lens, which now has 12 billion queries per month and growing, and we're really excited about what the capability of Google Lens, and it's just one of the examples of how we're continuing to advance what is Search and look forward to continuing to do so.
I feel like the Search discussion around AI is sort of what is part of the zeitgeist of site right now, given some other companies' products. But Google Services has a lot of other potential ways to use AI. So maybe just -- I know there's no announcement today, unless you're interested in that. But talk to us about sort of how you think about other uses of AI across all the Google Services suite of tools and products.
Yes. Good try, but I'll stick with the second part of your question. So you're absolutely right. It is across the full set of Google Services. So for example, within YouTube, we've been using AI for quite some time for a host of different things. One is as it relates to, again, advertisers and how do you maximize ROI. It goes back to PMax, Performance Max. It also goes to the recommendations engine for users. It also goes to the very important work that we're doing around trust and safety and content moderation, our ability to actually identify quickly content that violates our policies and take it down. And we're continuing to look at opportunities to apply AI in ways that help creators generate content in their storytelling.
In Pixel family, we're similarly using AI on a host of different applications. One of my favorite is in the camera and the ability to use, for example, something called Magic Eraser, which hopefully many of you have seen and used, the ability to take a photo, and if you don't actually want some of the clutter in the background, it will help you with AI, identify it, take it out and put in a better background and amazing with respect overall to what we're doing in ads. So yes, a lot of applications.
The advertising point. This is another common discussion on AI, the incrementality of revenue. You said at the fourth quarter earnings call that using advances in AI to drive new and better experiences for users in Search as well as deliver better measurement, higher ROI and tools for more compelling creative content advertisers as one of your big opportunities. Are there any statistics or ways you can help us quantify early signings of the -- early signs of the incremental revenue you can generate from all these AI tools?
Well, it starts with -- you sort of -- probably because you're quoting us back from the earnings call. It starts with a lot of the functionality that we bring. And so the ability with respect to advertising, there are 2 parts to it. One, what are we doing for advertisers? And whether it's through Smart Bidding or it's through Performance Max, we're helping advertisers really maximize what is the opportunity.
The second part is what we're doing with users, for users in support of users. When you think about what users turn to Google for, they're coming because they either want to learn something or they're trying to do a task or they're trying to transact something commercially. And so what we're using -- again, AI has been invaluable as we understand context so that we can be as responsive, which then translates into activity.
So to give you an example, say you want to go buy something or we each want to go buy something. You may care more about price, I may care about availability. I may want availability near me. And one of you may want something about the ranking and reviews. And so what we're able to do with AI is better enhance what is context for query, better deliver. We're able to help advertisers again, with placement, with creative, all of which translates into additional revenue, and we're continuing to leverage what we're doing with AI again to address both sides.
You mentioned as we were talking in the green room, some of the work that we had done around Google Shopping. It's a great example. What we've done through Google Shopping and the whole mission you've heard a lot of us talk -- a lot of you have heard us talk about it, is open up to the broadest array of merchants because that's good for merchants, it's good for users, it's good for the overall ecosystem. And that's the type of work that we're looking at. And again, it comes back to some of this context on queries, it's useful.
And how do you think about Google Cloud and GCP sort of fitting into this AI opportunity as well? Is that going to be an incremental competitive advantage? Like how does AI drive workloads to GCP?
Well, at the core of your question, we firmly believe that AI is transformative for each one of us in our businesses, our ability to connect more closely to customers, our ability to extract operating efficiency in our businesses, our ability to address some of the risk management requirements that we need. And so our Google Cloud team some way is very excited about the way they are able to work with customers across each one of those and then really leveraging our infrastructure, which, again, goes back to your AI question to better deliver.
So our AI infrastructure enables customers really to leverage TPUs, GPUs to really run what they need. We bring data analytics that's critical to their ability, again, on each of those dimensions. It's our cybersecurity capabilities. And so, yes, it is incremental to everything that we've already spoken about. And at its core, they too are leveraging the AI capabilities across Google. That's GCP.
And then similarly, within Workspace we've used AI in our collaborative docs and tools and Gmail for quite some time and look forward to, again, continuing to add to that with some of the LLM work that we've been doing.
Looking forward to those announcements. All those announcements, those AI tools, they do come with a cost. We've been doing a lot of work trying to figure out the cost. I think Wall Street generally has. We have some estimates to say that the incremental cost of the AI models can be 2x to 3x more expensive than paid search. In some cases, even higher than that when you're going through a lot of aggressive training of these large models. First, love your reaction to that. And then secondly, if that is the case, and it is more expensive than paid search to run these models, how should we think about the long-term margin profile of Alphabet, the OpEx structure? Just walk us through that a little bit.
So you're absolutely right, LLMs -- running Search on with [ ALMs ] or LLMs generally are more compute-intensive and less expensive. But the important point is this is not new for Google. And we've been asked this question for many, many years. The cost of compute has continued to increase, one, because of the -- just scale of the Internet and growth in the Internet. And two, as I've already said, we've been using AI for a decade now. We introduced LLMs 4 years ago. So this is not new for us. And what we've been doing and hopefully is self-evident from results is we have a number of teams that are very focused on every element of efficiency around compute and driving down efficiency as much as possible.
So to be more specific, within the infrastructure area, you've heard us talk about TPUs and the benefit that you can have with TPUs. We have a team that's focused on software optimization, so you can move workloads as efficiently as possible. We're looking at overall model efficiency where there are benefits there as well.
And then the other is the obvious point that LLMs don't need to be used for every query. So just optimizing actually how you think about applications. And so yes, the premise is right. Yes, we are focused on it. We've got a host of teams that are focused on cost efficiency around compute because it's an important part as we think about sort of long-term performance, and they have been for years.
Exactly, much with all the other AI tools, it's not new.
It's not new. It's been a decade here. So yes, it's been a lot of work.
Great. What about CapEx? You talked about total 2023 CapEx being flat on a reported basis. We think the real estate CapEx is probably going to be down quite a bit. So we're sort of backing into infrastructure CapEx, up 20% plus for the year. Just sort of talk us through the drivers of CapEx this year. And how should we think about the CapEx and the capital intensity and the spend needed to sort of push more AI to more consumers across more products?
So CapEx for 2023 will generally be in line with 2022. And you're right, that is because real estate CapEx is down meaningfully. And CapEx around our technical infrastructure is up meaningfully. That's for both data centers and for servers and it's primarily in support of Cloud and everything that we're doing in AI. And as I said, in general, that comes to ballpark in line with last year.
Okay. On the fourth quarter call, you talked about your efficiency and expenses and sort of discipline. And you said we're focused on delivering sustainable financial value, and that obviously means that expense growth cannot be growing ahead of revenue growth. That is more than I remember you saying previously around these topics. So maybe walk us through what are sort of the few areas where you're focused on to drive this more durable expense discipline? Where have you made the most progress? And when should we expect to see the impact in the P&L?
Well, it's an important truism. So it seemed valuable to just reiterate. What I tried to underscore on the call is that we are focused on durably reengineering our cost base. And we do have a number of work streams that are in flight. They include everything from looking at how we can apply AI internally to our processes as well as all of the efficiencies in technical infrastructure that you just asked about. That's one bucket.
The second big bucket is around vendors and everything we're doing around procurement and how can we be more efficient in each element there. And then the third major bucket we call it, where and how we work. And so each one of those are work streams that have been launched. The reason I wanted to stress durably is to do change of substance takes time. So the work streams are in place. We expect they'll contribute more in '24 and then beyond. And our view is that these are critical elements that we're continuing to work on to deliver durable results as we look at kind of what's the overall investment requirements.
Okay. Very helpful. Let's talk about Hardware. Sundar has spoken about how Hardware is one of the key long-term priorities for Alphabet. Why does Hardware matter? And I guess the second one is just -- you talked a little bit about Pixel, but how can your own Hardware's ecosystem potentially to further AI advancement?
So when Sundar talks about Hardware, in particular, focused on the Pixel family, phone, tablet and wearable. And the view is that we're able to really highlight the opportunities, the future of compute by bringing together the best of AI and Hardware and software. And an example of that, I already gave an example around Magic Eraser, but you also have things around Translate.
So our view is that we're able to highlight what this is for the Android ecosystem, which is a benefit to the Android ecosystem in addition to the platform investments that we're making. And so it's in that context that we view it as important. As we tried to be clear on the call, we're investing meaningfully against the opportunity. And that's why we said that there's a real focus on how we can drive incremental efficiency, whether that's through work with our supply chain or some of the location opportunities for the teams, but that's the overall both opportunity and expense implication.
Okay. Let's shift to YouTube on sort of strategic priorities. Maybe just to level set for everyone, what are sort of the 1 or 2 key areas of investment and prioritization on YouTube when it comes to the user offering, the advertiser offering the overall ecosystem?
Well, I'd just say as you're asking, next year, we can do this as a YouTube Shorts. So overall, YouTube is an extraordinary asset and just billions of engaged people globally. And whenever we talk about billions of engaged people globally, we start with the creators. So the creator ecosystem for YouTube is really special, and it's really because of the opportunity as we've grown with them to express themselves on so many different platforms and to monetize across the platform. So whether it's long-form video, shorts, podcast, live stream, music, you name it. You've got the breadth of opportunity on YouTube. And that creator ecosystem creates its virtual circle with users and advertisers.
In terms of the opportunities, there are a number of them. One is Shorts. We're super excited about the momentum in Shorts, up to 50 billion views at the end of last year, that's relative to 30 billion back in the spring daily and just continued ongoing really extraordinary strength. And we're continuing to work to build that out.
The other is Connected TV. Living room continues to be the fastest-growing screen, and we're excited it's another place where AI is relevant both for user experience but also for advertisers.
On the subscription side, it's an incredible opportunity. We're now at 80 million subscribers for YouTube Music Premium, more than 5 million for YouTube TV and really pleased with how the team has grown that. And then longer term, we're looking at making YouTube more shoppable. So we're excited about YouTube.
AI goes across all those.
It does, it does. And has for a while.
Yes. YouTube Shorts, 50 billion views. Can you give us an update on sort of the monetization of Shorts and where you are? And maybe just sort of a philosophical question. If you think about the potential ad revenue per hour on Shorts versus other forms of longer-form video content on YouTube, do you see those eventually reaching parity? Or do you think there's going to be a difference in what an hour of Shorts monetizes that versus other forms of content on YouTube?
So a lot of questions in there. So YouTube Shorts, we have been investing to really continue to build out what we're offering with YouTube Shorts and it's, again, another area where AI can be helpful to creators as they are continuing to develop content. And we look forward to rolling out more there.
In terms of your core question on monetization, we talked about the monetization plan some time ago, launched it last month, and the feedback has been very positive, and this goes to my prior point, which is creators on the YouTube platform have the benefit of billions of users globally and then these multiple formats that they really can draw from, and we think that's a real asset. But monetization is early days. The feedback is positive. You're absolutely right, there's a gap to the traditional long-form, and we're working to close that gap, but it's still very, very early days.
Okay. YouTube, the -- one of the other recent announcements was the Sunday Ticket, and so reaching a deal to distribute the NFL Sunday Ticket. What excites you most about adding the Sunday Ticket to YouTube? And how should we think about the ROI math done around that investment?
So the ROI math is that we expect to generate an attractive ROI over the life of the deal. And we're focused on a couple of things. One is the opportunity to continue to extend subscribers, grow subscribers on YouTube TV. The second is the opportunity to offer a stand-alone premium channel subscription for the NFL Sunday Ticket, so distinct from YouTube TV subscription. The -- and so -- and the third, which I think is really interesting is the opportunity with our top YouTube creators to offer them exclusive NFL content and see what they can do. And so again, we see a number of different paths.
Okay. Excited to see those. One more on AI, on generative AI specifically because that does seem like another potential unlock for YouTube. I know there's sort of this discussion of how quickly will these tools develop and scale and be rolled out. Just as you sort of -- when you talk to the YouTube teams and the engineers, how far away are we from having scaled creator and advertiser generative AI tools that can be rolled out and used by the hundreds of millions or billions of people on YouTube?
Well, I want to start, and again, I keep going back to this, but we are building on years of using AI sort of to your question within YouTube as well. And I've already mentioned things like the recommendations engine or what we're doing with advertisers on PMax that's also benefiting YouTube. I think where you're going is what about generative AI for storytelling or other areas. And we do look forward to, as I said, launching our various applications here of LLMs and the concept of storytelling is a valuable part of it.
One more on Cloud. The Cloud segment, you had about mid to high 30s growth. For the total segment in 2022, I think, 37% growth. Just talk to us about sort of the key areas of strategic differentiation for Cloud to drive outsized growth versus the other competitors in the public cloud space.
So we keep coming back to the same couple of key drivers. One is our infrastructure, our AI infrastructure, TPUs and GPUs that lets them run AI models. That's valuable. Our data analytics, our extraordinary -- our cybersecurity has consistently been leading edge and being able to add Mandiant into the family as the ability to have early incident response capabilities. And so that whole suite is super valuable and then as well our industry solutions.
And when I'm with Cloud customers, a couple -- to your question, a couple of things that are called out, and you'll be hearing more of is Vertex AI, which really is a lightweight, simple way for our enterprise customers really to run AI models. And they don't need an AI scientist internally and so super valuable. And one that we've been doing for quite some time but given a lot of the discussion around conversational AI is all of the work with our Contact Center AI, which provides extraordinary operating leverage for our enterprise customers. CSAT scores are up, costs are down, and it's really leveraging our Contact Center AI. Verizon has talked about this a bunch as have many others. And so there are a number of examples where, again, we're leveraging AI on behalf of customers and it's both within our infrastructure and our capabilities.
Okay. I want to close on capital allocation. In 2021, you bought back $50 billion of equity; 2022, almost $60 billion of share repurchases. Just remind us again sort of philosophically, how do you come up with these numbers? Like how do you sort of think about what is the right amount of capital to be returning to shareholders? And what about the debate between buybacks versus the dividend as you compete for capital?
Well, our framework for capital return is -- our capital allocation is unchanged. It starts with what do we need in the business to support the organic growth of the business, and it then goes to acquisitions or strategic investments. And then we're very pleased to have this third lever, which is return of capital. At this point, we continue to believe that the share repurchase program is the most effective return of capital to shareholders and are pleased to have done $60 billion last year, and that's both the As and the Cs, and it was great to be able to have both of those in the program. To your question, we look at the form and size with the Board on a pretty regular basis and are pleased with the form and size as we have it. So that's the way we're looking at it.
Ruth, thank you very much. We're excited for all the announcements and everything to come out of AI.
Thank you.
Thank you.