2023-09-12 15:43:01 ET
Arista Networks, Inc. (ANET)
Piper Sandler Growth Frontiers Conference
September 12, 2023 12:30 PM ET
Company Participants
Ita Brennan - Chief Financial Officer
Liz Stine - Director, Investor Relations
Conference Call Participants
James Fish - Piper Sandler
Presentation
James Fish
All right, thanks everybody for joining us. We have the pleasure of having Ita and Liz from Arista with us. I will try to actually open it up to see if anyone has a question, but as a lot of people know, I tend to ask a lot. But thanks for joining us, ladies. Really appreciate you guys making the trip out. And Ita, congrats again on the early retirement announcement. Looking forward to working with you for the next couple of quarters. But before, retiring, we're going to have to pester you a lot on this little topic called AI that everybody wants to know about.
As we think about AI sizing here, a lot of people look at your supply chain. Broadcom talks about, $200 million last, for this year going to $800 million for their AI chips. Obviously, there's a lag in the supply chain thing, but as we start thinking out over the next, year or two, how should we think about that lag in the supply chain from your angle with those AI chips, as well as, is there a way to think about how big AI could be in the next year to three, kind of whatever time horizon you want to have us think about?
A - Ita Brennan
Yeah, so it's pretty hard to bridge kind of from the Broadcom numbers. I mean, obviously, all of those chips don't come to us, if you think back to our analyst day last November, which is kind of pre-ChatGPT, et cetera. We were talking about AI then, right? So, the 7,800-product set that we have, is a good AI vehicle and is deployed in some AI use cases today. Not kind of the, the post-GPT backend discussions that we've been having more recently, but definitely there's some AI traffic that's traversing some of that equipment today.
The problem is we don't really know what's an AI use case and what's not, right? Because that chassis has been deployed in DCI, it's deployed in spine switch situations as well, right? So that's one factor is it's not as simple as how do you count it? I don't know how Broadcom is counting it, honestly, in terms of where those go. So firstly, they don't all come to us. Secondly, we've got lots of use cases that consume those chips. And then in thirdly, you have the supply chain, and you have the fact that you're still living in a 52-week lead time world on silicon for now. So, it's kind of, you're definitely going to be dislocated, right?
So, I think it's hard for us to tie back to those numbers. I understand, why everybody wants to do that. But it's not completely something that we can quantitatively spread through to their numbers. The sizing of the market, I mean, the industry folks are trying to do that. I think, the 650-group size of the $2 billion to $3 billion in 2024 growing pretty aggressively beyond that. Again, I'm not sure that anybody really knows exactly, and what that's going to be. We think about it as, AI is going to underpin kind of investment and innovation around the network for the next three years, five years, which is great, right? And we're fully engaged in that.
But in terms of being able to slot it and say this is exactly what's going to happen, that's not clear yet, right? I think we still live in a world where we've talked for a long time, about 2024 being a kind of digestion mute cloud year. I think that's still our view, that underpins the comments made of the last call. And I don't know if there's anything about AI right now that changes that dynamic. We think it absolutely can be a driver for the future, but it's going to take some time to get there.
Question-and-Answer Session
Q - James Fish
Yes. So, we frequently talk about networking, especially high-end networking where you guys play, speeds, needs, feeds, right? Like does AI actually accelerate this 800-gig roadmap, understanding we're just talking about 400-gig real material deployments today, or is that 400-gig system actually good enough to be able to handle essentially the workloads that AI-based workloads?
Ita Brennan
I mean, certainly 400-gig can be deployed into those use cases and is being deployed in those use cases today, right? But obviously you want the fastest speeds that you can have, right? If you're going to generate more and more data, you want to more and more speeds, so everybody's going to want 800-gig to go faster, right? But that in and of itself is not going to make it go faster, right? There's still a set of work that has to be done.
I mean, you've finally had silicon delivered, so we're working kind of on the systems now, then there's customer qualifications, et cetera. So, it does take time to get there. There's definitely an emphasis on moving those dates as fast as you can, but still from experience, it just takes time to actually get to a quality product that you can qualify and ship in volume.
James Fish
Yes, it's a way to think about the capacity for that's sort of required for the average CPU-based application versus the average GPU application today, let alone down the road. And do we have to worry at all about substitution effect within your revenue base of CPU versus GPU kind of -- or CPU versus AI workload, essentially?
Ita Brennan
And Liz, if you want to take the first part of that question then….
Liz Stine
I think from the capacity standpoint, if you look at CPU connections today, they're connected at 10, 25, maybe 100-gig, some of these larger customers all those CPUs don't necessarily talk at the exact same time either, right? So, there can be some undersubscription -- oversubscription built into the network. But when you're talking about GPUs, your -- these things are connecting at 200, 400 today, and they want to go as quick as possible. They all have to speak at the same time. So, from a network capacity standpoint, much more demands on the network from a GPU processing standpoint, right?
So, when you're building out an architecture for an AI cluster, you're looking at a non-blocking architecture, which means that all nodes can talk at the same time at full capacity or maybe even like undersubscribed, right? So that you can have the capacity in case of a failure. So, I think that they're definitely driving much more demands on the network, even just from a raw GPU standpoint.
James Fish
So, I think the number one question I've got in this quarter on you guys is, well, why won't this all be InfiniBand-based when you talk about AI workloads? So, what advantage does Arista Ethernet have over NVIDIA's incentives band, especially when you include some of the other, uh, pieces of the NVIDIA portfolio overall that can help the capacity and scalability of what they've got?
Ita Brennan
Yeah, I think that that really comes down to scaling these AI clusters, right? So, in the past historically maybe InfiniBand had some technology advantages. It got it scaled to 40-gig faster. It had RDMA. If you look now, we have RDMA over converged Ethernet. We have for the last 10 years has been deployed in many of these high-performance compute clusters. If you look at like the sheer technology rate, we're at 400-gig going to 800-gig 1.60 is coming down the road too. So, I think technology-wise, Ethernet has taken what it wants from the InfiniBand world.
And then when you talk about scale and you think about like the scalable technologies, Ethernet is what's deployed today. It's multi-sourced, which is very important to these large cloud providers. It's multi-tenant, which is very important if you're going to take these GPU clusters and carve them up and service multiple customers. It's open standard space. It's the same operational model that they have today. So, I think that there's a lot of benefits as these AI clusters scale out. Then it goes back to how big is it really going to be, right? And I think that goes back to what do you believe AI is going to be as pervasive as everybody wants, like everybody's going to want their data center to be AI-enabled.
James Fish
So, you guys have a huge advantage here. Thesis early on was we're going to focus in on cloud and cloud-based workloads and where you're seeing a lot of, essentially the picks and shovels deployment today is with those hyperscales including two of your largest customers. And big question that I think is on everyone's mind is, does that networking mix actually change when you think about that AI-related CapEx versus the traditional kind of CPU-related CapEx? So, how should we be thinking about that mixed shift of percentage going towards switches and networking gear and sort of this AI world?
Ita Brennan
Yeah, I mean, that's a question that we get all the time as well, right? And I think you don't have a lot of quantitative, usually, you get these, this data from white papers that they are published by customers over time after they've had some experience and they've deployed some stuff in their network. We haven't really seen a ton of that yet. Again, the industry analysts are starting to talk about maybe your, we always talked about networking being kind of high single digits, 10% of the overall data center spend. And they believe maybe with AI, it's maybe 15%, right? Those numbers are thrown out there. But again, I don't know that you've had real validation yet of a real-life kind of large footprints where you're, where that's been measured on a consistent basis. But that's kind of where the industry guys are putting it.
James Fish
And as we think about the cloud and hyperscaler potential refresh, even some of your large enterprises, right? 2017, 2018. That was good couple years for you guys given the 100-gig kind of cycle was really ramping then, the hyperscale’s have kind of moved towards five-year, six-year depreciation cycles depending on which one you want to talk about. But, why wouldn't we also need to have that refresh of those systems on top of potentially some of the AI spending that you could get? So, what's the balance going to be or what's the feedback on the balance between what they need to refresh versus what they kind of need to invest, and how does that impact you?
Ita Brennan
Yeah, I mean we can't forget they grew triple digits last year, right? There's been a lot of spending that's kind of already happened, both last year and there'll be big contributors again this year, right? And then after that I think it just comes back to where their priorities going to be, right? And that's going to be driven by, how much new investment do they need to make? How much of that is AI-driven, versus, refreshing their existing.
I mean, I think Microsoft on Microsoft's call, I mean they were clear that they're going to end up having to kind of make sure they invest to support Azure in the meantime while they're kind of building out this new AI footprint. Right? So, I think you're going to see some mix of that exactly how what it is, we'll have to wait and see, but they've been making big investments over the last year, this year, as well into kind of their existing footprint.
James Fish
Got it. And circling back on AI, we frequently hear about you guys externally selling. But on the internal side of things, as you guys are thinking about your own kind of costs, are there areas of that would be leveraging AI that you are currently testing or planning or kind of helping out on the internal perspective?
Ita Brennan
Yeah, I mean, the biggest place that it would move the needle would be around all of the software development resources, maybe a little bit on the hardware side where, you're using kind of design, chip design, testing, that type of stuff, right? And I think, so Ken is looking to see kind of, what's truly valuable in terms of deploying it in that area. That's where most of our like large spend items are.
So that'll be the focus, I think to begin with is if there's something that can be done kind of to make the software team more efficient, we're always constrained in terms of the talent that we're able to hire and find at the level that he wants to have it. If there's something that can make that more efficient, there's definitely a lot of things on the list to go work on. That's probably the most important place, but I don't think he's, he hasn't kind of concluded how he wants to think about that or how he would deploy that yet.
James Fish
Yeah, and I think one of the surprising things, including for myself, admittedly this year, is actually the resilience of your enterprise install base, let alone capturing new customers. So, I guess what's enabling you guys to actually, gain share within the enterprise specifically, and then I got a follow-up around campus for that.
Ita Brennan
Yeah, I mean, it's a couple of things, right? I think there's a refresh in the enterprise. We've always talked about kind of digitization, transformation, whatever you want to call it, but we are actually seeing enterprises more and more reliant on their IT infrastructure in a real way that like, it impacts their business if it doesn't work, right? And that causes them to want to make an investment. And then we have an opportunity to go and show kind of just the differentiation that we bring and how much more, how much more reliable and how easy, much easier it is to manage an Arista footprint because of the single image, across both campus and data center.
And because of the visibility and all the other things that we can provide, right? So, I think that's, yeah, the, there's a pull from an enterprise perspective and then we're more ready as well to go play in that market. Now Cloud vision is way more featured, fully featured than it was before. It's doing a very nice job of managing everything across an enterprise footprint, and customers are really seeing value in that, right?
So, I think it's the combination, the need is there, and then we're just more ready. We've got kind of campus, data center, routing. We're able to kind of address it from a sales better from a sales perspective. We're continuing to add sales people to that. So, I think it's the right time for us to kind of be more successful and be more targeted in terms of those accounts.
James Fish
Yes. So back in May, we were chatting and you actually said campus was starting to become like half of the spear into the organizations I believe, at that time. So, what is actually enabling you guys to get in there as there's just this question of as to why does Arista have the right to win in that campus environment, and do you feel you're still on pace for that? I believe it was $750 million goal that you outlined at the last analyst day?
Ita Brennan
Yes, I think, look the reason why we went in campus is the reason -- is the same reason why we went in the data center, right? Again, campuses have become more complex. Campuses are -- even if it's an office footprint, there's the hybrid work. There's all the different ways that people want to access the network, et cetera. So that's way more complex. But it's not just offices anymore, it's retail, its hospitals, it's colleges. Again, the complexity, the level of complexity has just gone up.
And that's driving this need for like more simple approach to the network so that it can be managed efficiently. And that's the complexity doesn't break it, right? So, I think that's a big driver of it. It's the same. It all comes back to the same thing of this high quality and then simplistic in the -- in a good way, right? In terms of being able to manage the network.
James Fish
Yes. So obviously, AI's been a big topic, but now as we start to turn the page towards the fall here, the question we're going to get is what does 2024 look like for a lot of companies? You're not going to guide here. I don't think, but I think you'd start a little bit of what happened at Piper this week, but walk us through the puts and takes as you think about that we should be thinking about for calendar ‘24. You had mentioned digestion period potentially what else should we be thinking about?
Ita Brennan
Yes, I think, look, we'll do more of this obviously in November at the Analyst Day, Jayshree did, she's long had probably since the analyst day last year had this goal to be able to -- can you grow at least 10% in a year when cloud is muted or cloud is kind of in that down part of the cycle, right? And are more muted part of the cycle. So, we talked about this a little bit on the call right at the end, but I mean, that is a goal that we're kind of starting to feel better about. And obviously, we'll talk more about kind of the different ways that you can get there in November at the Analyst Day.
But one assumption obviously is cloud is more muted and then you're relying more on enterprise and some of the other cloud parts of the business to drive the growth, right? But again, whatever one we pick, it's never going to be the perfect answer for what actually happens, right? Whatever we -- so we need to have multiple different ways to get there. And so, we'll come back and kind of address those a little bit more at the Analyst Day. But that's I think that's something that's firming up in our minds that's possible, even if you're assumed that cloud is muted.
James Fish
Is there anything I think about for more so the purchase commitments or inventory turns, like where do you actually see inventory turns as kind of a normalized state for you? Or where does purchase commitments in a normalized environment actually go to seeing as the right way you've pointed us to is -- think about it as that and inventory together.
Ita Brennan
Yes. So, the purchase commitments are coming down nicely. And they should, right? Because we want to step back out of the contract manufacturer's supply chain. Now that everything is returning to normal, right? So, we had to in order to kind of make it all work, we had to kind of step in and backstop if you like. Purchase commitments on pretty much the entire bomb, which we never normally do, right? So now we're kind of coming back out of that and we'll be left with the key components that we have always buffered that we buy directly. That's what you're seeing in that raw materials line. On the balance sheet a big chunk of that is silicon. And the size of that is going to be very much linked to what the lead times are for those parts, right?
And right now, that's still at 52 weeks. Does it get better? We'll have to see. But that will drive that number. So, when you kind of, the purchase commitments are coming down, they need to continue to come down. We're very focused on doing that, making sure we're taking advantage of every lead time reduction that we can get.
On the other side, you have raw materials that's going to be driven by those lead times. And then, the finished goods days of inventory is still not back to where it was kind of pre-COVID, number bigger, but obviously, the business is much bigger too. So, as things get better, we'll probably add a little bit more there just to get back to kind of something Dave's numbers and then read. It'll be about the raw materials and where those lead times end up.
James Fish
There's about five minutes left. If anyone has a question, please feel free to raise your hand. I'll, I'll call on you. Otherwise, I can keep going.
Unidentified Analyst
Yeah, I had one. How hard is it for you guys to get chips these days? I know they're all -- they're all pretty tight, so how of a lead time? You could give us color on that.
Ita Brennan
Yeah, I think, for the key kind of components, the key chips, it is always been it's a longer lead time that's like currently 52 weeks. But there was predictability in terms of what you were going to get and when you were going to get it. And I think that's still the case, right? So, if we give enough visibility then, and they make commitments, they've been executing those commitments pretty well. The problem children were like the small parts that nobody really expected and we hadn't taken real action to try to resolve early on. Because we didn't understand that that was going to end up being kind of the golden screw. But that is much better. I mean, that's really coming back to normal.
Unidentified Analyst
Yes, base from access. So obviously, user clear focus data build up and typically you see that deployed up before through, but is there a way to think about the timeline from GPU deployment to when you start meeting to really accelerate the networking purchases or how to think about the difference between GPUs deployed when you see the lag between a risk starting to benefit from the explosion of traffic?
Ita Brennan
Yeah, I mean, simplistically you'd say, once you've deployed those GPUs, you're going to want to make sure you've got enough networking around it because that investment is so significant that relative to the network, that you're going to want the network to be there and not to be a bottleneck in any way. I think what's happening right now in terms of those first deployments that we're seeing, I mean a lot -- at least a good chunk of those are going in within InfiniBand because that's kind of the bundle that you're buying.
So again, that's why -- I think it does take some time before we start to see AI be a big driver. Jayshree, you talked about trials with customers now around infrastructures, pilots next year, and then you start to kind of ramp to volume beyond that.
James Fish
Maybe just keeping on that AI theme and probably more you for Liz, because I think most forget that you're an engineer at heart. Obviously, you guys are picks and shovels of AI here and actually the systems behind the thing. But -- how should we think about potentially other AI-based products coming out of Arista or leveraging AI in your products?
Liz Stine
Yes. I mean I think that if you look at our ABA [ph], which is part of the Awake acquisition, right? So, this is our network detection and response product, we actually had some AI and ML capabilities that we utilize in ABA to detect anomalies within the network. So, it utilizes all of the EOS data, it's state that we have from the architecture and it's able to kind of process through what is normal activity, what is abnormal activity, what needs to be flat, what needs to be remedied. And so, we're offering that today as a kind of ABA I think that we'll continue to work on those types of product lines and see what else that we can do kind of with the data and the raw state data that EOS captures.
James Fish
Yes. Anything to call out in terms of sales cycles for this year, could they actually start to get better in the back half of the year as we lap some of those tougher sales cycles picking up at all or anything to think about as we think of going into '24?
Ita Brennan
In terms of seasonality or...
James Fish
Just in terms of sweating assets for longer at customers and those deals that you expected maybe to close are getting pushed out potentially? Or is that starting to slow down, and we're actually seeing those deals, especially on the non-AI side, non-hyperscaler side actually closed?
Ita Brennan
Yes. I don't know that we've seen any huge difference there. I mean, obviously, the lead time change is a big deal from a customer perspective because instead of having to make decisions for 12 months from now or even longer, you can now kind of starting to get back to maybe like a six-month lead time so you can make decisions in a closer horizon, right?
So, I think that's definitely a factor. It's going to allow customers to -- some customers hesitate to make something that decides something for 12 months from now once those lead times get shorter. I think yes, we'll lose some visibility. But from a customer perspective, I think it's really important, right? It gives them a lot more flexibility, whether that's the enterprise or the cloud, right? I mean, to have to wait 12 months to get a switch, and once you have a need that's just too long. So, it really is good to start to see those lead times come down just from a pure business perspective.
James Fish
And we're coming up on time here, and it's kind of a bittersweet here a little bit for me. But Obviously, you've announced that you're going to be stepping down retiring here, and congrats again. I guess why retire now? And given that you're involved in the CFO search, what is Arista looking for out of the next CFO?
Ita Brennan
Mean why, honestly, because I can, right? I mean, it's something I've been thinking about a little bit more. I spent a lot of time in -- I've got a lot of family in Ireland and stuff. So, it will just make that whole thing a lot easier. What are we looking for? I mean, again, you've met the team, you know the team, right? It's a very smart kind of -- it's a culture where everybody is humble and focused on the business and finding somebody that fits that, that can kind of help the team and book in the team going forward. I think that's the key, right? So, culture is a big part of the search for sure.
James Fish
Awesome. Well, thank you very much for making the trip out, and congrats again on your retirement and looking forward to the next big thing out of Arista.
Ita Brennan
Okay. Thank you very much. Thanks.
James Fish
Thanks, everyone.
For further details see:
Arista Networks, Inc. (ANET) Piper Sandler Growth Frontiers Conference (Transcript)