Over The Edge

Digital Infrastructure in an Emerging AI World with Greg Cudahy, Global Leader of TMT and Global Industries AI Leader at EY

Episode Summary

Organizations need access to digital infrastructure and communications networks to leverage AI in a cost effective way. In this episode, Bill sits down with Greg Cudahy, Global Leader of TMT and Global Industries AI Leader at EY, to discuss digital infrastructure in an emerging AI world.

Episode Notes

Organizations need access to digital infrastructure and communications networks to leverage AI in a cost effective way. In this episode, Bill sits down with Greg Cudahy, Global Leader of TMT and Global Industries AI Leader at EY, to discuss digital infrastructure in an emerging AI world. The two talk about the judgment calls we will have to make about AI and how culture impacts AI use.

 

---------

Key Quotes:

“The people part of generative AI is every bit as important to get traction as the technology is.”

 “All the forecasts are saying that roughly 75 percent of the data over the next 10 years that we see, that's acted upon for AI or any other business use, is going to come from edge computing. It's a groundswell. This is a huge trend.”

“As edge computing rises, you're creating more points of entry for bad actors. So the importance of cyber has never been higher.”

“Moving all this data that AI requires means huge investment in communications networks, even more than we've got today…. People don't think about moving the data. They think about what I'm going to do with it. They don't think about what this gigantic swell of volume of data is going to go….Enabling communications infrastructure has got to be a priority for virtually every nation state right now.”

--------

Timestamps: 
(01:25) How Greg got started in tech

(09:38) AI shifting from a feature/function to a focus business outcome 

(14:35) Bespoke models at scale 

(19:00) What bandwidth do we allow AI to work within?

(32:59) How corporate and geographic cultures impact the use of AI 

(34:46) Navigating regulation and security concerns with AI at the edge 

(41:29) The need to invest in communications networks to move data

--------

Sponsor:

Over the Edge is brought to you by Dell Technologies to unlock the potential of your infrastructure with edge solutions. From hardware and software to data and operations, across your entire multi-cloud environment, we’re here to help you simplify your edge so you can generate more value. Learn more by visiting dell.com/edge for more information or click on the link in the show notes.

--------

Credits:

Over the Edge is hosted by Bill Pfeifer, and was created by Matt Trifiro and Ian Faison. Executive producers are Matt Trifiro, Ian Faison, Jon Libbey and Kyle Rusca. The show producer is Erin Stenhouse. The audio engineer is Brian Thomas. Additional production support from Elisabeth Plutko.

--------

Links:

Follow Bill on LinkedIn

Follow Greg on LinkedIn

Episode Transcription

Producer: [00:00:00] Hello and welcome to Over the Edge. This episode features an interview between Bill Pfeiffer and Greg Cudahy, the global leader of TMT and global industries AI leader at EY. Greg is responsible for driving EY's portfolio of technology related relationships into other sectors, blending emerging technology with business innovation to accelerate client innovation.

He also leads a 62, 000 plus person team focused on helping clients increase competitiveness and deliver agile corporate strategies for growth. In this conversation, Greg discusses how his clients are thinking about AI and the necessary increase in focus on digital infrastructure and communications networks.

He also dives into how culture impacts the judgment calls that we'll have to make regarding AI use. But before we get into it, here's a brief word from our sponsor.

Ad read: Over the Edge is brought to you by Dell Technologies to unlock the potential of your infrastructure with Edge solutions. From hardware and software to [00:01:00] data and operations, across your entire multi cloud environment, we're here to help you simplify your Edge solution.

So that you can generate more value, learn more by visiting dell.com/edge for more information, or click on the link in the

Producer: show notes. And now please enjoy this interview between Bill Pfeifer and Greg Cudahy, the global leader of TMT and Global Industries AI leader at ey.

Bill Pfeifer: Greg, thanks so much for joining us today.

I cannot wait to have this conversation and find out more about your perspective and your views. You, you have a view into a wide swath of the market, and this should be a lot of fun. Just by way of introduction, can we talk a little bit about your history? How did you get started in tech? How did you get here?

Greg Cudahy: Sure, and I'm glad to be here, Bill. Appreciate the opportunity. I've been in tech forever, pretty much. And I actually started at 16 as a programmer for a research institute to help fund myself through college. And did everything like [00:02:00] MISLFN, Flutter analysis. Did. Bird strike on canopy analysis and included doing things like the space shuttle's fuel flow routine at 19.

So that was kind of an interesting thing. I got paid 4 and 25 cents an hour to do that, which I'd known what I know now about those things. But, but it was a great intro to tech and I came out of undergraduate and then went to work in hardware with Texas Instruments and did sort of a career shift, went back into get my MBA, do more of the sort of the.

Business side in tech. I've worked throughout the rest of the career since I got my MBA in 88. Really? in professional services, but also software. So I've sort of been in and around this industry one way or another for a long, long time and seen a lot of change.

Bill Pfeifer: I can imagine. Now, today you lead a major division of EY.

Your team is responsible for something north of 10 percent of their overall business, which is amazing. So you have a really good cross sectional view of what's going on globally across the [00:03:00] market, across a bunch of different markets, I would imagine. How have. You've seen customer requests, interactions, needs change over the past couple years, right?

We've seen so much advance with data, data management, artificial intelligence, move toward edge, all of this stuff. It's kind of all coming to a head right now. And I would imagine you're seeing more of that movement than most of us.

Greg Cudahy: Yeah, I think, you know, the part of the business that I've been involved with is, it's called TMT, it's tech, media, and telco, and we actually have 18 sub sectors inside of it, right?

So it includes the obvious things like software companies, semiconductor, hardware, but it also includes various different kinds of entertainment like, Uh, theme parks and cruise ships. It includes telecommunications and you're right. So it's a broad swath. And it's kind of interesting because it's, it's what our clients are seeing, but many of these companies that are TMT clients are also involved with all the other industries.

When you're working with a Microsoft or a [00:04:00] Dell or an NVIDIA, there's the work that we're doing directly with them, but there's also the work that we're doing with them in other industries. So it is kind of an interesting thing. And I think. You can break it into a couple of things. Obviously there are industry specific things, for example, advertising and network equipment companies have very different views as to things.

But I think if you look three years ago, it was coming out of, you know, sort of the hard lockdowns of the pandemic. There was, it was really about reopening business and reaccelerating three years ago, then the real big change has happened about the last 12 to 15 months. And with the obvious topic that everybody's talking about, which is generative AI.

And I think it's really interesting because a lot of the stuff we were doing coming out of, you know, those first year or two of the pandemic was Industry specific, right? And it was a lot about getting back on the growth agenda. There were some who were looking, you know, like telcos in [00:05:00] particular, we're looking at cost cutting to create capital to go ahead and drive growth.

So it did matter, but now generative AI is across everything. The applications of it are very different, but I think it's really interesting, you know, some of these other things you had to sort of push clients to think about, the early adopters were in some cases few and far between. But I think what we've really seen now is everybody is looking at generative AI right now.

There is not a client that we have that's not looking at it. The interesting thing is some of them really know what they want to do with it, but a lot of them have budgets. We've walked into many clients and they say, you know, we've got, X millions of dollars to spend on AI. Well, what are you looking to achieve with it?

Well, we don't know. And so that's been a very interesting part. But the one common theme through all of it has been this notion of digital infrastructure. I don't think many companies, certainly outside of TMT, but even within TMT, we're really clear on. It's [00:06:00] the war for talent and things like that has always been around.

It'll continue to be. Talent's always going to be scarce, but infrastructure actually run. AI, and to be able to do that at different parts of, you know, in the cloud or at the edge, things like that, that particular thing is now what everybody is focused on. It's not only how do I get access to infrastructure for AI, but how do I actually make sure it's cost effective?

Because there's different cost profiles for doing it. So it's been quite a shift coming out of the pandemic. And then with this generative AI launch that's attracted everybody's attention and it's real, and it's not something that's gone through a typical hype curve, like we've seen other technologies.

Bill Pfeifer: So it's interesting, you were saying. Before and going into the pandemic, companies were trying to cut costs to move in certain directions, but now everybody's interested in generative AI. Are companies trying to pare down their operations elsewhere? Are they doing net new investment? Is this just [00:07:00] a massive expansion?

Greg Cudahy: Yeah. Where's that coming from? That's a really good question because some of the cash flush businesses are just going. Fast for impact, right? And I think that the cashflush businesses are more likely to use AI to go ahead and do things that are growth oriented, that are customer oriented, that may even be like product development, product design oriented.

It's really not a question of, are you going to use AI for one thing or another? It depends on the industry subsector and what you prioritize. So your whole point about cost cutting, if you look at some of the telcos, for example, telcos are huge companies. They throw off great dividends if you're an investor, but also it can be a very, very capital intensive business and AI is all about that capital.

So a lot of the work that we're doing is when you go into telcos, for example, is to help them sequence it. So their initial use of AI is actually freeing up cash flow so they can invest [00:08:00] that cash flow. Elsewhere. And it's mostly around the AI topic. It's not just AI, it's AI plus other things as well. But I, I think it's a really interesting question you've posed here is it's actually varies, even though they're all going the same direction in terms of the application of AI, how they do it and how they sequence it.

Sometimes sequencing is more important than timing, frankly, and sequencing, whether you need cash or whether you need market growth or whatever, it's very, very subsector dependent. Wow.

Bill Pfeifer: It really is going to be fascinating to see where this goes because it's moving so fast and so many companies. It's a giant experiment globally with what we're going to change our businesses into.

And where that goes is going to be fun to watch over the next couple of years. I would imagine it's going to settle out within the next five to 10, but until then, it's going to be, it's going to be a bumpy ride. So you mentioned NVIDIA a couple of minutes ago and also Gen AI, two of the darlings of the tech world right now.

Everybody's kind of focused on them. [00:09:00] I think you were at GTC not too long ago.

Greg Cudahy: That's right.

Bill Pfeifer: Which I can imagine with their stock valuation and everything going on, that must have been unlike every other GTC before it. What was your highlight of that conference?

Greg Cudahy: Yeah, it's interesting. I could talk about the technical highlights of it, but actually we had a pretty large team there and the big change, not just for GTC, but actually for the mothers who were involved in the infrastructure part of AI, is that For the first time, we were seeing as many or more business execs there and business oriented folks as tech oriented and development oriented folks.

These have typically been very heavy tech development focused conferences, GTC being the most obvious one that now you suddenly, there was standing room only for things like, how do you develop an AI business case? I don't think five years ago [00:10:00] you would have had more than 20 percent of the room filled.

That kind of a topic. Now, everybody is focusing on the business aspects of AI in general, but generative AI in specific. One of the reasons why I think the hype cycle, there's always a hype cycle, but why I think it's faster and the likely dip is going to be shallower, is that suddenly the gap between technology transformation and business transformation is shrinking close to zero.

You could see it by the change in the audience. And again, we go to the tech aspects and all that thing. There's a lot of great tech covered, but the really important part is that business and tech are coming together very rapidly on this topic. And it was very evident from GTC.

Bill Pfeifer: So did you notice, I mean, you were saying like the business case of AI would be full.

Was there a major shift in the types of topics more toward the business side, the low, the less technical [00:11:00] side? Was it still as technical as it was and the businesses got more technical?

Greg Cudahy: That's an interesting way to phrase the question because it's not like there was a reduction In tech, I think that that's accelerating as well, right?

So there were a lot of hardcore tech folks, you know, really deep in the bits and bytes of things, but it was also a scale that it hasn't been held at before. Now it was, it was virtual a few years before, but you compare this to all the physical ones in the past, this was by far the biggest. And so it wasn't as if you're trading one off for the other.

So it was, like you said, it was the topics were clearly more business oriented. I think it's a shift from feature function. To business outcome. And what we're finding is what really makes success right now, whether it's in our industry, you know, professional services, or it's people applying this in corporate environments is tech savvy business people or business savvy tech people, that sounds straightforward, but honestly.[00:12:00]

Finding either of those in the market today is such a premium. It's so hard to find. I think it's not only the topics that were there, but the people were there. And I think there's a lot of people who are saying, how do I expand my own personal career? By figuring out this beast that is generative AI and what role I can play or stretch myself.

So it was kind of a, not just topics, but a little bit of a human interest aspect from my perspective as well.

Bill Pfeifer: I like the point that you made a minute ago about generative AI shrinking the distance between The business and the technology and helping, helping the business absorb, right? Helping the technology apply to the business, connecting the AI deeper into the business and interpreting results for the business and things like that, as opposed to just analyzing the data.

Hindsight is a service, if you will, right? Getting more into business type outcomes, which is, which is sort of fascinating. It makes sense that NVIDIA would. And, uh, [00:13:00] I think that's a really good way to start to lead that because they seem to be driving so much of this, that being the biggest driver of GPU sales right now, I would imagine.

Up until now of edge deployments, in particular, AI deployments, data management deployments, all kind of, you know, coming together as one is that they've been very customized, very bespoke for, you know, here's the way I'm trying to solve this business problem for this segment, different per vertical, per customer, per use case.

even per location. And they were all snowflakes. As you're seeing more of these deployments, more of edge, more of AI, more of data generation and management, are you starting to see more commonality, more best practices, more, you know, central themes coming out or are they still snowflakes?

Greg Cudahy: Yeah, there's a lot to unpack in what you just teed up.

It's really interesting because I think if, if one's worked in like robotic process automation, [00:14:00] Or some earlier versions of machine learning that I've been involved with. You tend to think about building these rule sets that you then create a library of. And I think in things like robotic process automation, you're, you're almost definitively taking something that's standardized and this ends up becoming a best practice.

And then it becomes easier and easier and easier and so forth. AI, even though I think intellectually we understood how generative AI was going to impact operations, when you actually get to the individual clients, you find out that just because they're doing the same thing in the same industry, even mostly the same way, the difference with generative AI is, I use this term, the personality or the culture of the company affects the What answers you want generative AI to give, what bandwidth you're willing to do it.

And then you add in edge, which is, you know, a lot of this has been analytical stuff, but the edge stuff is real [00:15:00] time use of basic AI. So it makes it incredibly complex. So where I think it is, is I've sort of got this catchphrase, which I think it's bespoke at scale, which is a little bit of an oxymoron is what you're trying to do is.

You're not going to do it the way, you know, a lot of the old 90s ERP companies said, well, you know, I built it on the first client and then I do two more and then everybody after the third customer, do it this way because it's best practice that is, that has gone the way of the dodo bird in terms of, I think, largely, and not just generative AI.

So what we're trying to figure out is how do we create sort of a deli menu of different capabilities, and then we have a fairly thick client configuration Layer. And the last point I'll make, cause I know I've gone a little bit long on this, but is that with, with AI, it's not like the old view of enterprise software.

You build it, you define it, then you wait for an update. So you're creating the [00:16:00] ability to have dynamic updating of your operating systems, right? That takes a lot of focus and understanding about what you want generative AI to do or not implied in the beginning of your question, right? Is what do, what do I want to actually let it do?

Do I want to let it change my manufacturing run in my manufacturing line real time? Maybe I do within some parameters. That's really important that every company is going to make a different decision. On how they use that. Therefore, I don't see this becoming, Hey, it's best practice. This is the standard for the network equipment industry, or this is the standard for advertising.

It's going to be more, yeah, we have some rule sets we can use there, but the, the, what I call the configuration layer, it's conceptual as opposed to literal, the way a company uses that, the way it applies, it is going to be very different, dependent on their, their leadership and their strategic intent.

Overall, that's why I really think business and technology or the [00:17:00] gap between the two transformations is shrinking to zero.

Bill Pfeifer: So funny that you were saying bespoke at scale. Because that was one of the promises of AI, right? It'll let us do mass customization. And you know, like you can, you can go into a retail establishment and order just the shirt that you want.

That's customized to your fit and things like that. And now AI itself is customization at scale, mass customization bespoke at scale. You're, you're still seeing reasonable amounts of that customization, right? Bespoke at scale. Do you think that's going to change? Is it going to come toward the middle as companies start to evolve?

I mean, you were talking about. Changing your manufacturing line on the fly automatically, which is fantastic and amazing, but terrifying. What if it makes the wrong decision and it destroys my manufacturing line? So, you know, you have to generate some [00:18:00] trust there. And if you're not already fully data driven and really know here are the guardrails within which it can work automatically, then that's going to be just outright terrifying.

But as you develop some experience with that, maybe you'll get. More commonly toward that. Is that, again, is that likely to come toward the center or is it going to be always? Yeah, I like the way

Greg Cudahy: you've characterized this because the really brutal, frank answer is we don't know yet. Directionally, absolutely.

You're absolutely right that we will start getting to a point where there are levels of, you know, AI in general, but generative AI in specific, that, that we will allow. And what this is going to be is there's going to be, there's not a lot of discussion about this, and I think there will be an increasing amount of discussion, is what bandwidth do we allow our generative AI to work within?

And I think what you're going to find is there's going to be a lot of learning, a lot of creation, and certain processes will [00:19:00] converge rather rapidly into almost an industry standard. But once it starts touching, am I trying to take market share? Am I trying to maximize profits by trying to go ahead and generate cash flow, you know, and with tax considerations and all that, that will, again, you just won't get to an industry standard because those are judgment calls.

And I think that's the other thing is. Everybody's worried about what AI will replace. Let's face it. There's, there's judgment that AI cannot execute on its own. The AI does not do a great job of evaluating risks. It can only give you really fast analysis of something. It can't tell you whether you should take the risk or not.

So if we go to the examples that we're talking about, when you're doing something that is analytical and there's lots of time, you know, I can go ahead and have generative AI come back to me with recommendations and things like that. I can consider them. But if you're doing something that's, um, operational, I'll go back to the manufacturing line, a great edge [00:20:00] example.

You may allow it to shift and change what the machine is doing within certain proven parameters. Generative AI may identify a new, and this is, we're actually doing this with a client as we speak. That's why I'm speaking with some level of detail is the generative AI may actually say, here's a way to improve this performance right here, but you may not allow it to actually do that until it's been elevated And that's why To someone above that line level, maybe within the plant, maybe within the networking plants to say, okay, we're going to authorize this.

This also means there's, we're going to have real hybrid models between edge computing, on premise computing, and cloud public cloud. And you're going to have different kinds of generative AI in there. Some that's got to work really, really fast, the manufacturing line, obviously, but then you've got to have things that maybe I'm going to distribute that learning from that particular plant to my other 200 plants around the world.

Once again, you've got to decide, who do you trust? Do you trust the generative AI to do that on its own? [00:21:00] Probably not initially, especially with hallucination rates still close to zero. But you're going to go ahead and eventually allow more and more time. So, the part we don't know is, we know it's going to happen, so there's going to be some converging on these practices.

But how it will manifest itself and how it will change, I don't think anybody can reliably say that for most business processes.

Bill Pfeifer: And you mentioned hallucination rates, which is another interesting topic, right? It's, it's a funny thing. People started freaking out. When they saw generative AI make mistakes, well, people make mistakes.

We wanted the AI to be like people. Guess what? It is. Right. And you know, we're working with large language models, so it can tell you the different types of women's shoes and which wine goes best with dinner, and write the answer in the form of a haiku. And so, Until we get down to small language models or focused models that are, you know, within, trained [00:22:00] within the constraints of a manufacturing product line.

Okay. Now, when it hallucinates, it's going to be at least very, very close to the right answer because it's not trained on You know, all of the, the pop song lyrics of the last 20 years, which is a little scary. Are you starting to see more focused models coming out or are they still reasonably, reasonably standard?

I mean, it's, you can't, you can't cost consciously train your own large language model, but then we get into smaller language models or hybrid models or something

Greg Cudahy: like that. Right. You're a spot on. With the inference there, no pun intended. The fact is large language models are needed for a lot of these consumer type of applications, honestly.

And, and they're quite expensive, right? And the ability to update them with the latest data as well as see with many of these popular platforms, this data is through, you know, December, I'm making the dates up, but December 21, this one's through January of 23 and all [00:23:00] that, and to keep up to speed, you've got to actually.

refresh and renew those. In a lot of the enterprise applications that I'm talking about, the ones that will actually transform companies, it will be, it's a large language model, I guess is in the eye of the beholder, but it will be what I call more medium language models. And in some cases, what you call small language models, it depends on what it's for.

And I think there's not only the question of the initial source data, but let's take that manufacturing example again. You know, we learned something that we're producing industrial equipment. We learned something in And we share it through the network and we've already taken scan for all the manufacturing learnings that are out in the broader world through these large language models.

Now I'm taking it back into me. I find some innovations. Do I actually want to share that back outside? And in most of our clients cases, They don't, they want to take in everything they can. I think there will be some industry initiatives to start sharing these, [00:24:00] but these are strategic advantages when they learn something in manufacturing or in customer service or in broader supply chain things or in product design.

When you get these new things that your generative AI is helping you create, do you want to share them back? In some cases? Yeah. Because I want people to learn about my product. Do I want people to know how I do that in my product? You know, your, your moats are relatively short in terms of, you know, they're months now instead of years.

But do you want to give that back? That's a real big question. And that's why this is so important that one designs these things, not only looking at the intended outcome, but what are the unforeseen consequences of some of these things? And I have to say, that's one of the things that our clients actually are spending more time with What could go wrong as opposed to what will go right and the what will go right.

Absolutely mission critical, but also [00:25:00] making sure that I'm not doing something, sharing out strategic information that I don't want shared out or not, at least at the time that will disadvantage me. Those are hard questions to make. It comes back to the, I can't make those decisions for you. That's about leadership.

Bill Pfeifer: And the question of whether you share your results with the rest of the world and how. Becomes a much bigger, it's a, it's a confusing question. And I'm looking at. hyperscalers are making, starting to make their own processors and their own AI accelerators, which I presume is to run their model. So it becomes more insular, more, I don't want to say locked in, but maybe.

And then we have meta that just open sourced Llama 2 and gave it away. So, you know, on the one hand, we're seeing things closing down and becoming more customized. per where you're running it and who is running it. And then on the other hand, we're [00:26:00] seeing all of this open source activity and it seems like they're diverging more and more.

So it's, it's more of a, more of a long term large scale type commitment, which is, which is fascinating. And the motivations for one versus the other, nobody likes lock in, but then you give stuff away and it's given away. That's going to be an interesting conversation. I would imagine that's going to drive a lot of of your business as well, right?

Greg Cudahy: I think the interesting thing, Bill, about what you're saying there about open source, I think for the technology itself, there's going to be more of a bias towards open source, the, but you know, is that definitive? I don't know, but that's, that would be my expectation. Just looking at some historical trends, the application of the technology, am I going to be, because you're, you're actually using it and we're using it in our own business, on our own business, insight generation that we just could, we could put.

You know, hundreds of people just looking at this, what AI can do really fast [00:27:00] for us and those insights that we learn about ourselves, those insights that, you know, manufacturers may learn about themselves or retailers or whatever. The real question is, do you have control of what you do and don't release to others?

And I think that's going to be the defining point. I, the technology itself, different view on open source than open source of Aviation insights. If I'm an aerospace designer, that's going to be a, I think that would be the battleground because that's the part, by the way, that really still requires scarce human capacity.

And a lot of people are all worried about what, what happens, how many people is this going to displace? Well, it's going to displace the rote. It's not going to displace judgment and creativity and risk taking and all those things. So I think those will become bigger things that are priorities. Companies, the AI, if you're a leader right now in it, [00:28:00] you're going to have a big advantage over your competitors, especially if you get access to the infrastructure.

But. I'm not so sure that it doesn't soon become table stakes soon being, you know, five to 10 years, the, the real differentiator is going to be the kind of people who can take advantage of it and make the decisions on the questions you're raising right now.

Bill Pfeifer: Sure. So getting a little more tactical, I guess, when you have a project that's AI deploying at the edge with some data collection, data management, do your customers typically ask for?

An edge deployment, help with an edge deployment. Do they ask for help with an AI deployment? Do they ask for a business result? I'm, I'm always fascinated and still trying to understand how these things come about. What people, what level of this are customers thinking about and asking for?

Greg Cudahy: Yeah, it's a very insightful question, right?

Because customers are coming to us in very different [00:29:00] ways. Really many of the larger ones come in with, we want to do X with generative AI. And I know this isn't all about AI, but it's true of some other spaces as well, beyond AI. And what we've really tried to do is, Hey, look, let's, let's understand your business strategy first.

Okay. Right? And usually when you're doing a lot of tech work, it's like, this is a backbone system. It's about storing data. Yes, we need to know what data you need, but how you actually are going to do your business strategy is that, why are you asking me that? Well, actually with AI, it's really important that we get down to what are you actually looking to achieve As opposed to saying, Hey, I think I know AI can do this.

I hope that distinction is kind of clear because, and we call it outcome centricity. There are many ways to go ahead and deal with these, and frankly, some of the things that are small applications of generative AI, honestly, can [00:30:00] be done much simpler than some of these things. So, we differentiate what we call TAUs, transformative AI opportunities.

From sort of point application and a lot of the point to application thing is good to learn from but do you really need all that power to solve something that frankly with a couple of simple rules and some robotic process or automation, much less expensive you could use? Yeah. So, so I, I would say that to your question, clients are, and clients are coming to us saying, I've got a business outcome.

The most, the most thoughtful of those clients are I'm looking for. What can I do? Because they also know that infrastructure is still a limitation. So some of them can't wait. for that business problem to be solved until they have infrastructure. So it's, it's not as simple as everybody comes with one thing or even one, one or two methods is the industry norm way to do.

Bill Pfeifer: And what's, what sort of timelines do you deal with for projects? What's like, [00:31:00] how long from, from ideation, if someone came to you today and said, you know, I need help with a business output, what does that look like in terms of the engagement typically?

Greg Cudahy: Yeah, it ties really nicely to what you were. Talking about before we have a variety of proofs of concept that are proofs of concept.

And I really don't like proof of concept. The concepts work. It's really proof of application, right? The technology works. Do you have the infrastructure? Who knows? But the ones that are fairly focused, we can come in and do an assessment, be up there running, using AI tools themselves to help develop the AI.

We can be in there in a matter of weeks. And then actually be in production in a matter of months, if it's a fairly narrow scope thing. On the other hand, we've got, we're working with a major tech firm and we're looking into the three to five year effort for them to completely AI enable every single part of their business.

It still starts out with some small things because you've got [00:32:00] some Downing Thomases, right? So why don't we go ahead and apply this to something that's a current problem? And as part of the change management process, show people, what does it feel like to have generative AI working on your behalf? I do think these can be quick.

What actually takes the longest, it's, if you have the infrastructure, what takes the longest is getting people to agree what you want done. And what, to your earlier questions, What are you going to allow generative AI to do? And there's corporate cultures. And I will say there's even geographic cultures.

Some, some cultures and some industries are not real willing to go ahead and let their hands off the wheel, even if it would be better to do so. Got a client in Asia. We've actually been able to show that we can allocate inventory. Reallocating for incredibly fast, well, for their, you know, midsize business customers, they're comfortable.

Okay, we'll let the system do that. But if you're going to make a big change for one of their tier one customers, [00:33:00] that has to go through senior executives. They have to contact the customer and things like that. Well, guess what? Ironically, it means your mid scale customers are getting faster decisions.

than those that you want to treat as better customers. So that's what I mean by this notion of culture affecting how you deploy the technology. And this is the people part of generative AI is every bit as important to get traction as the technology is.

Bill Pfeifer: That's an interesting point to consider, but yeah, I guess not just learning how to use it, but learning how to trust it and where to trust it and how much to And what's the priorities, right?

Right. Yeah, and I had thought about that as related to business culture, but not the different business cultures geographically, which is a whole different layer on top. So closer to home, just top of mind for me right now, I spent the past week working on some security training. That's what I happen to be doing this week.[00:34:00]

There's a big chunk of that that's focused on regulations. The level, the, you know, the recovery times that are required and the fines you can get if you don't have them and all this, all this other stuff. But it's a very heavily regulated space and there's so much momentum, right? New, new requirements coming in place, industries and nations and just best practices and regulations and things like that.

What are you seeing in The edge and AI space. Are you seeing much regulation? Is it still too new? Is it coming?

Greg Cudahy: Yep. Yep. I think I'd break that into two parts, just the overall topic of security and cyber. Let's start out with that. And it is related to the edge point. We'll hold off on AI for a second. The whole security aspect First of all, all the forecasters are saying that roughly 75 percent of the data over the next 10 years that we see that's acted upon for AI or any other business [00:35:00] use is going to come from edge computing.

It's a groundswell. This is a huge trend. The interesting thing about what you brought up about the cyber training is that as edge computing rises, you're creating more points of entry for bad actors. So the importance of cyber has never been higher because then when you take AI, which can make decisions happen at the speed of light, can also make problems happen at the speed of light and give people access at the speed of light.

This is an incredibly important area in terms of cyber is a huge focus and it's only going to get bigger. And that you add the geopolitical things, which aren't part of our podcast today. But, but the fact is geopolitical tensions drive security concerns. Full stop, right? So that's, that's the first half.

The second part of your question, which is the regulatory aspects, it's so different depending on which government we're working with, right? And you're [00:36:00] even seeing right now, just the privacy aspects of a lot of countries are now looking at, okay, I want to use. public cloud, but want commitments that the data stays only within our national boundaries.

That's obvious for government applications themselves, but even for consumer. And so now this raises it to be, and technology is now a strategic priority of nation states. It's always been somewhat, but it's really important right now. And so therefore, if you look at the regulatory environment, there's some, you know, the U.

S. tends to regulate after something has been. Tried the Europeans. These are broad statements tended to regulate something in advance. I think that ability to regulate in advance of things, the technology is moving so fast, it's hard to anticipate it. And so we're seeing very disparate regulatory environments.

So it's the two coming together. You've got cyber concerns, which are certainly a national security concern, certainly an [00:37:00] economic concern. And then you have regulators trying to do, to deal with that through regulation, but regulation is moving. Far slower than the technology is. So I don't think there's any definitive answer, but, but the path of travel is that you're going to have to be able to work with the regulators as you make big scale changes with these technologies in most countries now, the notion of public, private partnership is going to continue to accelerate from our perspective because there's too much concern about if these technologies are let go on their own.

Is it valid concern? I don't know the answer, but I do know that. The idea of public private partnerships, dealing with the regulatory and the security issues together. It's just going to be a rising trend. And I think exponentially so.

Bill Pfeifer: Okay. And you touched on something there that we haven't touched on before in this podcast, which is geopolitics.

And Boy, from an AI perspective and an edge computing perspective and global [00:38:00] implementations, there's some war, there's some rising tensions, there's economic uncertainty, there's political uncertainty globally, and public private partnerships, but then what about when the private part is in a different nation state than you're in?

How in the world do you? How do you handle doing business cross border with all of that stuff going on?

Greg Cudahy: Well, there's a couple of perspectives on it. One is just your cross border point. And just, you get all this media discussion of globality. And I'll stay away from all the hype and all that, but I'm in Palo Alto today, and I'm at breakfast, and there are at least, in the breakfast room, 12 or 15 different nationalities all coming to, you know, to Palo Alto to work with different companies.

And so the cross border collaboration in terms of the technologists themselves still remains pretty good. But the regulatory side that you brought up [00:39:00] earlier can either impede that or it can lift that. And, and I think it's almost like we're seeing a lot of accelerator break, accelerator break, accelerator break, because if you over regulate, you don't get innovation.

You don't regulate enough, you get innovation that has unintended consequences, especially when you add the geopolitical part. So I don't think there's any recipe that works. Do you think one thing that is, is to the greatest extent possible that you're trying, one is trying to achieve something, getting all the parties in the room early, not dealing with the regulators.

After you've gone down, the path is really important. And I give you a great example of how many. Leaders of state now consider technology. If you've ever been to Viva Tech, which is held in Paris, that's a very large European based conference. And Macron, you know, the head of France there is he's, he's there.

He's a keynote frequently. [00:40:00] When did we expect nation state leaders? To be at tech conferences. He's not just there, he's walking the halls, seeing the latest technology. And I think you have this geopolitical thing. Technology is now in the top three of almost every government's agenda. Right? So, so I do think.

Figuring out a better way to do these things is interesting you've raised here. It's almost like there needs to be a little bit more of a cookbook of how regulators, technologists, and industry work together. Because, and that includes financing as well, right? There's really not a playbook on how to do that the right way today.

Bill Pfeifer: And it connects into your comments about Gen AI bringing the business and the technology closer. Maybe we're also bringing the politics and the business and the technology closer, either with it or because of it or for it. And we'll see where that takes us.

Greg Cudahy: Yep. One add on thing I wanted to say is we've [00:41:00] talked a lot about AI.

We've talked a lot about digital infrastructure. The one thing we haven't talked about is the communication side of things. And Moving all this data that AI requires means huge investment in communications networks, even more than we've got today. So to your point about working with the regulators in the nation states, in the vast majority of countries, It's the regulators and it's the legislative side that are allowing you to develop that capacity that is required to make all these things work in the first place.

People don't think about moving the data. They think about what I'm going to do with it, with those decisions. They don't think about what this gigantic, swell volume of data is going to go, even more than we've seen today. So I think you've raised a really good point on this in terms of the regulators, not just for those pieces we're talking about, AI itself.

Enabling communications infrastructure has got to be a priority for virtually every nation state right now. [00:42:00]

Bill Pfeifer: Mm hmm. That actually touches on an interesting point that I was exploring not too long ago in my role at Dell, which was the sustainability aspects of edge computing and moving the data takes a surprising amount of power.

And so it ends up Improving a sustainability story, just having edge computing. I was thinking, you know, distributing all that computing has to be a sustainability nightmare. What in the world? How are we doing this? And it's not necessarily, it's kind of a fascinating balance that you can strike. And there's, it just goes into so much depth of the conversation.

There is so much more that we could talk about. This was fantastic, Greg. Thank you so much for the time, the perspective, and just sharing your points of view. This was a really fun conversation. How can folks keep up with the latest that you're up to and learn more about you?

Greg Cudahy: Well, you know, you can certainly follow me on LinkedIn.

I post a lot of the things that we're doing and seeing with clients. But I also think, you know, check [00:43:00] out EY. com where you can see some of our latest things, especially around the AI topic. You'll find a lot of our thought leadership there. And honestly, I'm always happy to, you know, chat with anybody if they've got something of interest on these topics.

So don't hesitate to reach out to me directly as well.

Bill Pfeifer: Love it. Well, thank you again so much for the time, Greg. This was wonderful.

Ad read: That does it for this episode of Over the Edge. If you're enjoying the show, please leave a rating and a review and tell a friend. Over the Edge is made possible through the generous sponsorship of our partners at Dell Technologies.

Simplify your edge so you can generate more value. Learn more by visiting dell. com slash edge.