Over The Edge

Global Implementation of Edge AI Technologies with Paul Savill, Global Practice Leader of Networking and Edge Compute at Kyndryl

Episode Summary

In this episode, Bill sits down with Paul Savill, Global Practice Leader of Networking and Edge Compute at Kyndryl, a company that builds and modernizes the world's critical technology systems. Paul provides insights on the cultural challenges between IT and OT teams, the implementation of edge computing across industries, and innovative use cases including robotics in retail, AI deployment, and private 5G networks

Episode Notes

In this episode, Bill sits down with Paul Savill, Global Practice Leader of Networking and Edge Compute at Kyndryl, a company that builds and modernizes the world's critical technology systems. Paul provides insights on the cultural challenges between IT and OT teams, the implementation of edge computing across industries, and innovative use cases including robotics in retail, AI deployment, and private 5G networks
---------

Key Quotes:

“To really leverage AI, you have to start with a data foundation. Are you collecting enough information? Is the information that you're collecting correct? Have you got good data integrity? And do you have enough of it over time?” 

“The value of private 5G networks lies in empowering enterprises to innovate within their own environments.”

“The biggest cultural challenge is bridging the gap between IT and OT—it’s a dynamic of trust and risk.”

--------

Timestamps: 

(01:11) How Paul got started in tech 

(05:43) Challenges in IT and OT integration

(15:08) Industry use cases and innovations

(16:45) Bitcoin mining and edge computing

(21:10) AI models and custom solutions

(26:33) Global trends and private 5G

--------

Sponsor:

Edge solutions are unlocking data-driven insights for leading organizations. With Dell Technologies, you can capitalize on your edge by leveraging the broadest portfolio of purpose-built edge hardware, software and services. Leverage AI where you need it; simplify your edge; and protect your edge to generate competitive advantage within your industry. Capitalize on your edge today with Dell Technologies.

--------

Credits:

Over the Edge is hosted by Bill Pfeifer, and was created by Matt Trifiro and Ian Faison. Executive producers are Matt Trifiro, Ian Faison, Jon Libbey and Kyle Rusca. The show producer is Erin Stenhouse. The audio engineer is Brian Thomas. Additional production support from Elisabeth Plutko.

--------

Links:

Follow Paul on LinkedIn

Follow Bill on LinkedIn

Episode Transcription

Producer: [00:00:00] Hello and welcome to Over the Edge. This episode features an interview between Bill Pfeiffer and Paul Savill, global practice leader of networking and edge compute at Kyndryl, a company that builds and modernizes the world's critical technology systems. Paul provides insights on the cultural challenges between IT and OT teams.

He also talks about the implementation of edge computing across industries and Dives into innovative use cases, including robotics and retail, AI deployment, and private 5G networks. But before we get into it, here's a brief word from our sponsor. 

Ad: Edge solutions are unlocking data driven insights for leading organizations.

With Dell Technologies, you can capitalize on your edge by leveraging the broadest portfolio of purpose built edge hardware, software, and services. Leverage AI where you need it. Simplify your edge. And protect your edge to generate competitive advantage within your industry. Capitalize on your [00:01:00] edge today with Dell Technologies.

Producer: And now, please enjoy this interview between Bill Pfeiffer and Paul Savill, Global Practice Leader of Networking and Edge Compute at Kyndryl. 

Bill: Paul, thanks so much for joining us today. This should be A really fun conversation. We've got a lot to talk about today. So starting out kind of, kind of way back in time, how did you get started in technology?

What brought you here? 

Paul: Yeah, well, you know, I grew up in rural Oklahoma and I had a neighbor who was into ham radio. And he started kind of getting me interested in that because, you know, I live a way out in the boonies. And so, you know, the idea of being able to talk to people all over the world through radio sounded a lot of fun to me, so I really got into that and did do that, you know, for, as I was growing up, I did it all through high school, was a, got my general class license and.

That kind of led me into electrical engineering and communications in college. So I got an electrical engineering degree and my [00:02:00] emphasis was around, you know, different types of technologies of networking and communications. Then that led to me getting into working as a, you know, for a telecom company and, and getting, moving toward technology more and more, I started in engineering.

There and ran a bunch of different groups around network planning and, and network migrations and architecture. And so that's how I, how I started down this whole path. 

Bill: Kind of makes sense. Kind of makes sense. And looking career wise, you started out in sort of the core telco products, doing like big communication type stuff.

And then you moved into data and now What's happening with data at the edge, which is an interesting series of leaps. It's kind of a big jump. Drove that. What, what took you to the edge? 

Paul: Yeah. So it is kind of interesting, I think, because the, I, you know, at the telecom companies, I did most every kind of technical job that there is to do there.

You know, I worked in, you [00:03:00] know, I worked in the field. I did in, worked in engineering. I did a lot of architectural work I had was in operations, but in the later part of my career, I started shifting over to product management and to more business types of roles. And as part of that transition, there was a group that.

Got moved under me, which was like our consulting group, our technology consulting group. And they did, we did professional services work. We did designs on different types of technologies for them. And that was really what first got me started moving toward the edge because right around that same, that time there were new technologies like Kubernetes and containerization and orchestration technologies around containers that started emerging and started maturing.

And. Customers were really interested in how they could leverage that, and so we started in that group, working with those customers and working on engagements around design and deployment that would combine [00:04:00] those containerization orchestration techniques with network solutions to deliver on edge computing, um, Services that these companies would need.

And so that's kind of how I started making the transition into, into technology and IT tech. 

Bill: It makes sense. And it's really. It sounds like customers were having a hard time moving. And so you were helping them move from the start, which is, I mean, kind of amazing, right? In general, I think most people figure out that process and then eventually get around toward helping other people do it.

But you were starting with the customer in mind. 

Paul: Yeah, that's right. It was really a great experience for me. Like a lot of experience I had just being hands on with the technology. Some of the young people in IT now kind of take it for granted, possibly around how they can use distributed compute and how cloud compute works.

But back in those days, you know, it wasn't, that was breaking new ground and [00:05:00] really figuring these things out, taking stuff that was still very early in its development and release dates. And tying it all together and figuring out how to make it work and was a was a great learning experience. And it also had a lot of tie into networking because the idea of having managing a lot of very deeply distributed compute and networking it all together was a really big networking challenge.

And that's why it all kind of came together for me with that technology group and my experience in in networking and telecom. 

Bill: Nice, love it. So, along with that, right, that's the technology side of it, but edge computing is really about taking IT technology and moving it out to OT type spaces, right, putting it out where the users of those downstream things are, the logistics and the retail and the factory floors and things like that.

So, let's say there's a huge culture difference between [00:06:00] IT and OT, and being in the practice of building out edge computing, you're the one who's actually moving that technology across those cultural lines. 

Paul: Yes. 

Bill: You have to work with both sides. How do you do that? What sorts of challenges have you come across?

Paul: Yeah, that really is a huge challenge in the industries right now. And, in fact, it just reminds me, as you were saying that about, uh A conversation that I had with a CIO of a large steel company who literally was asking us that same question was basically describing how he was having trouble leveraging all of the information that's getting generated now from OT in many of their manufacturing facilities around the world, and getting it to, getting access to it as, as the kind of head of IT.

to make performance improvements and to learn from it and to make operational suggestions for improvements. [00:07:00] And, you know, the reason why it's a huge cultural difference, I think is because in a lot of companies, there's this divide between the operations teams and the kind of more corporate IT organizations.

When you think about a large factory, for example, there is always an operational plant manager who is over that plant and Really owns the performance of it in the end. It owns the P& L of the plant and the facility is totally, it's, that is the person where the buck stops with that person. If there's anything that fails, that goes down, that breaks, that blows up, that doesn't make money, that harms the company.

And so when, when you have somebody from corporate kind of coming in. From, you know, landing from the sky to in your, your plant that, you know, you know, inside and out and suggesting that you open up things and you bring in new, new technologies that are going to allow the outside world or to, or corporate to have visibility into what's going [00:08:00] on deeper, you know, culturally, it's kind of threatening.

to allow that to happen. It's also scary for a plant manager because you feel like you've got your plant under control and it's running and you're, you're hitting your marks. And this is a risk, you know, to let somebody else come in and start dropping in technology that you're, you're not familiar with, or you're not comfortable with.

That kind of dynamic is really what, what we see happening right now out across the world. 

Bill: So one of the things that I saw have come across in the past in my IT background, right, was The idea of, of uptime and you're committed to three nines, four nines, five nines of uptimes. In the past, I've heard things like if you're committed to three nines of uptime and you deliver five nines of uptime, pushing fast enough, you're not taking enough risks, you're being too conservative, things like that, which is a very non OT sort of mindset.

Paul: Yeah. And 

Bill: the idea of, You know, I'm only going [00:09:00] to give you three nines of uptime for your factory systems. Whoa, what? That's not my only source of downtime. And that's not an acceptable level of downtime. So I can imagine that's indicative, that's kind of the linchpin of the sort of conflict that you would have.

And is IT adapting more toward OT expectations, is OT becoming more flexible? How does, how do we get to that middle ground? 

Paul: You know, I do think that there, we're getting there. I mean, I, I have seen it personally in a number of companies and the reason is just that the value that can be gained is just way too much for the company for the two sides to really start figuring it out and figuring out what's acceptable.

And I, I got perfect total sympathy for an operational plant manager saying, I'm not going to, there's no way I'm going to move from. You know, five nines, uh, uptime to three nines uptime. And that's right. And in most cases, they're, they're right about that. I mean, it would be at that, that's a huge amount of money for them, but they're also, and so the, the IT people have to understand that [00:10:00] and adapt themselves.

But the, the operational people also have to adapt. We've got use cases and situations that have deployments that we've done where the operational savings, the operational safety measures just improved dramatically whenever We start to deploy this, these types of technologies like edge compute. 

Bill: And digging into those operational safety mechanisms and more the people side of things, right?

We've got worker safety projects, like connected worker and things like that, where you're helping maintain the safety of workers, ensuring that they're wearing their, their protective equipment properly, and they're following proper safety procedures and things like that. But that's really, I mean, that's getting into People have to adapt to the technology.

Individual people who did not before. How well do workers accept that new tech? Is it, is it like really intuitive and you can just install [00:11:00] it and everything's good and it does nothing but help them? Do they have to do additional things? Do they resist it? What, what are you seeing from the actual individuals that have to adapt to this?

Paul: Yeah, I, you know, I think that the, the individuals generally adapt to it pretty well. I mean, We're, come on, we live in a world where we're all on computers now, on laptops, where we've got the, we've got our smartphones with us. And so the devices that, that they're being asked to carry as we start deploying the stuff into, into large manufacturing facilities is really familiar technology to them and works very much like, you know, smartphone technology.

I think that the, The area that, where the resistance is, is just that they have to learn new things on, on how to do stuff. So, if you take a large plant, like a petrochemical plant, and you're saying that a person in the past, because they didn't have a network with edge compute deployed in the facility, or a wireless network in that facility that, that reached all their [00:12:00] locations, You know, they would have to go out, they would have to carry documentation with them to go fix something that may be broken up on a tower that they got to, that they got to climb up on.

And if they didn't have the thing that they need or they had a question, well, then they had to, you know, come back down and then go into the ops room and, and, you know, pull it up on the computers there or, or file different paperwork that they did. That was very, you'd be surprised how paper intensive a lot of these operations can be.

Still are, but now, you know, whenever we deploy these technologies, then people can just do that over a small, a smart device, and it does require them to learn how to do new things and new processes. But really, once they, once they start picking it up, then that's, it's a big improvement for them. The interesting thing that, that I've learned in.

Working with these folks also is just that for some of the biggest safety gains to be had, for instance, people don't need to do anything. They don't need to change any of their behaviors because the biggest safety improvements [00:13:00] really come from a very simple thing of just having people moving around the facility less.

There's a direct correlation between safety incidents. And how much time people spend walking around a environment that has hazards. And so if you cut that down by half, by just eliminating their travel time, because you know, they're not going back to the ops room to look something up, you know, they've got information at their fingertips and they can, Solve problems right there on the spot without having to go find somebody else to help because they've got that.

That's not really even changing, requiring a change in behavior. It just comes from the fact that they're not moving around so much. 

Bill: That's a neat point. Even just making it more efficient, faster. Yeah. And it's an interesting point about everybody carrying cell phones, right? So having connected workers adapt to having technology with them.

Google and Apple have already trained us to do that. We're used [00:14:00] to having a little device that does most of our remembering and communicating for us. So yeah, I guess they would be frustrated if they didn't have that more than if they do. 

Paul: Yeah. People don't realize that in heavy industry environments, cell phones don't work.

And a large portion of the facilities because of all that metal that's around, because of the, you know, usually these are places that are in remote locations and they don't have good connectivity and Wi Fi, you know, when you're talking about a plant that, that is several square miles and footprint, Wi Fi certainly isn't going to be deployed across something like that.

Bill: That's just a big mental adaptation in this day and age. Like, what do you mean my cell phone won't work and there's no Wi Fi? Like, How do you communicate? 

Paul: People like you and me that sit around and we've got great coverage, going into working in environments where, where we don't have it, that seems, we kind of take it for granted, but that really is a situation out in, in the field.

I mean, think about mining, for instance, those places are always in very remote locations with no coverage. And that's part of what [00:15:00] creates the problem with really leveraging technology to its fullest potential. 

Bill: So that was mostly kind of a manufacturing conversation so far, but you work across a lot of industries.

In which industries, in addition to manufacturing, I presume, are you seeing the most activity around the edge? Which ones have the most interesting use cases that you've come across? What's fun? 

Paul: Yeah. Yeah. Well, well, first of all, I'd say they're all fun. I really love this space. It's a, it's one of the reasons, you know, why I do the job that I do is because I've run across so many interesting and fun use cases.

But the, I'd say that the two, two areas that have really, I think they're on the forefront and they're, they're the places where the biggest gains to be had are in kind of heavy industries. Manufacturing falls into that petrochemical. Falls into that. Mining, harbors, mass transit, for example. And I think also retail, there's a lot of great retail use cases [00:16:00] around EdgeCompute.

Bill: So we chatted a little bit when we were prepping for this about work that you've done around Bitcoin miners. That's an interesting one that most people don't really understand, haven't had experience with. I haven't had any conversations about Bitcoin mining, frankly. And I mean, that's, it's, it's huge, right?

I've seen. global energy usage for the telecom industry and for the data center industry. And they always say asterisk, this doesn't include Bitcoin mining. So like that it's big enough that it shows up relative to global telecom and global data center. Ooh, okay. Can you tell us a little bit about What you've seen in that space.

Paul: Yeah, I will. But I, I'll tell you what, I'll give you another couple of interesting examples and then go back to the Bitcoin one, because the whole reason that the Bitcoin one was, we ran into it was because we were doing some of these other things and it'd make more sense to you if I start with them. So some, just some fun examples, for instance, [00:17:00] we, I worked with a large Retail customer who was trying to deploy robotics and experimenting with robotics in their stores.

And they were trying to run the robotics out of the cloud. I mean, I'm sorry, the, the control systems out of the, the closest hyperscaler cloud location, and then using network to tie it to it. But basically they couldn't get the, they couldn't get it to work because the robots in the store had like a need for a five millisecond latency.

Turn around like that, that was the max or is it 5 or 10 milliseconds that they had to have from a response time and then, you know, from input to response to, to command hitting. So, of course, you know, you don't want to the, some robotics moving around a store and not, you know, having really solid old systems on them, right?

And so. We were working with a company on putting compute much closer to those stores. The problem is that putting that much compute in every single store was [00:18:00] really cost prohibitive. And putting an instance of edge compute locally in the market that could serve several stores with that kind of latency performance is really the trick that helped solve the problem for the company.

So that, that's, that's one example I thought was really fun one to work on. Thanks. Early on, another one was with, with a company that was, they're in the restaurant business and they own a chain of small restaurants around the country. They are also coming very technology forward and using a lot, they were already, you know, even a few years ago, they were already experimenting with AI and with machine learning and they were doing things like.

Really cutting edge stuff like putting cameras and doing video analytics on customers in the drive thru to try to predict how much more food they needed to start working on, even before somebody got up to the, to the window to make their order, because this company is really about optimization of operations and about [00:19:00] efficiency and about speed and supporting customers with very high performance.

very quickly. And we're basically turning their whole, the backside of their restaurant into a little mini computer room and had reached out to me and asked if we had methodologies or knew of methodologies to like hang server racks from ceiling tiles in there, because that's how bad they were getting with it.

And so that's another fun example of how we use. Edge compute in a more centralized location to start take offloading some of that and still provide the latency kind of responsiveness that's needed to run that and to support that, to support that restaurant location. The funny thing about like Bitcoin operators and how we wound up working with some of them is because when we had that compute deployed at various.

Locations and planted around there, just like everything else. You know, there are times of the day or times of the, of the week that where there is high loads on it, and then times that there are low loads on the compute [00:20:00] that you've got in these locations. And so we started running into Bitcoin miners who were interested in, you know, if they could cut a deal with us on using that, that compute during times that were low volume times.

And basically maximizing the use of it and then getting access to compute at a discounted rate, you know, because we, they were doing it during low, low time. So, I mean, so many creative models that are out there for how this technology can be used. You think about how, like, what's going on with the containerization and the orchestration of that and the ability to turn things on and off and allocate compute resources on the fly at any location.

All of that stuff is kind of coming together to enable these types of use cases. 

Bill: Boy, that's pretty cool. Yeah, I love that idea of just working across any of your customers that are interested and have idle compute time wherever it sits. That's very [00:21:00] clever. So, moving toward The AI models. It's interesting to see how many different pre trained models are becoming available out there, right?

Intel has them available. NVIDIA has them available. Amazon has them available. Seems like everybody has their own set of pre trained models. And yet we still talk about training models. Are you seeing. your customers by and large building their own models from scratch or just doing refining on existing pre trained models or just using off the shelf models?

Like are you seeing a need for custom AI or can we start to use what's already built in? been used, been created. 

Paul: I think it depends upon our customers. You know, I, certainly custom AI for our customers who are large multinational global companies that [00:22:00] have operations spread around the world and very, they're very mature in their IT.

They're very mature in terms of their plant operations and, and what they're trying to run. They generally, Are the work that we have with them is really customized. There's not a lot that they're going to take off the shelf. Now there's a lot of that, those types of models that can be used for things like supporting employees and some of their, they're pretty common among, among many different companies.

And, you know, we use some of those tools in our own company from that, like that Microsoft makes them in the engagements that we've got, where customers are asking us to come in and help them figure this out, figure out how to deploy AI in their, in their company. They, you know, they don't need their, they don't need so much help from us when they're just for those more common AIs that are being built by Microsoft and NVIDIA, they're, where they really need our help is whenever they're, they're doing the custom work.

Bill: Okay. So customers are still by and large using some of those at least, but the [00:23:00] places where you get involved, it's not the, the easy stuff. That makes sense. Okay. Yep. And then kind of from a different tack, how much are you seeing single data source AI and how much are you seeing more comprehensive? Like, how much is So An AI model looking at parts on a line or looking at workers for safety and how much is looking at, we have all of these lines running and we have workers doing all of these things.

And so draw a conclusion about the larger kind of the global environment that they're working in. 

Paul: Yeah. Yeah. That's a really good question because, you know, the, a lot of people don't understand that. That when we're deploying these AIs, AIs are very, very data hungry for them to really work effectively.

They have to be able to consume tons of data that is all around the issue that they're, that they're trying to optimize, even peripheral things around the issue that they're trying to optimize. [00:24:00] And that creates a problem. And that problem is that corporations have to have a really solid data. Foundation in order to leverage this technology.

A lot of times they have the impression that, oh, okay, well, you know, I collect this and that and I'll just start repeating it and, and, you know, we want, we want an AI to start looking at trying to draw correlations. You know, if you've just got 50 data points that you're looking at and you have that, Measured every second over the last three years.

That's not even that as enough data for an AI to be effective. What we really do with customers is help them understand that to really leverage AI, you have to start with a data foundation. Are you collecting enough information? Is the information that you're collecting correct? Have you got good data integrity?

And do you have enough of it over time? You know, can you reliably collect it on an ongoing basis for it to continue to learn and keep optimizing? And that's a big data structuring, data foundation issue that enterprises need [00:25:00] to address first. 

Bill: Along those same lines, like shared data, shared models, shared visions, you're the global practice lead.

So global really, really starts to have meaning when you're looking at projects globally. Yes. So part of your job is to see. Emerging technologies, changing market trends in one area and extrapolate how you can use that to help clients in a different industry, in a different geo, in a different space, of a different size, whatever.

Yes. What are you seeing that's kind of percolating through time as we see those advances, as we see those changes? from that global view, looking across how you apply all of these things from one place to another, right? You're probably seeing an overall trend that many of us don't because we don't have that high level view.

What does that look like to you? 

Paul: That's a really good question too. I, because that's one of the funnest parts of my job, you know, is to be able to look across the world and see [00:26:00] what things are emerging and percolating. And certainly, you know, AI is one of those things that is emerging and There's interesting things happening with it in different places of the world.

But a lot of people like to talk about that. So I'll actually start with an example of something that's, that's a little more down to earth and then kind of move into then how that ties into what we see going on now with AI. And, you know, one of those technologies that we, that we have saw start kind of from, from early infancy and then start to spread.

And we started to help spread around the world is around private 5G deployments and private wireless deployments. And a lot of people kind of, to, to be clear, you know, I'm not talking about private wifi. That's something that's been around for, you know, Decades, but the, the, the private wireless space has just recently opened up over the last few years because, because as, as some governments have recognized that [00:27:00] spectrum is a valuable thing for their, their country.

To be able to, to put in other entities hands other than just the mobile network operators. They, because Spectrum is being used to support different types of technologies so, so extensively now, if the control of the Spectrum is completely in the hands of the mobile network operators, and of all Spectrum, and they are the ones that have to do the innovation, then there are, You know, only a handful of wireless operators, and that's, it's kind of choking down the innovation into a few companies.

And by putting it in the hands of enterprises and giving them some, some of that, or allowing them to completely control that spectrum, to deploy these really reliable LTE networks and 5G networks that could be deployed, you know, in their facilities, and they have control over it, and they can manage, they can innovate on it with new operational technology.

Then that really supports a more [00:28:00] rapid advancement of some of these, you know, next gen industry, industry 4. 0 revolution types of things that companies need to do to optimize their plants and really build modern plants that are much more automated and run much more efficiently and consume much less power.

That's an area that the United States really led early on. And opening up CBRS spectrum and making it easier for enterprises to work with wireless carriers and, and deploy these types of networks, that philosophy that the U S took in opening that up, we have seen that start to spread around to different governments and many other countries start to follow suit and do that, and as they've done that, then we've kind of taken that message and that, those technology solutions on the road.

And where there are countries that have more open and liberal spectrum policies, we have really created a part of our business that helps [00:29:00] enterprises in those countries to adopt those technologies and to leverage them in, in their operations. And so I think that that's a really neat example of How Kindle has played a role in, in taking a technology trend that started in one part of the world and help plant it and help it expand in other parts of the world.

But, you know, AI is really the thing that is just got everybody's attention now, of course, and, and the benefits that can be driven from it are, are pretty dramatic. And the neat thing about, and the thing that I like about. My job and about what we do at Kindrol is that the company doesn't just, I run the network and edge practice for the company, the global network and edge practice, but our company, I have a peer that that does the same thing in the security area and another one that does.

The same thing in, in artificial intelligence and data analytics. And then I have another peer that does, has the same role in our cloud services, for instance. And so [00:30:00] deploying these types of solutions in customer environments is to do that effectively. The customer isn't deploying AI just for the sake of deploying AI.

They're looking for a business outcome that they're trying to achieve. And that generally requires. All of us to work together across these different technology disciplines to put together a complete solution to solve that real business outcome that the customer is looking for. And that's really the fun thing that we're doing.

And a big part of what makes my job fun is being able to see how it all ties together and see how. The different parts that we play with, that we have specialists in, come in and go into a customer environment and look at what they're trying to achieve and put a combined technology solution together that, that gets them across the line.

And that's what we're seeing happening right now around the world is, is those types of engagements expanding. 

Bill: Mm hmm. Yeah. You were listing your peers and I was thinking, [00:31:00] boy, edge of network. But then also security that kind of needs to be baked in. That's right. Data and AI kind of needs to be baked in, and cloud probably going to be baked in, and like, wow, you guys must be joined at the hip or have a lot of review cycles or something because 

Paul: Yeah.

And it's been, you know, we've been tracking this trend too. We've been, as a company, every month I look at our reports that, that show all of the, summarize all of the deal opportunities that we're working on. And we now track which of those deal opportunities. have which practice. Involved in the deal and the, and we track progress in terms of like, from a couple of years back, three years back, what are the stat, those stats look like then versus now, and we continue on a steady trend of customers, less and less, just buying point solutions from us versus coming to us for more complete solutions that span many practices.

Three years ago, it was much more often that's frequent that, that a [00:32:00] customer might come and say, Hey, I need help deploying. A network, we're designing a network that does this, but now it's just over time. And it's gotten more and more where we are providing these integrated technology solutions that go across the stack that really drive much better value for them.

And we've had this conversation with industry analysts, third party industry analysts, too. Actually, we had one speak at our leadership conference, uh, Few months back that brought in stats talking about how more and more enterprises are buying in this manner as opposed to buying bits and pieces from different companies.

Bill: That makes a lot of sense. And it kind of connects back to the AI model conversation. They can get off the shelf models, but they don't need you for that. They can get, just install some servers or install a network, but they don't need you for that. And the stuff that you get called for is going to be the much more integrated stuff, edge with security, with AI, with a little bit of cloud thrown in.

And then how do we make all of that stuff work together [00:33:00] into One set of results that actually means something. That's, that's where the magic happens. Kind of cool that you get to work right in the middle of that and see all of it. That's amazing. 

Paul: Yes, that is. That's, 

Bill: that's what I live for every day. Yeah, no doubt.

So what gaps do you see right now between the art of the possible, what your customers wish they could do, and what people actually can achieve today and are achieving today? 

Paul: Yeah, I think that I would tie back to the, to that issue that I brought up earlier around the data foundation. That's just so important to doing, to meeting customers expectations and achieving what, before you go in and deploy one of these AI solutions and hope to achieve some business goal that you're shooting for, it is well worth the time to really be thoughtful about how you pull that together.

You know, I, it reminds me of a quote from Einstein that I love, said that if you asked me to solve a problem and you told me I had an hour to solve a problem, I would spend 59 minutes understanding the [00:34:00] problem and one minute solving the problem. And that's really the approach that we've got to take in this, that we've seen in pursuing these types of things with customers.

Bill: So back to the human side of the edge. We talked about some of the cool things that you can do and some of the places that you're doing it. Good stuff. I'm hearing more and more about the silver tsunami, the group of, you know, the cadre of baby boomers in particular that are just starting to retire. And everyone's like, oh no, what, what happens when they all retire?

Are you seeing More and more of these projects as we can do things more efficiently and make more money. Are you seeing things of we can't find enough people that are trained or we're trying to capture the expertise while it's still there? Are you seeing any evidence of this yet? Or is it still just in newspapers?

Paul: I have seen situations where our customers have Struggled with [00:35:00] losing their good technology talent and are just not able to pursue projects because of that. That's actually one of the reasons why they come to Kendrell and hire us to, with help to, it's because of that, of that competition for skilled talent and for some of the older talent that's moving out and retiring.

In terms of how AI plays into that, I can't remember who it was, but I, I remember hearing somebody once say at a conference that it's not so much that AI is gonna take people's jobs, it's that the people that are, that really learn to use AI are going to be, you know mm-hmm . Become much more valuable and more productive.

Mm-hmm . And I, I. Kind of buy into that. That's, I do see that. I mean, we're, we're deploying AI in our own organization. For instance, we are building out an AI ops platform that we're using to more efficiently run our customers IT estate and their technology platforms. And. We call it [00:36:00] Kindle Bridge, by the way, and it's, it's great tool that, that it's still under development, but it's being used in over, well over a thousand customers of ours today, and really being used to gain new insights into how we can improve operations for them and save, save our customers money.

But, you know, just this morning, actually, I was on a call with one of our ops teams, and they, they were, Supporting one of our customers and they, they do a regular standup call where they look at everything and how things are running and they have these tasks of what's, what they need to do and where are we having problems and what do we need to go address proactively and such.

And now that we have Kindle bridge. And we're using AI to analyze much more data in a much shorter amount of time. You know, I'm seeing it live happening in our own company. I mean, just in our call this morning, part of the operations call now is to pull up the dashboards in [00:37:00] our AI Ops tool and go through and look at what the AI Op is doing.

is recommended because it actually just, it looks at all this mass amount of information and it actually makes recommendations to our operations people of things that look off that they should go investigate it'll like spot problems and make recommendations on like. You know, you need to do a new software upload on this to do, you know, that there's a new, there's a new patch out that, that somehow got missed in this particular range of equipment that you should be working on.

Yeah, it'll, it'll say things like one of the ones that we saw this morning was it identified that there was a, there was one function that had. An unusual number of trouble tickets that originated for it over the course of the last week. And that stood out because that's had never happened before. So even though the, the device was working fine right now, it flagged that to say, Hey, that's an [00:38:00] anomaly that it might be a good idea for somebody to go check that out and understand what's going on.

So, so what's happening there in like, in that example is that people are, it's, it's not. That we're kind of like out just replacing people, people are just becoming much more productive and effective in their jobs as, as a result of having information that they've never had available to them before. And so they, the, as people get more and more used to what's, you know, what AI is and how it works.

What it will do and what it won't do and how to engage it and interact with it. I think that it's going to make really all of our lives and jobs much, much more enjoyable, really. 

Bill: Yeah. That agent sounds like the epitome of it, right? Rather than having someone going system by system through your global estate, looking for patch levels and watching for trends in tickets and things like that, just have the system do [00:39:00] that and have.

The smart, capable people doing smart, capable things rather than drudgery stuff. I would like to not have to do that stuff anymore, please. That would be lovely. Cool. Paul, thank you so much for your time today. This was, this was a really fun conversation. We covered a whole lot of ground. How can people find you online and keep up with the latest work?

Paul: Yeah, well, I'm on LinkedIn and I'm, I am a pretty active there. So a lot of times you'll see me posting things about new customers that we've been doing fun things with new technologies that we've been experimenting with. So that's really the best place to, to look for me and connect with me. 

Bill: Fantastic.

All right. Thank you so much. I had a good time. I hope you had a good time, and I hope our listeners enjoyed listening as well. Same here, Bill. 

Paul: Thank you so much. 

Ad: Capitalize on your edge to generate more value from your data. Learn more at dell. com slash [00:40:00] edge.