Over The Edge

Building Solutions for the Edge with Dan Cummins, Dell Technologies Fellow / VP Edge Computing

Episode Summary

This episode of Over the Edge features an interview between Bill Pfeifer and Dan Cummins, Dell Technologies Fellow / VP of Edge Computing. The two sit down to discuss the challenges and constraints at the edge, and the associated obstacles to building edge solutions. Dan dives into NativeEdge, a new edge operations software platform that centralizes deployment and management of edge infrastructure, sharing the thought process behind its creation.

Episode Notes

This episode of Over the Edge features an interview between Bill Pfeifer and Dan Cummins, Dell Technologies Fellow / VP of Edge Computing. The two sit down to discuss the challenges and constraints at the edge, and the associated obstacles to building edge solutions. Dan dives into NativeEdge, a new edge operations software platform that centralizes deployment and management of edge infrastructure, sharing the thought process behind its creation.

---------

Key Quotes:

“Solving for the edge is really solving for distributed computing. And then you need to understand the constraints on each of those locations based on the vertical that you're in.”

“The thing that most excites me is the ability to federate your processing at these edge locations, right? I mean, the only way the world is going to scale is by federating this processing where the data is being generated.”

--------

Show Timestamps:

(01:10) How Dan got started in technology 

(03:19) Dan’s journey to Dell 

(05:54) What prompted Dan’s move into edge technologies 

(08:24) The evolution of the edge and the need for a horizontal platform 

(17:53) Cloud-out versus edge-in 

(21:18) The process of building NativeEdge 

(25:58) The connection between edge and multicloud 

(27:52) Challenges with orchestrating across the edge 

(33:29) Challenges the edge still needs to evolve to address

(36:17) Silicon diversity and hardware at the edge/core

(39:10) Possibilities that Dan is excited about at the edge 

(42:05) Edge’s impact on business and society 

--------

Sponsor:

Over the Edge is brought to you by Dell Technologies to unlock the potential of your infrastructure with edge solutions. From hardware and software to data and operations, across your entire multi-cloud environment, we’re here to help you simplify your edge so you can generate more value. Learn more by visiting dell.com/edge for more information or click on the link in the show notes.

--------

Links:

Follow Bill Pfeifer on LinkedIn

Connect with Dan Cummins on LinkedIn

You can learn more about NativeEdge through these resources: NativeEdge website, NativeEdge Vlog and NativeEdge 

webinar
 

Episode Transcription

Narrator: [00:00:00] Hello and welcome to Over the Edge. This episode features an interview between Bill Pfeiffer and Dan Cummins, Dell Technologies Fellow and VP of Edge Computing. Dan has spent time understanding the challenges that customers face at the edge and working to create a viable solution. He dives into the process and thoughts behind building Native Edge, an edge operation software platform from Dell built to manage edge devices across use cases and verticals.

He and Bill discuss the current challenges, the opportunities and the future of edge technologies. But before we get into it, here's a brief word from our sponsors. Over the Edge is brought to you by Dell Technologies to unlock the potential of your infrastructure with edge solutions. From hardware and software to data and operations, across your entire multi cloud environment, we're here to help you succeed.

Narrator 2: Simplify your edge so that you can generate more value. Learn more by visiting Dell. com slash edge for more information or click [00:01:00] on the link in the show notes. And now please enjoy this interview between Bill Pfeiffer and Dan Cummins, Dell Technologies Fellow and VP of Edge Computing.

Bill Pfeifer: So Dan, thanks for joining us and welcome to the podcast.

Dan Cummins: Thanks, Bill. Thanks for having me. This should be a fun

Bill Pfeifer: conversation. Always best to start with a little bit of history. So can you tell us a little bit about how you got started in

Dan Cummins: technology? So how did my career get started? Technology internships, growing up, my mother worked for Pfizer Pharmaceuticals.

Dan Cummins: My father was in the service, in the Navy in particular. I spent many a summers working in, as a summer intern for Pfizer Pharmaceuticals, developing Fortran programs on a, on a VAX cluster for a statistician, and then later. I won a, or was awarded a position on a contract called the Darwin Program from the University of Rhode Island College of Engineering through the Naval Undersea Warfare Center.

Dan Cummins: So I worked 20 hours a week while I [00:02:00] was in school, working at the Naval Undersea Warfare Center, developing defense electronics, tactical warfare systems, and things like that. And shortly after school, I took a full time job there and then ended up immigrating North to Boston. Again, working in defense electronics before moving over to digital equipment corporation on the commercial side.

Dan Cummins: And then it kind of just took off from there.

Bill Pfeifer: So I love it. You actually started as an intern at the Naval Undersea Warfare Center. I actually spent a summer interning at the Naval Air Warfare Center outside of Philadelphia, which is kind of fun. So what kind of technologies did you work on at? The undersea warfare center.

Bill Pfeifer: And what did

Dan Cummins: you learn from it? Well, I work mostly in embedded systems, single board computers. I remember I was developing data acquisition systems and control systems on single board [00:03:00] computers using an embedded operating system called PSOS. But that was combined with user level application software on a Unix system running X windows, where I would develop display software and things like that.

Dan Cummins: So it was mostly embedded system, like Unix software.

Bill Pfeifer: Okay. And then from there you got into overall data center, hardware, architecture, design. What brought you into

Dan Cummins: Dell? What was that path? Oh, the path to Dell. So through the early part of my career, growing up, I kind of stayed with defense electronics until finally I got out of the defense electronics.

Dan Cummins: I went to go work for a company called Digital Equipment Corporation, which is another famous company. And there I was writing device drivers for laptops. I worked for the portable division that was later acquired by Compaq. But then after that, I kind of moved to a startup with a bunch of people that I used to work with at Digital Equipment Corporation.

Dan Cummins: And there we were [00:04:00] developing a consumer grade, whole home video in, in audio products, right? The big innovation there was a multi room DVR product. Then later I went to a company called ZChange, which was on the other side of video delivery, which was on the data center side, where they were Comcast's largest customer developing media systems, clustered media systems that were delivering video on demand over cable networks.

Dan Cummins: So I worked there for probably about seven years. And we got into all sorts of things there, whether it was storage systems or there was clustering systems, a little well known fact. I worked on the team that gave Mellanox their very first purchase order for InfiniBand. So that's kind of interesting.

Dan Cummins: From there, I knew some of the folks that I worked with at Digital and at C Change had gone over to EMC. And they gave me a call and basically said, Hey, we have some challenges over here and we could use your leadership and could you [00:05:00] come over and help us? C Chain at the time was going to stay, it's video delivery was really going to stay over a broadband cable networks.

Dan Cummins: And, you know, I could clearly see that the future was going to be HTTP over the top media stream, which is dominant today, right? So it was very easy for me at that time to, to move over to EMC and apply all of my knowledge Right, to help EMC in the mid range storage business transform their mid range business starting with VNX2 in their multi core transformation.

Dan Cummins: So I led all of that, but that's what got me over into EMC and datacenter. And then from there, obviously Dell acquired EMC.

Bill Pfeifer: So you were doing streaming video way back in the day before it was streaming video. And that was kind of the distributed view of things. And then you came to Dell and did some datacenter type storage.

Bill Pfeifer: And now you're working in the edge. What prompted that move and what have you, I mean, what's, what's the evolution of data centers out to [00:06:00] the edge and what have you learned

Dan Cummins: along that way? Well, the thing that really prompted it and why I said, yeah, absolutely, is because I saw the opportunity in the growth of the data that's happening at the edge and knew that there were opportunities for Dell to play a larger part there.

Dan Cummins: I knew that there was a massive opportunity here. I also knew that giving my experience and through my career history, given both working in consumer, commercial, defense electronics, and other businesses in technology areas, that I was a really good fit, right? Because I had experience in, in many of the technology areas that, that Edge had.

Dan Cummins: Matter of fact, when I was working for the New Wonder Sea Warfare Center, we were, you know, developing systems using artificial intelligence and machine learning as early as 1999. So it comes full circle. And now if you listen to, okay, what's important today? Well, primary motivator for computer vision is, is AI inferencing.

Dan Cummins: And proliferation of all the IOT sensors at the [00:07:00] edge, generating that data so that you can generate value near the point of data generation, right? Yeah, it's

Bill Pfeifer: kind of funny how often in technology we see these infrastructure cycles and process cycles, and it feels like the edge is bigger than just one of those cycles, but kind of ties into several of them.

Bill Pfeifer: It's interesting to see all the things that come together there. Yeah, it's kind of

Dan Cummins: like the same kind of technology patterns. I mean, the technology changes. But the same fundamental computer science concepts kind of remain constant, right? I think the only thing that, that has really changed or has evolved is everything around AI right?

Dan Cummins: Which is nice to see, but fundamentally from a systems perspective. You're solving kind of the same problems over and over again, but that's just with a better technology.

Bill Pfeifer: So speaking of solving problems right now, you're the chief architect of Native Edge here at Dell, which just got announced a couple of days ago.

Bill Pfeifer: Can you talk us [00:08:00] through the evolution of the Edge and how that pointed you to. The need for a horizontal platform where that fits in that whole solution

Dan Cummins: set. Ah, yeah. So, well, there was a lot of research that basically pointed to the fact that there needed to be, you know, a control point in that there needs to be consolidation of application runtimes.

Dan Cummins: And other services in order to be successful and to reduce the complexity at the edge. So the strategy when I first came in was really to develop verticalized solutions, to have Dell package ISV software with hardware and deliver that into the different edge verticals. And then secondly, to optimize our infrastructure portfolio to better meet the needs of these edge solutions.

Dan Cummins: When I Took a look at the initial research, I started delving into a lot of just kind of understanding the different solutions and the challenges that customers faced [00:09:00] in each of these verticals. And there were some common themes. So as I was looking at this, I came up with kind of these eight constraints at the edge.

Dan Cummins: Security. Simplified management, connectivity, cost, multi cloud, scale. Scale is much different at the edge. And then some internal things. OEM readiness. If Dell is going to enter the edge, we wanted to be able to OEM the products and as well as help customers. And we realized that given all these constraints and these challenges, because you are outside the walls of an IT data center, that things were down at the edge.

Dan Cummins: Because of where the infrastructure is deployed in remote locations, the type of personas that are out there, we said, the fragmentation, the same solution could be implemented 10 or 20 different ways. The person who buys, the person who installs it, the person who maintains it. They could be all different people.

Dan Cummins: So we know that it's highly fragmented, it's highly complex. We also know that we've [00:10:00] got a great portfolio of products and services that we can apply to help customers with their outcomes. But we really need a software platform that tied it all together to simplify all these use cases. And so that's kind of how it came about.

Dan Cummins: So I know

Bill Pfeifer: you had a lot of customer conversations and at least one of those customers was kind of pointing toward... You could build the software that's the overlay that pulls all of these things together. How many of those conversations were customers, like, on point with understanding these constraints and the need for that horizontal platform?

Bill Pfeifer: And how much were they looking for ways to solve their problems? How much do our customers really understand the direction that we're headed now? And how much is... Is the market

Dan Cummins: still trying to figure out? Well, it's kind of funny. It's like kind of one of those implicit things. So it depends on, well, let me back up.

Dan Cummins: Let me say this. In every [00:11:00] single industry vertical that we've seen today, there is a simpler approach to the horizontal platform, but it's proprietary. For example, we find people in manufacturing, there are companies out there that orchestration platforms, they don't go as far as the edge goes, right? And they're not tackling all the things, but they exist.

Dan Cummins: They exist in one form or the other, and it points to a real need, a real challenge. I remember talking to one of our major oil customers, and I brought them through the entire constraints in the story behind Native Edge, and I said, this is how we can simplify and secure and lifecycle manage your infrastructure.

Dan Cummins: This is how you can focus on your outcomes, right? And we can take all that off for you. And when I went through that, it was a drop, drop the mic moment. They're like, that's absolutely what we need. Because there wasn't that full horizontal platform in the market. That combines secure life cycle management of the infrastructure with a full multi cloud [00:12:00] integrated orchestrated software platform that can deliver those complex solutions to cross the edge of the core in the cloud.

Dan Cummins: They were building their own. So,

Bill Pfeifer: so it's a problem that's sort of being solved out there in the market, but in kind of dribs and drabs and smaller pieces and corner cases and not really. Something that works across all of the different solutions that customers need. So they end up with multiple of them, I presume.

Bill Pfeifer: Or across

Dan Cummins: all the verticals. A lot of them are very vertical specific. Like you'd find certain orchestration use cases and lifecycle management use cases that are specific to retail or specific to manufacturing or specific to automotive. And there's technology, there's different technologies that are born out of, out of all of these.

Dan Cummins: Like a good example would be automotive, right? The connected car today, by secure over the air updates, the department of Homeland Security came out with a recommendation and next thing you know, the industry [00:13:00] responds with a software protocol, a secure software protocol for delivering over the air updates to vehicles.

Dan Cummins: Wow. That's an example of an edge specific technology that was born out of the need to orchestrate edge compute. Because, you know, if you think about it, solving for the edge is really solving for distributed computing. And then you need to understand the constraints on each of those locations based on the vertical that you're in.

Dan Cummins: Like your far edge network or your OT network is different. Like in manufacturing may have an OT network in, in a hospital, right? They've got kind of a patient network. I can't sell. Right. There's different constraints that you have to deal with, different protocols that you have to deal with, different processes that you have to deal with.

Dan Cummins: You have to understand all those. So there was no one single platform that kind of solved broadly across all of these with an open ecosystem and applying edge specific technology to solve for these constraints. A great example I like [00:14:00] to use is how we've completely modernized day zero and day one.

Dan Cummins: Meaning that since we own the supply chain, we have the ability to be able to securely initialize our devices in manufacturing such that we can cryptographically transfer ownership, improve, and attest these devices when they're powered on. One of the challenges you have at edge is infrastructure is deployed outside the walls of an IT data center they can be tampered with.

Dan Cummins: Intel was on the run track back in 2016 when they came out with something called secure device onboarding. That kind of went part of the way, right? To help solve this problem. It was later adopted and given to the FIDO Alliance and the FIDO Alliance extended it and added late binding to it. Well, the way that we're deploying infrastructure from behind firewalls in remote locations, combining that with zero touch provisioning.

Dan Cummins: It enables a seamless, secure experience for delivering and [00:15:00] provisioning infrastructure at scale in parallel outside the walls of an IT data center. Now it just so happens that this technology can also apply to IT. However, the IT techniques they use today for infrastructure provisioning don't necessarily fit at the edge.

Dan Cummins: They don't have the same security. They don't have the same scalability. They don't have the same ease of use. Oh, that's, that's a good example. And so native edge is really, it really solves for edge constraints first. So it's, it's like what we call an edge in solution versus a cloud out solution. Because the other thing that we find out in the industry.

Dan Cummins: Is that people that are developing these horizontal type solutions, I really take an IT approach first or a cloud first, first approach. And it does, it's like fitting a square peg through a round hole, right? It just doesn't fit, but native edge is different. And that's why it's different is because it's leveraging edge first techniques to solve for these constraints.

Dan Cummins: Can you talk a little bit more

Bill Pfeifer: about why that [00:16:00] cloud out approach doesn't fit for the edge versus an edge in to get a little more granular about what those

Dan Cummins: differences are? Yeah. So, well, first off, a cloud out approach assumes that somebody is managing the infrastructure behind the scenes. In the edge, you have infrastructure that is deployed outside of a cloud data center or outside of a data center, but in a remote location or in a location that doesn't have people with IT skillsets.

Dan Cummins: And so if you think about how do you service that, how do you provision it? Those are different because you're constrained by operational constraints. You're constrained by skillset constraints. You're constrained by environmental constraints, dust, moisture, vibration temperature. You get operational constraints, like in terms of like who can operate on it.

Dan Cummins: So that's just an example. So in order to solve for those, you need to really be thinking about [00:17:00] who's going to be touching the hardware. How do you deliver the hardware? How do you upgrade the hardware? How do you orchestrate the software? What people with little to no IT skill sets in places that may not be accessible.

Dan Cummins: Right, which is very different than a data center and it's very different than like a hyperscaler. So that's, that's from kind of a lifecycle management perspective. From a cloud out management perspective, a lot of what the hyperscalers are doing is they're extending their reach down into. These edge locations where they're standing up their clouds to action, they're kind of monetizing it through their, their cloud.

Dan Cummins: That can get very expensive. And some of the challenges that we see with our customers is that they're looking to avoid the cost of centralization. So that's why another reason they want more compute and more delivery of compute and services. Outside of the core data center or outside of the cloud, closer to where the data is generated.

Dan Cummins: So kind of ballpark

Bill Pfeifer: the cloud out, [00:18:00] one of the problems that you're seeing is the cloud loosely assumes that compute is there. Which is kind of one of the fundamental tenets of cloud, right? If you need more compute within your cloud space, just provision more compute. That you don't have to think about the hardware.

Bill Pfeifer: But then on the edge, we're talking about hundreds, thousands, tens of thousands of locations where you have to put the compute, and they don't necessarily have an answer for that built

Dan Cummins: into their solution. Yeah, and it needs to be managed, and even supply chain, like how do I deliver it? You know, Dell has supply depots all around the world and we can get, you know, infrastructure out to you pretty quickly, but you, you nailed it.

Dan Cummins: So there's a cloud like software experience. Like I want to, you know, on an elastic computer or elastic storage and things like that. So they're kind of reducing op ed that way. We can stand up those cloud facts at the edge, but what you can't do and what's different is how do you actually service and manage those estates?

Dan Cummins: That's what's different.

Bill Pfeifer: So you've talked through a whole [00:19:00] series of problems at the edge that are not simple to solve, and some of them we talked a little bit about how you solve them, but how do you get that all together into a net new operations platform from scratch? Can you talk us through the process?

Bill Pfeifer: I mean, I'm sure there's. Some iterations and some failures along the way, probably some funny stories, maybe some inappropriate stories, don't tell those. But can you talk us through what that process of building it looks like? How did you decide what was working, what wasn't working? How did you iterate for that?

Bill Pfeifer: And how did you pull it all together

Dan Cummins: into one thing? Yeah, well, first off, we're still building it. Okay. So this is the first, we have a full Northstar vision that we need to get to implementing, right? Yeah. Implementing in a large company, trying to kick the innovator's dilemma is a challenge, however.

Dan Cummins: Myself and a few others on the leadership team have done this multiple times before within a big company. So we have proven lead. But [00:20:00] to answer the question about developing this from the ground up or from scratch, it's not completely. The concepts in the architecture were built, not by just laying something down on pieces of paper or into Slack.

Dan Cummins: We actually sat down and we thought about and had the conversations with the customers. And we found out what their real problems were and what was common across all these industry verticals. We mapped that to where can Dell add value. And that's how we arrived at, Hey, we need to do this horizontal platform approach.

Dan Cummins: So then it was just a look at constraints and I ended up hiring a team of architects that I work with very closely and put together the high level architecture before we even hired the development teams and convinced ourselves by introducing that to customers in calls and got wide scale acceptance of that.

Dan Cummins: After that, we started hiring development teams and we started with the hello world type application. But the very first thing that we went [00:21:00] after was secure onboarding. So as we were developing the first version of native edge, we knew that the toughest challenge that we would have was that vital device onboarding because it required us to modify manufacturing processes and also engage other parts of Dell to develop other services that Dell could offer.

Dan Cummins: And also change the ordering and the manufacturing process. I mean, that's a big deal. So that took a bulk of our time and that's no small feat. So anybody that's looking to develop a secure onboarding strategy, we'll have to go through the same thing, but we've done it, we focused on that. We also, for the first release, as we were building teams up, of course, we've got a global worldwide team.

Dan Cummins: We got teams in Singapore, we've got teams in India, we've got teams in Israel, we've got teams in the United States, right? So it's a global team. So of course, there's always some challenges getting everybody aligned and keeping, even though we have an agile development model, it's still [00:22:00] very challenging across these time zones.

Dan Cummins: Right, to keep everybody alive because your most senior people aren't there necessarily day to day to help guide. So that was always a challenge. But one thing that did happen was we knew that the two major pieces to the platform was one, was secure lifecycle management and secure operations. And the second was really around service orchestration.

Dan Cummins: The ability to be able to take an outcome, like overall equipment effectiveness or one of your computer vision use cases. And simplify the creation of that solution and then to be able to break that solution down and orchestrate the piece parts across the edge, the corn, the clouds. And it has to be multi cloud, right?

Dan Cummins: Not just one cloud. I mentioned before that we were Thunder Neutral and Cloud Neutral. In order to orchestrate these outcomes and reduce this fragmentation and salivation, we wanted an open system that could orchestrate across these locations. The end. [00:23:00] It's going to take us a long time to develop all of those plugins and orchestrations across these environments.

Dan Cummins: So we went out and we said, okay, where can we find a company either partnering with them or acquire them that could help us facilitate this or accelerate this vision? And of course it was well known that we went and purchased a company called Cloudify. We also purchased another smaller called Myst to bring that talent and that expertise into our team.

Dan Cummins: As well as code that we could leverage that we could accelerate native engines development. So that was kind of the process, right? It still continues today. One of the advantages of a company like Dell is the massive resources that we do have and our global presence and the ability to tap into these, this talent that's all around the world.

Dan Cummins: Pretty exciting. Very cool.

Bill Pfeifer: A glimpse into Dan's brain and how it works. So you mentioned multicloud there, just kind of briefly. What's the connection between edge and multi [00:24:00] cloud? Are they tightly connected, loosely connected? How do they connect to one another? What does that look like?

Dan Cummins: Well, many of your modern applications are going to be microservices, cloud based type applications or leveraging services that are in the cloud.

Dan Cummins: I would say today, if you're looking at AI ML, most of the training today is still done in the clouds, right? They've got all of your services available that make it very simple to train models. And then they also have inferencing servers and frameworks that allow you to push those models off to the edge.

Dan Cummins: Now, not everybody is going to be using just one cloud. They're going to be using services or deploying applications to multiple clouds. One interesting stat that we have is that with all the customers that we talk to on average, they have an investment in all three major hyperscalers. AWS, Google, and Azure, and they're using best of breed services for their outcomes.

Dan Cummins: So a lot of customers have [00:25:00] solutions and services that are based on all three or part of all three. So that's the connection to the edge, right? So anything that's long term post processing or batch processing, there's a lot of services that are in the cloud today. Now, some customers are also using Some of the, the extended services from the cloud, whether that's Azure or that's an AWS Outposts, for example, but each of the solutions also offer things like managed Kubernetes services or stacks that can be deployed in a local data center.

Dan Cummins: And so you would need to, you have to have a system that can integrate into each of these cloud services or cloud locations. Right, whether it's public cloud or private cloud. So that's just it. I mean, it's kind of black and white, right? We need to be able to orchestrate across all of those environments.

Dan Cummins: What are some of the

Bill Pfeifer: challenges with that? I mean, Orchestrating across the edge, which has all these different constraints and hundreds, thousands, [00:26:00] or more locations, clouds that are owned by different companies and have different services available. How do you think in terms of orchestrating across all of those different places and types of operational methodologies?

Dan Cummins: Well, everything's API driven these days, and usually there's an API gateway. You know, so if I'm talking to a vSphere cluster, you know, I could talk through a vCenter, as long as I have the credentials, or I can have an API token from AWS, I can use EKS CTL, for example, to talk through its API. There are pretty well known ways to set up VPCs between your cloud and your location, but mostly it's going to be orchestrating through their APIs into those environments.

Dan Cummins: You know, take a look at Kubernetes, for example, right? Whether it's a managed Kubernetes or it's a bare metal Kubernetes, it's in a data center or even an edge location. There's one way to orchestrate the Kubernetes. That's the nice thing about Kubernetes, kind of democratizing compute. [00:27:00] You can extend it using Kubernetes operators.

Dan Cummins: It's just a lot of work, right? But in general, everything is orchestrated across these clouds. Now, the complexity is how do I simplify An outcome, let's talk about an outcome, right? So an outcome for N, just like an NG application. So if I were talking about a computer vision application, pick one, right?

Dan Cummins: There's going to be an inferencing server that needs to get deployed at the edge. There's going to be. A more than likely a cloud based service for your training. And there could be some other services in the data center. So how do you describe that in a declarative way and then have that orchestrated?

Dan Cummins: So our vision here was to simplify all of that to a concept called blueprints. So I can create a solution blueprint and I can specify all of the cloud services that I need. I can specify the applications and the runtimes that I need. [00:28:00] I can specify the SLAs for those applications. I can target them to the specific nodes or locations to deploy.

Dan Cummins: But that recipe, right, that deployment recipe is a templated pattern that I can then now use and deploy it across this distributed computing estate, which includes Well, computing locations that are looking at the edge, the data center, or colo, and cloud, and it's, that's where the complexity is, right? But orchestrating into clouds or from clouds to, to edge, I mean, those are kind of well known things today, but somebody who could pull it all together using some sort of workflow engine that is, uh, allows you to recover from that, right, and manage that complexity to really make it simple, that's the key.

Dan Cummins: And that's what we're doing with native edge. So,

Bill Pfeifer: by addressing the constraints at the edge and then adding these API connections on top of it, you've effectively integrated edge into the multi cloud, [00:29:00] and the Northstar vision sounds like loosely solved a lot of the challenges of multi cloud, actually delivered real multi cloud as opposed to we have one cloud and another cloud, which is sort of multiple clouds, but not

Dan Cummins: multi cloud.

Dan Cummins: We talk about being able to orchestrate services across multiple clouds, right? So we can certainly do that. What we can also do is orchestrate the deployment of a cloud stack, private cloud stack, so it becomes very powerful. So now you put the power into the system integrators hands or the customers or development teams hands, where they have the ability to now be able to compose different solutions for their businesses or their customers, and to be able to offer those and operate those simply.

Dan Cummins: But it sounds

Bill Pfeifer: like, I mean, more than just deploying services across multiple clouds, you're using blueprints to define, here's what my workload should do, and then deploying the workload, which is more what businesses care about, right? They don't necessarily care about. One [00:30:00] cloud versus another cloud versus an edge location, where your compute sits necessarily.

Bill Pfeifer: They just want their stuff to work the way they intended it to work. And we're kind of moving toward that, which is really cool. It's an impressive, it's an impressive challenge to solve through Uh, platform, but it makes sense, right? A horizontal platform reaches across into all the things

Dan Cummins: that it touches.

Dan Cummins: In my career, I have the saying, right? Do more with less easily. That's how you're going to win. Mandela's in a perfect position to do that for our customers and to give for the edge. If you think about it, we have a fundamental right to win the edge, right? We own the supply chain. We're number one in almost every category.

Dan Cummins: We know how to service both consumer. As well as commercial enterprise customers and customers want to be in business with companies like Dell that have their back, right, that have the, the ability to be able to help them long term and not only with their edge complexity, but when it's other parts of their business as well.[00:31:00]

Dan Cummins: I mean, these are the reasons why it makes sense. Then a company like Dell will be very successful here at the Edge.

Bill Pfeifer: So, what other problems or gaps do you see in Edge technology that need to be tackled still? Either through this platform, through other platforms. What do you see as a fundamental challenge that the Edge needs to evolve to address?

Dan Cummins: The biggest challenge I think right now still is many of the applications that are out there today are still monolithic Windows applications, and customers need to go through that digital transformation, both hardware and software, so that they have the flexibility. And what we find out is that customers are not transforming their existing software, right?

Dan Cummins: But they're writing that new. And so I think there's still a challenge out there today, but from a technology perspective, I think obviously. The impact of Gen AI, I think maturity [00:32:00] in silicon diversity and edge optimized infrastructure. The growth of computing devices continues to grow at an exponential scale.

Dan Cummins: My prediction is that the growth of IoT devices and edge locations that are servicing IoT devices. The compute of these locations in aggregate is going to pretty shortly outnumber the number of devices that are at the data center in the clouds. So you talk about scale. So really, if you think about the constraints at the edge, power constraints, right, the environmental constraints, right, for edge optimized infrastructure, getting smaller, more capable devices and pushing.

Dan Cummins: Inferencing out further and further to the IOT devices, maturing is still going to happen, so there's plenty of opportunity there. And on the flip side, generative AI and large language models, things like how do you apply those and prompt engineering to facilitate simplifying your operational overhead for your solutions?

Dan Cummins: And it's very complex to create a blueprint, for example, but I can use an [00:33:00] AI to help facilitate creating that blueprint and ultimately deploying that. Right? There's lots of opportunities for technology. I mean, it's, it's kind of wide and broad, right? I mean, there's advancement in like private wireless and 5G.

Dan Cummins: It still needs to proliferate. So telecommunications or private wireless, there's quite a bit. Actually, that means mature. You mentioned

Bill Pfeifer: silicon diversity and that that's kind of a fascinating conversation, right? We, with all the virtualization, everything moved on to x86 hardware. So it was nice and simple, but now we're talking about silicon diversity where we're going to get hyper specialized chips that do a particular thing really, really well at very low power or very high speed or both.

Bill Pfeifer: And AI specific chips and things like that, so this idea of pushing all of this compute out to the edge is manageable right now because it's x86 and you may have some GPUs to accelerate your AI or something like that, [00:34:00] but it's relatively simple, straightforward, consistent hardware. Then when we get silicon diversity in there, do you think we're going to see an explosion of diversity of hardware out at the edge as well as in the core?

Bill Pfeifer: Or is it going to be relatively more constrained to specialized locations like the core?

Dan Cummins: Oh no, absolutely. It's going to be a diversification of, you know, hardware specialization as more and more locations need to be automated or sensor data is collected. You know, it's kind of funny. So, you know, the general purpose processor, it was predicted.

Dan Cummins: I remember I was in a USENIX conference back in, I think it was like 2014 or something like that. Yeah. Where there was a professor from, I think it was Stanford, right? Where he got up and he predicted Moore's law is dead, right? And the reason Moore's law is dead is because of Dennard scaling. You can't effectively scale power with transistor count to power up those transistors.

Dan Cummins: Right. Too much [00:35:00] bleed through and such. Yeah. Yeah, I mean, I, I can't have a refrigerator side, side power supply, you know, or a scale my transistors and the heat in, in the cost of that power. So, and that, that is true at the edge. So what, what that's leading to is, you know, I'll move away from general purpose processors and chip manufacturers are moving to a centralized processor, but then with specialized offloads that are much more efficient in district usage for those functions.

Dan Cummins: Like, whether it's a DMA. Engine or, or a GPU. It's, it doesn't require the same amount of power for a general purpose processing pipeline. So the prediction there was, okay, all the chip vendors are going to start competing in kind of accelerators around the main CPU, and we're seeing that today. It absolutely came true.

Dan Cummins: So if you look at the ARM SoCs, right? So it's, and core processors surrounded by a bunch of accelerators. Right? So you're seeing that today, all to optimize [00:36:00] power, right? Because you know that power can't scale with the transistors. So, but there's some tricks that people much smarter than I are doing to kind of, continue along the line of, attempting to continue along the line of more, you know, growth, right?

Dan Cummins: So. At the edge, where you need much less power, you need more performance. You're, you're seeing the rise of this SOC type processing architecture because of these reasons. And with the proliferation of more and more devices and sensors. And the need to process data near that point of generation is also fueling the need for these, this silicon diversity, right?

Dan Cummins: We're seeing it, uh, RISC architectures now, ARM architectures, and of course the x86 architectures trying to address and go down to this space at the edge.

Bill Pfeifer: Okay. So that's kind of brushing up against this next question, but looking forward, what excites you about the possibilities that the edge will generate?

Dan Cummins: The thing that most excites me is the [00:37:00] ability to federate your processing at these edge locations, right? I mean, the only way the world is going to scale is by bettering this processing where the. When the data is being generated. The other thing that really excites me too is, there's another thing that we haven't talked about yet is really the data activation or the data management that has to happen, most edge workloads are a streaming workload, but somebody who's operating a business or an edge outcome needs a combination of both IT workloads and edge workloads.

Dan Cummins: And you're going to need data management. And if you're going to deal with this scale. Where all this data is being generated by these devices, you want to reduce the cost of centralization. I don't want to have to pull all this stuff up into a cloud or a data center in order to process it. I want the ability to be able to locate my, where my data is federated at my edge sites is to be able to intelligently distribute, compute to where that data exists, need to generate value from that data there that way, that way I can avoid [00:38:00] the infrastructure sprawl cost, right?

Dan Cummins: And, uh, cost essential, that's what really excites me. It's a combination of say like a data management platform that's using a distributed ledger, right? With the MLOps platform, right? Cause now I can, I can understand where all of my data that I need for training is at the edge and then rather than dragging all that up into a cloud.

Dan Cummins: I can distribute now, I can federate training to those locations because now we know how to find the data. So tracking and finding data and transforming this data and moving this data to where it makes sense to reduce that cost of centralization and to improve. Your economy for being able to operate on that data, and I think is what probably most excites me.

Dan Cummins: So the thing that I, the

Bill Pfeifer: thing that kind of tweaks me about the edge is it's where so many things are coming together, right? Most of our conversation was about how do we get compute to the edge and how do we get workloads to the compute wherever that [00:39:00] happens to sit. But then there's also the whole AI conversation, the whole data management conversation, that's not to say anything of the application development and the business transformation that has to happen to use all of that data and let the automation happen and it's a fascinating

Dan Cummins: space.

Dan Cummins: Yeah, I find it, I find it kind of like a layered cake, right? So first you have to have your... You're a secured infrastructure management, lifecycle management, and then you have to have your orchestration, right? Your service orchestration across these edge computing locations to be able to manage that, you know, estate.

Dan Cummins: And then now you can start to add additional value across these estates, right? To further deepen the value that you can generate at the edge.

Bill Pfeifer: And what do you see that doing in terms of impact on business and society as a whole, near term and

Dan Cummins: long term? Oh, near term. Well, certainly the federation of compute.

Dan Cummins: We'll enable faster time to value for businesses for sure. Inferencing computer vision use cases is a great location, right? Improving the [00:40:00] quality, the ability to deliver products with higher quality to customers faster is really what it comes down to as is one such example. But it will also hasten the rate of innovation because anytime you can federate computing or scale computing, you're going to simplify a lot of processes.

Dan Cummins: You're going to generate a lot of value at the edge. Therefore, you'll be able to innovate faster. So we'll see an acceleration in the evolution of technologies, products, and services is what we're going to see. Yeah, I guess that's what

Bill Pfeifer: it's really all about is how we innovate faster and how we drive the world forward, help the world drive forward.

Dan Cummins: What is it that Michael Dell says? He says, um,

Bill Pfeifer: that our mission is to drive human progress.

Dan Cummins: So if you think about it, it's true to, to Michael Dill's slogan, right? That we're a company here to drive you in progress, right? Cause that's exactly what it's going to do. Yep.

Bill Pfeifer: Kind of fascinating that it all comes into one conversation like this.

Bill Pfeifer: I love [00:41:00] it.

Dan Cummins: Well, that's, yeah, kind of what it's all about.

Bill Pfeifer: Yep. It's a big conversation, but interesting. So how can people find you online and learn more about your work?

Dan Cummins: Well, people can always look me up on LinkedIn. So that's probably the best way to get a hold of me. Right. Cool.

Bill Pfeifer: All right. Love it. Thanks so much for joining us today.

Bill Pfeifer: And I appreciate the time and the, the glimpses into your mind and your processes. You've got some interesting challenges ahead of you, but. It sounds like you've laid the right foundations. Yeah,

Dan Cummins: thanks Bill. Thanks for having me.

Narrator 2: That does it for this episode of Over the Edge. If you're enjoying the show, please leave a rating and a review and tell a friend.

Over the Edge is made possible through the generous sponsorship of our partners at Dell Technologies. Simplify your edge so you can generate more value. Learn more by visiting dell. com slash edge.