How does the Tour de France collect data and provide real time stats to viewers? Shahid Ahmed, EVP for New Ventures and Innovations at NTT, argues that the work they do to track the Tour de France is the ultimate edge use case. He also dives into small AI models and localized data at the edge, as well as educating for the future of jobs in tech.
How does the Tour de France collect data and provide real time stats to viewers? Shahid Ahmed, EVP for New Ventures and Innovations at NTT, argues that the work they do to track the Tour de France is the ultimate edge use case. He also dives into small AI models and localized data at the edge, as well as educating for the future of jobs in tech.
--------
“ I think [tracking the Tour de France] is one of the most fascinating edge use cases that I've seen in a while…. We’ve got cyclists going at north of 50 miles an hour, 60 miles an hour sometimes, and they're moving across a variety of different towns, villages, mountains, always a challenge from a network coverage perspective. It's a very difficult equation just from that movement part.”
“If you tune in to NBC, which covers the Tour de France, or USA Network, which is the sister network, you'll always see the live coverage of stats…. Live data is a must have and it has to have very small latency, meaning it can't be five seconds later when the cyclist has gone down or finished across the line.”
Timestamps:
(01:32) How Shahid got started in tech
(04:55) Innovations in 5G
(09:06) Challenges and innovations in the Tour de France
(18:01) Edge AI and industrial applications
(27:38) Lessons from a global role and regulatory challenges
(34:03) Teaching Industrial IoT and preparing the workforce
--------
Edge solutions are unlocking data-driven insights for leading organizations. With Dell Technologies, you can capitalize on your edge by leveraging the broadest portfolio of purpose-built edge hardware, software and services. Leverage AI where you need it; simplify your edge; and protect your edge to generate competitive advantage within your industry. Capitalize on your edge today with Dell Technologies.
--------
Over the Edge is hosted by Bill Pfeifer, and was created by Matt Trifiro and Ian Faison. Executive producers are Matt Trifiro, Ian Faison, Jon Libbey and Kyle Rusca. The show producer is Erin Stenhouse. The audio engineer is Brian Thomas. Additional production support from Elisabeth Plutko.
--------
Follow Shahid on LinkedIn
Follow Bill on LinkedIn
Producer: [00:00:00] Hello and welcome to Over the Edge. This episode features an interview between Bill Pfeiffer and Shahid Ahmed, EVP for New Ventures and Innovations at NTT, a mobile operator, infrastructure, networks, applications, and consulting provider. NTT has a close partnership with the Tour de France and Shahid dives into their experience tracking the race and providing real time stats to viewers, discussing how they overcame some major limiting factors on the road.
He also discusses small AI models and localized data at the edge, as well as educating for the future of jobs in tech. But before we get into it, here's a brief word from our sponsor.
Ad: Edge solutions are unlocking data driven insights for leading organizations. With Dell Technologies, you can capitalize on your edge by leveraging the broadest portfolio of purpose built edge hardware, software, and services.[00:01:00]
Leverage AI where you need it. Simplify your edge and protect your edge to generate competitive advantage within your industry. Capitalize on your edge today with Dell Technologies.
Producer: And now please enjoy this interview between Bill Piper and Shahi Ahmed, EVP for New Ventures and Innovations at NTT
Bill: Shahid. Thanks so much for joining us today. This should be a fun conversation. We're gonna talk about sports, globalization, teaching, and whatever else seems fun along the way. Awesome. Great to be here. Go. Thanks. Lots to cover. So let's jump straight in with a little bit of a history lesson. Can you tell us how you got started in technology?
Shahid: Wow. That takes me way back decades ago, but let me, let me try to jog my own memory here. It's been that, that long. You know, I started at Sprint. Do you remember that company Sprint? The out of Kansas city. I started with a, even [00:02:00] a smaller company that was a predecessor of Sprint called Sentel. They were out of Chicago and one year they decided that they're going to go buy some spectrum and deploy cellular all over the U S and so I was, I started there, got involved with wireless very, very early on.
Literally buying land for putting up towers and doing zoning and construction. So got involved with that quite early on and then spent about four years and rolled out a technology my last year at Sprint called CDPD, which is called cellular digital packet data. Basically what that meant was we put in an IP address.
In every single device that you could literally ping Or connect to over an ip network that [00:03:00] started a whole revolution As we know it right that led us to things like gprs 3g 4g 5g as we know it and And all kinds of iot applications, but that's kind of where I got started moved to accenture Went 20 years there, but don't hold that against me.
And then went on to PWC again, very short stint there, did a bunch of stuff, including Rand digital, and then ultimately decided. You know, it's time to try something new. And I ended up starting a couple of different companies. One of them was blockchain and the other one was 5g. Life happens a lot of learnings, a lot of mistakes ended up not doing much with those two companies or ideas.
Ideas were great, but I think the, the timing as always. Wasn't quite there. We [00:04:00] ended up shutting the businesses and returning the money back to our investors. And then I moved to NTT. Took some of the ideas I learned from our blockchain and 5G businesses that we started and applied it here. So here's kind of the summary of who I am and where I came from.
Bill: It's an interesting list of companies. I like the conversation about starting out at sprint, and it's funny to think about how much of running a cellular network is property management, just managing the leases, getting, getting the places for all of those towers, right? I just think of my phone. And of course there's service on the other end of it.
And if there's not that I'm mad, that's that, but like, that's a whole set of that. Probably a whole department of it and fascinating stuff. So today you're the EVP for new ventures and innovation at NTT, which is. An interestingly slushy title. I imagine that lets you touch a lot of things. What does that mean to you?
What [00:05:00] does that look like? What do you do with that?
Shahid: Well, we kept it sufficiently ambiguous and vague so I can do a lot of things. But basically it means we launch new products and services, gets to some level of scale, and then we roll it out to variety of different operating companies, countries, but we try to incubate these ideas, one of which was private 5G.
And we'd like to think about three years ago, it was sort of a category on its own. There weren't too many players at that time launching those products and services around private 5G. And so, you know, we, when we launched it, we knew We didn't want this to be a cellular operating model type product. And we wanted this to be more a CIO centric.
Solution or product [00:06:00] set. So you know, you've got carriers cellular model where they have a public network. They deploy, as you mentioned, towers near the businesses or factories or airports. And they say, Hey, we've got you the 5G coverage you need. Here's your private 5G solution. But if you. Really dig into this and specifically within the CIO framework, where they have very specific enterprise requirements, they want to build their own 5g network that is integrated tightly into their land, when all those networking infrastructures that they have already in place.
So when we built this and we thought about this, we really had the CIO in our minds and what they are looking for. But it was one of those things that we have to educate the market a bit about this. And even the CIOs didn't [00:07:00] know what they wanted. Right. And so we. We had to work with our customers, work with our suppliers or other ecosystem partners like device manufacturers to all come together and really build this market for everyone.
We've, we've seen quite a bit of progress over the last two, three years. Devices have become abundantly available where you can deploy. A private 5g network and be able to do the things you wanted to do in the first place. So we've come a long way
Bill: Yeah, it's kind of an interesting challenge a lot of technology right now feels like I want to go out and buy a car And I go to a vendor and they give me a great engine and a fantastic transmission and wish me luck Yeah, but I still have to build the rest of it myself.
Source all the parts, make sure everything's compatible. I want a car, I want to sit down, turn the key and go to the grocery store. And, you know, that's [00:08:00] getting all of those pieces together. Boy, they're just, the ecosystems are getting wider and wider and more and more integrated. And asking customers to figure out all of those things is just Amazing and painful and terrifying with all of the with all those integrations
Shahid: even take electric cars bill and you need a whole ecosystem around it off chargers, you know, you've got battery manufacturers.
You've got storage. I mean, it's not just building an electric car even in the U. S. You know, we still are. Yeah. Are way short of all the electric chargers across the country. In fact, in Europe, we know the electric car sales are plunging because one of the main reasons is that the ecosystem is not quite there yet, including, but not limited to just charging stations across.
Europe
Bill: and the power grid to back it up. That's that's a [00:09:00] whole different conversations. The power grid and how much of a demand we want to put on that. But that's taking us a little far off track, but it's a fun conversation. So. The Tour de France, NTT and your team specifically support what you refer to as the ultimate edge use case, the Tour de France, and you're doing a lot of the, a lot of the tracking, a lot of the measuring, a lot of the instrumentation kind of work.
It's a really interesting set of challenges. And stuff that most people probably don't think about and will never have to hit. But it's interesting to hear kind of that like far off in the corner, interesting use case, because that comes back toward more of the mainline use cases. And you go, Oh yeah, okay.
That's I can solve my much less restrictive problem in that space. Can you tell us a little bit about what you're doing for the Tour de France and how you got involved with it? [00:10:00]
Shahid: So we've been involved with Tour de France for the last 10 years. It's an awesome partnership, not only because they allow us to actually bike behind the professionals, believe me, they, they smoke us.
And all we see is their backs for like five seconds. I would hope so. They're well far away from us as soon as we start. But look, I think it's one of the. most fascinating edge use cases that I've seen in a while. There's three reasons for it. One, you've got sensors in, deployed in the bike itself that do all kinds of measurements, not only Just the cycle is biometrics, but you've got sensors all across the bikes.
And we know everything from not only just the power on your left foot and right foot, but we know the braking speeds. We know the location, [00:11:00] obviously the, the acceleration, all kinds of metrics coming from the bike. The second thing is it's moving. Constantly moving when we talk about edge use cases, most of them are pretty fixed inside the factory inside the hospital airports mining operation.
Things are relatively fixed here. We got, you know, cyclists going at. North of 50 miles an hour, 60 miles an hour sometimes. And they're moving across a variety of different towns, villages, mountains, always a challenge from a network coverage perspective. It's a very difficult equation just from that movement.
Part, they're constantly moving and it's never always in a constant coverage area. So that's the second challenge. The third is the data and there's a lot of data. And how do you make sense of that data? [00:12:00] How do you make it? Useful and insightful, not only for the teams themselves, but the viewers. So, you know, if you tune in to NBC, which covers tour the fronts or USA network, which is sister network, you'll always see the live coverage of stats, right?
It's not very useful when a. When an audience or somebody watching TV just to see pictures of cyclists moving, they want to see stats, right? So it's, you got to make it compelling for the audience to keep watching that clip or keep watching live that stage. And so live data. Is a must have and it has to be, you know, has to have very small latency, meaning it can't be five seconds later when the cyclist has gone down or finished across the line and you see the data coming back five seconds later.
It's just [00:13:00] not it's not going to make make the cut, especially for the producer or director of that clip. And so if you think about these three challenges, it is. It produces a monumental technology lift that we have to make every year, starting with, you know, just all the data you collected. And we built a very robust model over the last 10 years, AI model that can almost 99 percent accuracy predict who's going to be Up front, what the lineup is going to be, the ranking is going to be based on not only previous data, but also the current positioning of each cyclist.
It's incredible. That's, we need that kind of AI model in place because the connectivity, which was my second point earlier, that You know, you don't have robust connectivity. You're going through different [00:14:00] towns. There's usually no cellular coverage in those areas. Even if they do have those, they lose connection with our, you know, helicopter that's always hovering up.
If the helicopter moves away because there's a Momgon nearby, then you, you're left with a peer to peer network with motorcyclists going. in between different cyclists and capturing the data. We have to have a model that can take all this data. We actually take even video images from the motorcyclist who's, you know, got his camera on his shoulder.
And, you know, the cameraman is always looking at different, different cyclists. And so we take that data images, very accurately able to tell the distances and who is actually in place. And we take that data and feed that into the AI model and it accurately then spits out all the stats, even if we've lost complete connection [00:15:00] with the cyclist and the sensors on their bicycle.
So it's incredible, a challenge for all of us, but it involves. AI. It involves IOT. It involves some really cool networking technology. We have to bring it all together into building this, this edge solution.
Bill: And I guess you need all of that data constantly because most of the people watching aren't cyclists and don't understand the nuances of what's going on.
And that you're just watching people on bikes, whatever, but what makes it a race. Is that competitive, you know, here's where we're starting to see this person slip. Ooh, watch for something coming next. There's something coming and, you know, helping to do that. But then I hadn't thought of using AI to parse the video that you're collecting from helicopters and motorcyclists and things like that.
I was just, I was assuming instrument the bikes, but I guess you can't even do that very heavily [00:16:00] because, you know, you hear about how much these bikes cost because they're shaving off. Portions of an ounce and making it just a little bit more aerodynamic. And now you want to stick instrumentation and batteries and things like that.
Oh, we'll just pull power from the pedals. No, no. Now you're making the riders tired. All of this stuff that the constraints that you're dealing with are just amazing. So you have to have these super light, super low power sensors, but still keep connected all the time. So you can pull all this data because.
People are watching for the data and the storytelling of the data. I guess that's really what the commentators are working from. Absolutely.
Shahid: And yeah, you can lift these bites with your pinky. I mean, literally, I mean, specialized S works. You can literally with the back of your pinky, we lift that up. Uh, it's that light.
Yeah, absolutely. We don't want to. Burton these bikes with all kinds of instrumentation. So we try to limit it to one single sensor that does all kinds of things, [00:17:00] but you lose connection with those sensors all the time. And so that's why the importance of an AI model comes in play and it starts to look at other sources of data coming from different, different parts of the race.
And all. Brought together, it's able to predict what's happening and then spit out the actual metrics accordingly. And some of them, you know, 99. 9 percent accurate.
Bill: That's amazing. And I guess you can put a lot more. Heavy instrumentation on the motorcyclists and the helicopters and drones and things like that, because that's not the bike and you can't really touch the bike much
Shahid: exactly.
Yeah,
Bill: that's fascinating. Fascinating challenge. So what lessons have you learned from it? That I have to imagine that working with that tight of a set of constraints, then you go to manufacturing or energy or something like that. And they say, we have this crazy problem. And you go, [00:18:00] we already solved that.
Shahid: Yeah. You know, actually born out of this whole tour de France architecture came this idea of edge AI. And this is our new product that we recently announced. The idea is very simple to do AI at the edge. And what that means is in the edge, you just don't have all the compute resources, all the storage resources you don't even have.
Cloud connections, because the networking capabilities are restrictive, right? You don't, in a factory generally, you want to keep all data local. You don't want anything to go out into public. And so what we're talking about is algorithms and AI models that support use cases like safety, efficiency, things like [00:19:00] security.
You want to be able to do those very specific tasks in real time. They have to be actionable. And so what that means is you can't have a large language model. that has static data from 2017 or 2019 for that matter. This has to be real time data that has a time series bound to it, right? So there's data coming in and out from conveyor belts.
They're coming from thermostats all around the factory. Environment or mining operations. And so they're coming in and you need to take action right away. And so that's the kind of world we're talking about when we talk about Edge AI. It has to be small AI footprint, very task specific. We're talking about instead of 20 billion parameter [00:20:00] AI model, much like chat gbt 440.
We're talking about 20 parameters or 30 at most 50. And it's able to parse very specific real time data time series and is able to take action right away. Give you an example. You have a video camera, right? And that's a very, you've, we've seen all kinds of machine vision examples where machine vision camera is able to tell how many people are walking in.
And it's able to tell, you know, if you have your hard hat on or not, it's able to do very, What I called singular tasks, right? It's count number of people count how many people who have hats on. But what if you wanted to take this, this video camera, that's 4k or 8k, and it's able to look at all kinds of events inside.
The factory or a [00:21:00] mining operations or an airport, for that matter, it's able to tell simultaneously if the doors open, if people are wearing hats, how many people have walked in, who has walked in and is able to simultaneously process all this data in real time, and it has to do that in a way that can't have a long latency cycle.
Meaning if somebody leaves the door open, you want to raise the alarm right away. Red light should be blinking immediately. Or for some reason, someone's obstruct a, a AGV, a path. With a piece of equipment and it's about you want to be able to alert that person immediately in there, you know, in the radio.
Hey, you need to, you can't put this thing here. There's an about to come in and plow right through. So. Those are the kinds of use cases [00:22:00] where we're contemplating. I mean, there's a bunch of other ones. Another one is, which I really, we've implemented this, um, one of our customers. You've got a big factory, 1 million square feet or more.
And you've got a bunch of sense, um, thermostats all around the factory. Think about it, right? You've got 300, 400 thermostats to cover the whole thing. You've got so many HVAC systems. Each one of them has some sort of thermostat to regulate it. But what if you wanted that, that factory to be at set at 70 degrees across all of its thermostats, believe me, it's a.
Easy to say this, but it's a huge monumental task. Literally, there's a guy out there and it's iPad and he's got each one of the 300. That's
Bill: the thermostat guy. That's my job, setting the temperature every day.
Shahid: He's got an app that looks much like a DJ equalizer [00:23:00] where, you know, he's literally going up and down trying to.
manage each one of the thermostats just to get some ambient temperature that he guesses will get him to 70 degrees, right? Um, and you know, country companies are doing this, countries too, but companies are doing this because they want to reach some sort of energy levels to meet their sustainability goal, right?
Their power consumption, energy consumption, so they can meet their carbon goals and so forth. So it's a very actually important piece of use case that could, you know, either save or, or cost a lot of companies. And so with AI, especially running on the edge, you can do that flawlessly, and it can run a linear programming, multilinear programming algorithm, be able to very quickly tell you how, how to get to 70 degrees by sending [00:24:00] commands to each one of the thermostats to put, adjust itself based on local temperature, heat emission from some machinery nearby and so forth.
And it's a compelling use case, all done by a very small AI model running on the edge.
Bill: So as you build those small models and you try To strip them down again, not large language models, right? You don't want to know what kind of wine to have with fish for dinner and things like that. You just want to know, is the workers safety equipment installed properly?
And are there things blocking the highways and things like that? So is it more efficient to have? A series of discreetly tuned AI models, right? Here's a picture of a worker run that through the worker safety AI. Here's a picture of an empty aisleway run that through the is the aisleway safe? Is there anything blocking the machinery that's going to come?
Or does it make more sense to have one model that runs like the factory or [00:25:00] some, you know, a major chunk of it? Where, where's the break point there?
Shahid: I, I think the, the break point is very dependent on the use case itself, but generally speaking, I think, you know, we've seen a lot of AI models that are, you know, huge, large language models that require a lot of compute power, a lot of storage, a lot of cloud.
Bill: That's certainly what's in vogue right now. Yeah. Yes.
Shahid: But I think often I feel like. We forget about some very specific things that AI can do in a factory floor type environment or hospital or airport, mining operation, oil rig, what have you. And those are very compelling, very specific, actually revenue bearing opportunities.
And, and that can provide. Huge [00:26:00] value to the end user or or the use case itself, and it doesn't require a heavy lift in terms of building out a model. You do need a base AI model, no doubt about it, but. Specifically, you know, you don't need to build something that's so huge, such a huge data set that it requires.
So I think the inflection point, I guess, is where we see a lot of use cases that require real time and time series type actionable AI insights. To me, that's the turning point, inflection point, whatever you want to call it. And we're beginning to see that. And I think also going to me, one other thing. A side benefit of all of this is the AI has been pretty much relegated towards companies that have a lot of capital, a lot of investment funding to do the [00:27:00] things they need to do to help their large employee base.
But what about the blue collar worker? Have they been given the same access to AI? And the answer is actually, no, they haven't seen AI in, in the same way that you and I have been experiencing the last two, three years. So I think there is a Digital divide, if you want to put it that way, is, is beginning to shape here where all the haves and not haves are, we're beginning to see that.
Bill: So, jumping tracks a little bit.
Shahid: Yeah.
Bill: You work in a global role. Which means you're working across different economies, different regulatory environments, different regional preferences. You know, you've got economies that have lots of money and some that have less money. Uh, you know, when you're talking about manufacturing and low [00:28:00] cost regions, you can't have an expensive solution because that's just, they'll just brute force it with more people and things like that.
So how much do you have to localize your offerings and What do you see across those markets that leads you to think here's an opportunity where we should create and incubate a new solution, a new product that we can then deploy at wide scale?
Shahid: Yeah, great question, Bill. I think to me, the proverbial think globally, act locally.
I mean, it's almost a cliche these days, but it is kind of a guiding principle. Well, we build a. Global platform for generally all of NTT data. We tend to work very locally in our markets. And by that, I mean, we have very different regulatory frameworks [00:29:00] as an example, picking up on just telecom, even in the U S our telecom policies.
Very different than European union telecom policy. And if you even dig even deeper, the German telecom policy is very different than the French.
Bill: Well, and you had mentioned something in Chicago that the data had to be processed in Chicago, which is, yeah, I hadn't heard about something like that in the U S that was fascinating.
I've only heard of it country by country.
Shahid: Yeah, it's in Chicago. I believe the city council is going to vote. They will be voting. On whether data center providers can. Take their data outside their data center, outside, when I say outside their Cook County, or they had, they're required to keep it in the county, forget about Chicago area or Illinois, or us [00:30:00]
Bill: local,
Shahid: that's hyper local.
There's some truth to this, by the way. I mean, in Europe today, the regulators are thinking about this quite. Seriously and haven't been, right? Mm-hmm . There are a lot of laws out there where you can't actually take data out. There's a lot of GDPR there of course, right? In California we have CCPA and there's all kinds of regulations.
Each state are not thinking about how they wanna treat data as AI starts to take shape. So look, I think regulation aside, there's also compliance. And if you take even compliance away from the equation, then you have this, you know, customers themselves, there is language, um, language challenges, there is customers requirements that are very specific to that country, you know, there's labor laws.
So, yeah, it's always a challenge. We try to [00:31:00] build a base horizontal global platform, but really. In the end, we have to meet the requirements of that country's specific area and that customer,
Bill: right? So building the edge AI platform that you mentioned earlier, I mean, the idea of the edge is pulling technology out and putting it as local as possible.
So you're moving the data as little as possible, processing it in place as much as possible. But now you add on all these different regulations of. Cost restrictions and data movement restrictions where you can move it within a country or just within a city or, you know, whatever. So to make it cost effective, you want to have it globally consistent and centrally managed and all that good stuff, but then to make it regular.
Regulatory. Okay, okay. With the regulators, you have to account for all of those local things. And even just keeping track of all the security requirements, the regulatory requirements, [00:32:00] the data movement requirements, all that good stuff. That's that is an interesting challenge.
Shahid: Yeah, no, those are very interesting.
I think one thing edge has edge AI has going for at least our products that is that We try to keep everything local inside that edge edge hardware solution that we have in place. And we've got a pretty interesting edge edge solution has ample enough storage, great compute power. And CPU, we've got, you know, Intel based CPU, NVIDIA based CPU.
We're not talking about GPUs. That costs a lot and takes a lot of power. But, you know, you don't need all that to run those AI models I'm talking about. So we, a lot of those things kind of negate itself when we talk about really doing everything locally inside that hardware box. So that's [00:33:00] something we've kind of had in our mind.
Look, we don't want. Any of this data going out that way, we don't have to deal with all these issues related to regulatory compliance, sovereignty, all that,
Bill: and I would imagine that the hardware used to be the limiting factor for stuff like this. But now it's gotten so powerful, so durable that it's more about how do you get it out to every place and the telecoms like NTT.
have been doing that 5G property management, you know, the cellular property management piece. So you know how to do small installations in many, many, many, many places, which is, that's a big chunk of the battle right there is just having that expertise and the number of people that can go out and install it and then centrally figuring out all the, all the regulatory environments and things like that.
But you've probably had to do that for years too. So I, I [00:34:00] think you're really well positioned for this. It's kind of amazing for you. So jumping tracks again, if I may on top of your day job, you're also a visiting professor teaching industrial Internet of Things. at McCormick School of Engineering at Northwestern University.
So what took you to teaching and what took you to that course?
Shahid: I've been associated with Northwestern, particularly McCormick for, for a while. I obviously graduated from there, but, you know, one of the big challenges that the school had, McCormick Engineering, School of Engineering, was that as industrial companies, We're beginning to move from hardware to software.
Shift, right. And robotics now, they needed to re skill and bring in new talent that had that not only skill set, but also a talent pool [00:35:00] who were thinking more like software engineers and less like industrial hardware engineers. And so the idea was let's build a course that can essentially. Bring in concepts of software engineering, data structures, networking, and also the fundamentals of industrial engineering all blended together.
And that led us to the industrial internet of things course. And, and it's really all about making sure that the. Leaders that are working currently in these industrial companies who are looking to advance their careers and software, or they just want to learn a little bit more about how software can help their industrial processes inside their plants and [00:36:00] for any industry.
Thank you graduate student who wanted to take a course on industrial manufacturing that had a very much a software inclination to be able to get all those concepts in one single course. So yeah, that's, that was the design principle behind that course. It's always sold out, not because, because of me, but it's mainly because of the ideas that we're bringing to the table for all of these students.
Bill: It's a great topic for sure, but it's also a really fast moving topic, right? What we're trying to do in industrial IOT is changing so fast, I would imagine you've got to keep updating that course just to stay current, let alone be more forward leaning because by the time they get out and get their job and, you know, get established and are ready to actually use that information, it's been some amount of time and they'd have to move really fast to.
[00:37:00] To not get stale with that knowledge, how do you stay up to date to keep them? What skills are going to be relevant for the generation that's in school today? And how do you, how do you train them in advance?
Shahid: Yeah, let me address that last question. I think that is the biggest challenge right now. All universities are facing the pace of change in technology.
We all know that we've seen it. We see it every day, right? Every Almost every day, there's some announcement out there. And if you keep track of any AI development, your inbox is completely filled in with all kinds of stuff. And at the end of the night, you're trying to like, what, what just happened today?
And so as an employer, I'm always looking for. Not only a smart, very agile student who can potentially join our [00:38:00] company, but also somebody that has good handle on the industry and I'm not looking for any, you know, five years of experience of understanding AI for that matter, or product development. I know they just don't Graduated from college.
I don't expect anything. I do expect them to have some grasp off what is happening in the world. And it's a challenge for the universities to, you know, not only teach them basic fundamental academic concepts. Even economics have pretty much changed these days, right? You can't teach the old Keynesian economics, economic theory anymore, right?
a lot more happening even with, you know, how digitization of currencies are happening all over the world, right? And so [00:39:00] even as a economist graduating out of Kellogg School of Management at Northwestern, even the Fed Reserve, Hiring manager will say, okay, tell me a little bit about CBDC, Central Bank Digital Currency, you know, what, what are your thoughts on that?
And the student didn't know anything about it. You've got a real problem. And so how, how does, you know, even the economics teacher. forget about tech teacher, right, like me, teach about the economics of crypto technology. And how does a open technology like crypto can help serve a Fed Reserve central bank objectives, right?
So it's, it's, it's a challenge for a lot of universities across different fields, not just industrial engineering or. Um, um, technology or electrical engineering, economics, uh, [00:40:00] finance. I think it's a challenge across the board. And I think we need to have universities, uh, have a closer relationship with the industries companies so they can, they can bring in these kinds of expertise to the class.
Bill: It's an interesting challenge to solve. And it's kind of cool. I mean, I've, I've. I'm sure we've all seen the stats of, you know, like 50 percent of all jobs or some, some insane number that kids in 30 to 40 years will have haven't even been invented yet. And I think it's higher than that. I think it's like 70 percent of the jobs that they're going to have in 30 years don't exist yet.
And so how do you train for that thing? Which thing? I don't know yet. We don't know. So, but it's probably also pretty amazing as an opportunity for you because you get to be in the middle of that and do as much learning as [00:41:00] your students do just to keep up and to stay current and try to stay ahead far enough to be useful to them.
And I would imagine that they do, they do a lot more back and forth than we did in the past in university. Here is the knowledge. I give it to you. And now it's more a conversation about what comes next. Much more, much more exploration of the topic. I would expect, I would hope, I would anticipate, but this has been a fantastic conversation and we are coming up on time here.
So Shahed, how can people find you online and keep up with Everything you're up to.
Shahid: Yeah, I'm pretty active on LinkedIn. So please feel free to find me on LinkedIn. Connect with me or send me a quick note. Happy to field questions. And yeah, I'm always on the road. And so, you know, there's a conference where you come across my name.
Feel free to [00:42:00] stop me afterwards.
Bill: Fantastic. I love the conversation. Thank you so much for the time and the perspective. And I hope you had fun as well.
Shahid: It was awesome, Bill. Thank you so much.
Ad: Capitalize on your edge to generate more value from your data. Learn more at dell. com slash edge.