Over The Edge

The Evolution and Future of Data Centers with John Bonczek, Chief Revenue Officer of fifteenfortyseven Critical Systems Realty

Episode Summary

How have data centers evolved and how will we adapt to meet modern data storage demands? In this episode Bill sits down with John Bonczek, Chief Revenue Officer of fifteenfortyseven Critical Systems Realty, a leading developer of highly interconnected, custom-designed data centers. They discuss the impact of AI on power and cooling requirements and the growing interest in edge deployments. John also highlights the importance of planning, customer requirements, and the challenges of building data centers to meet modern demands.

Episode Notes

How have data centers evolved and how will we adapt to meet modern data storage demands? In this episode Bill sits down with John Bonczek, Chief Revenue Officer of  fifteenfortyseven Critical Systems Realty, a leading developer of highly interconnected, custom-designed data centers. 

They discuss the impact of AI on power and cooling requirements and the growing interest in edge deployments. John also highlights the importance of planning, customer requirements, and the challenges of building data centers to meet modern demands.

--------

Key Quotes:

“ And I believe everyone in the space, including AI and hyperscalers, are planning further ahead, as well. Partially because you have to, but you look at the consumption of some of these AI companies that are coming in and gobbling up all of the available inventory out there that meets their needs, it's just causing a lack of inventory to be available.”

“There's going to be a next wave from AI, more of the inference applications that are more of the edge applications that require lower latency and more real time compute and learning.”

--------

Timestamps: 

(01:45) How John got started in tech 

(06:45) Data centers and edge deployments 

(11:13) Challenges in modern data centers

(20:29) The role of AI

(31:01) Power and sustainability in data centers

(35:50) Nomad Futurists and the future workforce 

--------

Sponsor:

Edge solutions are unlocking data-driven insights for leading organizations. With Dell Technologies, you can capitalize on your edge by leveraging the broadest portfolio of purpose-built edge hardware, software and services. Leverage AI where you need it; simplify your edge; and protect your edge to generate competitive advantage within your industry. Capitalize on your edge today with Dell Technologies.

--------

Credits:

Over the Edge is hosted by Bill Pfeifer, and was created by Matt Trifiro and Ian Faison. Executive producers are Matt Trifiro, Ian Faison, Jon Libbey and Kyle Rusca. The show producer is Erin Stenhouse. The audio engineer is Brian Thomas. Additional production support from Elisabeth Plutko.

--------

Links:

Follow John on LinkedIn

Follow Bill on LinkedIn

Episode Transcription

Producer: [00:00:00] Hello and welcome to Over the Edge. This episode features an interview between Bill Pfeiffer and John Bonczyk, Chief Revenue Officer of 1547 Critical Systems Realty, a leading developer of highly interconnected, custom designed data centers. John and Bill discuss the impact of AI on power and cooling requirements and the growing interest in edge deployments.

John also highlights the challenges of building data centers to meet modern demands. But before we get into it, here's a brief word from our sponsor.

Ad read: Edge solutions are unlocking data driven insights for leading organizations. With Dell Technologies, you can capitalize on your edge by leveraging the broadest portfolio of purpose built edge hardware, software, and services.

Leverage AI where you need it. Simplify your edge and protect your edge to generate competitive advantage within your industry. Capitalize on your [00:01:00] edge today with Dell Technologies.

Producer: And now please enjoy this interview between Bill Pfeiffer and John Bonchik, Chief Revenue Officer of 1547 Critical Systems Realty.

Bill: John, thanks so much for joining us today. This is going to be a rollickin good time. We're going to be more data center focused than usual, which is a topic that I, I started out in data center. I love talking about them and we've been avoiding them more because we tend to talk about edge computing. But there is a heavy data center component to edge computing.

So let's start with a little bit of background. Who's the man behind the man? How did you get started in technology? What brought you here?

John: Oh, boy. So, I mean, we could probably go back further than I usually do if you want to talk about how I got started in technology. But I was a teenager when I actually started working during the summers.

For a company, it was basically New York telephone and it went through iterations of [00:02:00] it's different bell names and putting the nine X, but I was working summer jobs. Get this in the pub comm departments. And I worked in an, an analysis department as a, uh, let's call it a summer intern. It was basically a paid summer job.

And I started working when I was probably a sophomore in high school through college. And yeah, I was, it's sort of a family industry thing. Yeah. My father was, was in telecom his whole career. And worked for many years in Pubcom. And for those out there who might be too young to understand that Pubcom means pay phones.

I worked in a analysis department doing, doing reports on a daily basis on payphone outages, troubles. And I always. Have to mention the most interesting report I ran during that time was the cherry bomb report, because it was when fireworks are very popular thing in New York and New Jersey, there was a report that basically said how many telephones had been blown up the day before and they were out of them.

It was weird.

Bill: So that just makes me think my kids [00:03:00] are 14. I'm sure they have never used a pay phone. I want to take them out and find a pay phone and have them call their grandma. And just talk to her, like, we're talking, do you want to pay phone?

John: Like, I honestly don't know if you could find one and use a quarter anymore, but you know, there was, you know, collection and people were blowing them up.

Actually not, they were blowing them up more for fun than for the money. I guess it was paid forms would hold a couple of hundred bucks and change, but yeah. And a lost technology. I guess there are still some of them out there. I don't know if we can still put a quarter and want to make a call, but. But after college, I, my first job in this industry, I guess you could call it was I was working in sales at AT& T and then, you know, I moved over more into a operations, believe it or not, operations and engineering side working for a couple of small CLECs and ultimately landed, I got an introduction to some folks at Telex and started working there in 2003 and started out basically started with [00:04:00] sales and sales engineering.

Groups there and was working with Telex from the time we were just a single location at 60 Hudson Street through, so it was upwards of 30 locations by the time digital realty acquired Telex and stayed on there for. Another year, but what I learned through the evolution of, uh, it was more of the edge co location interconnection space, carrier hotel environment, some bigger data center footprint.

And I, when I say bigger, it's all relative, you know, back then a few hundred KW was a big footprint, but, you know, learned a lot, you know, that's really where, you know, I cut my teeth in the data center space and learned quite a bit about what's the importance of edge interconnection and, and carrier hotel space.

Then one of the telex co-founders is a co-founder of 1547. That's Todd Raymond and a couple of other buddies in the industry who started this up. You know, they started it up around 2012 [00:05:00] and you know, got their start with the first couple of of facilities and I joined up with them in 2017 and. Been here since, seems like a, you know, a blink, seems like yesterday, I guess the older I get, the faster time seems to go, but it's a little over seven, seven years now that I've been here with them, you know, a growing and evolving this portfolio and, you know, edge really is the theme of what we do.

So I think the topic, perfect.

Bill: And you've seen some incredible evolutions. You were talking about working for a telco where a decent size deployment was a couple hundred kilowatts. And now on the big side, that's two racks. Yeah, exactly. Yeah.

John: Even just on the interconnection side, which is where we focus a lot, you know, network gear that used to be, you know, sub two KW per cabinet is now You know, 17 20 KW per cabinet.

Yeah, there's still some passive. We still see some passive equipment coming in for fiber [00:06:00] cross connections and things of that nature and some lower density gear to support network applications. But even on the network side, things have densified today's network gear. From an infrastructure perspective, our space and power and cooling requirement looks like 10 years ago, it was data center, high density data center environment.

And now today's high density data center environment, that's a whole nother animal, which I'm sure we'll get into a bit.

Bill: Yeah. Yeah. It looks like the wrong side of an old power plant. My goodness. So, as I said in the intro there, we're going to be pretty data center focused in this conversation. We usually, I don't think we've had.

This much of a data center specific conversation. But as we get into the edge and we look at kind of the bigger side of edge or the more advanced side of edge, we talk, we kind of brush the edges of tiered computing and some of it's going to be on device or on site. And then some of it's going to go to, you know, a small regional data center.

And then it's going to go to an aggregation data center. And then it's going to go to a big centralized data center or cloud or something like [00:07:00] that. And so, you know, all of those tiers of. Compute and storage and networking and all those are those are pretty heavy deployments in some cases. And even if it's just a couple of racks, you still have to build it out like it's a data center.

You have to cool it and power and all that stuff. So what does edge mean to you in your world?

John: Yeah, I'll say there are two different layers to the edge for my worlds. When we look at data center space, from our perspective, we look at it very simply there. It's sort of like a three ring. Circle where the innermost, the bullseye of it we look at as the carrier hotels.

If we're all looking at it from edge meeting network edge, meaning closest point to the eyeballs, closest point of interconnection or aggregation for, you know, where the hyperscalers are going to meet each other, meet the peering exchanges, meet the networks and meet the folks who are getting content out to the eyeballs.

Um, we own and operate what I'll say now is several, what I'll call carrier [00:08:00] hotels in our portfolio. But then we also play in a space, which is just in that middle ring, the outer ring, being the hyperscale data centers and the AI data centers needs massive. Now, several hundred megawatts to gigawatts or multiple gigawatts.

Now, as we're hearing from some of the providers and developers, we're not playing in that space, but that space is interesting to us because. There's edge that would be adjacent to those that I think it's going to, I mean, our part of our thesis is that where those clusters of massive data center investments are taking place, there is going to be an increase in content that needs to be moved and delivered out to the eyeballs.

And what do you need from there is some of these more edge facilities, the edge facility that we play in are more in our Orangeburg, New York facility where today is a 24 megawatt facility that we have now designed an approval to move. Forward with an expansion to more than double that. We're going to be a near 60 megawatt data center.

We'll call it a [00:09:00] campus, even though it's technically going to be one building, which, you know, just a few years ago, if you're talking about a 60 megawatt data center, that's pretty big. It's not that big anymore. That's more, that's the design for what I'll call edge enterprise co location of data center footprint.

That would mean, you know, a megawatt to several megawatts of space and power for their applications. That are serving a geography and need to be within that geography. Don't they're not network applications that need to set a carrier hotel, but they're the applications. Well, they'll benefit from the proximity or the tethering back into those carrier hotels into that bullseye, but also from serving that general market.

You know, be trading applications, media applications or financial for financial analytics, customer behavior type applications, the ones that are not necessarily ultra low latency, like a term that we don't use that much today. It was a big buzzword 10 or so years ago, but [00:10:00] more the benefit from lower latency network access to the carrier hotels and end users.

Bill: We talk a lot about the edge as you know, there's the tech trends of centralized, decentralized, and I've seen it with servers, right? Push the servers right out to the users into their local wiring closets. No, pull them into the core data center, which is in the same building. But it was, you know, quote unquote centralized.

And most of the data centers that I built back in my day were On premises that the company that I worked for at the time owned, we were just, you know, in suburbia in an office park. So we weren't close to an internet trunk line. We weren't sitting right next to our own nuclear power plant where we leased all the power, you know, nothing like that, but it was good enough.

And it was where all of our people are now that starts to look like distributed and an edge data center, you know, at the time that was like hardcore centralized, but racks are getting denser and heavier. And so even just [00:11:00] like it blows my mind to think that even just the weight of the equipment in the racks will now break the older raised floors that I was working with.

And so as we look at those on Prem quote unquote centralized data centers right now, customers are going to have to rerun power and rebuild cooling and rebuild their flooring and you know, like everything about it has to be rebuilt. Are you actually seeing customers building? new on prem data centers or rehabbing their old ones.

Are they moving to colos just so that floor power cooling all of that infrastructure stuff is taken care of? What are you seeing customers do?

John: What I'm seeing them do is if there are applications that they are not willing to or cannot move to the cloud, they are moving out to data centers like The one I mentioned in Orangeburg hours or were other providers of data center space to offload the management of that infrastructure.

[00:12:00] I think everything that you just mentioned from power densities, floor loads, cooling requirements and and all of the evolution thereof causing them to go out and. Put their data centers, you know, so we, and when I look at our customers and the data center footprints that they have within our buildings and in this one in particular, anywhere from one megawatt to our largest tenant, there's about five megawatts today.

They'll consider that their own data center, but it's off prem so they don't have to worry about anything. Everything inside the PDU belongs to them. Everything outside it belongs to us. So we could manage own and operate that to an SLA that meets their standards. And I, you know, take providers like us and, you know, others out there, we talk about SLA's that to us are table stakes.

And I think it would be hard pressed and enterprise would be hard pressed to say that they were managing their own on prem data center to the same level of an SLA that a provider who focuses on infrastructure and, you know, mechanical and electrical infrastructure alone, the core of what we do. And holding us responsible for it, we [00:13:00] do a better job than they would on their own prem.

Bill: Yeah, it's crazy expensive to have that much redundancy built in and test everything and have the specialists and oh boy, okay, especially the way they're built now.

John: The ongoing maintenance of it, I mean, the number of maintenance agreements we have to have, the management of fuel and fuel delivery, fuel cleaning, there's so much.

You know, it might seem simple on the surface that's, you know, space, power, physical, cross connectivity, cooling, but there is a lot to managing it. You know, fire suppression systems, all of those things, you know, ongoing maintenance, there's stuff happening in our data centers every day when there's not an incident.

And if there is and you're designed for it, then there's even more taking place. So it's a lot. And I think that just. Offloading the stuff that's not the core competency of that enterprise is in some cases, just takes the strain off of having the staff for it and, you know, just manage those, you're still managing outside agreements to handle your on prem equipment.

And then there's the end of life part. That's no fun. End of [00:14:00] life for batteries and cooling systems. You know, you're rebuilding equipment within the data center as it gets to be 20 years old. 20, we talked to her, I think it was before we started, we talked about, you know, the older we get, the faster time goes.

I've been a part of standing up new data centers and the next thing you know, they're 15, 20 years old and you replace a year. It happens fast. Yeah,

Bill: yeah. Well, and at this point, I, I would imagine that these days data centers don't last 15, 20 years like they used to because floor load and heating and cooling.

You know, when, what, 10 years ago, a standard rack was pulling five kilowatts. And now the hyperscalers are building toward like 130 kilowatts per rack. That's incredible. How do you cool that without going liquid or, you know, like hot aisle, cool aisle, but now they're totally segregated. So you can't stand in the hot aisle because it's too hot, but just mind blowing stuff.

That's got to be in a colo.

John: And I think the short answer to the first question, you know, I [00:15:00] maybe it was a bit rhetorical, but you don't quit without liquid cool. I mean, you have to, and that's bringing on. And I think anybody who is standing up and talking about being the expert in direct liquid cooling today, I don't want to say they're lying, but we're all learning about it together.

In a recent conferences I've been at, I was kind of refreshing to hear a lot of leadership and infrastructure. Design and development positions across this industry are saying we are working with our customers and learning together how we're going to deliver what it's not even to say what technology, you know, the technology is more, you know, the direct liquid cooling and cd use.

And, you know, but now are they and I'll see to use in rack cd use. And of course, we're talking about cooling distribution units, which, you know, my understanding of, and I'm not the technical guy, of course, but you know, I know enough to be dangerous in the talk at a high level. In a good way, we're using the technology that we typically have in our facilities, chilled closed loop systems, but instead of pumping them through air handlers, we're going to be pumping them through the cabinets and delivering that chilled water directly [00:16:00] to the chips.

And what does that do in terms of, you know, SLAs and life expectancy on equipment, how you manage issues. You got water running everywhere, liquid running everywhere, more potential for different types of issues that, you know, you didn't have before. I mean, you had them to some extent, but we got pretty good at not having leaks and having corridors where all our piping ran and limited how much water ever came near the data center.

And now we're going to be pumping it through every single cabinet in the data center. It changes things quite a bit. What could go wrong?

Bill: It's fine. Yeah, what could go wrong? A couple years ago, I, a couple years ago, I started to hear about. Edge devices being liquid cooled and I laughed because that's ridiculous.

You know, you get hardened devices. You don't need liquid cooling. Now we're talking about liquid cooling edge devices, which is kind of incredible. You know, like one, two, three servers and liquid cooling out there in the field. Ooh, okay.

John: Yeah. It's amazing. And we'll see where we end up here. I'm getting older and thinking more about, you know, where is retirement, but I'm still excited to see what's [00:17:00] 10 years because.

We were talking about our immersion cooling for a while, and I feel like we just now skipped over immersion, but I was talking to some folks recently and said, you know, you watch it, immersion is going to come back once, once there's more of a, an exception to adoption and, you know, they just perfect that technology and they aren't steering away from it because it's messy, but you're still going to be able to get higher densities apparently out of immersion.

And I don't know if we need the educating the folks listening, but immersion, we're talking about actually. It's immersing your servers into a liquid, a different type of liquid, dielectric liquid, that's going to, you know, more of like a mineral oil, for lack of a better term, that's going to cool your gear.

You're still essentially water cooled because you're running water through the immersion gear to cool the immersive liquid, but the densities that you'll be able to get and apparently lifespan of the equipment because it's somewhat protected by the immersion fluid, but. You know, also there, nobody wanting to [00:18:00] maintain because every time you had to replace the server, you're pulling it out of this liquid and it's it gets everywhere for

Bill: and it's not water.

It's some kind of oil or something that's nonreactive. Well, and Microsoft did an experiment where they what built a data center in a container. Filled it with nitrogen gas. It was a non reactive gas, submerged it in the ocean, ran it for a couple of years. And it was actually apparently like much lower wear and tear on the servers because they stayed really cool.

They weren't surrounded by oxygen. No one touched them. It was just fantastic from a maintenance perspective. I wonder like the new immersion. And

John: also I thought it was the craziest thing I heard that, but I, yes, of course followed that story and that's amazing to me and, you know, seeing over time the barnacles and everything growing on your data center, that's insane to me, but also there's a company out there who was building data centers on basically on ships, on barges and, you know, the idea there was [00:19:00] unlimited supply of chilled ocean water to run through your facility, which is amazing to me.

Which at the time seemed pretty crazy to me, putting your data center on a boat. I mean, basically dock barges for, I mean, that's essentially what they were, but now that becomes a bit more feasible running water through them anyway, you know, why not put them at, or on near the water to give them full access to that type of cooling.

Bill: Yeah. Really kind of blurs the line.

John: Some people say we're warming the ocean now with the, uh, the water that we're pumping back in. So I don't know if that's a. Environmental issue, but you know, I'm sure it will be to somebody

Bill: as long as we don't warm individual spots too much. Maybe it's not too bad, I don't know.

But if it sits there and we warm one spot, then I'm sure it's gonna change the whole ecology of the area. That'll be an interesting conversation too. Yeah. I mean, is that an edge deployment? Is that a central deployment? There're probably pretty big deployments. I don't know. I don't even know what to call them anymore, so [00:20:00] it all blurs together.

It's very interesting. So then we get to the logical question, ai. AI is using tons of power and just exacerbating this whole, you know, the speed of the evolution of all of this. How is that impacting the way you see designing, managing, building data centers?

John: It's a great question. And now, while we don't necessarily play in what I'll call the hyperscale or AI footprint space, you know, we're building more of the edge locations.

But we are seeing an impact it might not be from A. I. Directly yet. We expect to see and well, I take that back. I think we are seeing an uptick in network connectivity through some of the facilities that that we operate because of what the hyperscalers are doing with A. I. If we are seeing a direct increase in business from them for their network fiber hubs.

Things like that were always in these carrier hotels. They're now just growing and capacity. And I believe [00:21:00] I is driving some of that from the content that it's creating. I think there's gonna be a next wave from a more of the inference applications that are more of the edge applications that require.

Lower latency and more real time compute and learning. I compare AI to some of what we're, it's not necessarily call AI, what we're seeing, but, you know, some of our enterprises who are on the financial side are running, you know, forms machine learning. That's using the same technology, the same type of NVIDIA stacks, where, you know, they're requiring the same level of power densities in cabinets.

So, whether you call it AI or machine learning, the technology that they're using, I mean, it's available to them. They're using that in their stack, and we have deployments in Aussie now traditional data center that we just built within the last few years. That's able to do, you know, on average, that 17 to 20 some odd KW per cabinet.

In some cases, those cabinets are only about one fifth occupied because that's. That they're pulling all the power. [00:22:00] So so we're working with some of these existing tenants and potentially retrofit their environments. But the hard thing to think about, and I don't know if anybody's got this nailed yet on how you build data center to spec, but you could talk about expansion in our facility in Orangeburg, where essentially going to build another 30 megawatt building.

At least that's what we're designing for today. It's a 230, 000 square foot building. We may only need 50, 000 square feet of that to use all 30 megawatts based on the densities we're seeing in a couple of the customers we're talking to. So, you know, then the next thing is, are you building the size structure that you can build to accommodate lower density applications or potentially get more power down the road, or do you build them to a higher density spec?

For the most part, our design all the way up to and through the chilled. Water loops that we're putting in is not really impacted. It's just the amount of capacity that will be able to get into a physical structure. And if we know getting X amount of megawatts to a particular building [00:23:00] that we can, we can build our mission critical infrastructure or MEP from there to meet that 30 megawatts in this case.

But then it's just a matter of Am I going to end up with 150, 000 square feet of empty space because we're putting it into a very high dense direct liquid cooled environment. It only needs, you know, a few thousand square feet.

Bill: And then all of a sudden you're renting out office space.

John: Yeah, yeah. Or, you know, setting up some very cool amenities for our tenants.

Bowling alleys. Golf simulators and whatnot, an indoor pool. And it's so warm.

It's actually, I mean, that is probably some very cool, interesting things that could be done like that out there. I don't know, but that's, I mean, it's on our minds, but probably not on our minds as much as it is the folks who are building out the hundreds of megawatts and up to [00:24:00] gigawatts. What's the ratio?

It was a pretty predictable and standard formula that did change over time. I could give you the evolution right in one building of ours where 15 to 20, 000 square feet of space, you know, not that long ago, you know, eight, 10 years ago, we were building that space out for. Two megawatts and then, you know, we increased it to about three megawatts and the latest deployment in this a similar size is four megawatts.

That's to manage an average amount of power and cooling per rack. But now the next build for us, you know, I'm talking to a couple of different tenants who. They could put four megawatts in just a couple thousand square if that I mean, when you're talking about 200 KW per cabinet in your averaging, you know what?

25 to 30 square foot per cabinet. Do the math. You don't need a whole heck of a lot of space to put. So how do you build that on spec? I mean, I think we have to stop basically at the white space and not really [00:25:00] deploy too much in particular, you know, even from, are we putting in air handlers or until you have a customer agreement.

And I think part of the difference here is stopping deployment, getting that towered shell up with some of their, you know, your generators and UPSs and all of that. But from there. How are you going to distribute the power and cooling is really going to be dependent on that end user. And, you know, a lot of folks are going to be stopping there until you actually have a customer agreement in place that specifies what the design looks like.

Otherwise you're going to have to charge them for standard builds and then any augments to it are just going to. Cost more. So you're going to have to either pass those along or I don't think anybody's going to be willing to eat them.

Bill: Yeah. Yeah. I guess the state of the art right now is an educated guess, which is not really a great business model.

John: It's not, but the good news is folks are planning further ahead.

Bill: And that was going to be my next question was you were talking about, you know, planning for what the customer wants. So with AI especially, but just the general growth of [00:26:00] technology and change the evolution, how much more of this is going.

Into the core of a business, as opposed to you could bolt that on and it's fine. It's nice to have. How is that changing customer planning cycles, designs, what's their horizon for all of this stuff? How far out are they looking?

John: Yeah. And, and, and I think, you know, for especially, I mean, we still have the regular enterprise customers out there who are categorized as enterprise.

And I think the pull of enterprise has gotten smaller in terms of the verticals that they're in, you know, you're talking financial and healthcare, maybe some media and recent being, they're the ones that still have applications that they need to keep. In house and in house, meaning in a data center or environment that they control, like within a space that we operate.

But the couple of things to me that I see driving them to plan further ahead. And not, I believe everyone in the space, including AI and hyperscalers are planning further ahead as well, partially because you have to, but you look at the consumption, some of these AI [00:27:00] companies that are coming in and gobbling up all of the available inventory out there that meets their needs.

It's just causing a lack of inventory to be available. So it was maybe five years ago, I was answering RFPs for what I'll call the enterprises where they were looking for occupancy within the next six months, which means I already had to have it built. Or it had to be at least in the final stages of build.

So we could commission and deliver, which means I already knew how we were building it, what the spec looked like. So your answer, you were answering that RFP to say what you're answering it based on what your design was. Now I find myself answering them based on what my design will or could be. And you are seeing folks now planning their next three to five years on the enterprise side.

Here's what I'm going to need in the next three to five. I literally just answered one where. The first delivery date, early 2027 for the first tranche of capacity staged deployment. So that's good. You know, I already have [00:28:00] plans for a build in the market that you're looking for. We know what our footprint looks like.

We know what utility power we're going to get. And basically, I'm at the point right now where I have enough time to say. And instead of answering their questions to say, you know, what is your resiliency? What is your design? Are you tier three? I'm plus one concurrently maintainable answers. What do you want me to be?

Because that's another thing that we're seeing as well that we didn't talk about yet is we're seeing a lot of resiliency requirements. Being lowered. Really? If your design is inherently redundant requests for N, I'm getting requests for, you know, I don't need generators. Wow. Rather save the money on the deployment and just have N on UPS.

And no generators

Bill: can just fail to a different data center or a cloud or something like that.

John: Yeah, exactly. Or, yeah, they're, they're redundant locations. I really think some folks are just waking up to the fact that if you look at if you, if you're redundant in your data centers and you're [00:29:00] properly redundant.

What is the utility, the typical utility uptime in each market that you're going into, and maybe you're playing the odds board and the chances of, you know, in two markets, separate utilities going down at the same time and impacting, and, you know, you have an SLA anyway, you know, the chances are you're going to have an event in any data center you go into, I mean, I'm in the space, but I'm happy to say it's not if it's when, but usually, thankfully, events are Very short and time frame.

Typically, you're designed to absorb one side going down and whatever it is, it's why you're in plus one. That's why you're currently payable or two and or and plus to see or whatever the requirement that you're building out to us. But we're seeing some folks back off on that because obviously our cost on a per kilowatt basis that we lease the space at is based on our cost of design and we have a return profile on how long is it going to take me the proper amount of profit on.

This space over what period of time, [00:30:00] that's what drives my rate per KW. And there's also a thing that there's supply and demand impacts of market rates as well, but eliminating generators and half the UPSs, and there are only maybe do one side distributed, you have half PDUs, all that. That's a lot of power

Bill: and equipment.

And yeah, that's a whole lot. Now you mentioned utility power in there, which is a whole fascinating topic right now, especially, you know, cross the U S right. California was just asking people not to charge their EVs between four and seven because demand was too high and yet they want to electrify the whole fleet and we're trying to build more data centers and put AI in everything, but we can't build new data centers because the power grids can't support it.

So. All of this stuff going on. So it makes sense that you're looking now three or more years out so that the utilities can play on their power. But then what are you seeing in terms of capacity? Do you have to look that far out to get utility power? Is [00:31:00] that far enough out to get utility power? What's the state of the power world.

John: It's very market specific. There's utility power down the road, and it's just okay. It's going to take us a year to get it here on site. In others, it requires new substation build. And in some cases, we're putting substations on site of our own and taking the taking out that just transmission from the utility power.

That's the biggest component in site selection, the first question you ask and when an opportunity crosses your plate and you know, there's this data center available here and what, how much power does it have? How much power can you get? Where are the conversations with the utility? Where is the commit letter?

That's number one priority.

Bill: And that takes us into green power, sustainable power, solar, wind, things like that. I was, I saw something not too long ago that people were complaining that. They couldn't get sustainable power for their homes because it's all allocated to hyperscale data centers [00:32:00] and brought out like it was a bad thing.

And I was thinking, but they have guaranteed buyers for it, which will increase the supply because they know they're not oversupplied if it's already all allocated. Like, that's amazing. It's a lot of power, but. Green power, what does that do for a data center? Right? It's then the wind stops blowing and your data center crashes.

Ooh, how fun. How do you accommodate that kind of variable power supply or do you? I mean, we,

John: I just said we do and we don't, I mean, everywhere we're building, we're still dependent on the utility. And when we talk about what the utility is using and what we're paying for, it's the credits, you're getting green power because of the percentage green power that they have that's feeding their grid, but it's feeding their grid, it's.

Basically what you're getting is just, you know, a mixed unit. You're not necessarily getting your power directly from the solar farms or getting it directly from the hydro or getting it directly from the wind farm. You know, for us, that's just what's feeding [00:33:00] their grid. And that's, what's coming to us. So if the wind stops blowing, it's not going to directly impact us.

It's just impacting the amount of power that they're being fed from that part of their generation. And I think the interesting thing about. Green power. It's still a very important topic, and I think it will continue to be. And I think the evolution of the definition of green power will change over time, and we'll see what is actually included in that.

But it seems not. But 23 years ago, it seemed a much more important topic for Every end customer looking for data center space where it was a requirement that you have some commitment to some form of green power. I answered an RFP about two years ago that required on site, renewable, sustainable energy generation that's gone away.

It's more of a I'd like to have. But with all of the demand that we're seeing in the space, I'm going to take the power that I can get where I can get it [00:34:00] and what I'm getting at with what's considered green and sustainable things that I'm hearing at conferences. And when you look at if we can continue to see this hockey stick increase in demand for power, I don't know how don't look to nuclear more.

Well, I mean, I shouldn't say more than one or two because there are folks looking to nuclear today, and I think you'll see more of that in discussions and actual deployments in the very near future. It's just, I know that doesn't necessarily check the box for green and sustainable because of the concerns, the waste in the containment.

And I guess technically it's not, there is a shelf life, right? So it's not necessarily sustainable, but. Is it renewable? I don't know. That's where I think there's got to be flexibility in the definition.

Bill: Yeah, it kind of depends on who's giving the definition. Some people say nuclear is green and sustainable, and some people say it's not.

Yeah. But it's better than coal. I think we can agree on that.

John: I think we can. Yeah. And in, you know, some markets [00:35:00] we might be in, one, they're still burning a lot of diesel fuel to provide power to the grid.

Bill: So jumping tracks a little bit. You're involved with a group called the Nomad Futurists. Can you tell us a little bit about that? I assume that's more of a passion project than a work thing and kind of interesting.

John: Yeah, it's an organization I really support because of what they're doing. And it's something that I'll say that the theme kind of something that I was poking fun at for a number of years.

I've been in the data center side of the this industry as a data center. Sorry, I talked about starting out in telecom, right? But I've been in the data center spaces. 2003. And when I was going to conferences back then, I was the young guy at the conference. And when I look around the conferences today, I'm not one of the old guys yet.

And there's just

Bill: the conferences get older,

John: the conferences get older. You know, there's more gray hair when you look [00:36:00] around the room. And most of the young people you see in the industry are in certain parts of the industry in the auto marketing and events side. All right. The PR side. It's interesting to me when you see the core of the folks that I'm meeting with, there were, I have friends that I've worked with that I worked at the same company or now they've just moved on.

They're buyers and developers and users. The industry is getting a lot older and you know, where are the young people in this space when what we do is so critical to everything that happens around us. Every one of these devices that you hold in your hand, what we're doing here today data center requires compute.

So where is the where is the content in the schools, whether it's trade schools or universities. So Nomad Futurist is doing is they're out working with colleges and universities and high schools and young people to educate them on The data center space and the beautiful thing about the data center space as well as [00:37:00] there is a lot of back office, our company, we're a data center provider.

And when you look at when you go to our office, most of the people there are technically in the data center space. You know, we have finance folks and human resource folks, but it is good for them to know and understand what Well, data center world is so I've participated with them on some in person and some virtual meetings.

I've given virtual tours of our data centers to classrooms, college students to help them learn about what you can study in school. If you're interested in the data center world now, I'm not the technical person. We talk a bit technically here, right? But. You can go out and get a business degree and then enter the data center space to understand what it means from a real estate and investment perspective from a sales and marketing perspective.

But we're trying to interest young people in the data center space so that we bring in some more young people into this industry so we all. God willing, just, you [00:38:00] know, get to retire that we're leaving this space in good hands.

Bill: Yeah. Reception that you get when you introduce the idea and walk them through a data center and talk about the possibilities.

John: It's great. It's great because I, you know, the light bulb goes off when they realize that they use data centers every day. Multiple times a day. All day. I mean, when do you not see a young person with a device in their hands? Either, you know, either just communicating with somebody. Or very likely creating or consuming content.

You're using data centers. You're big drive. You're big drivers of what of what we do and the demand for what we build on when they understand that, like that the cloud isn't this mysterious place in the sky that it's actually physical and tangible and all your stuff actually. sits here, you know, and we kind of walk through what a communication might look like, or if you're creating a tick tock that lives somewhere on a physical device, multiple physical devices in order for you to have access to it.

Here's how your [00:39:00] network provider gets to a physical fiber cable that gets to a data center that gets to wherever that content is sitting and living to get it back to you. And it has to happen like that.

Bill: Yeah. The best kept secret. The cloud is actually a place

John: people build it. I will mention the specific person by name, but hosted a financial TV show many years ago when Equinix was going public and bashing the stock because everything's moving to the cloud.

We don't need Equinix. We don't need data centers. It's all moving to the cloud. And I go, my goodness, here are people that are, are actually giving out financial advice. I have no idea that the cloud is actually a real physical, tangible thing. How do you know this? What's common knowledge to us? I talked to my own kids and my wife.

It's not nearly common knowledge to most people out there. And why should it be? I mean, I live it and breathe it every day, and I understand how our end users are using our space, [00:40:00] and it's just lost on people who are using it every day that they're actually using. These facilities that we build and operate.

Bill: That's fantastic. Huh. Okay. Good stuff. John, that was a super fast time. It has flown by and we should probably wrap up here. So how can people keep up with you online? Keep up with your work. Maybe even connect into the Nomad Futurists.

John: Yeah, absolutely. I mean, of course, I'm on LinkedIn and anybody can message me and connect to me through LinkedIn.

1547 visit our website. Info comes directly to my inbox. I'm a pretty accessible person. Um, and you know, my future is the same. They're on every social media platform, including linkedin, which is where I working with him and track them mostly. So yeah, check us out.

Bill: Fantastic. Love it. Thanks so much for the time, john.

This was a great conversation.

John: Pleasure. Thank you. Really enjoyed it.

Ad read: Capitalize on your edge [00:41:00] to generate more value from your data. Learn more at dell dot com slash edge.