The Wild West of Data Centers: Energy and water use top concerns
Data center in Haymarket Virginia. Credit: Hugh Kenny.
By Anthony Flint, December 18, 2025
It’s safe to say that the proliferation of data centers was one of the biggest stories of 2025, prompting concerns about land use, energy and water consumption, and carbon emissions. The massive facilities, driven by the rapidly increasing use of artificial intelligence, are sprouting up across the US with what critics say is little oversight or long-term understanding of their impacts.
“There is no system of planning for the land use, for the energy consumption, for the water consumption, or the larger impacts on land, agricultural, (forest) land, historic, scenic, and cultural resources, biodiversity,” said Chris Miller, president of the Piedmont Environmental Council, who has been tracking the explosion of data centers in northern Virginia, on the latest episode of the Land Matters podcast.
“There’s no assessment being made, and to the extent that there’s project-level review, there’s a lot of discussion about eliminating most of that to streamline this process. There is no aggregate assessment, and that’s what’s terrifying. We have local land use decisions being made without any information about the larger aggregate impacts in the locality and then beyond.”
Miller appeared on the show alongside Lincoln Institute staff writer Jon Gorey, author of the article Data Drain: The Land and Water Impacts of Data Centers, published earlier this year, and Mary Ann Dickinson, policy director for Land and Water at the Lincoln Institute, who is overseeing research on water use by the massive facilities. All three participated in a two-day workshop earlier this year at the Lincoln Institute’s Land Policy Conference: Responsive and Equitable Digitalization in Land Policy.
There is no federal registration requirement for data centers, and owners can be secretive about their locations for security reasons and competitive advantage. But according to the industry database Data Center Map, there at least 4,000 data centers across the US, with hundreds more on the way.
A third of US data centers are in just three states, with Virginia leading the way followed by Texas and California. Several metropolitan regions have become hubs for the facilities, including northern Virginia, Dallas, Chicago, and Phoenix.
Data centers housing computer servers, data storage systems and networking equipment, as well as the power and cooling systems that keep them running, have become necessary for high-velocity computing tasks. According to the Pew Research Center, “whenever you send an email, stream a movie or TV show, save a family photo to “the cloud” or ask a chatbot a question, you’re interacting with a data center.”
The facilities use a staggering amount of power; a single large data center can gobble up as much power as a small city. The tech companies initially promised to use clean energy, but with so much demand, they are tapping fossil fuels like gas and coal, and in some instances even considering nuclear power.
Despite their outsized impacts, data centers are largely being fast-tracked, in many cases overwhelming local community concerns. They’re getting tax breaks and other incentives to build with breathtaking speed, alongside a major PR effort that includes television ads touting the benefits of data centers for the jobs they provide, in areas that have been struggling economically.
Listen to the show here or subscribe to Land Matters on Apple Podcasts, Spotify, Stitcher, YouTube, or wherever you listen to podcasts.
Further Reading
Supersized Data Centers Are Coming. See How They Will Transform America | The Washington Post
Thirsty for Power and Water, AI-Crunching Data Centers Sprout Across the West | Bill Lane Center for the American West
Project Profile: Reimagining US Data Centers to Better Serve the Planet in San Jose | Urban Land Magazine
A Sustainable Future for Data Centers | Harvard John A. Paulson School of Engineering and Applied Sciences
New Mexico Data Center Project Could Emit More Greenhouse Gases Than Its Two Largest Cities | Governing magazine
Anthony Flint is a senior fellow at the Lincoln Institute of Land Policy, host of the Land Matters podcast, and a contributing editor of Land Lines.
Transcript
Anthony Flint: Welcome back to the Land Matters Podcast. I’m your host, Anthony Flint. I think it’s safe to say that the proliferation of data centers was one of the biggest stories of 2025, and at the end of the day, it’s a land use story braided together with energy, the grid, power generation, the environment, carbon emissions, and economic development – and, the other big story of the year, to be sure, artificial intelligence, which is driving the need for these massive facilities.
There’s no federal registration requirement for data centers, and sometimes owners can be quite secretive about their locations for security reasons and competitive advantage. According to the industry database data center map, there are at least 4,000 data centers across the US. Some would say that number is closer to 5,000, but unquestionably, there are hundreds more on the way.
A third of US data centers are in just three states, with Virginia leading the way, followed by Texas and California. Several metropolitan regions have become hubs for these facilities, including Northern Virginia, Dallas, Chicago, and Phoenix, and the sites tend to get added onto with half of data centers currently being built being part of a preexisting large cluster, according to the International Energy Agency.
These are massive buildings housing computer servers, data storage systems, and networking equipment, as well as the power and cooling systems that keep them running. That’s according to the Pew Research Center, which points out that whenever you send an email, stream a movie or TV show, save a family photo to the cloud, or ask a chatbot a question, you’re interacting with a data center. They use a lot of power, which the tech companies initially promised would be clean energy, but now, with so much demand, they’re turning largely to fossil fuels like gas and even coal, and in some cases, considering nuclear power.
A single large data center can gobble up as much power as a small city, and they’re largely being fast-tracked, in many cases, overwhelming local community concerns. They’re getting tax breaks and other incentives to build with breathtaking speed, and there’s a major PR effort underway to accentuate the positive. You may have seen some of those television ads touting the benefits of data centers, including in areas that have been struggling economically.
To help make sense of all of this, I’m joined by three special guests, Jon Gorey, author of the article Data Drain: The Land and Water Impacts of Data Centers, published earlier this year at Land Lines Magazine; Mary Ann Dickinson, Policy Director for Land and Water at the Lincoln Institute; and Chris Miller, President of the Piedmont Environmental Council, who’s been tracking the explosion of data centers in Northern Virginia.
Well, thank you all for being here on Land Matters, and Jon, let me start with you. You’ve had a lot of experience writing about real estate and land use and energy and the environment. Have you seen anything quite like this? What’s going on out there? What were your takeaways after reporting your story?
Jon Gorey: Sure. Thank you, Anthony, for having me, and it’s great to be here with you and Mary Ann, and Chris too. I think what has surprised me the most is the scale and the pace of this data center explosion and the AI adoption that’s feeding it. When I was writing the story, I looked around the Boston area to see if there was a data center that I could visit in person to do some on-the-ground reporting.
It turns out we have a bunch of them, but they’re mostly from 10, 20 years ago. They’re pretty small. They’re well-integrated into our built environment. They’re just tucked into one section of an office building or something next to a grocery store. They’re doing less intensive tasks like storing our emails or cell phone photos on the cloud. The data centers being built now to support AI are just exponentially larger and more resource-intensive.
For example, Meta is planning a 715,000-square-foot data center outside the capital of Wyoming, which is over 16 acres of building footprint by itself, not even counting the grounds around it. That will itself use more electricity than every home in Wyoming combined. That’s astonishing. The governor there touted it as a win for the natural gas industry locally. They’re not necessarily going to supply all that energy with renewables. Then there’s just the pace of it. Between 2018 and 2021, the number of US data centers doubled, and then it doubled again by 2024.
In 2023, when most people were maybe only hearing about ChatGPT for the first time, US data centers were already using as much electricity as the entire country of Ireland. That’s poised to double or triple by 2028. It’s happening extremely fast, and they are extremely big. One of the big takeaways from the research, I think, was how this creates this huge cost-benefit mismatch between localities and broader regions like in Loudoun County, Virginia, which I’m sure Chris can talk about.
The tax revenue from data centers, that’s a benefit to county residents. They don’t have to shoulder as much of the bills for schools and other local services. The electricity and the water and the infrastructure and the environmental costs associated with those data centers are more dispersed. They’re spread out across the entire utilities service area with higher rates for water, higher electric rates, more pollution. That’s a real discrepancy and it’s happening pretty much anywhere one of these major data centers goes up.
Anthony Flint: Mary Ann Dickinson, let’s zoom in on how much water these data centers require. I was surprised by that. In addition to all the power they use, I want to ask you, first of all, why do they need so much water, and where is it coming from? In places like the Southwest, water is such a precious resource that’s needed for agriculture and people. It seems like there’s a lot more work to be done to make this even plausibly sustainable.
Mary Ann Dickinson: Well, water is the issue of the day right now. We’ve heard lots of data center discussion about energy. That’s primarily been the focus of a lot of media reporting during 2025. Water is now emerging as this issue that is dwarfing a lot of local utility systems. Data centers use massive amounts of water. It can be anywhere between 3 and 5 million gallons a day. It’s primarily to answer your question for cooling. It’s a much larger draw than most large industrial water users in a community water system.
The concern is that if the data centers are tying into local water utilities, which they prefer because of the affordability and the reliability and the treatment of the supply, that can easily swamp a utility system that is not accustomed to that continuous, constant draw. These large hyperscale data centers that are now being built can use hundreds of millions of gallons yearly. That’s equivalent to the water usage of a medium-sized city.
To Jon’s point, if you look at how much water that is being consumed by a data center in very water-scarce areas in the West in particular, you wonder where that water is going to come from. Is it going to come from groundwater? Is it going to come from surface water supplies? How is that water going to be managed and basically replaced back into the natural systems, like rivers, from which it might be being withdrawn? Colorado River, of course, being a prime example of an over-allocated river system.
What is all this water going for? Yes, it’s going for cooling, humidification in the data centers, it’s what they’re calling direct use, but there’s also indirect use, which is the water that it takes to generate the electricity that supplies the data center. The data center energy loads are serious, and Chris can talk about the grid issues as well, but a lot of that water is actually indirectly used to generate electricity, as well as directly used to cool those chips.
This indirect use can be substantial. It can be equivalent to about a half a gallon per kilowatt hour. That can be a fair amount of water just for providing that electricity. What we’re seeing is the average hyperscale data center uses about half a million gallons of water a day. That’s a lot of water to come from a local community water system. It’s a concern, and especially in the water-scarce regions where water is already being so short that farmers are being asked to fallow fields, how is the data center water load going to be accommodated within these water systems?
The irony is the data centers are going into these water-scarce regions. There was a Bloomberg report that showed that, actually, water-scarce regions were the most popular location for these data centers because they were approximate to areas of immediate use. That, of course, means California, it means Texas and Phoenix, Arizona, those states that are already struggling with providing water to their regular customers.
It’s a dilemma, and it’s one that we want to look at a lot more closely to help protect the community water systems and give them the right questions to ask when the data center comes to town and wants to locate there, and help them abate the financial risk that might be associated with the data center that maybe comes and then goes, leaving them with a stranded asset.
These are all complex issues. The tax issues tie into the water issues because the water utility system and impacts to that system might not be covered by whatever tax revenues are coming in. As sizable as they might be, they still might not be enough to cover infrastructure costs that then would otherwise be given to assess to the utility ratepayers. We’re seeing this in the energy side. We’re seeing electric rates go up. At the same time, we know these data centers are necessary given what we’re now as a society doing in terms of AI and digital computing.
We just have to figure out the way to most sustainably deal with it. We’re working with technical experts, folks from the Los Alamos National Lab, and we’re talking with them about the opportunities for using recycled water, using other options that are not going to be quite as water-consumptive.
Anthony Flint: Yes, we can talk more about that later in the show — different approaches, using gray water or recycled water, sounds like a promising idea because at the end of the day, there’s only so much water, right? Chris Miller, from the Piedmont Environmental Council, you pointed out, in Jon’s story, that roughly two-thirds of the world’s internet traffic essentially passes through Northern Virginia, and the region already hosts the densest concentration of data centers anywhere in the world. What’s been the impact on farmland, energy, water use, carbon emissions, everything? Walk us through what it’s like to be in such a hot spot.
Chris Miller: The current estimate is that Virginia has over 800 data centers. It’s a little hard to know because some of them are dark facilities, so not all of them are mappable, but the ones we’ve been able to map, that’s what we’re approaching. For land use junkies, there’s about 360 million square feet of build-approved or in-the-pipeline applications for data centers in the state. That’s a lot of footprint. The closest comparison I could make that seemed reasonable was all of Northern Virginia has about 150,000 square feet of commercial retail space.
We are looking at a future where just the footprint of the buildings is pretty extraordinary. We have sites that are one building, one gigawatt, almost a million square feet, 80 feet high. You just have to think about that. That’s the amount of power that a nuclear reactor can produce at peak load. We’re building those kinds of buildings on about 100 acres, 150 acres. Not particularly large parcels of land with extraordinary power density of electricity demand, which is just hard to wrap your head around.
The current estimate in Virginia for aggregate peak load demand increase in electricity exclusively from data centers is about 50 gigawatts in the next 20 years. That’ll be a tripling of the existing system. Now, more and more, the utilities, grid regulators, the grid monitor for PJM, which is a large regional transmission organization that runs from Chicago all the way to North Carolina.
As Anthony said, the existing system is near breaking point, maybe in the next three years. If all the demand came online, you would have brownouts and blackouts throughout the system. That’s pretty serious. It’s a reflection of the general problem, which is that there is no system of planning for the land use, for the energy consumption, for the water consumption. Larger impacts on land, agricultural, forestal land, historic scenic, cultural resources, biodiversity sites. There’s no assessment being made.
To the extent that there’s project-level review, there’s a lot of discussion about eliminating most of that to streamline this process. There is no aggregate assessment. That’s what’s terrifying. We have local land use decisions being made without any information about the larger aggregate impacts in the locality and then beyond. Then the state and federal governments are issuing permits without having really evaluated the combined effect of all this change.
I think that’s the way we’re looking at it. Change is inevitable. Change is coming. We should be doing it in a way that’s better than the way we’ve done it before, not worse. We need to do it in a way that basically is an honest assessment of the scale and scope, the aggregate impacts, and then apply the ingenuity and creativity of both the tech industry and the larger economy to minimize the impact that this has on communities and the natural resources on which we all depend on.
It’s getting to the point of being very serious. Virginia is water-constrained. It doesn’t have that reputation, but our water supply systems are all straining to meet current demand. The only assessment we have on the effect of future peak load from data centers is by the Interstate Commission on the Potomac River Basin, which manages the water supply for Washington metropolitan region in five states.
Their conclusion is, in the foreseeable future, 2040, we reach a point where consumption exceeds supply. Think about that. We’re moving forward with [facilities] as they create a shortage of water supply in the nation’s capital. It’s being done without any oversight or direction. The work of the Lincoln Institute and groups like PEC is actually essential because the governmental entities are paralyzed. Paralyzed by a lack of policy structure, they’re also paralyzed by politics, which is caught between the perception of this is the next economic opportunity, which funds the needs of the community.
The fact is, the impacts may outweigh the benefits. We have to buckle down and realize this is the future. How do we help state, local, federal government to build decision models that take into account the enormous scale and scope of the industry and figure out how to fix the broken systems and make them better than they were before? I think that’s what all of us have been working on over the last five years.
Anthony Flint: It really is extraordinary, for those of us in the world of land use and regulations. We’ve heard a lot about the abundance agenda and how the US is making it more difficult to build things and infrastructure. Whether it’s clean energy or a solar farm or a wind farm, they have to go through a lot of hoops. Housing, same way. Here you have this — it’s not just any land use; it’s just this incredibly impactful land use that is seemingly not getting any of that oversight or making these places go through those hoops.
Chris Miller: They are certainly cutting corners. Jon mentioned the facility outside of Boston. What did you say, 150 acres? We have a site adjacent to the Manassas National Battlefield Park, which is part of the national park system, called the Prince William Digital Gateway, which is an aggregation of 2100 acres with plans for 27 million square feet of data centers with a projected energy demand of up to 7.5 gigawatts. The total base load supply of nuclear energy available in Virginia right now is just a little bit over 3 gigawatts.
The entire offshore wind development project at Dominion is 80% complete, but what’s big and controversial is 2.5 gigawatts. The two biggest sources of base load supply aren’t sufficient to meet 24/7 demand from a land use proposal on 2100 acres, 27 million square feet, that was made without assessing the energy impact, the supply of water, or the impact of infrastructure on natural, cultural, and historic resources, one of which is hallowed ground. It’s a place where two significant Civil War battlefields were fought. It’s extraordinary.
What’s even more extraordinary is to have public officials, senators, congressmen, members of agencies say, “We’re not sure what the federal next steps [are].” These are projects that have interstate effects on power, on water, on air quality. We haven’t talked about that, but one of the plans that’s been hatched by the industry is through onsite generation and take advantage of the backup generation that they’ve built out. They have to provide 100% backup generation onsite for their peak load. They’ve 90% of that in diesel without significant air quality controls.
We have found permits for 12.4 gigawatts of diesel in Northern Virginia. That would bust the ozone and PM2.5 regulatory standards for public health if they operated together. It’s being discussed by the Department of Environmental Quality in Virginia as a backup strategy for meeting power demand so that data centers can operate without restriction. These are choices that are being proposed without any modeling, without any monitoring, and without any assessment of whether those impacts are in conflict with other public policy goals, like human health. Terrifying.
We are at a breaking point. I have to say that the grassroots response is a pox upon all your houses. That was reflected in the 2025 elections that Virginia just went through. The tidal wave of change in the General Assembly and statewide offices and data centers and energy costs were very, very high on the list of concerns for voters.
Anthony Flint: I want to ask all three of you this question, but Jon, let me start with you. Is there any way to make a more sustainable data center?
Jon Gorey: Yes, there are some good examples here and there. It is, in some cases, in their best interest to use less electricity. It’ll be less expensive for them to use less water. Google, for its part, has published a pretty more transparent than some companies in their environmental report. They compare their water use in the context of golf courses irrigated, which does come across as not a great comparison because golf courses are not a terrific use of water either.
They do admit that last year, 2024, they used about 8.1 billion gallons of water in their data centers, the ones that they own, the 28% increase over the year before, and 14% of that was in severely water-stressed regions. Another 14% was in medium stress. One of their data centers in Council Bluffs, Iowa, consumed over a billion gallons of water by itself. They also have data centers, like in Denmark and Germany, that use barely a million gallons over the course of a year.
I don’t know if those are just very small ones, but I know they and Microsoft and other companies are developing … there’s immersive cooling, where instead of using evaporative water cooling to cool off the entire room that the servers are in, you can basically dunk the chips and servers in a synthetic oil that conducts heat but not electricity. It’s more expensive to do, but it’s completely possible. There are methods. There’s maybe some hope there that they will continue to do that more.
Mary Ann Dickinson: Immersive cooling, which you’ve just mentioned, is certainly an option now, but what we’re hearing is that it’s not going to be an option in the future, that because of the increasing power density and chips, they are going to need direct liquid cooling, period, and immersive cooling is not going to work. That’s the frightening part of the whole water story is as much or as little water is being used now, is going to pale against the water that’s going to be used in the next 5 to 10 years by the new generation of data centers and the new chips that they’ll be using.
The funny thing about the golf course analogy is that, in the West, a lot of those golf courses are irrigated with recycled water. As Chris knows, it also recharges back into groundwater. It is not lost as consumptive loss. That’s the issue is, really, to make these sustainable, we’re going to need to really examine the water cooling systems, what the evaporative loss is, what the discharge is to sewer systems, what the potential is for recycled water. There’s going to be a whole lot of questions that we’re going to ask, but we’re not getting any data.
Only a third of the data centers nationally even report their energy and water use. The transparency issue is becoming a serious problem. Many communities are being asked to sign NDAs. They can’t even share the information that a data center is using in energy and water with their citizens. It is a little bit of a challenge to try and figure out the path going forward. It’s all about economics, as Chris knows. It’s all about what can be afforded.
The work we’re doing at the Lincoln Institute, we would like to suggest as many sustainable options from the water perspective as possible, but they’re going to have to be paid for somewhere. That is the big question. Data centers need to pay.
Chris Miller: I think we’re entering a [time] where innovation is necessary. It has to be encouraged, and it’s where a crisis, just short of what we saw with lapse of the banking system in 2008, 2009, where no one was really paying attention to the aggregate system-wide failures. Somebody had to step up and say it’s broken. In the case of the mortgage crisis, it was actually 49 states coming to a court, saying, “We have to have a settlement so that we can rework all these mortgages and settle out the accounts and rebuild the system from no ground up.”
I think that’s the same place we’re at. We have to have a group of states get together and saying, “We are going to rebuild a decision model that we use for this new economy. It’s not going away. Any gains in efficiency are going to be offset by the expansion on demand for data. That’s been the trend for the last 15 years. We have to deal with the scale and the scope of the issue. I’ll give you just one example.
Dominion Energy has published at an aggregated contracts totaling 47.1 gigawatts of demand that they have to meet. Their estimate of the CapEx to do that ranges for 141 billion to 271 billion depending on whether they comply with the goals of the Virginia Clean Economy Act and move towards decommissioning and replacement of existing fossil fuel generation with cleaner sources. That range is not the issue. It’s the bottom line, which is 150 to 250 $300 billion in CapEx in one state for energy infrastructure. That’s enormous. We need a better process than a case-by-case review of the individual projects.
The state corporation does not maintain a central database of transmission and generation projects, which it approves. The state DEQ does not have a central database for water basin supply and demand. The state DEQ does not have a database of all of the permits in a model that shows what the impacts of backup generation would be if they all turned on at the same time in a brownout or blackout scenario. The failure to do that kind of systems analysis that desperately needs to be addressed. It’s not going to be done by this administration at the federal level.
It’s going to take state governments working together to build new systems decision tools that are informed by the expertise of places like the Lincoln Institute, so that they’re looking at this as a large-scale systemic process. We build it out in a way that’s rational, that takes into account the impacts of people and on communities and on land, and does it a way that fairly distributes the cost back to the industry that’s triggering the demand.
This industry is uniquely able to charge the whole globe for the use of certain parts of America as the base of its infrastructure. We should be working very hard on a cost allocation model and an assignment of cost to data center industry that can recapture the economic value and pay themselves back from the whole globe. No reason for the rate payers of Virginia or Massachusetts or Arizona, Oregon to be subsidizing the seven largest corporations in the world, the [capital expenditures] of over $22 trillion. It’s unfair, it’s un-American, it’s undemocratic.
We have to stand up to what’s happening and realize how big it is and realize it’s a threat to our way of life, our system of land use and natural resource allocation and frankly, democracy itself.
Anthony Flint: I want to bring this to a conclusion, although certainly there are many more issues we could talk about, but I want to look at the end user in a way and whether we as individuals can do anything about using AI, for example. I was talking with Jon, journalist-to-journalist, about this. I want to turn to you, Jon, on this question. Should we be trying not to use AI, and is that even possible?
Jon Gorey: The more I researched this piece, the more adamant I became that I shouldn’t be using it where possible. Not that that’s going to make any difference, but to me, it felt like I don’t really want to be a part of it. I expect there’s legitimate and valuable use cases for AI and science and technology, but I am pretty shocked by how cavalier people I know, my friends and family, have been in embracing it.
Part of that is that tech companies are forcing it on us because they’ve invested in it. They’re like, “Hey, we spent all this money on this, you got to use it.” It takes some legwork to remove the Google Assist from your Google searches or to get Microsoft Copilot to just leave you alone. I feel like that’s like it’s ancestor Clippy, the paperclip from Microsoft Office back in the day.
Here’s something that galls me more in a broader sense. I don’t know if we want to get into it, but I’m an amateur musician. I’m amateur because it’s already very difficult to make any money in the arts. There’s a YouTube channel with 35 million subscribers that simply plays AI-generated videos of AI-generated music, which is twice as many subscribers as Olivia Rodrigo has and 20 times as many as Gracie Abrams. Both of them are huge pop stars who sell out basketball arenas. It astounds me, and I don’t know why people are enjoying just artificially created things. I get the novelty of it, but I, for one, am trying to avoid stuff like that.
Chris Miller: We were having a debate about this issue this week on a series of forums. The reality is there’s stuff that each of us can do to significantly reduce our data load. It takes a little bit of effort. Most of us are storing two or three times what we need to, literally copies of things that we already have. There’s an efficiency of storage thing that takes time, and that’s why we don’t do it. There’s the use of devices appropriately.
If you can watch a broadcast television show and not stream it, that’s a significant reduction in load, actually. Ironically, we’ve gone from broadcast through the air, which has very little energy involved, to streaming on fiber optics and cable, and then wireless, which is incredibly resource-intensive. We’re getting less efficient in some ways in the way we use some of these technologies, but there are things we can do.
The trend in history has been that doesn’t actually change overall demand. I think we need to be careful as we think about all the things we can do as individuals to not lose sight of the need for the aggregate response, the societal-wide response, which is this industry needs to check itself, but it also needs to have proper oversight. The notion that somehow they’re holier than the rest of us is totally unsustainable.
We have to treat them as the next gold rush, the next offshore drilling opportunity, and understand that what they are doing is globally impactful, setting us back in terms of the overall needs to address climate change and the consumption of energy, and threatens our basic systems for water, land, air quality that are the basis of human life. If those aren’t a big enough threat, then we’re in big trouble.
Anthony Flint: Mary Ann, how about the last word?
Mary Ann Dickinson: When I looked up and saw that every Google search I do, which is AI backed these days, is half a liter of water, each one, and you think about the billions of searches that happen across the globe, this is a frightening issue. I’m not sure our individual actions are going to make that big a difference in the AI demand, but what we can require is, in the siting of these facilities, that they not disrupt local sustainability and resiliency efforts. That’s, I think, what we want to focus on at the Lincoln Institute. It’s helping communities do that.
Anthony Flint: Jon Gorey, Mary Ann Dickinson, and Chris Miller, thank you for this great conversation on the Land Matters Podcast. You can read Jon Gorey’s article, Data Drain, online at our website, lincolninst.edu. Just look for Land Lines magazine in the navigation. On social media, the handle is @landpolicy. Don’t forget to rate, share, and subscribe to the Land Matters Podcast. For now, I’m Anthony Flint signing off until next time.