Topic: Agua

Drenaje de datos: los impactos en el suelo y el agua del auge de la IA

Por Jon Gorey, Octubre 17, 2025

Un zumbido débil emerge desde lo profundo de una vasta tumba con luz tenue, cuyo ocupante devora energía y agua con un apetito voraz e inhumano. El centro de datos beige y rectangular es una especie de vampiro: pálido, inmortal, sediento. Resguardado de la luz del sol, activo toda la noche. Y, al igual que un vampiro, al menos según la tradición folclórica, solo puede entrar en un lugar si lo han invitado.

En los estados y condados de Estados Unidos, los legisladores no solo están abriendo la puerta a estos monstruos mecánicos metafóricos. Los están atrayendo de manera activa, con exenciones fiscales y otros incentivos, ansiosos por recaudar nuevos ingresos municipales y reclamar una parte del crecimiento explosivo que rodea a la inteligencia artificial.

Eso puede sonar hiperbólico, pero los centros de datos en verdad devoran recursos. Un centro de datos de tamaño mediano consume tanta agua como una ciudad pequeña, mientras que los más grandes requieren hasta 18,9 millones de litros de agua todos los días, la misma cantidad que una ciudad de 50.000 personas.

También se requiere una asombrosa cantidad de electricidad para alimentar y enfriar las filas de servidores. Un centro de datos convencional, como el almacenamiento en la nube para los documentos de trabajo que usamos a diario o la transmisión de videos, consume la misma cantidad de electricidad que entre 10.000 y 25.000 hogares, según la Agencia Internacional de Energía. Pero un centro de datos a “hiperescala” más nuevo y centrado en la IA puede usar la misma cantidad de energía que la equivalente a 100.000 hogares o más. Por ejemplo, se espera que el centro de datos Hyperion de Meta en Luisiana consuma más del doble de energía que toda la ciudad de Nueva Orleans una vez finalizado. Otro centro de datos de Meta, planificado en Wyoming, usará más electricidad que todos los hogares del estado combinados.

Y, por supuesto, a diferencia de las nubes reales, los centros de datos requieren suelo. Y mucho. Algunos de los centros de datos más grandes que se están construyendo hoy en día cubrirán cientos de hectáreas con acero impermeable, hormigón y superficies pavimentadas —suelos que ya no estarán disponibles para cultivo, naturaleza o vivienda— y también requerirán nuevos corredores de líneas de transmisión y otra infraestructura asociada.

Sin embargo, los centros de datos forman parte de nuestro paisaje construido desde hace más de una década; muchos de ellos están escondidos en discretos complejos de oficinas, desde donde procesan en silencio nuestras búsquedas en la web y almacenan las fotos de nuestros teléfonos celulares. Entonces, ¿por qué la preocupación repentina? Las herramientas de inteligencia artificial entrenadas con modelos de lenguaje de gran tamaño, como ChatGPT de Open AI, entre otras, utilizan exponencialmente más potencia informática que los servicios tradicionales en la nube. Y las empresas de tecnología más grandes, como Amazon, Meta, Google y Microsoft, están realizando inversiones rápidas y considerables en IA.

Entre 2018 y 2021, el número de centros de datos de EUA se duplicó con creces y, con el impulso de las inversiones en IA, ese número ya se ha duplicado de nuevo. Al principio del auge de la IA, en 2023, los centros de datos de EUA consumieron 176 teravatios-hora de electricidad, alrededor de la misma cantidad que toda la nación de Irlanda (cuya red eléctrica ya funciona casi a su máxima capacidad, lo que provoca que los centros de datos utilicen generadores contaminantes desconectados de la red), y se espera que este consumo se duplique o incluso se triplique tan pronto como en 2028.

Esta rápida proliferación puede ejercer una presión enorme sobre los recursos locales y regionales, cargas que muchas comunidades anfitrionas no tienen en cuenta en su totalidad o no están preparadas para afrontar.

“La demanda de centros de datos y procesamiento acaba de explotar de forma exponencial debido a la IA”, dice Kim Rueben, exasesora principal de sistemas fiscales del Instituto Lincoln de Políticas de Suelo. Explica que Virginia y Texas tienen, desde hace mucho tiempo, incentivos fiscales para atraer nuevos centros de datos, y “otros estados se están subiendo al tren” con la esperanza de ver crecimiento económico y nuevos ingresos fiscales.

Pero en una conferencia de Políticas de Suelo y Digitalización convocada por el Instituto Lincoln la primavera pasada, Rueben comparó la naturaleza extractiva de los centros de datos con las minas de carbón. “No creo que los lugares reconozcan todos los costos”, indica.

Sí, Virginia, los datos son reales

En la conferencia de prensa, Chris Miller, director ejecutivo del Piedmont Environmental Council (PEC, por sus siglas en inglés), explicó cómo alrededor dos tercios del tráfico mundial de Internet pasa por el norte de Virginia. La región ya alberga la concentración más densa de centros de datos en cualquier parte del mundo, con alrededor de 300 instalaciones en solo un puñado de condados. Ya se planifican o se están desarrollando docenas más, listas para consumir las tierras agrícolas, la energía y el agua disponibles en la región, atraídas por un incentivo estatal que permite que las empresas ahorren más de USD 130 millones en impuestos sobre las ventas y el uso cada año.

A pesar de la reducción de impuestos a nivel estatal, los centros de datos representan una contribución significativa para las arcas locales. En el condado de Loudon, que tiene más de 2,5 millones de metros cuadrados de espacio ocupado por centros de datos, los funcionarios esperan que los ingresos totales por impuestos a la propiedad recaudados de los centros de datos locales en el año fiscal 2025 se acerquen a los USD 900 millones, casi tanto como todo el presupuesto operativo del condado. La proporción de ingresos derivados de los centros de datos creció tanto que la junta de supervisores del condado está considerando ajustar la tasa impositiva para no depender tanto de una sola fuente.

Centros de datos existentes y planificados en el norte de Virginia. El estado recibió el apodo de “la capital mundial de los centros de datos”. Crédito: Piedmont Environmental Council.

Si bien muchas comunidades perciben a los centros de datos como una ventaja económica debido a los ingresos fiscales, las instalaciones en sí mismas no son grandes generadores de puestos de trabajo a largo plazo. La mayoría de los empleos que crean están enraizados en la construcción de los centros de datos y no en la operación continua y, por lo tanto, son temporales en su mayor parte.

Hace décadas, PEC apoyó parte del desarrollo de centros de datos en el norte de Virginia, comenta Julie Bolthouse, directora de políticas de suelo de PEC. Pero hubo cambios drásticos en la industria desde entonces. Por ejemplo, cuando AOL tenía su sede en lo que se conoce como Data Center Alley, el centro de datos de la empresa era una pequeña parte de unas instalaciones más grandes, “contaba con senderos peatonales alrededor, canchas de tenis, canchas de baloncesto… en su apogeo, la empresa tenía 5.300 empleados en el sitio”, relata Bolthouse. Las instalaciones se demolieron y se están construyendo tres grandes centros de datos en el lugar. “Hay una gran valla a su alrededor por motivos de seguridad, por lo que ahora está totalmente aislado de la comunidad, y solo va a dar empleo a entre 100 y 150 personas en el mismo terreno. Ahí está la diferencia”.

El uso de los servicios públicos también se volvió “masivo”, agrega Bolthouse. “Cada uno de esos edificios utiliza el equivalente al consumo de energía de una ciudad, por lo que hay enormes consecuencias para la infraestructura eléctrica de nuestras comunidades. Todas las líneas de transmisión que deben construirse, la expropiación para obtener la tierra a fin de instalar las líneas de transmisión, toda la infraestructura energética, las plantas de gas, los gasoductos que transportan el gas, la contaminación del aire asociada, los impactos climáticos relacionados con todo lo anterior”.

En todo el norte de Virginia, cada uno de los miles de generadores diésel in situ, que tienen el tamaño de un vagón de ferrocarril, emana vapores diésel, lo que crea problemas de calidad del aire. “No conozco otro uso del suelo que utilice tantos generadores como un centro de datos”, comenta Bolthouse. Y, si bien la clasificación oficial de dichos generadores es de energía de respaldo de emergencia, los centros de datos pueden utilizarlos para “satisfacer la demanda” durante 50 horas por vez, agrega. “A nivel local, el aire está muy contaminado. Es materia particulada y NOx [óxidos de nitrógeno], que afecta el crecimiento de los pulmones de los niños, puede provocar casos de asma y exacerbar las enfermedades cardíacas y otras enfermedades subyacentes en los adultos mayores”.

Y luego está la cuestión del agua.

“Como una pajita gigante de refresco”

En un estudio realizado por el Houston Advanced Research Center (HARC) y la Universidad de Houston, se descubrió que los centros de datos en Texas usarán 185.485 millones de litros de agua en 2025, y hasta 1,5 billones de litros en 2030. Eso equivaldría a un descenso en el nivel de agua del embalse más grande de los EUA (el lago Mead, de 63.500 hectáreas) de más de 4,88 metros en un año.

Cualquier persona que haya dejado su teléfono bajo la lluvia por accidente o lo haya dejado caer en un charco podría preguntarse cuál podría ser la relación entre un edificio lleno de delicados aparatos electrónicos costosos y millones de litros de agua. Es, en gran parte, para refrigeración. Al alimentarse con corriente eléctrica, los servidores pueden calentarse mucho, y la refrigeración por evaporación de la habitación es una de las formas más simples y baratas de evitar que los chips se sobrecalienten y dañen.

Sin embargo, eso significa que el agua no solo se usa para enfriar y luego se descarga como agua residual tratable: gran parte de ella se evapora en el proceso, ¡puf!

“Incluso si usan agua recuperada o reciclada, el agua ya no regresa al caudal base de los ríos y arroyos”, comenta Bolthouse. “El proceso tiene impactos ecológicos y problemas de suministro. Alguien siempre estará aguas abajo de la cuenca”. Washington, DC, por ejemplo, seguirá perdiendo el suministro de agua si los centros de datos del norte de Virginia utilizan agua reciclada o recuperada, porque esa agua no volverá al río Potomac. La refrigeración por evaporación también deja altas concentraciones de sales y otros contaminantes, agrega, lo que crea problemas en la calidad del agua.

Existen formas de enfriar los centros de datos que utilizan menos cantidad de agua, incluidos los sistemas de agua de circuito cerrado, que requieren más electricidad, y la refrigeración por inmersión, en la que los servidores se sumergen en un baño de líquido, como un aceite sintético, que conduce el calor, pero no la electricidad. La refrigeración por inmersión también permite una instalación más densa de los servidores, pero aún no se utiliza ampliamente, en gran parte debido al costo.

Resulta irónico, pero puede ser difícil confirmar datos específicos sobre los centros de datos. Dada la condición de propiedad exclusiva de la tecnología de IA y, tal vez, la posible reacción pública, muchas empresas no son muy comunicativas sobre la cantidad de agua que consumen sus centros de datos. Google, por su parte, informó haber utilizado más de 18.900 millones de litros de agua en todos sus centros de datos en 2023, y el 31 por ciento de sus extracciones de agua dulce proviene de cuencas con mediana o alta escasez de agua.

Un estudio de 2023 realizado por la Universidad de California en Riverside estimó que una sesión de chat de IA de alrededor de 20 consultas consume hasta una botella de agua dulce. Esa cantidad puede variar según la plataforma, con modelos más sofisticados que exigen mayores volúmenes de agua, mientras que otras estimaciones sugieren que podría estar más cerca de unas pocas cucharadas por consulta.

“Pero lo que no se reconoce, desde la perspectiva de los sistemas naturales, es que toda el agua es local”, comenta Peter Colohan, director de asociaciones e innovación de programas del Instituto Lincoln, quien ayudó a crear el Internet of Water (Internet del Agua). “Es una pequeña cantidad de agua para un par de consultas, pero todo se toma de una cuenca cercana a ese centro de datos, es decir, miles y miles de litros de agua que se extraen de un solo lugar porque personas de todo el mundo realizan sus consultas de IA”, dice.

“Donde sea que elijan poner un centro de datos, es como una pajita gigante de refresco que absorbe agua de esa cuenca”, continúa Colohan. “Y, cuando se toma agua de un lugar, se debe reducir la demanda o reponer el agua en ese mismo lugar: no hay otra solución. En algunos casos, al menos, los principales desarrolladores de centros de datos comenzaron a reconocer este problema y participan activamente en la reposición de agua donde es necesario”.

Ubicar los centros de datos en regiones más frías y húmedas puede ayudar a reducir la cantidad de agua que utilizan y el impacto de las extracciones de agua dulce. Sin embargo, alrededor de dos tercios de los centros de datos construidos desde 2022 se ubicaron en regiones con escasez de agua, según un análisis de Bloomberg News, que incluyen climas cálidos y secos como Arizona.

Sistema de refrigeración de agua tibia en un centro de datos de Sandia Labs en Albuquerque, Nuevo México. El centro de datos obtuvo la certificación LEED Gold por eficiencia en 2020. Crédito: Bret Latter/Sandia Labs vía Flickr CC.

No es solo enfriar las salas de servidores y los chips lo que consume agua. Casi la mitad de la electricidad que utilizan los centros de datos de EUA en la actualidad proviene de centrales eléctricas de combustibles fósiles, que, a su vez, utilizan mucha agua, ya que calientan vapor para encender las enormes turbinas.

¿Y qué ocurre con los millones de microchips que procesan toda esa información? Para cuando llegan a un centro de datos, cada chip ya ha consumido miles de litros de agua. La fabricación de estos pequeños y potentes componentes informáticos requiere agua tratada “ultrapura” para enjuagar los residuos de silicio sin dañar los chips. Se necesitan alrededor de 5,6 litros de agua del grifo para producir 3,8 litros de agua ultrapura, y una típica fábrica de chips utiliza alrededor de 37,8 millones de litros de agua ultrapura a diario, según el Foro Económico Mundial, lo que equivale a 33.000 hogares estadounidenses.


A medida que las comunidades consideran los beneficios y riesgos del desarrollo de los centros de datos, los consumidores podríamos tener en cuenta nuestro propio papel en el crecimiento de los centros de datos, y si nuestro uso de la IA vale el precio del agua, la energía y el suelo que devora.

Podría haber usos importantes de la inteligencia artificial, si se puede aprovechar, por ejemplo, para resolver problemas complejos o para mejorar la eficiencia de los sistemas de agua y las redes eléctricas.

También existen otros usos claramente superfluos. Por ejemplo, un canal de YouTube con 35 millones de suscriptores presenta videos musicales generados por IA… de canciones generadas por IA. La MIT Technology Review estima que, a diferencia de las consultas de texto simples, el uso de IA para crear contenido de video consume una cantidad extrema de recursos: hacer un video generado por IA de cinco segundos consume casi tanta electricidad como hacer funcionar un microondas sin parar durante más de una hora.

Los defensores de los centros de datos tienden a señalar el hecho de que los estadounidenses usan más agua cada año para regar campos de golf (más de 1,89 billones de litros) y césped (más de 7,57 billones de litros) que los centros de datos de IA. Sin embargo, ese argumento suena falso: Estados Unidos tiene una conocida obsesión con el césped verde que tampoco colabora. La solución, dicen los expertos en agua, radica en la conservación del agua y la educación del consumidor, no en la comparación de un derroche con otro.


 

Priorizar un recurso limitado

Incluso un pequeño centro de datos puede suponer una carga inmensa y concentrada para la infraestructura local y los recursos naturales. En el condado de Newton, Georgia, un centro de datos de Meta que se inauguró en 2018 utiliza 1,89 millones de litros de agua por día, el 10 por ciento del consumo de agua de todo el condado. Y dada la energía barata de Georgia y las generosas exenciones fiscales estatales, el condado de Newton continúa otorgando solicitudes de nuevos permisos de centros de datos, algunos de los cuales usarían hasta 22,71 millones de litros de agua por día, más del doble de lo que todo el condado consume en la actualidad.

Las intensas demandas que los centros de datos imponen a los recursos regionales complican la toma de decisiones a nivel local. Las comunidades y los funcionarios regionales del agua deben participar en debates sobre los centros de datos desde el principio con una comprensión coordinada y holística de los recursos existentes y los posibles impactos en la red de energía y la cuenca, indica Mary Ann Dickinson, directora de políticas de suelo y agua del Instituto Lincoln. “Nos gustaría ayudar a las comunidades a tomar decisiones más inteligentes sobre los centros de datos, ayudándolas a analizar y planificar los posibles impactos en sus estructuras y sistemas comunitarios”.

“El agua suele ser una de las últimas cuestiones en las que se piensa, por lo que en verdad estamos promoviendo la participación temprana, entre otras cuestiones”, comenta John Hernon, gerente de desarrollo estratégico de Thames Water en el Reino Unido. “Cuando piensa en los centros de datos, no se trata solo de la velocidad que se obtendrá, no se trata solo de asegurarse de que haya mucha energía disponible, sino que es necesario garantizar que el agua se tenga en cuenta lo antes posible… en primer lugar y no como una idea de último momento”.

A pesar de su reputación húmeda, Londres no recibe mucha lluvia en comparación con el norte del Reino Unido: menos de 635 mm al año, en promedio, o cerca de la mitad de lo que cae en la ciudad de Nueva York. Sin embargo, debido a que gran parte del crecimiento se centra en Londres, el área de servicio de Thames Water alberga alrededor del 80 por ciento de los centros de datos del Reino Unido, agrega Hernon, y se proponen alrededor de 100 más.

Además, el consumo de agua alcanza su punto máximo durante las épocas más calurosas y secas del año, cuando la empresa de servicios públicos tiene la menor capacidad para satisfacer la demanda adicional. “Por eso hablamos de restringir, reducir u objetar [los centros de datos]”, indica Hernon. “No es porque no nos gusten. Es clarísimo, nosotros también los necesitamos. La IA será una ayuda enorme para nuestro centro de llamadas… lo que significa que podemos poner más personas a arreglar fugas y administrar nuestras redes con proactividad”.

Mantener las luces encendidas

Una forma de que los centros de datos usen menos agua es depender más de la tecnología de refrigeración por aire, pero esto requiere más energía y, a su vez, puede aumentar el uso de agua en forma indirecta, según la fuente de energía. Además, las redes regionales ya tienen problemas para satisfacer la demanda de este tipo de instalaciones sedientas de energía, y hay cientos más en proceso. “Se anunciaron muchos de estos proyectos, pero no queda claro qué fuente de energía puede surgir lo bastante rápido como para alimentarlos”, comenta Kelly T. Sanders, profesora adjunta de Ingeniería en la Universidad del Sur de California.

El gobierno quiere que las empresas de tecnología estadounidenses construyan sus centros de datos de IA en el territorio nacional, no solo por razones económicas, sino también por motivos de seguridad nacional. Pero incluso cuando la gestión de Trump parece comprender las enormes demandas energéticas que los centros de datos impondrán a la red eléctrica, ha aplastado activamente nuevos proyectos de energía eólica, como Revolution Wind frente a la costa de Rhode Island.

El Laboratorio Nacional de Energía Renovable (NREL, por sus siglas en inglés) creó este mapa superpuesto de líneas de transmisión y ubicaciones de centros de datos para “ayudar a visualizar la superposición y simplificar la planificación del cosistema”. Crédito: NREL.gov.

Otras alternativas libres de carbono, como los pequeños reactores modulares (SMR, por sus siglas en inglés) y la energía geotérmica, tienen apoyo bipartidista, comenta Sanders. “Pero el problema es que, incluso si comienza a construir un SMR hoy, el proceso llevará 10 años”, agrega. “Las fuentes con las que podemos contar más rápido son el viento, la energía solar y las baterías. Pero en los últimos seis meses perdimos muchos de los incentivos para la energía limpia, y se libró una guerra contra lo eólico. Se están cancelando proyectos eólicos que ya están construidos y pagados. Y me parece peculiar porque esa es la electricidad que pronto estaría lista para salir a la red, en algunas de estas regiones que están muy congestionadas”.

Los centros de datos se encuentran entre las razones por las que los contribuyentes de todo el país han visto aumentar sus facturas de electricidad al doble de la tasa de inflación en el último año. Parte de eso tiene que ver con la nueva infraestructura que requerirán los centros de datos, como nuevas plantas de energía, líneas de transmisión u otras inversiones. Esos costos, así como el mantenimiento y las actualizaciones continuas de la red, suelen ser costos compartidos entre todos los clientes de electricidad en un área de servicio a través de cargos agregados a las facturas de servicios públicos.

Esto crea, como mínimo, dos problemas: si bien los ingresos fiscales de un nuevo centro de datos solo beneficiarán a la comunidad anfitriona, toda el área de servicio eléctrico debe pagar la infraestructura asociada. En segundo lugar, si una empresa de servicios públicos realiza esa gran inversión, pero el centro de datos, en algún momento, cierra o necesita mucha menos electricidad de la proyectada, son los contribuyentes quienes pagarán la factura, no el centro de datos.

Algunas empresas de tecnología aseguran su propia energía limpia independiente de la red: Microsoft, por ejemplo, firmó un acuerdo de 20 años para comprar energía de manera directa a la planta nuclear de Three Mile Island. Pero ese enfoque tampoco es ideal, indica Sanders. “De todos modos, estos centros de datos utilizarán líneas de transmisión y todos los activos de la red, pero si no están comprando la electricidad de la empresa de servicios públicos, no están pagando toda esa infraestructura a través de las facturas”, agrega.

Además de generar nueva energía, explica Sanders, existen estrategias para exprimir más capacidad de la red existente. “Una de ellas es la vieja y confiable eficiencia energética, y los propios centros de datos tienen todos los incentivos alineados para tratar de hacer que sus procesos sean más eficientes”, comenta. La IA en sí misma también podría ayudar a mejorar el rendimiento de la red. “Podemos usar la inteligencia artificial para obtener más información sobre cómo fluye la energía a través de la red, y así podemos optimizar ese flujo de energía, lo que nos puede dar más capacidad de la que tendríamos de otra manera”, agrega Sanders.

Otra estrategia es hacer que la red sea más flexible. La mayoría de las veces y en la mayoría de las regiones de los EUA, solo usamos alrededor del 40 por ciento de la capacidad total de la red, explica Sanders grosso modo. “Construimos la capacidad de la red para que pueda soportar la demanda en el día más caluroso… y ahí es donde nos preocupamos por estas grandes cargas de los centros de datos”, indica. Sin embargo, una red coordinada de baterías, incluso en los hogares de las personas y los vehículos eléctricos, puede agregar flexibilidad y estabilizar la red durante los momentos de mayor demanda. En julio, Pacific Gas and Electric Company (PG&E) de California realizó la prueba más grande jamás realizada de su “planta de energía virtual” para todo el estado y utilizó baterías residenciales para suministrar 535 megavatios de energía a la red durante dos horas completas al atardecer.

Con cierta planificación intencional y coordinada (“no sucederá de forma natural”, comenta Sanders) puede ser posible agregar más capacidad sin requerir gran cantidad nueva de generación si los centros de datos logran reducir la carga de trabajo durante las horas pico e invertir en baterías de respaldo a gran escala: “Existe un escenario en el que estos centros de datos pueden cumplir un buen papel respecto de la red y agregar más flexibilidad”.

Enfrentar las concesiones con las políticas de suelo

A medida que crece la demanda de centros de datos, la búsqueda de ubicaciones adecuadas para estas instalaciones obligará a las comunidades a enfrentar un sinfín de elecciones injustas entre el agua, la energía, el suelo, el dinero, la salud y el clima. “La planificación integrada del uso del suelo, con prácticas sostenibles de suelo, agua y energía, es la única forma en que podemos lograr, de manera sostenible, el círculo virtuoso necesario para cosechar los beneficios de la IA y el crecimiento económico asociado”, indica Colohan.

Por ejemplo, usar gas natural para satisfacer la carga de electricidad anticipada de los centros de datos de Texas requeriría 50 veces más agua que usar energía solar, según el estudio de HARC, y 1.000 veces más agua que viento. Pero si bien la alimentación de nuevos centros de datos con parques eólicos consumiría la menor cantidad de agua, también requeriría la mayor cantidad de suelo: cuatro veces más suelo que la generación solar y 42 veces más que el gas natural.

A falta de una avalancha de energía nueva y limpia, la mayoría de los centros de datos aportan grandes cantidades de gases de efecto invernadero a nuestras emisiones colectivas, en un momento en que la ciencia exige que los reduzcamos drásticamente para limitar los peores impactos del cambio climático. Los reguladores de Luisiana aprobaron en agosto planes para construir tres nuevas plantas de energía de gas para compensar la demanda de electricidad esperada del centro de datos de IA, Hyperion, de Meta.

A medida que las ciudades o los condados compiten entre sí para atraer centros de datos, las comunidades anfitrionas se llevarán los beneficios fiscales, pero los costos (la intensa demanda de agua, las facturas de electricidad más altas y la contaminación del aire de los generadores de respaldo) se repartirán a la región, incluso a áreas que no verán ningún nuevo ingreso fiscal.

Esa es una de las razones por las que los permisos de los centros de datos necesitan más supervisión estatal, comenta Bolthouse. “La única aprobación que en verdad tienen que obtener es de la localidad, y la localidad no tiene en cuenta los impactos regionales”, agrega. PEC también está impulsando la protección de los contribuyentes y los compromisos de sostenibilidad. “Queremos asegurarnos de fomentar las prácticas más eficientes y sostenibles dentro de la industria, y exigir mitigación cuando no se pueden evitar los impactos”.

¿Demasiado cerca para ser cómodo? Un centro de datos colinda con casas en el condado de Loudoun, Virginia. Crédito: Hugh Kenny a través del Piedmont Environmental Council.

PEC y otras entidades también están ejerciendo presión para lograr una mayor transparencia de la industria. “Muy a menudo, la llegada de los centros de datos incluye acuerdos de confidencialidad”, dice Bolthouse. “Ocultan mucha información sobre el uso del agua y la energía, los impactos en la calidad del aire, las emisiones; ninguna de esa información se divulga, por lo que las comunidades en realidad no saben en qué se están metiendo”.

“Es necesario educar a las comunidades sobre lo que enfrentan y cuáles son sus concesiones cuando dejan entrar un centro de datos”, comenta Colohan. “¿Cuál es el costo real de un centro de datos? Y luego, ¿cómo convertir ese costo real en un beneficio a través de políticas de suelo integradas?”

Rueben explica que entiende el deseo de aprovechar una industria en crecimiento, en especial en las comunidades que experimentan la pérdida de población. Pero en lugar de competir entre sí para atraer centros de datos, agrega, las comunidades deberían tener conversaciones más amplias sobre el crecimiento del empleo y las estrategias de desarrollo económico, teniendo en cuenta los costos reales y las compensaciones que representan estas instalaciones, y pedir a las empresas que proporcionen más garantías y planes detallados.

“Obligar a los operadores de centros de datos a explicar cómo administrarán las instalaciones de manera más eficiente y de dónde obtendrán el agua, y no solo asumir que tienen prioridad en el acceso a los sistemas de agua y energía”, indica, “es un cambio de perspectiva que necesitamos que hagan los funcionarios del gobierno”.


Jon Gorey es redactor del Instituto Lincoln de Políticas de Suelo.

Imagen principal: instalaciones del centro de datos en el condado de Prince William, Virginia. El condado tiene 59 centros de datos en funcionamiento o en construcción. Crédito: Hugh Kenny a través del Piedmont Environmental Council.

Webinarios

Nature-Based Solutions: Wet Architecture for Climate Resilience  

Marzo 24, 2026 | 12:00 p.m. - 1:00 p.m. (EDT, UTC-4)

Offered in inglés

As climate change accelerates and sea levels continue to rise, communities are being forced to rethink long-standing assumptions about land, development, and risk. In this webinar, architect and author Weston Wright will introduce the concept of wet architecture—an approach to design and planning that accepts water as a permanent condition and explores how we might live more productively with it.

Drawing on ideas from his book More Water Less Land New Architecture, Wright will examine how the relationship between land and water has shaped cities, policies, and development patterns, and why many of those frameworks are increasingly misaligned with climate realities. Rather than focusing on resistance or retreat alone, the talk will consider adaptive strategies that accommodate flooding, tides, and sea level rise, raising important questions about land use, coastal development, and long-term resilience.

Through examples from around the world, Wright connects architectural thinking with broader conversations about land policy, governance, and climate adaptation, offering a grounded, forward-looking perspective on how design, planning, and policy can evolve together in an increasingly water-defined future.


Speakers

Weston Wright

Principal, Weston Wright Architects


Detalles

Fecha(s)
Marzo 24, 2026
Time
12:00 p.m. - 1:00 p.m. (EDT, UTC-4)
Registration Deadline
March 24, 2026 12:50 PM
Idioma
inglés

Registrar

Registration ends on March 24, 2026 12:50 PM.


Palabras clave

mitigación climática, planificación, agua

Cultivating Change

Irrigated agriculture in the U.S. Southwest and northwest Mexico faces a future where water supplies will not only be reduced, but also less reliable and more expensive. In a region where irrigated agriculture uses nearly 75 percent of the water supply in the Colorado River Basin, occupies more than four million acres of land, and provides food for local and global markets, the impact of reduced water supplies for farmers—in some regions, as much as 40 percent over the next century—will be far-reaching. The Babbitt Center is focused on improving water resiliency and regional sustainability through efforts with rural, urban, and Tribal agricultural stakeholders.


Palabras clave

agua, planificación hídrica

Stormwater poses challenges to most cities, but what if rain could be treated as a valuable resource instead of a problem?

This 27-minute documentary case study charts Philadelphia’s bold attempt to capture billions of gallons of stormwater with green infrastructure. With limited resources and competing demands from constituents and city officials, Philadelphia’s Green City, Clean Waters initiative promised to mitigate water pollution, while bringing much-needed investments to neighborhoods. A story of political will, coalition building, innovation, and unforeseen challenges, what can be learned from Philadelphia’s massive undertaking?

The Wild West of Data Centers: Energy and water use top concerns

December 18, 2025

By Anthony Flint, December 18, 2025

It’s safe to say that the proliferation of data centers was one of the biggest stories of 2025, prompting concerns about land use, energy and water consumption, and carbon emissions. The massive facilities, driven by the rapidly increasing use of artificial intelligence, are sprouting up across the US with what critics say is little oversight or long-term understanding of their impacts.

“There is no system of planning for the land use, for the energy consumption, for the water consumption, or the larger impacts on land, agricultural, (forest) land, historic, scenic, and cultural resources, biodiversity,” said Chris Miller, president of the Piedmont Environmental Council, who has been tracking the explosion of data centers in northern Virginia, on the latest episode of the Land Matters podcast.

“There’s no assessment being made, and to the extent that there’s project-level review, there’s a lot of discussion about eliminating most of that to streamline this process. There is no aggregate assessment, and that’s what’s terrifying. We have local land use decisions being made without any information about the larger aggregate impacts in the locality and then beyond.”

Miller appeared on the show alongside Lincoln Institute staff writer Jon Gorey, author of the article Data Drain: The Land and Water Impacts of Data Centers, published earlier this year, and Mary Ann Dickinson, policy director for Land and Water at the Lincoln Institute, who is overseeing research on water use by the massive facilities. All three participated in a two-day workshop earlier this year at the Lincoln Institute’s Land Policy Conference: Responsive and Equitable Digitalization in Land Policy.

There is no federal registration requirement for data centers, and owners can be secretive about their locations for security reasons and competitive advantage. But according to the industry database Data Center Map, there at least 4,000 data centers across the US, with hundreds more on the way.

A third of US data centers are in just three states, with Virginia leading the way followed by Texas and California. Several metropolitan regions have become hubs for the facilities, including northern Virginia, Dallas, Chicago, and Phoenix.
Data centers housing computer servers, data storage systems and networking equipment, as well as the power and cooling systems that keep them running, have become necessary for high-velocity computing tasks. According to the Pew Research Center, “whenever you send an email, stream a movie or TV show, save a family photo to “the cloud” or ask a chatbot a question, you’re interacting with a data center.”

The facilities use a staggering amount of power; a single large data center can gobble up as much power as a small city. The tech companies initially promised to use clean energy, but with so much demand, they are tapping fossil fuels like gas and coal, and in some instances even considering nuclear power.

Despite their outsized impacts, data centers are largely being fast-tracked, in many cases overwhelming local community concerns. They’re getting tax breaks and other incentives to build with breathtaking speed, alongside a major PR effort that includes television ads touting the benefits of data centers for the jobs they provide, in areas that have been struggling economically.

Listen to the show here or subscribe to Land Matters on Apple Podcasts, Spotify, Stitcher, YouTube, or wherever you listen to podcasts.

 


Further Reading

Supersized Data Centers Are Coming. See How They Will Transform America | The Washington Post

Thirsty for Power and Water, AI-Crunching Data Centers Sprout Across the West | Bill Lane Center for the American West

Project Profile: Reimagining US Data Centers to Better Serve the Planet in San Jose | Urban Land Magazine

A Sustainable Future for Data Centers | Harvard John A. Paulson School of Engineering and Applied Sciences

New Mexico Data Center Project Could Emit More Greenhouse Gases Than Its Two Largest Cities | Governing magazine

  


Anthony Flint is a senior fellow at the Lincoln Institute of Land Policy, host of the Land Matters podcast, and a contributing editor of Land Lines. 


Transcript

Anthony Flint: Welcome back to the Land Matters Podcast. I’m your host, Anthony Flint. I think it’s safe to say that the proliferation of data centers was one of the biggest stories of 2025, and at the end of the day, it’s a land use story braided together with energy, the grid, power generation, the environment, carbon emissions, and economic development – and, the other big story of the year, to be sure, artificial intelligence, which is driving the need for these massive facilities.

There’s no federal registration requirement for data centers, and sometimes owners can be quite secretive about their locations for security reasons and competitive advantage. According to the industry database data center map, there are at least 4,000 data centers across the US. Some would say that number is closer to 5,000, but unquestionably, there are hundreds more on the way.

A third of US data centers are in just three states, with Virginia leading the way, followed by Texas and California. Several metropolitan regions have become hubs for these facilities, including Northern Virginia, Dallas, Chicago, and Phoenix, and the sites tend to get added onto with half of data centers currently being built being part of a preexisting large cluster, according to the International Energy Agency.

These are massive buildings housing computer servers, data storage systems, and networking equipment, as well as the power and cooling systems that keep them running. That’s according to the Pew Research Center, which points out that whenever you send an email, stream a movie or TV show, save a family photo to the cloud, or ask a chatbot a question, you’re interacting with a data center. They use a lot of power, which the tech companies initially promised would be clean energy, but now, with so much demand, they’re turning largely to fossil fuels like gas and even coal, and in some cases, considering nuclear power.

A single large data center can gobble up as much power as a small city, and they’re largely being fast-tracked, in many cases, overwhelming local community concerns. They’re getting tax breaks and other incentives to build with breathtaking speed, and there’s a major PR effort underway to accentuate the positive. You may have seen some of those television ads touting the benefits of data centers, including in areas that have been struggling economically.

To help make sense of all of this, I’m joined by three special guests, Jon Gorey, author of the article Data Drain: The Land and Water Impacts of Data Centers, published earlier this year at Land Lines Magazine; Mary Ann Dickinson, Policy Director for Land and Water at the Lincoln Institute; and Chris Miller, President of the Piedmont Environmental Council, who’s been tracking the explosion of data centers in Northern Virginia.

Well, thank you all for being here on Land Matters, and Jon, let me start with you. You’ve had a lot of experience writing about real estate and land use and energy and the environment. Have you seen anything quite like this? What’s going on out there? What were your takeaways after reporting your story?

Jon Gorey: Sure. Thank you, Anthony, for having me, and it’s great to be here with you and Mary Ann, and Chris too. I think what has surprised me the most is the scale and the pace of this data center explosion and the AI adoption that’s feeding it. When I was writing the story, I looked around the Boston area to see if there was a data center that I could visit in person to do some on-the-ground reporting.

It turns out we have a bunch of them, but they’re mostly from 10, 20 years ago. They’re pretty small. They’re well-integrated into our built environment. They’re just tucked into one section of an office building or something next to a grocery store. They’re doing less intensive tasks like storing our emails or cell phone photos on the cloud. The data centers being built now to support AI are just exponentially larger and more resource-intensive.

For example, Meta is planning a 715,000-square-foot data center outside the capital of Wyoming, which is over 16 acres of building footprint by itself, not even counting the grounds around it. That will itself use more electricity than every home in Wyoming combined. That’s astonishing. The governor there touted it as a win for the natural gas industry locally. They’re not necessarily going to supply all that energy with renewables. Then there’s just the pace of it. Between 2018 and 2021, the number of US data centers doubled, and then it doubled again by 2024.

In 2023, when most people were maybe only hearing about ChatGPT for the first time, US data centers were already using as much electricity as the entire country of Ireland. That’s poised to double or triple by 2028. It’s happening extremely fast, and they are extremely big. One of the big takeaways from the research, I think, was how this creates this huge cost-benefit mismatch between localities and broader regions like in Loudoun County, Virginia, which I’m sure Chris can talk about.

The tax revenue from data centers, that’s a benefit to county residents. They don’t have to shoulder as much of the bills for schools and other local services. The electricity and the water and the infrastructure and the environmental costs associated with those data centers are more dispersed. They’re spread out across the entire utilities service area with higher rates for water, higher electric rates, more pollution. That’s a real discrepancy and it’s happening pretty much anywhere one of these major data centers goes up.

Anthony Flint: Mary Ann Dickinson, let’s zoom in on how much water these data centers require. I was surprised by that. In addition to all the power they use, I want to ask you, first of all, why do they need so much water, and where is it coming from? In places like the Southwest, water is such a precious resource that’s needed for agriculture and people. It seems like there’s a lot more work to be done to make this even plausibly sustainable.

Mary Ann Dickinson: Well, water is the issue of the day right now. We’ve heard lots of data center discussion about energy. That’s primarily been the focus of a lot of media reporting during 2025. Water is now emerging as this issue that is dwarfing a lot of local utility systems. Data centers use massive amounts of water. It can be anywhere between 3 and 5 million gallons a day. It’s primarily to answer your question for cooling. It’s a much larger draw than most large industrial water users in a community water system.

The concern is that if the data centers are tying into local water utilities, which they prefer because of the affordability and the reliability and the treatment of the supply, that can easily swamp a utility system that is not accustomed to that continuous, constant draw. These large hyperscale data centers that are now being built can use hundreds of millions of gallons yearly. That’s equivalent to the water usage of a medium-sized city.

To Jon’s point, if you look at how much water that is being consumed by a data center in very water-scarce areas in the West in particular, you wonder where that water is going to come from. Is it going to come from groundwater? Is it going to come from surface water supplies? How is that water going to be managed and basically replaced back into the natural systems, like rivers, from which it might be being withdrawn? Colorado River, of course, being a prime example of an over-allocated river system.

What is all this water going for? Yes, it’s going for cooling, humidification in the data centers, it’s what they’re calling direct use, but there’s also indirect use, which is the water that it takes to generate the electricity that supplies the data center. The data center energy loads are serious, and Chris can talk about the grid issues as well, but a lot of that water is actually indirectly used to generate electricity, as well as directly used to cool those chips.

This indirect use can be substantial. It can be equivalent to about a half a gallon per kilowatt hour. That can be a fair amount of water just for providing that electricity. What we’re seeing is the average hyperscale data center uses about half a million gallons of water a day. That’s a lot of water to come from a local community water system. It’s a concern, and especially in the water-scarce regions where water is already being so short that farmers are being asked to fallow fields, how is the data center water load going to be accommodated within these water systems?

The irony is the data centers are going into these water-scarce regions. There was a Bloomberg report that showed that, actually, water-scarce regions were the most popular location for these data centers because they were approximate to areas of immediate use. That, of course, means California, it means Texas and Phoenix, Arizona, those states that are already struggling with providing water to their regular customers.

It’s a dilemma, and it’s one that we want to look at a lot more closely to help protect the community water systems and give them the right questions to ask when the data center comes to town and wants to locate there, and help them abate the financial risk that might be associated with the data center that maybe comes and then goes, leaving them with a stranded asset.

These are all complex issues. The tax issues tie into the water issues because the water utility system and impacts to that system might not be covered by whatever tax revenues are coming in. As sizable as they might be, they still might not be enough to cover infrastructure costs that then would otherwise be given to assess to the utility ratepayers. We’re seeing this in the energy side. We’re seeing electric rates go up. At the same time, we know these data centers are necessary given what we’re now as a society doing in terms of AI and digital computing.

We just have to figure out the way to most sustainably deal with it. We’re working with technical experts, folks from the Los Alamos National Lab, and we’re talking with them about the opportunities for using recycled water, using other options that are not going to be quite as water-consumptive.

Anthony Flint: Yes, we can talk more about that later in the show — different approaches, using gray water or recycled water, sounds like a promising idea because at the end of the day, there’s only so much water, right? Chris Miller, from the Piedmont Environmental Council, you pointed out, in Jon’s story, that roughly two-thirds of the world’s internet traffic essentially passes through Northern Virginia, and the region already hosts the densest concentration of data centers anywhere in the world. What’s been the impact on farmland, energy, water use, carbon emissions, everything? Walk us through what it’s like to be in such a hot spot.

Chris Miller: The current estimate is that Virginia has over 800 data centers. It’s a little hard to know because some of them are dark facilities, so not all of them are mappable, but the ones we’ve been able to map, that’s what we’re approaching. For land use junkies, there’s about 360 million square feet of build-approved or in-the-pipeline applications for data centers in the state. That’s a lot of footprint. The closest comparison I could make that seemed reasonable was all of Northern Virginia has about 150,000 square feet of commercial retail space.

We are looking at a future where just the footprint of the buildings is pretty extraordinary. We have sites that are one building, one gigawatt, almost a million square feet, 80 feet high. You just have to think about that. That’s the amount of power that a nuclear reactor can produce at peak load. We’re building those kinds of buildings on about 100 acres, 150 acres. Not particularly large parcels of land with extraordinary power density of electricity demand, which is just hard to wrap your head around.

The current estimate in Virginia for aggregate peak load demand increase in electricity exclusively from data centers is about 50 gigawatts in the next 20 years. That’ll be a tripling of the existing system. Now, more and more, the utilities, grid regulators, the grid monitor for PJM, which is a large regional transmission organization that runs from Chicago all the way to North Carolina.

As Anthony said, the existing system is near breaking point, maybe in the next three years. If all the demand came online, you would have brownouts and blackouts throughout the system. That’s pretty serious. It’s a reflection of the general problem, which is that there is no system of planning for the land use, for the energy consumption, for the water consumption. Larger impacts on land, agricultural, forestal land, historic scenic, cultural resources, biodiversity sites. There’s no assessment being made.

To the extent that there’s project-level review, there’s a lot of discussion about eliminating most of that to streamline this process. There is no aggregate assessment. That’s what’s terrifying. We have local land use decisions being made without any information about the larger aggregate impacts in the locality and then beyond. Then the state and federal governments are issuing permits without having really evaluated the combined effect of all this change.

I think that’s the way we’re looking at it. Change is inevitable. Change is coming. We should be doing it in a way that’s better than the way we’ve done it before, not worse. We need to do it in a way that basically is an honest assessment of the scale and scope, the aggregate impacts, and then apply the ingenuity and creativity of both the tech industry and the larger economy to minimize the impact that this has on communities and the natural resources on which we all depend on.

It’s getting to the point of being very serious. Virginia is water-constrained. It doesn’t have that reputation, but our water supply systems are all straining to meet current demand. The only assessment we have on the effect of future peak load from data centers is by the Interstate Commission on the Potomac River Basin, which manages the water supply for Washington metropolitan region in five states.

Their conclusion is, in the foreseeable future, 2040, we reach a point where consumption exceeds supply. Think about that. We’re moving forward with [facilities]  as they create a shortage of water supply in the nation’s capital. It’s being done without any oversight or direction. The work of the Lincoln Institute and groups like PEC is actually essential because the governmental entities are paralyzed. Paralyzed by a lack of policy structure, they’re also paralyzed by politics, which is caught between the perception of this is the next economic opportunity, which funds the needs of the community.

The fact is, the impacts may outweigh the benefits. We have to buckle down and realize this is the future. How do we help state, local, federal government to build decision models that take into account the enormous scale and scope of the industry and figure out how to fix the broken systems and make them better than they were before? I think that’s what all of us have been working on over the last five years.

Anthony Flint: It really is extraordinary, for those of us in the world of land use and regulations. We’ve heard a lot about the abundance agenda and how the US is making it more difficult to build things and infrastructure. Whether it’s clean energy or a solar farm or a wind farm, they have to go through a lot of hoops. Housing, same way. Here you have this — it’s not just any land use; it’s just this incredibly impactful land use that is seemingly not getting any of that oversight or making these places go through those hoops.

Chris Miller: They are certainly cutting corners. Jon mentioned the facility outside of Boston. What did you say, 150 acres? We have a site adjacent to the Manassas National Battlefield Park, which is part of the national park system, called the Prince William Digital Gateway, which is an aggregation of 2100 acres with plans for 27 million square feet of data centers with a projected energy demand of up to 7.5 gigawatts. The total base load supply of nuclear energy available in Virginia right now is just a little bit over 3 gigawatts.

The entire offshore wind development project at Dominion is 80% complete, but what’s big and controversial is 2.5 gigawatts. The two biggest sources of base load supply aren’t sufficient to meet 24/7 demand from a land use proposal on 2100 acres, 27 million square feet, that was made without assessing the energy impact, the supply of water, or the impact of infrastructure on natural, cultural, and historic resources, one of which is hallowed ground. It’s a place where two significant Civil War battlefields were fought. It’s extraordinary.

What’s even more extraordinary is to have public officials, senators, congressmen, members of agencies say, “We’re not sure what the federal next steps [are].” These are projects that have interstate effects on power, on water, on air quality. We haven’t talked about that, but one of the plans that’s been hatched by the industry is through onsite generation and take advantage of the backup generation that they’ve built out. They have to provide 100% backup generation onsite for their peak load. They’ve 90% of that in diesel without significant air quality controls.

We have found permits for 12.4 gigawatts of diesel in Northern Virginia. That would bust the ozone and PM2.5 regulatory standards for public health if they operated together. It’s being discussed by the Department of Environmental Quality in Virginia as a backup strategy for meeting power demand so that data centers can operate without restriction. These are choices that are being proposed without any modeling, without any monitoring, and without any assessment of whether those impacts are in conflict with other public policy goals, like human health. Terrifying.

We are at a breaking point. I have to say that the grassroots response is a pox upon all your houses. That was reflected in the 2025 elections that Virginia just went through. The tidal wave of change in the General Assembly and statewide offices and data centers and energy costs were very, very high on the list of concerns for voters.

Anthony Flint: I want to ask all three of you this question, but Jon, let me start with you. Is there any way to make a more sustainable data center?

Jon Gorey: Yes, there are some good examples here and there. It is, in some cases, in their best interest to use less electricity. It’ll be less expensive for them to use less water. Google, for its part, has published a pretty more transparent than some companies in their environmental report. They compare their water use in the context of golf courses irrigated, which does come across as not a great comparison because golf courses are not a terrific use of water either.

They do admit that last year, 2024, they used about 8.1 billion gallons of water in their data centers, the ones that they own, the 28% increase over the year before, and 14% of that was in severely water-stressed regions. Another 14% was in medium stress. One of their data centers in Council Bluffs, Iowa, consumed over a billion gallons of water by itself. They also have data centers, like in Denmark and Germany, that use barely a million gallons over the course of a year.

I don’t know if those are just very small ones, but I know they and Microsoft and other companies are developing … there’s immersive cooling, where instead of using evaporative water cooling to cool off the entire room that the servers are in, you can basically dunk the chips and servers in a synthetic oil that conducts heat but not electricity. It’s more expensive to do, but it’s completely possible. There are methods. There’s maybe some hope there that they will continue to do that more.

Mary Ann Dickinson: Immersive cooling, which you’ve just mentioned, is certainly an option now, but what we’re hearing is that it’s not going to be an option in the future, that because of the increasing power density and chips, they are going to need direct liquid cooling, period, and immersive cooling is not going to work. That’s the frightening part of the whole water story is as much or as little water is being used now, is going to pale against the water that’s going to be used in the next 5 to 10 years by the new generation of data centers and the new chips that they’ll be using.

The funny thing about the golf course analogy is that, in the West, a lot of those golf courses are irrigated with recycled water. As Chris knows, it also recharges back into groundwater. It is not lost as consumptive loss. That’s the issue is, really, to make these sustainable, we’re going to need to really examine the water cooling systems, what the evaporative loss is, what the discharge is to sewer systems, what the potential is for recycled water. There’s going to be a whole lot of questions that we’re going to ask, but we’re not getting any data.

Only a third of the data centers nationally even report their energy and water use. The transparency issue is becoming a serious problem. Many communities are being asked to sign NDAs. They can’t even share the information that a data center is using in energy and water with their citizens. It is a little bit of a challenge to try and figure out the path going forward. It’s all about economics, as Chris knows. It’s all about what can be afforded.

The work we’re doing at the Lincoln Institute, we would like to suggest as many sustainable options from the water perspective as possible, but they’re going to have to be paid for somewhere. That is the big question. Data centers need to pay.

Chris Miller: I think we’re entering a [time] where innovation is necessary. It has to be encouraged, and it’s where a crisis, just short of what we saw with lapse of the banking system in 2008, 2009, where no one was really paying attention to the aggregate system-wide failures. Somebody had to step up and say it’s broken. In the case of the mortgage crisis, it was actually 49 states coming to a court, saying, “We have to have a settlement so that we can rework all these mortgages and settle out the accounts and rebuild the system from no ground up.”

I think that’s the same place we’re at. We have to have a group of states get together and saying, “We are going to rebuild a decision model that we use for this new economy. It’s not going away. Any gains in efficiency are going to be offset by the expansion on demand for data. That’s been the trend for the last 15 years. We have to deal with the scale and the scope of the issue. I’ll give you just one example.

Dominion Energy has published at an aggregated contracts totaling 47.1 gigawatts of demand that they have to meet. Their estimate of the CapEx to do that ranges for 141 billion to 271 billion depending on whether they comply with the goals of the Virginia Clean Economy Act and move towards decommissioning and replacement of existing fossil fuel generation with cleaner sources. That range is not the issue. It’s the bottom line, which is 150 to 250 $300 billion in CapEx in one state for energy infrastructure. That’s enormous. We need a better process than a case-by-case review of the individual projects.

The state corporation does not maintain a central database of transmission and generation projects, which it approves. The state DEQ does not have a central database for water basin supply and demand. The state DEQ does not have a database of all of the permits in a model that shows what the impacts of backup generation would be if they all turned on at the same time in a brownout or blackout scenario. The failure to do that kind of systems analysis that desperately needs to be addressed. It’s not going to be done by this administration at the federal level.

It’s going to take state governments working together to build new systems decision tools that are informed by the expertise of places like the Lincoln Institute, so that they’re looking at this as a large-scale systemic process. We build it out in a way that’s rational, that takes into account the impacts of people and on communities and on land, and does it a way that fairly distributes the cost back to the industry that’s triggering the demand.

This industry is uniquely able to charge the whole globe for the use of certain parts of America as the base of its infrastructure. We should be working very hard on a cost allocation model and an assignment of cost to data center industry that can recapture the economic value and pay themselves back from the whole globe. No reason for the rate payers of Virginia or Massachusetts or Arizona, Oregon to be subsidizing the seven largest corporations in the world, the [capital expenditures] of over $22 trillion. It’s unfair, it’s un-American, it’s undemocratic.

We have to stand up to what’s happening and realize how big it is and realize it’s a threat to our way of life, our system of land use and natural resource allocation and frankly, democracy itself.

Anthony Flint: I want to bring this to a conclusion, although certainly there are many more issues we could talk about, but I want to look at the end user in a way and whether we as individuals can do anything about using AI, for example. I was talking with Jon, journalist-to-journalist, about this. I want to turn to you, Jon, on this question. Should we be trying not to use AI, and is that even possible?

Jon Gorey: The more I researched this piece, the more adamant I became that I shouldn’t be using it where possible. Not that that’s going to make any difference, but to me, it felt like I don’t really want to be a part of it. I expect there’s legitimate and valuable use cases for AI and science and technology, but I am pretty shocked by how cavalier people I know, my friends and family, have been in embracing it.

Part of that is that tech companies are forcing it on us because they’ve invested in it. They’re like, “Hey, we spent all this money on this, you got to use it.” It takes some legwork to remove the Google Assist from your Google searches or to get Microsoft Copilot to just leave you alone. I feel like that’s like it’s ancestor Clippy, the paperclip from Microsoft Office back in the day.

Here’s something that galls me more in a broader sense. I don’t know if we want to get into it, but I’m an amateur musician. I’m amateur because it’s already very difficult to make any money in the arts. There’s a YouTube channel with 35 million subscribers that simply plays AI-generated videos of AI-generated music, which is twice as many subscribers as Olivia Rodrigo has and 20 times as many as Gracie Abrams. Both of them are huge pop stars who sell out basketball arenas. It astounds me, and I don’t know why people are enjoying just artificially created things. I get the novelty of it, but I, for one, am trying to avoid stuff like that.

Chris Miller: We were having a debate about this issue this week on a series of forums. The reality is there’s stuff that each of us can do to significantly reduce our data load. It takes a little bit of effort. Most of us are storing two or three times what we need to, literally copies of things that we already have. There’s an efficiency of storage thing that takes time, and that’s why we don’t do it. There’s the use of devices appropriately.

If you can watch a broadcast television show and not stream it, that’s a significant reduction in load, actually. Ironically, we’ve gone from broadcast through the air, which has very little energy involved, to streaming on fiber optics and cable, and then wireless, which is incredibly resource-intensive. We’re getting less efficient in some ways in the way we use some of these technologies, but there are things we can do.

The trend in history has been that doesn’t actually change overall demand. I think we need to be careful as we think about all the things we can do as individuals to not lose sight of the need for the aggregate response, the societal-wide response, which is this industry needs to check itself, but it also needs to have proper oversight. The notion that somehow they’re holier than the rest of us is totally unsustainable.

We have to treat them as the next gold rush, the next offshore drilling opportunity, and understand that what they are doing is globally impactful, setting us back in terms of the overall needs to address climate change and the consumption of energy, and threatens our basic systems for water, land, air quality that are the basis of human life. If those aren’t a big enough threat, then we’re in big trouble.

Anthony Flint: Mary Ann, how about the last word?

Mary Ann Dickinson: When I looked up and saw that every Google search I do, which is AI backed these days, is half a liter of water, each one, and you think about the billions of searches that happen across the globe, this is a frightening issue. I’m not sure our individual actions are going to make that big a difference in the AI demand, but what we can require is, in the siting of these facilities, that they not disrupt local sustainability and resiliency efforts. That’s, I think, what we want to focus on at the Lincoln Institute. It’s helping communities do that.

Anthony Flint: Jon Gorey, Mary Ann Dickinson, and Chris Miller, thank you for this great conversation on the Land Matters Podcast. You can read Jon Gorey’s article, Data Drain, online at our website, lincolninst.edu. Just look for Land Lines magazine in the navigation. On social media, the handle is @landpolicy. Don’t forget to rate, share, and subscribe to the Land Matters Podcast. For now, I’m Anthony Flint signing off until next time.

Read full transcript

Planning for a Just Transition in the California Delta

By Jon Gorey, Diciembre 15, 2025

Some 50 miles inland from the iconic San Francisco Bay—east of the Golden Gate Bridge, beyond the Berkeley Hills and Mount Diablo—is the lesser-known California Delta, more than 1,100 square miles of lowlands and estuaries near the city of Stockton, at the confluence of the Sacramento and San Joaquin rivers.

Those two waterways alone drain about half of California, and much of that water gets pumped southward and westward to more populous areas of the state. Almost all the land in the delta—98 percent, much of it farmland—has been reclaimed since the 19th century with the help of hundreds of miles of levees and channels that drained what was once an inland sea during the wet winter months.

However, those drained wetlands, deprived of their natural sogginess, have been subsiding for decades as the peaty soil gets exposed to oxygen. “When you dry those out and make them terrestrial, they subside, the land elevation sinks,” says Brett Milligan, professor of landscape architecture and environmental design at the University of California, Davis. Despite its inland setting, “you have many places in the delta that are up to 20 or 25 feet below sea level.”

As sea levels rise, tidal saltwater intrusion from San Francisco Bay is increasingly a problem—especially during droughts and the summer dry season, when there’s less freshwater draining from the rivers to push back against rising tidal flows. Higher sea levels also put added strain on protective levees as the delta behind them sinks, increasing the risk of their potential failure.

An increase in salinity creates a lot of problems—for agriculture, for the ecosystem, and for the drinking water supply of millions of Californians. “We have one of the largest water infrastructure systems in the world,” Milligan says, largely focused on moving water from the wetter northern parts of the state to the more arid southern regions—“and the delta is sort of that switching point from north to south.”

An aerial photo of fields, roads, and rivers.
The California Delta covers 1,100 square miles at the confluence of the Sacramento and San Joaquin rivers. Credit: Freshwater Trust via USGS.

This tangle of interconnected issues is why the delta is often regarded as a “wicked problem,” Milligan says. “There are so many factors involved. It’s very complex; conditions are also changing quite fast.” Climate change is exacerbating nearly every challenge facing the delta: Tides are getting higher. Droughts are getting more frequent and more intense. Winter snowpack in the uplands would once have held back freshwater long into the spring, but it now melts earlier, and more precipitation falls as rain rather than snow to begin with.

That variety of factors makes the problem more complex, but it also means there are multiple ways of looking at—and perhaps addressing—the overarching issue of salinity in the delta. To help the delta community discuss and better understand some of the available solutions, Milligan and colleagues are conducting a series of participatory scenario planning workshops focused on salinity management as part of a four-year, multi-campus University of California project called Just Transitions in the Delta.

Exploring Multiple Futures to ‘Liberate the Present’ 

Scenario planning is a type of collective visioning process that invites community members to imagine and evaluate a set of specific, possible futures. It’s an inherently participatory process, but Milligan is foregrounding that idea of inclusion and equity, intentionally seeking out voices who don’t typically have a seat at the decision-making table.

By engaging dozens of people from across the delta’s diverse population—from farmers to Indigenous tribal members to residents of communities bearing a disproportionate burden of environmental pollution—Milligan hopes to build a broader understanding of the adaptation strategies available, and what tradeoffs each one presents. “We were really interested in trying to explore, within a context where people are often at odds, could this type of scenario planning around salinity management options be a way to build trust and mutual understanding?” he says.

The project is now in its third year, and Milligan and his colleagues presented their progress at the Lincoln Institute of Land Policy’s Consortium for Scenario Planning conference in 2025. (Registration is now open for the 2026 conference, to be held February 4–6, 2026, in Salt Lake City, Utah.)

So far, Milligan’s team has conducted more than half a dozen workshops with well over 100 total participants—including two main public workshops in 2024 and 2025, as well as smaller sessions requested by Indigenous groups and vulnerable communities—with the goal of first deciding upon the suite of scenarios to be included, then designing and refining them.

An aerial image of several people scattered around a large, wood-floored room, reading signs at a scenario planning workshop. The sign in the foreground reads, "What delta? What future?"
Participants in a scenario planning event held by the University of California, Davis as part of the multicampus Just Transitions in the Delta project. Credit: Courtesy of Brett Milligan.

“The first thing we did was a lot of outreach and interviews,” Milligan says, to determine and design the six main scenarios to be considered. The questions ranged from what people valued most about the delta, to which salinity management practices they wanted the team to explore, to who else ought to be included in adaptation discussions. Notably, Milligan says, 83 percent of respondents felt that past decision-making in the delta had not been equitable.

Using feedback from those interviews, the team designed a set of six scenarios for evaluation, which continue to be refined as workshops yield more feedback, and created an immersive, interactive exhibition of scenario narratives and maps ahead of the second full public workshop.

The first scenario is simply “Business as Usual,” which extrapolates current trends into the future as a sort of baseline from which to compare other adaptation measures. The second scenario models the Delta Conveyance Project, a long-discussed, partially permitted 40-mile water supply tunnel that could be built beneath the delta. The controversial tunnel is not particularly popular among many residents, Milligan explains, “but a lot of people wanted us to model that, to compare it to the other options.”

The third and fourth scenarios are nature-based restoration solutions. The “Eco Machine” approach would use strategically placed green infrastructure to reduce salinity intrusion and create recreational and ecological benefits. The “New Green Watershed,” meanwhile, is more ambitious in scope, phasing in green infrastructure across the entire region, along with carbon banking, land repatriation to Indigenous communities, and wet soil agriculture (such as rice farming) to reverse land subsidence and transition the delta to a regenerative green economy.

“That was driven by tribal input asking us to think about the delta more holistically,” Milligan says. “A lot of people are concerned about flooding, and interested in what can be done upstream in terms of land management, better fire stewardship, restoration of meadows, and things like that, that will influence when and how water comes down,” he says. “Could you reinvent the delta in a way that’s more sustainable and make that economically viable?”

The last two scenarios focus on more traditional infrastructure, but implemented and managed in smarter ways. “Bolster and Fortify” models how major engineering investments in the delta’s gray infrastructure—such as barriers, operable gates, and augmented levees—could reduce salinity and protect subsided land from levee breaches. “Calling on Reserves” focuses on operating upstream dams and reservoirs differently—allowing more water out when necessary to push back against tidal intrusion, for example—combined with statewide investments in increased water efficiency and storage.

A map of the California Delta. The base map is dark brown, with planned levee fortifications outlined in red, yellow, orange, blue, and purple.
A map from the “Bolster and Protect” scenario of the Just Transitions in the Delta project shows where different plans have prioritized levee fortifications in the region. Credit: University of California.

In the large public workshops, participants have so far ranked the two nature-based solutions most favorably (with the tunnel and business-as-usual scenarios battling it out for last place).

Those workshops also sought input on how each scenario ought to be assessed. The team is now using hydrodynamic and other modeling methods to evaluate and score each scenario according to six criteria participants selected: water quality and flow, ecological restoration, Indigenous sovereignty, environmental justice, recreation, and economy. A final public workshop in 2026 will present the fully modeled and scored scenarios, and ask participants to rank their preferences.

“What I find most useful about scenario planning is exploring multiple futures as a way to kind of liberate the present and how we think about futures. There’s not just one way the world can be,” Milligan says. He notes that people seem to be more open to understanding other people’s perspectives in the context of specific scenarios.

Encouragingly, post-workshop surveys have confirmed that participants feel the process has been useful. “We get very positive feedback from people saying they felt heard,” Milligan says. But voicing opinions is not the only reason people are attending the workshops; many have said they specifically came to hear what others had to say. “I’ve never heard that in my 12 years working in the delta,” he says.

“People are showing up because they’re curious about how other people experience this and think about this, which was a goal for our project—can we foster that kind of learning space? It seems that many people are coming to these because they want to learn; they want to understand other ways of how it can be.”


Jon Gorey is a staff writer at the Lincoln Institute of Land Policy.

Lead image: Middle River Bridge near Discovery Bay in the California Delta. Credit: toddarbini via iStock/Getty Images Plus.

Data Drain: The Land and Water Impacts of the AI Boom

By Jon Gorey, Octubre 17, 2025

A low hum emerges from within a vast, dimly lit tomb, whose occupant devours energy and water with a voracious, inhuman appetite. The beige, boxy data center is a vampire of sorts—pallid, immortal, thirsty. Sheltered from sunlight, active all night. And much like a vampire, at least according to folkloric tradition, it can only enter a place if it’s been invited inside.

In states and counties across the US, lawmakers aren’t just opening the door for these metaphorical, mechanical monsters. They’re actively luring them in, with tax breaks and other incentives, eager to lay claim to new municipal revenues and a piece of the explosive growth surrounding artificial intelligence.

That may sound hyperbolic, but data centers truly are resource-ravenous. Even a mid-sized data center consumes as much water as a small town, while larger ones require up to 5 million gallons of water every day—as much as a city of 50,000 people.

Powering and cooling their rows of server stacks also takes an astonishing amount of electricity. A conventional data center—think cloud storage for your work documents or streaming videos—draws as much electricity as 10,000 to 25,000 households, according to the International Energy Agency. But a newer, AI-focused “hyperscale” data center can use as much power as 100,000 homes or more. Meta’s Hyperion data center in Louisiana, for example, is expected to draw more than twice the power of the entire city of New Orleans once completed. Another Meta data center planned in Wyoming will use more electricity than every home in the state combined.

And of course, unlike actual clouds, data centers require land. Lots of it. Some of the largest data centers being built today will cover hundreds of acres with impermeable steel, concrete, and paved surfaces—land that will no longer be available for farmland, nature, or housing—and require new transmission line corridors and other associated infrastructure as well.

Data centers have been part of our built landscape for over a decade, however—many of them tucked into unassuming office parks, quietly processing our web searches and storing our cellphone photos. So why the sudden concern? Artificial intelligence tools trained with large language models, such as Open AI’s ChatGPT, among others, use exponentially more computing power than traditional cloud services. And the largest technology companies, including Amazon, Meta, Google, and Microsoft, are investing quickly and heavily in AI.

The number of US data centers more than doubled between 2018 and 2021 and, fueled by investments in AI, that number has already doubled again. Early in the AI boom, in 2023, US data centers consumed 176 terawatt-hours of electricity, roughly as much as the entire nation of Ireland (whose electric grid is itself nearly maxed out, prompting data centers there to use polluting off-grid generators), and that’s expected to double or even triple as soon as 2028.

This rapid proliferation can put an enormous strain on local and regional resources—burdens that many host communities are not fully accounting for or prepared to meet.

“Demand for data centers and processing has just exploded exponentially because of AI,” says Kim Rueben, former senior fiscal systems advisor at the Lincoln Institute of Land Policy. Virginia and Texas have long had tax incentives in place to attract new data centers, and “other states are jumping on the bandwagon,” she says, hoping to see economic growth and new tax revenues.

But at a Land Policy and Digitalization conference convened by the Lincoln Institute last spring, Rueben likened the extractive nature of data centers to coal mines. “I don’t think places are acknowledging all the costs,” she says.

Yes, Virginia, There Is a Data Clause

At that conference, Chris Miller, executive director of the Piedmont Environmental Council, explained how roughly two-thirds of the world’s internet traffic passes through Northern Virginia. The region already hosts the densest concentration of data centers anywhere in the world, with about 300 facilities in just a handful of counties. Dozens more are planned or in development, ready to consume the region’s available farmland, energy, and water, enticed by a statewide incentive that saves companies more than $130 million in sales and use taxes each year.

Despite the state-level tax break, the data centers make significant contributions to local coffers. In Loudon County, which has over 27 million square feet of existing data center space, officials expect the total real and property tax revenues collected from local data centers in fiscal year 2025 to approach $900 million, nearly as much as the county’s entire operating budget. The proportion of revenue derived from data centers has grown so lopsided that the county’s board of supervisors is considering adjusting the tax rate, so as not to be so reliant on a single source.

Existing and planned data centers in Northern Virginia. The state has been dubbed “the data center capital of the world.” Credit: Piedmont Environmental Council.

While many communities see data centers as an economic boon due to that tax revenue, the facilities themselves are not powerful long-term job engines. Most of the jobs they create are rooted in their construction, not their ongoing operation, and thus are largely temporary.

Decades ago, PEC supported some of the data center development in Northern Virginia, says Julie Bolthouse, PEC’s director of land policy. But the industry has changed dramatically since then. When AOL had its headquarters in what’s known as Data Center Alley, for example, the company’s data center was a small part of a larger campus, “which had pedestrian trails around it, tennis courts, basketball courts … at its peak, it had 5,300 employees on that site,” Bolthouse says. The campus has since been demolished, and three large data center facilities are being built on the site. “There’s a big fence around it for security purposes, so it’s totally isolated from the community now, and it is only going to employ about 100 to 150 people on the same piece of land. That’s the difference.”

The facilities have also gotten “massive,” Bolthouse adds. “Each one of those buildings is using as much as a city’s worth of power, so that power infrastructure is having a huge impact on our communities. All the transmission lines that have to be built, the eminent domain used to get the land for those transmission lines, all of the energy infrastructure, gas plants, pipelines that deliver the gas, the air pollution associated with that, the climate impacts of all of that.”

Across Northern Virginia, on-site diesel generators—thousands of them, each the size of a rail car—spew diesel fumes, creating air quality issues. “No other land use that I know of uses as many generators as a data center does,” Bolthouse says. And while such generators are officially classified as emergency backup power, data centers are permitted to run them for “demand response” for 50 hours at a time, she adds. “That’s a lot of air pollution locally. That’s particulate matter and NOx [nitrogen oxides], which impacts growing lungs of children, can add cases of asthma, and can exacerbate heart disease and other underlying diseases in the elderly.”

And then there’s the water issue.

‘Like a Giant Soda Straw’

A study by the Houston Advanced Research Center (HARC) and University of Houston found that data centers in Texas will use 49 billion gallons of water in 2025, and as much as 399 billion gallons in 2030. That would be equivalent to drawing down the largest reservoir in the US—157,000-acre Lake Mead—by more than 16 feet in a year.

Anyone who’s accidentally left their phone out in the rain or dropped it in a puddle might wonder what a building full of expensive, delicate electronics could want with millions of gallons of water. It’s largely for cooling purposes. Coursing with electrical current, server stacks can get very hot, and evaporative room cooling is among the simplest and cheapest ways to keep the chips from getting overheated and damaged.

What that means, however, is that the water isn’t just used for cooling and then discharged as treatable wastewater; much of it evaporates in the process—poof.

“Even if they’re using reclaimed or recycled water, that water is no longer going back into the base flow of the rivers and streams,” Bolthouse says. “That has ecological impacts as well as supply issues. Everybody is upstream from someone else.” Washington, DC, for example, will still lose water supply if Northern Virginia data centers use recycled or reclaimed water, because that water won’t make it back into the Potomac River. Evaporative cooling also leaves behind high concentrations of salts and other contaminants, she adds, creating water quality issues.

There are less water-intensive ways to cool data centers, including closed-loop water systems, which require more electricity, and immersion cooling, in which servers are submerged in a bath of liquid, such as a synthetic oil, that conducts heat but not electricity. Immersion cooling allows for a denser installation of servers as well, but is not yet widely used, largely due to cost.

Ironically, it can be hard to confirm specific data about data centers. Given the proprietary nature of AI technology and, perhaps, the potential for public backlash, many companies are less than forthcoming about how much water their data centers consume. Google, for its part, reported using more than 5 billion gallons of water across all its data centers in 2023, with 31 percent of its freshwater withdrawals coming from watersheds with medium or high water scarcity.

A 2023 study by the University of California Riverside estimated that an AI chat session of 20 or so queries uses up to a bottle of freshwater. That amount can vary depending on the platform, with more sophisticated models demanding larger volumes of water, while other estimates suggest it could be closer to a few spoonfuls per query.

“But what goes unacknowledged, from a natural systems perspective, is that all water is local,” says Peter Colohan, director of partnerships and program innovation at the Lincoln Institute, who helped create the Internet of Water. “It’s a small amount of water for a few queries, but it’s all being taken from one basin where that data center is located—that’s thousands and thousands of gallons of water being drawn from one place from people doing their AI queries from all over the world,” he says.

“Wherever they choose to put a data center, it is like a giant soda straw sucking water out of that basin,” Colohan continues. “And when you take water from a place, you have to reduce demand or put water back in that same place, there’s no other solution. In some cases, at least, major data center developers have begun to recognize this problem and are actively engaging in water replenishment where it counts.”

Locating data centers in cooler, wetter regions can help reduce the amount of water they use and the impact of their freshwater withdrawals. And yet roughly two-thirds of the data centers built since 2022 have been located in water-stressed regions, according to a Bloomberg News analysis, including hot, dry climates like Arizona.

The warm water-cooling system at a Sandia Labs data center in Albuquerque, New Mexico. The data center earned LEED Gold certification for efficiency in 2020. Credit: Bret Latter/Sandia Labs via Flickr CC.

It’s not just cooling the server rooms and chips that consumes water. About half of the electricity currently used by US data centers comes from fossil fuel power plants, which themselves use a lot of water, as they heat up steam to turn their massive turbines.

And the millions of microchips processing all that information? By the time they reach a data center, each chip has already consumed thousands of gallons of water. Manufacturing these tiny, powerful computing components requires “ultrapure” treated water to rinse off silicon residue without damaging the chips. It takes about 1.5 gallons of tap water to produce a gallon of ultrapure water, and the typical chip factory uses about 10 million gallons of ultrapure water each day, according to the World Economic Forum—as much as 33,000 US households.


As communities consider the benefits and risks of data center development, consumers might consider our own role in the growth of data centers, and whether our use of AI is worth the price of the water, power, and land it devours.

There could be important uses for artificial intelligence—if it can be harnessed to solve complex problems, for instance, or to improve the efficiency of water systems and electric grids.

There are clearly superfluous uses, too. A YouTube channel with 35 million subscribers, for example, features AI-generated music videos … of AI-generated songs. The MIT Technology Review estimates that, unlike simple text queries, using AI to create video content is extremely resource-heavy: Making a five-second AI-generated video uses about as much electricity as running a microwave nonstop for over an hour.

Data center defenders tend to point to the fact that Americans use more water each year to irrigate golf courses (more than 500 billion gallons) and lawns (over 2 trillion gallons) than AI data centers use. However, that argument rings false: America has a well-documented addiction to green grass that is also not serving us well. The solution, water experts say, lies in water conservation and consumer education, not comparing one wasteful use to another.


 

Putting a Finite Resource First

Even a small data center can place an immense, concentrated burden on local infrastructure and natural resources. In Newton County, Georgia, a Meta data center that opened in 2018 uses 500,000 gallons of water per day—10 percent of the entire county’s water consumption. And given Georgia’s cheap power and generous state tax breaks, Newton County continues to field requests for new data center permits—some of which would use up to 6 million gallons of water per day, more than doubling what the entire county currently consumes.

The intense demands that data centers place on regional resources make for complicated decision-making at the local level. Communities and regional water officials must engage in discussions about data centers early on, and with a coordinated, holistic understanding of existing resources and potential impacts on the energy grid and the watershed, says Mary Ann Dickinson, policy director for land and water at the Lincoln Institute. “We would like to help communities make smarter decisions about data centers, helping them analyze and plan for the potential impacts to their community structures and systems.”

“Water is often one of the last things that gets thought about, so one of the things that we’re really promoting is early engagement,” says John Hernon, strategic development manager at Thames Water in the UK. “So when you’re thinking about data centers, it’s not just about the speed you’re going to get, it’s not just about making sure there’s a lot of power available—we need to make sure that water is factored in at the earliest possible thinking … at the forefront, rather than an afterthought.”

Despite its damp reputation, London doesn’t receive a whole lot of rainfall compared to the northern UK — less than 25 inches a year, on average, or roughly half of what falls in New York City. Yet because so much growth is centered on London, the Thames Water service area holds about 80 percent of the UK’s data centers, Hernon says, and another 100 or so are proposed.

What’s more, their water usage peaks during the hottest, driest times of the year, when the utility can least accommodate the extra demand. “That’s why we talk about restricting or reducing or objecting to [data centers],” Hernon says. “It’s not because we don’t like them. We absolutely get it, we need them ourselves. AI will massively help our call center … which means we can have more people out fixing leaks and proactively managing our networks.”

Keeping the Lights On

One way for data centers to use less water is to rely more heavily on air-cooling technology, but this requires more energy —which may in turn increase water use indirectly, depending on the power source. What’s more, regional grids are already struggling to meet the demand of these power-hungry facilities, and there are hundreds more in the works. “A lot of these projects have been announced, but it’s not clear what can come on fast enough to power them,” says Kelly T. Sanders, associate professor of engineering at University of Southern California.

The government wants US technology companies to build their AI data centers domestically—not just for economic reasons, but for national security purposes as well. But even as the Trump administration appears to understand the enormous energy demands data centers will place on the electric grid, it has actively squashed new wind power projects, such as Revolution Wind off the coast of Rhode Island.

NREL (the National Renewable Energy Laboratory) created this overlay map of transmission lines and data center locations to “help visualize the overlap and simplify co-system planning.” Credit: NREL.gov.

Other carbon-free alternatives like small modular reactors (SMRs) and geothermal energy have bipartisan support, Sanders says. “But the problem is, even if you put shovels in the ground for an SMR today, it’s going to take 10 years,” she says. “The things that we can do the fastest are wind, solar, and batteries. But in the last six months we’ve lost a lot of the incentives for clean energy, and there’s an all-out war on wind. Wind projects that are already built, already paid for, are being canceled. And to me, that’s peculiar, because that’s electricity that would be ready to go out on the grid soon, in some of these regions that are really congested.”

Data centers are among the reasons ratepayers nationwide have seen their electric bills increase at twice the rate of inflation in the past year. Part of that is the new infrastructure data centers will require, such as new power plants, transmission lines, or other investments. Those costs, as well as ongoing grid maintenance and upgrades, are typically shared by all electric customers in a service area, through charges added to utility bills.

This creates at least two issues: While the tax revenues of a new data center will benefit only the host community, the entire electric service area must pay for the associated infrastructure. Secondly, if a utility makes that huge investment, but the data center eventually closes or needs much less electricity than projected, it’s the ratepayers who will foot the bill, not the data center.

Some tech companies are securing their own clean power independent of the grid—Microsoft, for example, signed a 20-year agreement to purchase energy directly from the Three Mile Island nuclear plant. But that approach isn’t ideal either, Sanders says. “These data centers are still going to use transmission lines and all those grid assets, but if they’re not buying the electricity from the utility, they’re not paying for all that infrastructure through their rate bills,” she says.

Aside from generating new power, Sanders says, there are strategies to squeeze more capacity from the existing grid. “One is good old energy efficiency, and the data centers themselves have all of the incentives aligned to try to make their processes more efficient,” she says. AI itself could potentially also help enhance grid performance. “We can use artificial intelligence to give us more information about how power is flowing through the grid, and so we can optimize that power flow, which can give us more capacity than we would have otherwise,” Sanders says.

Another strategy is to make the grid more flexible. Most of the time, and in most regions of the US, we only use about 40 percent of the grid’s total capacity, Sanders says, give or take. “We build the capacity of the grid to meet the hottest day … and that’s where we worry about these large data center loads,” she says. A coordinated network of batteries, however —including in people’s homes and EVs—can add flexibility and stabilize the grid during times of peak demand. In July, California’s Pacific Gas and Electric Company (PG&E) conducted the largest-ever test of its statewide “virtual power plant,” using residential batteries to supply 535 megawatts of power to the grid for two full hours at sundown.

With some intentional, coordinated planning—”it’s not just going to happen naturally,” Sanders says—it may be possible to add more capacity without requiring a lot of new generation if data centers can reduce their workloads during peak times and invest in large-scale battery backups: “There is a world in which these data centers can actually be good grid actors, where they can add more flexibility to the grid.”

Confronting Trade-Offs With Land Policy

As the demand for data centers grows, finding suitable locations for these facilities will force communities to confront myriad and imperfect trade-offs between water, energy, land, money, health, and climate. “Integrated land use planning, with sustainable land, water, and energy practices, is the only way we can sustainably achieve the virtuous circle needed to reap the benefits of AI and the economic growth associated with it,” Colohan says.

For example, using natural gas to meet the anticipated electricity load of Texas data centers would require 50 times more water than using solar generation, according to the HARC study, and 1,000 times more water than wind. But while powering new data centers with wind farms would consume the least water, it would also require the most land—four times as much land as solar, and 42 times as much as natural gas.

Absent an avalanche of new, clean power, most data centers are adding copious amounts of greenhouse gases to our collective emissions, at a time when science demands we cut them sharply to limit the worst impacts of climate change. Louisiana regulators in August approved plans to build three new gas power plants to offset the expected electricity demand from Meta’s Hyperion AI data center.

While towns or counties compete with one another to attract data centers, the host communities will reap the tax benefits while the costs—the intense water demand, the higher electricity bills, the air pollution from backup generators—will be dispersed more regionally, including to areas that won’t see any new tax revenue.

That’s one reason data center permitting needs more state oversight, Bolthouse says. “The only approval that they really have to get is from the locality, and the locality is not looking at the regional impacts,” she says. PEC is also pushing for ratepayer protections and sustainability commitments. “We want to make sure we’re encouraging the most efficient and sustainable practices within the industry, and that we’re requiring mitigation when impacts can’t be avoided.”

Too close for comfort? A data center abuts homes in Loudoun County, Virginia. Credit: Hugh Kenny via Piedmont Environmental Council.

PEC and others are also pressing for greater transparency from the industry. “Very often, data centers are coming in with non-disclosure agreements,” Bolthouse says. “They’re hiding a lot of information about water usage, energy usage, air quality impacts, emissions—none of that information is disclosed, and so communities don’t really know what they’re getting into.”

“We need communities to be educated about what they’re facing, and what their trade-offs are when they let in a data center,” Colohan says. “What is the cost—the true cost—of a data center? And then how do you turn that true cost into a benefit through integrated land policy?”

Rueben says she understands the desire, especially in communities experiencing population loss, to tap into a growing industry. But rather than competing with each other to attract data centers, she says, communities ought to be having broader conversations about job growth and economic development strategies, factoring in the true costs and trade-offs these facilities present, and asking the companies to provide more guarantees and detailed plans.

“Forcing data center operators to explain how they’re going to run the facility more efficiently, and where they’re going to get their water from—and not just assuming that they have first access to the water and energy systems,” she says, “is a shift in perspective that we kind of need government officials to make.”


Jon Gorey is a staff writer at the Lincoln Institute of Land Policy.

Lead image: Data center facilities in Prince William County, Virginia. The county has 59 data centers in operation or under construction. Credit: Hugh Kenny via Piedmont Environmental Council.

Solicitud de propuestas

Planejamento exploratório por cenários para abordar a resiliência hídrica na América Latina e no Caribe

Fecha límite para postular: November 13, 2025 at 11:59 PM

O Instituto Lincoln convida organizações comunitárias parceiras da América Latina ou do Caribe a apresentarem candidaturas para coorganizar, em 2026, um workshop de planejamento exploratório por cenários (XSP) sobre resiliência hídrica. Os parceiros selecionados trabalharão com o Consórcio para Planejamento por Cenários do Instituto Lincoln no projeto e na realização de um workshop fundamentado no contexto local que envolva as partes interessadas na exploração de um desafio urgente relacionado à água por meio de um processo imersivo e participativo. O workshop de XSP terá como foco compreender os impactos das questões locais dentro de um lugar ou região específica, explorar múltiplos futuros plausíveis e identificar estratégias para lidar com incertezas e criar resiliência hídrica a longo prazo.

Os diretrizes de submissão também estão disponíveis em espanhol.

Um documento com perguntas frequentes também está disponível.


Detalles

Fecha límite para postular
November 13, 2025 at 11:59 PM

Descargas


Palabras clave

planificación de escenarios, agua

Solicitud de propuestas

Planificación exploratoria de escenarios para abordar la resiliencia en América Latina y el Caribe

Fecha límite para postular: November 13, 2025 at 11:59 PM

El Instituto Lincoln invita a presentar postulaciones de contrapartes comunitarias en América Latina o el Caribe que estén interesados en ser coanfitriones de un taller de planificación exploratoria de escenarios (XSP, por su sigla en inglés) sobre resiliencia hídrica en 2026. Los socios seleccionados trabajarán con el Consorcio para la Planificación de Escenarios del Instituto Lincoln a fin de diseñar y ofrecer un taller con base local que involucre a las partes interesadas en la investigación de un desafío hídrico apremiante mediante un proceso participativo e inmersivo. El taller de XSP se centrará en comprender las consecuencias de problemas locales en un lugar o región específicos, explorar múltiples futuros plausibles e identificar estrategias para responder a la incertidumbre y desarrollar resiliencia hídrica a largo plazo.

La guía de postulación también está disponible en portugués.

Un documento de preguntas frecuentes también está disponible.


Detalles

Fecha límite para postular
November 13, 2025 at 11:59 PM

Descargas


Palabras clave

planificación de escenarios, agua