Topic: Land Use and Zoning

Drenaje de datos: los impactos en el suelo y el agua del auge de la IA

Por Jon Gorey, October 17, 2025

Un zumbido débil emerge desde lo profundo de una vasta tumba con luz tenue, cuyo ocupante devora energía y agua con un apetito voraz e inhumano. El centro de datos beige y rectangular es una especie de vampiro: pálido, inmortal, sediento. Resguardado de la luz del sol, activo toda la noche. Y, al igual que un vampiro, al menos según la tradición folclórica, solo puede entrar en un lugar si lo han invitado.

En los estados y condados de Estados Unidos, los legisladores no solo están abriendo la puerta a estos monstruos mecánicos metafóricos. Los están atrayendo de manera activa, con exenciones fiscales y otros incentivos, ansiosos por recaudar nuevos ingresos municipales y reclamar una parte del crecimiento explosivo que rodea a la inteligencia artificial.

Eso puede sonar hiperbólico, pero los centros de datos en verdad devoran recursos. Un centro de datos de tamaño mediano consume tanta agua como una ciudad pequeña, mientras que los más grandes requieren hasta 18,9 millones de litros de agua todos los días, la misma cantidad que una ciudad de 50.000 personas.

También se requiere una asombrosa cantidad de electricidad para alimentar y enfriar las filas de servidores. Un centro de datos convencional, como el almacenamiento en la nube para los documentos de trabajo que usamos a diario o la transmisión de videos, consume la misma cantidad de electricidad que entre 10.000 y 25.000 hogares, según la Agencia Internacional de Energía. Pero un centro de datos a “hiperescala” más nuevo y centrado en la IA puede usar la misma cantidad de energía que la equivalente a 100.000 hogares o más. Por ejemplo, se espera que el centro de datos Hyperion de Meta en Luisiana consuma más del doble de energía que toda la ciudad de Nueva Orleans una vez finalizado. Otro centro de datos de Meta, planificado en Wyoming, usará más electricidad que todos los hogares del estado combinados.

Y, por supuesto, a diferencia de las nubes reales, los centros de datos requieren suelo. Y mucho. Algunos de los centros de datos más grandes que se están construyendo hoy en día cubrirán cientos de hectáreas con acero impermeable, hormigón y superficies pavimentadas —suelos que ya no estarán disponibles para cultivo, naturaleza o vivienda— y también requerirán nuevos corredores de líneas de transmisión y otra infraestructura asociada.

Sin embargo, los centros de datos forman parte de nuestro paisaje construido desde hace más de una década; muchos de ellos están escondidos en discretos complejos de oficinas, desde donde procesan en silencio nuestras búsquedas en la web y almacenan las fotos de nuestros teléfonos celulares. Entonces, ¿por qué la preocupación repentina? Las herramientas de inteligencia artificial entrenadas con modelos de lenguaje de gran tamaño, como ChatGPT de Open AI, entre otras, utilizan exponencialmente más potencia informática que los servicios tradicionales en la nube. Y las empresas de tecnología más grandes, como Amazon, Meta, Google y Microsoft, están realizando inversiones rápidas y considerables en IA.

Entre 2018 y 2021, el número de centros de datos de EUA se duplicó con creces y, con el impulso de las inversiones en IA, ese número ya se ha duplicado de nuevo. Al principio del auge de la IA, en 2023, los centros de datos de EUA consumieron 176 teravatios-hora de electricidad, alrededor de la misma cantidad que toda la nación de Irlanda (cuya red eléctrica ya funciona casi a su máxima capacidad, lo que provoca que los centros de datos utilicen generadores contaminantes desconectados de la red), y se espera que este consumo se duplique o incluso se triplique tan pronto como en 2028.

Esta rápida proliferación puede ejercer una presión enorme sobre los recursos locales y regionales, cargas que muchas comunidades anfitrionas no tienen en cuenta en su totalidad o no están preparadas para afrontar.

“La demanda de centros de datos y procesamiento acaba de explotar de forma exponencial debido a la IA”, dice Kim Rueben, exasesora principal de sistemas fiscales del Instituto Lincoln de Políticas de Suelo. Explica que Virginia y Texas tienen, desde hace mucho tiempo, incentivos fiscales para atraer nuevos centros de datos, y “otros estados se están subiendo al tren” con la esperanza de ver crecimiento económico y nuevos ingresos fiscales.

Pero en una conferencia de Políticas de Suelo y Digitalización convocada por el Instituto Lincoln la primavera pasada, Rueben comparó la naturaleza extractiva de los centros de datos con las minas de carbón. “No creo que los lugares reconozcan todos los costos”, indica.

Sí, Virginia, los datos son reales

En la conferencia de prensa, Chris Miller, director ejecutivo del Piedmont Environmental Council (PEC, por sus siglas en inglés), explicó cómo alrededor dos tercios del tráfico mundial de Internet pasa por el norte de Virginia. La región ya alberga la concentración más densa de centros de datos en cualquier parte del mundo, con alrededor de 300 instalaciones en solo un puñado de condados. Ya se planifican o se están desarrollando docenas más, listas para consumir las tierras agrícolas, la energía y el agua disponibles en la región, atraídas por un incentivo estatal que permite que las empresas ahorren más de USD 130 millones en impuestos sobre las ventas y el uso cada año.

A pesar de la reducción de impuestos a nivel estatal, los centros de datos representan una contribución significativa para las arcas locales. En el condado de Loudon, que tiene más de 2,5 millones de metros cuadrados de espacio ocupado por centros de datos, los funcionarios esperan que los ingresos totales por impuestos a la propiedad recaudados de los centros de datos locales en el año fiscal 2025 se acerquen a los USD 900 millones, casi tanto como todo el presupuesto operativo del condado. La proporción de ingresos derivados de los centros de datos creció tanto que la junta de supervisores del condado está considerando ajustar la tasa impositiva para no depender tanto de una sola fuente.

Centros de datos existentes y planificados en el norte de Virginia. El estado recibió el apodo de “la capital mundial de los centros de datos”. Crédito: Piedmont Environmental Council.

Si bien muchas comunidades perciben a los centros de datos como una ventaja económica debido a los ingresos fiscales, las instalaciones en sí mismas no son grandes generadores de puestos de trabajo a largo plazo. La mayoría de los empleos que crean están enraizados en la construcción de los centros de datos y no en la operación continua y, por lo tanto, son temporales en su mayor parte.

Hace décadas, PEC apoyó parte del desarrollo de centros de datos en el norte de Virginia, comenta Julie Bolthouse, directora de políticas de suelo de PEC. Pero hubo cambios drásticos en la industria desde entonces. Por ejemplo, cuando AOL tenía su sede en lo que se conoce como Data Center Alley, el centro de datos de la empresa era una pequeña parte de unas instalaciones más grandes, “contaba con senderos peatonales alrededor, canchas de tenis, canchas de baloncesto… en su apogeo, la empresa tenía 5.300 empleados en el sitio”, relata Bolthouse. Las instalaciones se demolieron y se están construyendo tres grandes centros de datos en el lugar. “Hay una gran valla a su alrededor por motivos de seguridad, por lo que ahora está totalmente aislado de la comunidad, y solo va a dar empleo a entre 100 y 150 personas en el mismo terreno. Ahí está la diferencia”.

El uso de los servicios públicos también se volvió “masivo”, agrega Bolthouse. “Cada uno de esos edificios utiliza el equivalente al consumo de energía de una ciudad, por lo que hay enormes consecuencias para la infraestructura eléctrica de nuestras comunidades. Todas las líneas de transmisión que deben construirse, la expropiación para obtener la tierra a fin de instalar las líneas de transmisión, toda la infraestructura energética, las plantas de gas, los gasoductos que transportan el gas, la contaminación del aire asociada, los impactos climáticos relacionados con todo lo anterior”.

En todo el norte de Virginia, cada uno de los miles de generadores diésel in situ, que tienen el tamaño de un vagón de ferrocarril, emana vapores diésel, lo que crea problemas de calidad del aire. “No conozco otro uso del suelo que utilice tantos generadores como un centro de datos”, comenta Bolthouse. Y, si bien la clasificación oficial de dichos generadores es de energía de respaldo de emergencia, los centros de datos pueden utilizarlos para “satisfacer la demanda” durante 50 horas por vez, agrega. “A nivel local, el aire está muy contaminado. Es materia particulada y NOx [óxidos de nitrógeno], que afecta el crecimiento de los pulmones de los niños, puede provocar casos de asma y exacerbar las enfermedades cardíacas y otras enfermedades subyacentes en los adultos mayores”.

Y luego está la cuestión del agua.

“Como una pajita gigante de refresco”

En un estudio realizado por el Houston Advanced Research Center (HARC) y la Universidad de Houston, se descubrió que los centros de datos en Texas usarán 185.485 millones de litros de agua en 2025, y hasta 1,5 billones de litros en 2030. Eso equivaldría a un descenso en el nivel de agua del embalse más grande de los EUA (el lago Mead, de 63.500 hectáreas) de más de 4,88 metros en un año.

Cualquier persona que haya dejado su teléfono bajo la lluvia por accidente o lo haya dejado caer en un charco podría preguntarse cuál podría ser la relación entre un edificio lleno de delicados aparatos electrónicos costosos y millones de litros de agua. Es, en gran parte, para refrigeración. Al alimentarse con corriente eléctrica, los servidores pueden calentarse mucho, y la refrigeración por evaporación de la habitación es una de las formas más simples y baratas de evitar que los chips se sobrecalienten y dañen.

Sin embargo, eso significa que el agua no solo se usa para enfriar y luego se descarga como agua residual tratable: gran parte de ella se evapora en el proceso, ¡puf!

“Incluso si usan agua recuperada o reciclada, el agua ya no regresa al caudal base de los ríos y arroyos”, comenta Bolthouse. “El proceso tiene impactos ecológicos y problemas de suministro. Alguien siempre estará aguas abajo de la cuenca”. Washington, DC, por ejemplo, seguirá perdiendo el suministro de agua si los centros de datos del norte de Virginia utilizan agua reciclada o recuperada, porque esa agua no volverá al río Potomac. La refrigeración por evaporación también deja altas concentraciones de sales y otros contaminantes, agrega, lo que crea problemas en la calidad del agua.

Existen formas de enfriar los centros de datos que utilizan menos cantidad de agua, incluidos los sistemas de agua de circuito cerrado, que requieren más electricidad, y la refrigeración por inmersión, en la que los servidores se sumergen en un baño de líquido, como un aceite sintético, que conduce el calor, pero no la electricidad. La refrigeración por inmersión también permite una instalación más densa de los servidores, pero aún no se utiliza ampliamente, en gran parte debido al costo.

Resulta irónico, pero puede ser difícil confirmar datos específicos sobre los centros de datos. Dada la condición de propiedad exclusiva de la tecnología de IA y, tal vez, la posible reacción pública, muchas empresas no son muy comunicativas sobre la cantidad de agua que consumen sus centros de datos. Google, por su parte, informó haber utilizado más de 18.900 millones de litros de agua en todos sus centros de datos en 2023, y el 31 por ciento de sus extracciones de agua dulce proviene de cuencas con mediana o alta escasez de agua.

Un estudio de 2023 realizado por la Universidad de California en Riverside estimó que una sesión de chat de IA de alrededor de 20 consultas consume hasta una botella de agua dulce. Esa cantidad puede variar según la plataforma, con modelos más sofisticados que exigen mayores volúmenes de agua, mientras que otras estimaciones sugieren que podría estar más cerca de unas pocas cucharadas por consulta.

“Pero lo que no se reconoce, desde la perspectiva de los sistemas naturales, es que toda el agua es local”, comenta Peter Colohan, director de asociaciones e innovación de programas del Instituto Lincoln, quien ayudó a crear el Internet of Water (Internet del Agua). “Es una pequeña cantidad de agua para un par de consultas, pero todo se toma de una cuenca cercana a ese centro de datos, es decir, miles y miles de litros de agua que se extraen de un solo lugar porque personas de todo el mundo realizan sus consultas de IA”, dice.

“Donde sea que elijan poner un centro de datos, es como una pajita gigante de refresco que absorbe agua de esa cuenca”, continúa Colohan. “Y, cuando se toma agua de un lugar, se debe reducir la demanda o reponer el agua en ese mismo lugar: no hay otra solución. En algunos casos, al menos, los principales desarrolladores de centros de datos comenzaron a reconocer este problema y participan activamente en la reposición de agua donde es necesario”.

Ubicar los centros de datos en regiones más frías y húmedas puede ayudar a reducir la cantidad de agua que utilizan y el impacto de las extracciones de agua dulce. Sin embargo, alrededor de dos tercios de los centros de datos construidos desde 2022 se ubicaron en regiones con escasez de agua, según un análisis de Bloomberg News, que incluyen climas cálidos y secos como Arizona.

Sistema de refrigeración de agua tibia en un centro de datos de Sandia Labs en Albuquerque, Nuevo México. El centro de datos obtuvo la certificación LEED Gold por eficiencia en 2020. Crédito: Bret Latter/Sandia Labs vía Flickr CC.

No es solo enfriar las salas de servidores y los chips lo que consume agua. Casi la mitad de la electricidad que utilizan los centros de datos de EUA en la actualidad proviene de centrales eléctricas de combustibles fósiles, que, a su vez, utilizan mucha agua, ya que calientan vapor para encender las enormes turbinas.

¿Y qué ocurre con los millones de microchips que procesan toda esa información? Para cuando llegan a un centro de datos, cada chip ya ha consumido miles de litros de agua. La fabricación de estos pequeños y potentes componentes informáticos requiere agua tratada “ultrapura” para enjuagar los residuos de silicio sin dañar los chips. Se necesitan alrededor de 5,6 litros de agua del grifo para producir 3,8 litros de agua ultrapura, y una típica fábrica de chips utiliza alrededor de 37,8 millones de litros de agua ultrapura a diario, según el Foro Económico Mundial, lo que equivale a 33.000 hogares estadounidenses.


A medida que las comunidades consideran los beneficios y riesgos del desarrollo de los centros de datos, los consumidores podríamos tener en cuenta nuestro propio papel en el crecimiento de los centros de datos, y si nuestro uso de la IA vale el precio del agua, la energía y el suelo que devora.

Podría haber usos importantes de la inteligencia artificial, si se puede aprovechar, por ejemplo, para resolver problemas complejos o para mejorar la eficiencia de los sistemas de agua y las redes eléctricas.

También existen otros usos claramente superfluos. Por ejemplo, un canal de YouTube con 35 millones de suscriptores presenta videos musicales generados por IA… de canciones generadas por IA. La MIT Technology Review estima que, a diferencia de las consultas de texto simples, el uso de IA para crear contenido de video consume una cantidad extrema de recursos: hacer un video generado por IA de cinco segundos consume casi tanta electricidad como hacer funcionar un microondas sin parar durante más de una hora.

Los defensores de los centros de datos tienden a señalar el hecho de que los estadounidenses usan más agua cada año para regar campos de golf (más de 1,89 billones de litros) y césped (más de 7,57 billones de litros) que los centros de datos de IA. Sin embargo, ese argumento suena falso: Estados Unidos tiene una conocida obsesión con el césped verde que tampoco colabora. La solución, dicen los expertos en agua, radica en la conservación del agua y la educación del consumidor, no en la comparación de un derroche con otro.


 

Priorizar un recurso limitado

Incluso un pequeño centro de datos puede suponer una carga inmensa y concentrada para la infraestructura local y los recursos naturales. En el condado de Newton, Georgia, un centro de datos de Meta que se inauguró en 2018 utiliza 1,89 millones de litros de agua por día, el 10 por ciento del consumo de agua de todo el condado. Y dada la energía barata de Georgia y las generosas exenciones fiscales estatales, el condado de Newton continúa otorgando solicitudes de nuevos permisos de centros de datos, algunos de los cuales usarían hasta 22,71 millones de litros de agua por día, más del doble de lo que todo el condado consume en la actualidad.

Las intensas demandas que los centros de datos imponen a los recursos regionales complican la toma de decisiones a nivel local. Las comunidades y los funcionarios regionales del agua deben participar en debates sobre los centros de datos desde el principio con una comprensión coordinada y holística de los recursos existentes y los posibles impactos en la red de energía y la cuenca, indica Mary Ann Dickinson, directora de políticas de suelo y agua del Instituto Lincoln. “Nos gustaría ayudar a las comunidades a tomar decisiones más inteligentes sobre los centros de datos, ayudándolas a analizar y planificar los posibles impactos en sus estructuras y sistemas comunitarios”.

“El agua suele ser una de las últimas cuestiones en las que se piensa, por lo que en verdad estamos promoviendo la participación temprana, entre otras cuestiones”, comenta John Hernon, gerente de desarrollo estratégico de Thames Water en el Reino Unido. “Cuando piensa en los centros de datos, no se trata solo de la velocidad que se obtendrá, no se trata solo de asegurarse de que haya mucha energía disponible, sino que es necesario garantizar que el agua se tenga en cuenta lo antes posible… en primer lugar y no como una idea de último momento”.

A pesar de su reputación húmeda, Londres no recibe mucha lluvia en comparación con el norte del Reino Unido: menos de 635 mm al año, en promedio, o cerca de la mitad de lo que cae en la ciudad de Nueva York. Sin embargo, debido a que gran parte del crecimiento se centra en Londres, el área de servicio de Thames Water alberga alrededor del 80 por ciento de los centros de datos del Reino Unido, agrega Hernon, y se proponen alrededor de 100 más.

Además, el consumo de agua alcanza su punto máximo durante las épocas más calurosas y secas del año, cuando la empresa de servicios públicos tiene la menor capacidad para satisfacer la demanda adicional. “Por eso hablamos de restringir, reducir u objetar [los centros de datos]”, indica Hernon. “No es porque no nos gusten. Es clarísimo, nosotros también los necesitamos. La IA será una ayuda enorme para nuestro centro de llamadas… lo que significa que podemos poner más personas a arreglar fugas y administrar nuestras redes con proactividad”.

Mantener las luces encendidas

Una forma de que los centros de datos usen menos agua es depender más de la tecnología de refrigeración por aire, pero esto requiere más energía y, a su vez, puede aumentar el uso de agua en forma indirecta, según la fuente de energía. Además, las redes regionales ya tienen problemas para satisfacer la demanda de este tipo de instalaciones sedientas de energía, y hay cientos más en proceso. “Se anunciaron muchos de estos proyectos, pero no queda claro qué fuente de energía puede surgir lo bastante rápido como para alimentarlos”, comenta Kelly T. Sanders, profesora adjunta de Ingeniería en la Universidad del Sur de California.

El gobierno quiere que las empresas de tecnología estadounidenses construyan sus centros de datos de IA en el territorio nacional, no solo por razones económicas, sino también por motivos de seguridad nacional. Pero incluso cuando la gestión de Trump parece comprender las enormes demandas energéticas que los centros de datos impondrán a la red eléctrica, ha aplastado activamente nuevos proyectos de energía eólica, como Revolution Wind frente a la costa de Rhode Island.

El Laboratorio Nacional de Energía Renovable (NREL, por sus siglas en inglés) creó este mapa superpuesto de líneas de transmisión y ubicaciones de centros de datos para “ayudar a visualizar la superposición y simplificar la planificación del cosistema”. Crédito: NREL.gov.

Otras alternativas libres de carbono, como los pequeños reactores modulares (SMR, por sus siglas en inglés) y la energía geotérmica, tienen apoyo bipartidista, comenta Sanders. “Pero el problema es que, incluso si comienza a construir un SMR hoy, el proceso llevará 10 años”, agrega. “Las fuentes con las que podemos contar más rápido son el viento, la energía solar y las baterías. Pero en los últimos seis meses perdimos muchos de los incentivos para la energía limpia, y se libró una guerra contra lo eólico. Se están cancelando proyectos eólicos que ya están construidos y pagados. Y me parece peculiar porque esa es la electricidad que pronto estaría lista para salir a la red, en algunas de estas regiones que están muy congestionadas”.

Los centros de datos se encuentran entre las razones por las que los contribuyentes de todo el país han visto aumentar sus facturas de electricidad al doble de la tasa de inflación en el último año. Parte de eso tiene que ver con la nueva infraestructura que requerirán los centros de datos, como nuevas plantas de energía, líneas de transmisión u otras inversiones. Esos costos, así como el mantenimiento y las actualizaciones continuas de la red, suelen ser costos compartidos entre todos los clientes de electricidad en un área de servicio a través de cargos agregados a las facturas de servicios públicos.

Esto crea, como mínimo, dos problemas: si bien los ingresos fiscales de un nuevo centro de datos solo beneficiarán a la comunidad anfitriona, toda el área de servicio eléctrico debe pagar la infraestructura asociada. En segundo lugar, si una empresa de servicios públicos realiza esa gran inversión, pero el centro de datos, en algún momento, cierra o necesita mucha menos electricidad de la proyectada, son los contribuyentes quienes pagarán la factura, no el centro de datos.

Algunas empresas de tecnología aseguran su propia energía limpia independiente de la red: Microsoft, por ejemplo, firmó un acuerdo de 20 años para comprar energía de manera directa a la planta nuclear de Three Mile Island. Pero ese enfoque tampoco es ideal, indica Sanders. “De todos modos, estos centros de datos utilizarán líneas de transmisión y todos los activos de la red, pero si no están comprando la electricidad de la empresa de servicios públicos, no están pagando toda esa infraestructura a través de las facturas”, agrega.

Además de generar nueva energía, explica Sanders, existen estrategias para exprimir más capacidad de la red existente. “Una de ellas es la vieja y confiable eficiencia energética, y los propios centros de datos tienen todos los incentivos alineados para tratar de hacer que sus procesos sean más eficientes”, comenta. La IA en sí misma también podría ayudar a mejorar el rendimiento de la red. “Podemos usar la inteligencia artificial para obtener más información sobre cómo fluye la energía a través de la red, y así podemos optimizar ese flujo de energía, lo que nos puede dar más capacidad de la que tendríamos de otra manera”, agrega Sanders.

Otra estrategia es hacer que la red sea más flexible. La mayoría de las veces y en la mayoría de las regiones de los EUA, solo usamos alrededor del 40 por ciento de la capacidad total de la red, explica Sanders grosso modo. “Construimos la capacidad de la red para que pueda soportar la demanda en el día más caluroso… y ahí es donde nos preocupamos por estas grandes cargas de los centros de datos”, indica. Sin embargo, una red coordinada de baterías, incluso en los hogares de las personas y los vehículos eléctricos, puede agregar flexibilidad y estabilizar la red durante los momentos de mayor demanda. En julio, Pacific Gas and Electric Company (PG&E) de California realizó la prueba más grande jamás realizada de su “planta de energía virtual” para todo el estado y utilizó baterías residenciales para suministrar 535 megavatios de energía a la red durante dos horas completas al atardecer.

Con cierta planificación intencional y coordinada (“no sucederá de forma natural”, comenta Sanders) puede ser posible agregar más capacidad sin requerir gran cantidad nueva de generación si los centros de datos logran reducir la carga de trabajo durante las horas pico e invertir en baterías de respaldo a gran escala: “Existe un escenario en el que estos centros de datos pueden cumplir un buen papel respecto de la red y agregar más flexibilidad”.

Enfrentar las concesiones con las políticas de suelo

A medida que crece la demanda de centros de datos, la búsqueda de ubicaciones adecuadas para estas instalaciones obligará a las comunidades a enfrentar un sinfín de elecciones injustas entre el agua, la energía, el suelo, el dinero, la salud y el clima. “La planificación integrada del uso del suelo, con prácticas sostenibles de suelo, agua y energía, es la única forma en que podemos lograr, de manera sostenible, el círculo virtuoso necesario para cosechar los beneficios de la IA y el crecimiento económico asociado”, indica Colohan.

Por ejemplo, usar gas natural para satisfacer la carga de electricidad anticipada de los centros de datos de Texas requeriría 50 veces más agua que usar energía solar, según el estudio de HARC, y 1.000 veces más agua que viento. Pero si bien la alimentación de nuevos centros de datos con parques eólicos consumiría la menor cantidad de agua, también requeriría la mayor cantidad de suelo: cuatro veces más suelo que la generación solar y 42 veces más que el gas natural.

A falta de una avalancha de energía nueva y limpia, la mayoría de los centros de datos aportan grandes cantidades de gases de efecto invernadero a nuestras emisiones colectivas, en un momento en que la ciencia exige que los reduzcamos drásticamente para limitar los peores impactos del cambio climático. Los reguladores de Luisiana aprobaron en agosto planes para construir tres nuevas plantas de energía de gas para compensar la demanda de electricidad esperada del centro de datos de IA, Hyperion, de Meta.

A medida que las ciudades o los condados compiten entre sí para atraer centros de datos, las comunidades anfitrionas se llevarán los beneficios fiscales, pero los costos (la intensa demanda de agua, las facturas de electricidad más altas y la contaminación del aire de los generadores de respaldo) se repartirán a la región, incluso a áreas que no verán ningún nuevo ingreso fiscal.

Esa es una de las razones por las que los permisos de los centros de datos necesitan más supervisión estatal, comenta Bolthouse. “La única aprobación que en verdad tienen que obtener es de la localidad, y la localidad no tiene en cuenta los impactos regionales”, agrega. PEC también está impulsando la protección de los contribuyentes y los compromisos de sostenibilidad. “Queremos asegurarnos de fomentar las prácticas más eficientes y sostenibles dentro de la industria, y exigir mitigación cuando no se pueden evitar los impactos”.

¿Demasiado cerca para ser cómodo? Un centro de datos colinda con casas en el condado de Loudoun, Virginia. Crédito: Hugh Kenny a través del Piedmont Environmental Council.

PEC y otras entidades también están ejerciendo presión para lograr una mayor transparencia de la industria. “Muy a menudo, la llegada de los centros de datos incluye acuerdos de confidencialidad”, dice Bolthouse. “Ocultan mucha información sobre el uso del agua y la energía, los impactos en la calidad del aire, las emisiones; ninguna de esa información se divulga, por lo que las comunidades en realidad no saben en qué se están metiendo”.

“Es necesario educar a las comunidades sobre lo que enfrentan y cuáles son sus concesiones cuando dejan entrar un centro de datos”, comenta Colohan. “¿Cuál es el costo real de un centro de datos? Y luego, ¿cómo convertir ese costo real en un beneficio a través de políticas de suelo integradas?”

Rueben explica que entiende el deseo de aprovechar una industria en crecimiento, en especial en las comunidades que experimentan la pérdida de población. Pero en lugar de competir entre sí para atraer centros de datos, agrega, las comunidades deberían tener conversaciones más amplias sobre el crecimiento del empleo y las estrategias de desarrollo económico, teniendo en cuenta los costos reales y las compensaciones que representan estas instalaciones, y pedir a las empresas que proporcionen más garantías y planes detallados.

“Obligar a los operadores de centros de datos a explicar cómo administrarán las instalaciones de manera más eficiente y de dónde obtendrán el agua, y no solo asumir que tienen prioridad en el acceso a los sistemas de agua y energía”, indica, “es un cambio de perspectiva que necesitamos que hagan los funcionarios del gobierno”.


Jon Gorey es redactor del Instituto Lincoln de Políticas de Suelo.

Imagen principal: instalaciones del centro de datos en el condado de Prince William, Virginia. El condado tiene 59 centros de datos en funcionamiento o en construcción. Crédito: Hugh Kenny a través del Piedmont Environmental Council.

A(lready) D(esigned for) U

Preapproved design plans for accessory dwelling units, or ADUs, can help accelerate new housing in established neighborhoods.
By Jon Gorey, February 3, 2026

We have a severe affordable housing shortage in the United States — an urgent need for millions of additional homes. But exacerbating that housing shortage is a housing mismatch.

In much of the US, existing residential neighborhoods — the places where people already like to live, near their jobs, friends, and family members, and that are already served by utilities, transit, and other infrastructure — are overwhelmingly, and often exclusively, composed of single-family homes. While a four-bedroom Colonial can make good sense for a high-income family of five, it shouldn’t be the only housing option available in a community, given the kaleidoscopic variety of humanity and its households, from aging seniors to young adults to single parents.

“We’re going to have more people over the age of 65 than under 18 in the next decade,” says Rodney Harrell, AARP’s vice president of family, home, and community. The organization has a long history of advocating for better housing conditions and options for seniors. “People want to be near grocery stores, parks, libraries, transportation options — things that make them feel connected. But one of the challenges is that people want to stay in their existing neighborhoods, and there aren’t enough options there.”

Adding new housing options to existing communities, however, routinely elicits complaints about changes to the “neighborhood character.” This loaded phrase can contain exclusionary attitudes and bad-faith arguments within its ample ambiguity, but it can also be a response to dubious development decisions. A homeowner in a neighborhood of century-old Craftsman bungalows may understandably be put off by the idea of a sleek new seven-story steel and concrete building on the corner.

Therein lies the appeal of the humble accessory dwelling unit, or ADU — more colloquially known as an in-law apartment, carriage house, secondary suite, or casita, among other aliases. By converting a garage, attic, or basement to a separate apartment, or adding a small, detached cottage to a backyard, homeowners can create an extra space for family members or a small rental property that helps generate income. At the same time, they help increase the supply of affordable and accessible housing options in their neighborhood — without a dramatic impact to the local aesthetic. And making it easier for homeowners to do that can help communities everywhere address the local and national housing crunch.

Over the past decade, many cities and some states have relaxed decades-old restrictions on ADUs. California, for example, legalized ADUs on all single-family lots in 2017; a few years later, the nearly 27,000 ADUs permitted statewide in 2023 represented a 20-fold increase over 2016, and comprised more than 20 percent of all new housing permitted. In 2024, Los Angeles alone granted permits for more than 6,000 ADUs.

That’s not enough to singlehandedly solve California’s housing crisis — no one step is. But it’s certainly one piece of the puzzle, and a solution that many communities can get behind.

Still, making it legal to build an ADU at all is just the first hurdle. Making it easier for someone to accomplish is the next step — one that cities can assist in by removing unnecessary barriers.

For example, to encourage and accelerate the adoption of ADUs, many cities across the US and Canada have begun offering residents access to preapproved design plans for detached ADUs — complete technical schematics that have already been reviewed by building officials.

“The system can be a little bit stacked against the local homeowner who wants to be able to do this,” Harrell says. Between site reviews, utility plans, and architectural approvals, “there are so many things that you have to go through that you’re doing for the first time,” he adds. “Having these preapproved designs takes away one of those barriers. It says, ‘You don’t have to be a designer, or have enough money to hire one. Here are some designs that can work.’”

Preapproved ADU Plans in California

Los Angeles offers residents a growing catalog of preapproved ADU plans, including a standard one-bedroom architectural plan commissioned by the city, called the YOU-ADU (pictured), that any resident of Los Angeles can use for free.

Dozens of other plans are also preapproved, but require a modest licensing fee paid to their respective architects, most of whom can also be hired for site-specific consultations.

While a preapproved ADU plan already meets certain city codes (e.g. building, fire, and energy regulations), and thus can advance through the plan-check and permitting process more quickly than a custom design, it doesn’t mean a homeowner can just plop one in their backyard with no questions asked. There are still site-specific approvals required, such as land use or stormwater reviews.

But using a preapproved plan can shave weeks or even months off the process, and offers predictability for both homeowners and local officials. The efficiency of a standard design can also create cost savings.

“Custom plans not only take more time and money to design, they’re much more complex to deliver in the field,” says Whitney Hill, co-founder and chief executive of SnapADU in Southern California, whose standard design plans have been selected for preapproval in multiple cities around San Diego.

All of that drives up prices, she adds, noting that a fully custom ADU typically costs $30,000 to $50,000 more to build than a standard one of the same size and bed-and-bath count. “On the other hand, plans that we have built before have already been vetted for real-world constraints; we know we can build them efficiently.”

Hill says that faster permitting times on standard designs can also translate to lower costs. “Building an ADU in 12 months versus 18 months is far more economical from an overhead cost perspective for us,” she says. “We share that savings with the homeowner.”

Even when using a preapproved plan, homeowners should still be prepared for site-specific costs and work, she notes. “It’s critical to understand your site’s topography, existing utility locations, and existing utility loads,” she says. Some projects may require water service upgrades to accommodate an additional bathroom, for example, or an upgraded electrical panel—both of which can be costly.

But one of the biggest benefits to using a standard design, Hill says, is the predictability. “Build costs for an existing floor plan are available before you even kick off your own project,” Hill says, “[which] is great for homeowners who are trying to stick to a specific budget.”

Seattle’s ADUniverse

While Washington State recently passed legislation requiring cities to allow four homes on all residential lots (and six units near transit), Seattle began embracing ADUs over a decade ago, loosening some local restrictions that stood in the way of their adoption, such as minimum lot sizes. “That was an important first step, and a viable one, because land use regulations are what the city most directly controls,” says Nicolas Welch, senior planner in Seattle’s Office of Planning and Community Development.

Still, most homeowners have little if any experience with housing development, so the idea of hiring an architect and applying for permits to build a backyard cottage can feel overwhelming — even before the considerable cost involved. Seattle soon decided it should do more than simply improve its regulations, and developed a resource-rich website called ADUniverse.

“The site was meant to provide all the resources that a homeowner might need in one place with better, clearer information for folks who are basically trying to take on development for the first time without a background in it,” Welch says. “Offering some preapproved designs was one component of that, as well as letting them look up their property to see what’s actually feasible on their lot.”

The city invited architects to submit their ADU designs and then had a jury select 10 plans — out of about 150 submissions — to get preapproved by the building department. In the five years since, Welch says, “Some 350 permits have been issued for the preapproved designs,” or roughly 10 percent of all ADUs approved in that time; the city now permits an average of about 900 new ADUs per year.

“On the one hand, it’s a very small number in a city and county that has a shortage of hundreds of thousands of units, so I do think it’s important to right-size the expectations,” Welch says. “It’s very small and incremental. But it’s also hundreds of units that now exist, and that people are living in.”

Using a preapproved plan noticeably speeds up the early permitting process, Welch says: “If you don’t have something weird going on, like you’re on a very steep slope or you’re removing a gigantic tree or something, then you’ll get your permit in two to six weeks, rather than three or four months.”

While celebrating Washington’s statewide dissolution of single-family exclusive zoning, Seattle’s Director of Planning and Community Development, Rico Quirindongo, acknowledges that such a sea change in policy can also hasten gentrification pressure by opening up a new market.

“The challenge of gentrification in cities — and Seattle is no exception — is that an upzone happens, property values go up, property taxes go up, and then low- and middle-income families do not see the benefit of the upzone, they only feel the burden,” Quirindongo explains. “An opportunistic developer says, ‘I can buy you out for 10 percent over asking, and then you don’t have to worry about this anymore, you can go live somewhere else.’ That is how we have seen the Central District, a traditionally African American community here in Seattle, go from 75 percent Black families to 10 percent Black families over the last 20 to 25 years,” he says.

Easing the process, and cost, of building an ADU provides an “opportunity for homeowners to be a part of the development opportunity, where they’re building generational wealth,” he says. Whether a homeowner uses an ADU to generate long-term rental income or to house an aging relative or grown children, it can help them stay in their neighborhood and share in the benefits of local growth. “They are building a multi-generational campus that is their house and property. And you’re creating infill, missing middle housing, that is consistent with the context and feel of historic neighborhoods.”

Still, even if future rental income from an ADU might offset the cost of a construction or home equity loan, building one typically requires significant upfront investment. So Quirindongo helped devise a unique pilot program intended to open up the opportunity to more lower-income residents. Here’s how it works:

1. Selected homeowners (the pilot will begin with 10 parcels) will enter a partnership with the city and a developer, who will take out a 12-year ground lease on a portion of the homeowner’s lot.
2. In the first two years, the developer builds two detached ADUs in the homeowner’s backyard, at no cost to the homeowner.
3. The developer then rents out and manages both ADUs for 10 years. The developer keeps about half of the rental income, while the other half is split: a portion provides monthly revenue to the homeowner, while the rest is deposited in a set-aside account.
4. At the end of 10 years, the ground lease expires, and there’s enough money in that account to buy out the developer’s remaining interest and make them whole, so the homeowner ends up with two ADUs on their property, which they can continue to rent out or convey with the property should they sell their home. “Over that period of time, the homeowner builds up enough money in that account to buy out the partner, so they own those units outright after that 12-year period,” Quirindongo explains.

Preapproved ADU Plans in Oregon

Beyond creating unobtrusive infill housing, ADUs are, almost by definition, small — and thus inherently more affordable than most new single-family homes, which averaged 2,405 square feet in the third quarter of 2025.
In Oregon, Portland’s Residential Infill Project has yielded more than 1,400 new permits for ADUs and missing middle housing in single-family neighborhoods, comprising almost half of new development in the city from 2022-2024, even as other construction lagged. But as importantly, the project capped building sizes in an effort to encourage more small homes instead of fewer large houses — and that has demonstrably improved affordability. In 2023 and 2024, sale prices of new missing middle homes averaged $250,000 to $300,00 less than new single-family houses in the same Portland neighborhoods, largely due to their smaller sizes.

In a heartening example of municipal collaboration, Portland was able to borrow and tweak a preapproved plan from the city of Eugene, Oregon—the Joel, shown here—to offer its own residents a set of similar preapproved ADU plans.

Preapproved ADUs in Louisville, Kentucky

AARP published its first model ADU ordinance over two decades ago. Since then, the organization has helped a number of cities, including Louisville, Kentucky, to re-legalize ADUs by right locally, and helped communities hold contests to create free architectural plans for residents.

Louisville invited architects to submit their designs, and then purchased the rights to three preapproved ADU plans, which it offers for free to all residents.

Rodney Harrell, of AARP, says ADUs can enhance freedom for seniors by giving them more and better options in the places they already live. “What I love is that it’s a solution that gives more options to people who want to be in the communities that work best for them,” he says.

“I’ve talked to so many people who are stuck,” Harrell says. “They’ve got a house, and at some point it may have been their dream home, but now it’s become a nightmare. They’ve got too many stairs. Maybe it’s too big and their spouse passed, and they can’t afford it anymore.”

A senior who can no longer manage the stairs in their house can stay in the community they love by building a fully accessible, universally designed ADU in their backyard, he explains, and renting out the main house. “That gives you more freedom,” he says. “If you want to stay in your main house and have a caregiver stay in the ADU, that also gives you more freedom. Or maybe you just need a little bit of money to be able to afford to stay in your house, and maybe you’re able to rent out the ADU and stay in your main house.”

And Beyond

In Seattle, Welch says the city’s efforts to legalize ADUs in single-family neighborhoods helped pave the way for more middle housing (duplexes, triplexes, and fourplexes). “The sky didn’t fall, and so then state legislators felt more emboldened and empowered,” he says.

Many other cities and states across the US and Canada are now embracing ADUs as well, and providing design plans, guidance, and “ADU lookbooks” for residents interested in building one. Here’s a look at just a few preapproved designs offered in cities around North America.

There are dozens more examples across the country, and many cities continue to add new designs to their lists of approved plans. It’s merely one step in the right direction—but it’s a step nonetheless.

“People can be scared of things that are different,” AARP’s Harrell says. “But one thing that always gets me is that the ADU is really an old form of housing in a lot of the country. It’s just that we’re re-legalizing it. We’re making it able to be built again, and up to standards and codes of the modern day. So we shouldn’t put unnecessary barriers in place.”

The Wild West of Data Centers: Energy and water use top concerns

December 18, 2025

By Anthony Flint, December 18, 2025

It’s safe to say that the proliferation of data centers was one of the biggest stories of 2025, prompting concerns about land use, energy and water consumption, and carbon emissions. The massive facilities, driven by the rapidly increasing use of artificial intelligence, are sprouting up across the US with what critics say is little oversight or long-term understanding of their impacts.

“There is no system of planning for the land use, for the energy consumption, for the water consumption, or the larger impacts on land, agricultural, (forest) land, historic, scenic, and cultural resources, biodiversity,” said Chris Miller, president of the Piedmont Environmental Council, who has been tracking the explosion of data centers in northern Virginia, on the latest episode of the Land Matters podcast.

“There’s no assessment being made, and to the extent that there’s project-level review, there’s a lot of discussion about eliminating most of that to streamline this process. There is no aggregate assessment, and that’s what’s terrifying. We have local land use decisions being made without any information about the larger aggregate impacts in the locality and then beyond.”

Miller appeared on the show alongside Lincoln Institute staff writer Jon Gorey, author of the article Data Drain: The Land and Water Impacts of Data Centers, published earlier this year, and Mary Ann Dickinson, policy director for Land and Water at the Lincoln Institute, who is overseeing research on water use by the massive facilities. All three participated in a two-day workshop earlier this year at the Lincoln Institute’s Land Policy Conference: Responsive and Equitable Digitalization in Land Policy.

There is no federal registration requirement for data centers, and owners can be secretive about their locations for security reasons and competitive advantage. But according to the industry database Data Center Map, there at least 4,000 data centers across the US, with hundreds more on the way.

A third of US data centers are in just three states, with Virginia leading the way followed by Texas and California. Several metropolitan regions have become hubs for the facilities, including northern Virginia, Dallas, Chicago, and Phoenix.
Data centers housing computer servers, data storage systems and networking equipment, as well as the power and cooling systems that keep them running, have become necessary for high-velocity computing tasks. According to the Pew Research Center, “whenever you send an email, stream a movie or TV show, save a family photo to “the cloud” or ask a chatbot a question, you’re interacting with a data center.”

The facilities use a staggering amount of power; a single large data center can gobble up as much power as a small city. The tech companies initially promised to use clean energy, but with so much demand, they are tapping fossil fuels like gas and coal, and in some instances even considering nuclear power.

Despite their outsized impacts, data centers are largely being fast-tracked, in many cases overwhelming local community concerns. They’re getting tax breaks and other incentives to build with breathtaking speed, alongside a major PR effort that includes television ads touting the benefits of data centers for the jobs they provide, in areas that have been struggling economically.

Listen to the show here or subscribe to Land Matters on Apple Podcasts, Spotify, Stitcher, YouTube, or wherever you listen to podcasts.

 


Further Reading

Supersized Data Centers Are Coming. See How They Will Transform America | The Washington Post

Thirsty for Power and Water, AI-Crunching Data Centers Sprout Across the West | Bill Lane Center for the American West

Project Profile: Reimagining US Data Centers to Better Serve the Planet in San Jose | Urban Land Magazine

A Sustainable Future for Data Centers | Harvard John A. Paulson School of Engineering and Applied Sciences

New Mexico Data Center Project Could Emit More Greenhouse Gases Than Its Two Largest Cities | Governing magazine

  


Anthony Flint is a senior fellow at the Lincoln Institute of Land Policy, host of the Land Matters podcast, and a contributing editor of Land Lines. 


Transcript

Anthony Flint: Welcome back to the Land Matters Podcast. I’m your host, Anthony Flint. I think it’s safe to say that the proliferation of data centers was one of the biggest stories of 2025, and at the end of the day, it’s a land use story braided together with energy, the grid, power generation, the environment, carbon emissions, and economic development – and, the other big story of the year, to be sure, artificial intelligence, which is driving the need for these massive facilities.

There’s no federal registration requirement for data centers, and sometimes owners can be quite secretive about their locations for security reasons and competitive advantage. According to the industry database data center map, there are at least 4,000 data centers across the US. Some would say that number is closer to 5,000, but unquestionably, there are hundreds more on the way.

A third of US data centers are in just three states, with Virginia leading the way, followed by Texas and California. Several metropolitan regions have become hubs for these facilities, including Northern Virginia, Dallas, Chicago, and Phoenix, and the sites tend to get added onto with half of data centers currently being built being part of a preexisting large cluster, according to the International Energy Agency.

These are massive buildings housing computer servers, data storage systems, and networking equipment, as well as the power and cooling systems that keep them running. That’s according to the Pew Research Center, which points out that whenever you send an email, stream a movie or TV show, save a family photo to the cloud, or ask a chatbot a question, you’re interacting with a data center. They use a lot of power, which the tech companies initially promised would be clean energy, but now, with so much demand, they’re turning largely to fossil fuels like gas and even coal, and in some cases, considering nuclear power.

A single large data center can gobble up as much power as a small city, and they’re largely being fast-tracked, in many cases, overwhelming local community concerns. They’re getting tax breaks and other incentives to build with breathtaking speed, and there’s a major PR effort underway to accentuate the positive. You may have seen some of those television ads touting the benefits of data centers, including in areas that have been struggling economically.

To help make sense of all of this, I’m joined by three special guests, Jon Gorey, author of the article Data Drain: The Land and Water Impacts of Data Centers, published earlier this year at Land Lines Magazine; Mary Ann Dickinson, Policy Director for Land and Water at the Lincoln Institute; and Chris Miller, President of the Piedmont Environmental Council, who’s been tracking the explosion of data centers in Northern Virginia.

Well, thank you all for being here on Land Matters, and Jon, let me start with you. You’ve had a lot of experience writing about real estate and land use and energy and the environment. Have you seen anything quite like this? What’s going on out there? What were your takeaways after reporting your story?

Jon Gorey: Sure. Thank you, Anthony, for having me, and it’s great to be here with you and Mary Ann, and Chris too. I think what has surprised me the most is the scale and the pace of this data center explosion and the AI adoption that’s feeding it. When I was writing the story, I looked around the Boston area to see if there was a data center that I could visit in person to do some on-the-ground reporting.

It turns out we have a bunch of them, but they’re mostly from 10, 20 years ago. They’re pretty small. They’re well-integrated into our built environment. They’re just tucked into one section of an office building or something next to a grocery store. They’re doing less intensive tasks like storing our emails or cell phone photos on the cloud. The data centers being built now to support AI are just exponentially larger and more resource-intensive.

For example, Meta is planning a 715,000-square-foot data center outside the capital of Wyoming, which is over 16 acres of building footprint by itself, not even counting the grounds around it. That will itself use more electricity than every home in Wyoming combined. That’s astonishing. The governor there touted it as a win for the natural gas industry locally. They’re not necessarily going to supply all that energy with renewables. Then there’s just the pace of it. Between 2018 and 2021, the number of US data centers doubled, and then it doubled again by 2024.

In 2023, when most people were maybe only hearing about ChatGPT for the first time, US data centers were already using as much electricity as the entire country of Ireland. That’s poised to double or triple by 2028. It’s happening extremely fast, and they are extremely big. One of the big takeaways from the research, I think, was how this creates this huge cost-benefit mismatch between localities and broader regions like in Loudoun County, Virginia, which I’m sure Chris can talk about.

The tax revenue from data centers, that’s a benefit to county residents. They don’t have to shoulder as much of the bills for schools and other local services. The electricity and the water and the infrastructure and the environmental costs associated with those data centers are more dispersed. They’re spread out across the entire utilities service area with higher rates for water, higher electric rates, more pollution. That’s a real discrepancy and it’s happening pretty much anywhere one of these major data centers goes up.

Anthony Flint: Mary Ann Dickinson, let’s zoom in on how much water these data centers require. I was surprised by that. In addition to all the power they use, I want to ask you, first of all, why do they need so much water, and where is it coming from? In places like the Southwest, water is such a precious resource that’s needed for agriculture and people. It seems like there’s a lot more work to be done to make this even plausibly sustainable.

Mary Ann Dickinson: Well, water is the issue of the day right now. We’ve heard lots of data center discussion about energy. That’s primarily been the focus of a lot of media reporting during 2025. Water is now emerging as this issue that is dwarfing a lot of local utility systems. Data centers use massive amounts of water. It can be anywhere between 3 and 5 million gallons a day. It’s primarily to answer your question for cooling. It’s a much larger draw than most large industrial water users in a community water system.

The concern is that if the data centers are tying into local water utilities, which they prefer because of the affordability and the reliability and the treatment of the supply, that can easily swamp a utility system that is not accustomed to that continuous, constant draw. These large hyperscale data centers that are now being built can use hundreds of millions of gallons yearly. That’s equivalent to the water usage of a medium-sized city.

To Jon’s point, if you look at how much water that is being consumed by a data center in very water-scarce areas in the West in particular, you wonder where that water is going to come from. Is it going to come from groundwater? Is it going to come from surface water supplies? How is that water going to be managed and basically replaced back into the natural systems, like rivers, from which it might be being withdrawn? Colorado River, of course, being a prime example of an over-allocated river system.

What is all this water going for? Yes, it’s going for cooling, humidification in the data centers, it’s what they’re calling direct use, but there’s also indirect use, which is the water that it takes to generate the electricity that supplies the data center. The data center energy loads are serious, and Chris can talk about the grid issues as well, but a lot of that water is actually indirectly used to generate electricity, as well as directly used to cool those chips.

This indirect use can be substantial. It can be equivalent to about a half a gallon per kilowatt hour. That can be a fair amount of water just for providing that electricity. What we’re seeing is the average hyperscale data center uses about half a million gallons of water a day. That’s a lot of water to come from a local community water system. It’s a concern, and especially in the water-scarce regions where water is already being so short that farmers are being asked to fallow fields, how is the data center water load going to be accommodated within these water systems?

The irony is the data centers are going into these water-scarce regions. There was a Bloomberg report that showed that, actually, water-scarce regions were the most popular location for these data centers because they were approximate to areas of immediate use. That, of course, means California, it means Texas and Phoenix, Arizona, those states that are already struggling with providing water to their regular customers.

It’s a dilemma, and it’s one that we want to look at a lot more closely to help protect the community water systems and give them the right questions to ask when the data center comes to town and wants to locate there, and help them abate the financial risk that might be associated with the data center that maybe comes and then goes, leaving them with a stranded asset.

These are all complex issues. The tax issues tie into the water issues because the water utility system and impacts to that system might not be covered by whatever tax revenues are coming in. As sizable as they might be, they still might not be enough to cover infrastructure costs that then would otherwise be given to assess to the utility ratepayers. We’re seeing this in the energy side. We’re seeing electric rates go up. At the same time, we know these data centers are necessary given what we’re now as a society doing in terms of AI and digital computing.

We just have to figure out the way to most sustainably deal with it. We’re working with technical experts, folks from the Los Alamos National Lab, and we’re talking with them about the opportunities for using recycled water, using other options that are not going to be quite as water-consumptive.

Anthony Flint: Yes, we can talk more about that later in the show — different approaches, using gray water or recycled water, sounds like a promising idea because at the end of the day, there’s only so much water, right? Chris Miller, from the Piedmont Environmental Council, you pointed out, in Jon’s story, that roughly two-thirds of the world’s internet traffic essentially passes through Northern Virginia, and the region already hosts the densest concentration of data centers anywhere in the world. What’s been the impact on farmland, energy, water use, carbon emissions, everything? Walk us through what it’s like to be in such a hot spot.

Chris Miller: The current estimate is that Virginia has over 800 data centers. It’s a little hard to know because some of them are dark facilities, so not all of them are mappable, but the ones we’ve been able to map, that’s what we’re approaching. For land use junkies, there’s about 360 million square feet of build-approved or in-the-pipeline applications for data centers in the state. That’s a lot of footprint. The closest comparison I could make that seemed reasonable was all of Northern Virginia has about 150,000 square feet of commercial retail space.

We are looking at a future where just the footprint of the buildings is pretty extraordinary. We have sites that are one building, one gigawatt, almost a million square feet, 80 feet high. You just have to think about that. That’s the amount of power that a nuclear reactor can produce at peak load. We’re building those kinds of buildings on about 100 acres, 150 acres. Not particularly large parcels of land with extraordinary power density of electricity demand, which is just hard to wrap your head around.

The current estimate in Virginia for aggregate peak load demand increase in electricity exclusively from data centers is about 50 gigawatts in the next 20 years. That’ll be a tripling of the existing system. Now, more and more, the utilities, grid regulators, the grid monitor for PJM, which is a large regional transmission organization that runs from Chicago all the way to North Carolina.

As Anthony said, the existing system is near breaking point, maybe in the next three years. If all the demand came online, you would have brownouts and blackouts throughout the system. That’s pretty serious. It’s a reflection of the general problem, which is that there is no system of planning for the land use, for the energy consumption, for the water consumption. Larger impacts on land, agricultural, forestal land, historic scenic, cultural resources, biodiversity sites. There’s no assessment being made.

To the extent that there’s project-level review, there’s a lot of discussion about eliminating most of that to streamline this process. There is no aggregate assessment. That’s what’s terrifying. We have local land use decisions being made without any information about the larger aggregate impacts in the locality and then beyond. Then the state and federal governments are issuing permits without having really evaluated the combined effect of all this change.

I think that’s the way we’re looking at it. Change is inevitable. Change is coming. We should be doing it in a way that’s better than the way we’ve done it before, not worse. We need to do it in a way that basically is an honest assessment of the scale and scope, the aggregate impacts, and then apply the ingenuity and creativity of both the tech industry and the larger economy to minimize the impact that this has on communities and the natural resources on which we all depend on.

It’s getting to the point of being very serious. Virginia is water-constrained. It doesn’t have that reputation, but our water supply systems are all straining to meet current demand. The only assessment we have on the effect of future peak load from data centers is by the Interstate Commission on the Potomac River Basin, which manages the water supply for Washington metropolitan region in five states.

Their conclusion is, in the foreseeable future, 2040, we reach a point where consumption exceeds supply. Think about that. We’re moving forward with [facilities]  as they create a shortage of water supply in the nation’s capital. It’s being done without any oversight or direction. The work of the Lincoln Institute and groups like PEC is actually essential because the governmental entities are paralyzed. Paralyzed by a lack of policy structure, they’re also paralyzed by politics, which is caught between the perception of this is the next economic opportunity, which funds the needs of the community.

The fact is, the impacts may outweigh the benefits. We have to buckle down and realize this is the future. How do we help state, local, federal government to build decision models that take into account the enormous scale and scope of the industry and figure out how to fix the broken systems and make them better than they were before? I think that’s what all of us have been working on over the last five years.

Anthony Flint: It really is extraordinary, for those of us in the world of land use and regulations. We’ve heard a lot about the abundance agenda and how the US is making it more difficult to build things and infrastructure. Whether it’s clean energy or a solar farm or a wind farm, they have to go through a lot of hoops. Housing, same way. Here you have this — it’s not just any land use; it’s just this incredibly impactful land use that is seemingly not getting any of that oversight or making these places go through those hoops.

Chris Miller: They are certainly cutting corners. Jon mentioned the facility outside of Boston. What did you say, 150 acres? We have a site adjacent to the Manassas National Battlefield Park, which is part of the national park system, called the Prince William Digital Gateway, which is an aggregation of 2100 acres with plans for 27 million square feet of data centers with a projected energy demand of up to 7.5 gigawatts. The total base load supply of nuclear energy available in Virginia right now is just a little bit over 3 gigawatts.

The entire offshore wind development project at Dominion is 80% complete, but what’s big and controversial is 2.5 gigawatts. The two biggest sources of base load supply aren’t sufficient to meet 24/7 demand from a land use proposal on 2100 acres, 27 million square feet, that was made without assessing the energy impact, the supply of water, or the impact of infrastructure on natural, cultural, and historic resources, one of which is hallowed ground. It’s a place where two significant Civil War battlefields were fought. It’s extraordinary.

What’s even more extraordinary is to have public officials, senators, congressmen, members of agencies say, “We’re not sure what the federal next steps [are].” These are projects that have interstate effects on power, on water, on air quality. We haven’t talked about that, but one of the plans that’s been hatched by the industry is through onsite generation and take advantage of the backup generation that they’ve built out. They have to provide 100% backup generation onsite for their peak load. They’ve 90% of that in diesel without significant air quality controls.

We have found permits for 12.4 gigawatts of diesel in Northern Virginia. That would bust the ozone and PM2.5 regulatory standards for public health if they operated together. It’s being discussed by the Department of Environmental Quality in Virginia as a backup strategy for meeting power demand so that data centers can operate without restriction. These are choices that are being proposed without any modeling, without any monitoring, and without any assessment of whether those impacts are in conflict with other public policy goals, like human health. Terrifying.

We are at a breaking point. I have to say that the grassroots response is a pox upon all your houses. That was reflected in the 2025 elections that Virginia just went through. The tidal wave of change in the General Assembly and statewide offices and data centers and energy costs were very, very high on the list of concerns for voters.

Anthony Flint: I want to ask all three of you this question, but Jon, let me start with you. Is there any way to make a more sustainable data center?

Jon Gorey: Yes, there are some good examples here and there. It is, in some cases, in their best interest to use less electricity. It’ll be less expensive for them to use less water. Google, for its part, has published a pretty more transparent than some companies in their environmental report. They compare their water use in the context of golf courses irrigated, which does come across as not a great comparison because golf courses are not a terrific use of water either.

They do admit that last year, 2024, they used about 8.1 billion gallons of water in their data centers, the ones that they own, the 28% increase over the year before, and 14% of that was in severely water-stressed regions. Another 14% was in medium stress. One of their data centers in Council Bluffs, Iowa, consumed over a billion gallons of water by itself. They also have data centers, like in Denmark and Germany, that use barely a million gallons over the course of a year.

I don’t know if those are just very small ones, but I know they and Microsoft and other companies are developing … there’s immersive cooling, where instead of using evaporative water cooling to cool off the entire room that the servers are in, you can basically dunk the chips and servers in a synthetic oil that conducts heat but not electricity. It’s more expensive to do, but it’s completely possible. There are methods. There’s maybe some hope there that they will continue to do that more.

Mary Ann Dickinson: Immersive cooling, which you’ve just mentioned, is certainly an option now, but what we’re hearing is that it’s not going to be an option in the future, that because of the increasing power density and chips, they are going to need direct liquid cooling, period, and immersive cooling is not going to work. That’s the frightening part of the whole water story is as much or as little water is being used now, is going to pale against the water that’s going to be used in the next 5 to 10 years by the new generation of data centers and the new chips that they’ll be using.

The funny thing about the golf course analogy is that, in the West, a lot of those golf courses are irrigated with recycled water. As Chris knows, it also recharges back into groundwater. It is not lost as consumptive loss. That’s the issue is, really, to make these sustainable, we’re going to need to really examine the water cooling systems, what the evaporative loss is, what the discharge is to sewer systems, what the potential is for recycled water. There’s going to be a whole lot of questions that we’re going to ask, but we’re not getting any data.

Only a third of the data centers nationally even report their energy and water use. The transparency issue is becoming a serious problem. Many communities are being asked to sign NDAs. They can’t even share the information that a data center is using in energy and water with their citizens. It is a little bit of a challenge to try and figure out the path going forward. It’s all about economics, as Chris knows. It’s all about what can be afforded.

The work we’re doing at the Lincoln Institute, we would like to suggest as many sustainable options from the water perspective as possible, but they’re going to have to be paid for somewhere. That is the big question. Data centers need to pay.

Chris Miller: I think we’re entering a [time] where innovation is necessary. It has to be encouraged, and it’s where a crisis, just short of what we saw with lapse of the banking system in 2008, 2009, where no one was really paying attention to the aggregate system-wide failures. Somebody had to step up and say it’s broken. In the case of the mortgage crisis, it was actually 49 states coming to a court, saying, “We have to have a settlement so that we can rework all these mortgages and settle out the accounts and rebuild the system from no ground up.”

I think that’s the same place we’re at. We have to have a group of states get together and saying, “We are going to rebuild a decision model that we use for this new economy. It’s not going away. Any gains in efficiency are going to be offset by the expansion on demand for data. That’s been the trend for the last 15 years. We have to deal with the scale and the scope of the issue. I’ll give you just one example.

Dominion Energy has published at an aggregated contracts totaling 47.1 gigawatts of demand that they have to meet. Their estimate of the CapEx to do that ranges for 141 billion to 271 billion depending on whether they comply with the goals of the Virginia Clean Economy Act and move towards decommissioning and replacement of existing fossil fuel generation with cleaner sources. That range is not the issue. It’s the bottom line, which is 150 to 250 $300 billion in CapEx in one state for energy infrastructure. That’s enormous. We need a better process than a case-by-case review of the individual projects.

The state corporation does not maintain a central database of transmission and generation projects, which it approves. The state DEQ does not have a central database for water basin supply and demand. The state DEQ does not have a database of all of the permits in a model that shows what the impacts of backup generation would be if they all turned on at the same time in a brownout or blackout scenario. The failure to do that kind of systems analysis that desperately needs to be addressed. It’s not going to be done by this administration at the federal level.

It’s going to take state governments working together to build new systems decision tools that are informed by the expertise of places like the Lincoln Institute, so that they’re looking at this as a large-scale systemic process. We build it out in a way that’s rational, that takes into account the impacts of people and on communities and on land, and does it a way that fairly distributes the cost back to the industry that’s triggering the demand.

This industry is uniquely able to charge the whole globe for the use of certain parts of America as the base of its infrastructure. We should be working very hard on a cost allocation model and an assignment of cost to data center industry that can recapture the economic value and pay themselves back from the whole globe. No reason for the rate payers of Virginia or Massachusetts or Arizona, Oregon to be subsidizing the seven largest corporations in the world, the [capital expenditures] of over $22 trillion. It’s unfair, it’s un-American, it’s undemocratic.

We have to stand up to what’s happening and realize how big it is and realize it’s a threat to our way of life, our system of land use and natural resource allocation and frankly, democracy itself.

Anthony Flint: I want to bring this to a conclusion, although certainly there are many more issues we could talk about, but I want to look at the end user in a way and whether we as individuals can do anything about using AI, for example. I was talking with Jon, journalist-to-journalist, about this. I want to turn to you, Jon, on this question. Should we be trying not to use AI, and is that even possible?

Jon Gorey: The more I researched this piece, the more adamant I became that I shouldn’t be using it where possible. Not that that’s going to make any difference, but to me, it felt like I don’t really want to be a part of it. I expect there’s legitimate and valuable use cases for AI and science and technology, but I am pretty shocked by how cavalier people I know, my friends and family, have been in embracing it.

Part of that is that tech companies are forcing it on us because they’ve invested in it. They’re like, “Hey, we spent all this money on this, you got to use it.” It takes some legwork to remove the Google Assist from your Google searches or to get Microsoft Copilot to just leave you alone. I feel like that’s like it’s ancestor Clippy, the paperclip from Microsoft Office back in the day.

Here’s something that galls me more in a broader sense. I don’t know if we want to get into it, but I’m an amateur musician. I’m amateur because it’s already very difficult to make any money in the arts. There’s a YouTube channel with 35 million subscribers that simply plays AI-generated videos of AI-generated music, which is twice as many subscribers as Olivia Rodrigo has and 20 times as many as Gracie Abrams. Both of them are huge pop stars who sell out basketball arenas. It astounds me, and I don’t know why people are enjoying just artificially created things. I get the novelty of it, but I, for one, am trying to avoid stuff like that.

Chris Miller: We were having a debate about this issue this week on a series of forums. The reality is there’s stuff that each of us can do to significantly reduce our data load. It takes a little bit of effort. Most of us are storing two or three times what we need to, literally copies of things that we already have. There’s an efficiency of storage thing that takes time, and that’s why we don’t do it. There’s the use of devices appropriately.

If you can watch a broadcast television show and not stream it, that’s a significant reduction in load, actually. Ironically, we’ve gone from broadcast through the air, which has very little energy involved, to streaming on fiber optics and cable, and then wireless, which is incredibly resource-intensive. We’re getting less efficient in some ways in the way we use some of these technologies, but there are things we can do.

The trend in history has been that doesn’t actually change overall demand. I think we need to be careful as we think about all the things we can do as individuals to not lose sight of the need for the aggregate response, the societal-wide response, which is this industry needs to check itself, but it also needs to have proper oversight. The notion that somehow they’re holier than the rest of us is totally unsustainable.

We have to treat them as the next gold rush, the next offshore drilling opportunity, and understand that what they are doing is globally impactful, setting us back in terms of the overall needs to address climate change and the consumption of energy, and threatens our basic systems for water, land, air quality that are the basis of human life. If those aren’t a big enough threat, then we’re in big trouble.

Anthony Flint: Mary Ann, how about the last word?

Mary Ann Dickinson: When I looked up and saw that every Google search I do, which is AI backed these days, is half a liter of water, each one, and you think about the billions of searches that happen across the globe, this is a frightening issue. I’m not sure our individual actions are going to make that big a difference in the AI demand, but what we can require is, in the siting of these facilities, that they not disrupt local sustainability and resiliency efforts. That’s, I think, what we want to focus on at the Lincoln Institute. It’s helping communities do that.

Anthony Flint: Jon Gorey, Mary Ann Dickinson, and Chris Miller, thank you for this great conversation on the Land Matters Podcast. You can read Jon Gorey’s article, Data Drain, online at our website, lincolninst.edu. Just look for Land Lines magazine in the navigation. On social media, the handle is @landpolicy. Don’t forget to rate, share, and subscribe to the Land Matters Podcast. For now, I’m Anthony Flint signing off until next time.

Read full transcript

Building Vibrant Communities: Municipal Government Workers Get a Boost

November 4, 2025

By Anthony Flint, November 4, 2025

 

It’s a tough time to be working in government right now—long hours, modest pay, and lots of tumult in the body politic.

While this is especially true at the moment for employees in the federal government, a new program offered by Claremont Lincoln University and the Lincoln Institute of Land Policy aims to give public employees in municipal government a boost.

Over the last year, 150 planners, community development specialists, and other professionals in municipal government have participated in the Lincoln Vibrant Communities fellowship, a 24-week curriculum combining in-person and online education, expert coaching, and advanced leadership training.

The idea is to build capacity at the local level so those professionals can have greater impact in the communities they serve, on everything from affordable housing to greenspace preservation and revitalizing Main Streets, said Stephanie Varnon-Hughes, executive dean of academic affairs at Claremont Lincoln University.

“All of us can Google or go to seminars or read texts or access knowledge on our own, but this program is about the transformative, transferable leadership skills it takes for you to use that knowledge and use that technical experience to facilitate endeavors to bring about the change that you need in your community,” she said on the latest episode of the Land Matters podcast.

“These leadership skills can be measured and modeled and sustained. We can surround you with the abilities and the resources to change the way that you move through the world and collaborate with other people working on similar issues for long-term success,” she said.

Lincoln Vibrant Communities fellows can use the training to implement some of the ideas and policy recommendations that the Lincoln Institute has developed, like setting up a community land trust (CLT) for permanently affordable housing, said Lincoln Institute President and CEO George W. “Mac” McCarthy, who joined Varnon-Hughes on the show.

“They’re the ones who find a way to find the answers in land and to manifest those answers to actually address the challenges we care about,” he said. “It’s this cadre of community problem solvers that are now all connected and networked together all across the country.”

The support is critical right now, McCarthy said, given estimates of a shortage of a half-million government workers, and amid a flurry of retirements from veteran public employees who tend to take a lot of institutional memory with them.

The Lincoln Institute has a long tradition of supporting local government, beginning in earnest in 1974, when David C. Lincoln, son of founder John C. Lincoln, established the Lincoln Institute as a stand-alone entity emerging from the original Lincoln Foundation. The organization made its mark developing computer-assisted assessment tools to help in the administration of property tax systems, and has since supported city planners, land conservation advocates, and public finance professionals experimenting with innovations such as the land value tax.

In the later stages of his philanthropic career, David Lincoln established a new model for university education, Claremont Lincoln University, a fully accredited non-profit institution offering a Bachelor of Arts in Organizational Leadership, as well as master’s degrees and graduate certificates. The guiding mission is to bridge theory and practice to mobilize leaders in the public sector.

Municipal employees engage in the Lincoln Vibrant Communities fellowship for about a six-month program in advanced leadership training and expert coaching, either as individuals or as part of teams working on projects in cities and towns and regions across the US.

McCarthy and Varnon-Hughes joined the Land Matters podcast after returning from Denver last month for a leadership summit where some of the first graduates of the program had an opportunity to share experiences and celebrate some of the first graduates of the program. Denver Mayor Mike Johnston joined the group, underscoring how technical expertise will be much needed as the city launches complex projects, such as building affordable housing on publicly owned land.

More information about Claremont Lincoln University and the Lincoln Vibrant Communities fellowship program is available at https://www.claremontlincoln.edu.

Listen to the show here or subscribe to Land Matters on Apple Podcasts, Spotify, Stitcher, YouTube, or wherever you listen to podcasts.

 


Further Reading

Bridging Theory and Plastics | Land Lines

Lincoln Institute Invests $1 Million in Scholarships for Future Leaders | Land Lines 

Denver Land Trust Fights Displacement Whether It Owns the Land or Not | Shelterforce 

New Lincoln Institute Resources Explore How Community Land Trusts Make Housing More Affordable | Land Lines

Accelerating Community Investment: Bringing New Partners to the Community Investment Ecosystem | Cityscapes

  


Anthony Flint is a senior fellow at the Lincoln Institute of Land Policy, host of the Land Matters podcast, and a contributing editor of Land Lines. 

Data Drain: The Land and Water Impacts of the AI Boom

By Jon Gorey, October 17, 2025

A low hum emerges from within a vast, dimly lit tomb, whose occupant devours energy and water with a voracious, inhuman appetite. The beige, boxy data center is a vampire of sorts—pallid, immortal, thirsty. Sheltered from sunlight, active all night. And much like a vampire, at least according to folkloric tradition, it can only enter a place if it’s been invited inside.

In states and counties across the US, lawmakers aren’t just opening the door for these metaphorical, mechanical monsters. They’re actively luring them in, with tax breaks and other incentives, eager to lay claim to new municipal revenues and a piece of the explosive growth surrounding artificial intelligence.

That may sound hyperbolic, but data centers truly are resource-ravenous. Even a mid-sized data center consumes as much water as a small town, while larger ones require up to 5 million gallons of water every day—as much as a city of 50,000 people.

Powering and cooling their rows of server stacks also takes an astonishing amount of electricity. A conventional data center—think cloud storage for your work documents or streaming videos—draws as much electricity as 10,000 to 25,000 households, according to the International Energy Agency. But a newer, AI-focused “hyperscale” data center can use as much power as 100,000 homes or more. Meta’s Hyperion data center in Louisiana, for example, is expected to draw more than twice the power of the entire city of New Orleans once completed. Another Meta data center planned in Wyoming will use more electricity than every home in the state combined.

And of course, unlike actual clouds, data centers require land. Lots of it. Some of the largest data centers being built today will cover hundreds of acres with impermeable steel, concrete, and paved surfaces—land that will no longer be available for farmland, nature, or housing—and require new transmission line corridors and other associated infrastructure as well.

Data centers have been part of our built landscape for over a decade, however—many of them tucked into unassuming office parks, quietly processing our web searches and storing our cellphone photos. So why the sudden concern? Artificial intelligence tools trained with large language models, such as Open AI’s ChatGPT, among others, use exponentially more computing power than traditional cloud services. And the largest technology companies, including Amazon, Meta, Google, and Microsoft, are investing quickly and heavily in AI.

The number of US data centers more than doubled between 2018 and 2021 and, fueled by investments in AI, that number has already doubled again. Early in the AI boom, in 2023, US data centers consumed 176 terawatt-hours of electricity, roughly as much as the entire nation of Ireland (whose electric grid is itself nearly maxed out, prompting data centers there to use polluting off-grid generators), and that’s expected to double or even triple as soon as 2028.

This rapid proliferation can put an enormous strain on local and regional resources—burdens that many host communities are not fully accounting for or prepared to meet.

“Demand for data centers and processing has just exploded exponentially because of AI,” says Kim Rueben, former senior fiscal systems advisor at the Lincoln Institute of Land Policy. Virginia and Texas have long had tax incentives in place to attract new data centers, and “other states are jumping on the bandwagon,” she says, hoping to see economic growth and new tax revenues.

But at a Land Policy and Digitalization conference convened by the Lincoln Institute last spring, Rueben likened the extractive nature of data centers to coal mines. “I don’t think places are acknowledging all the costs,” she says.

Yes, Virginia, There Is a Data Clause

At that conference, Chris Miller, executive director of the Piedmont Environmental Council, explained how roughly two-thirds of the world’s internet traffic passes through Northern Virginia. The region already hosts the densest concentration of data centers anywhere in the world, with about 300 facilities in just a handful of counties. Dozens more are planned or in development, ready to consume the region’s available farmland, energy, and water, enticed by a statewide incentive that saves companies more than $130 million in sales and use taxes each year.

Despite the state-level tax break, the data centers make significant contributions to local coffers. In Loudon County, which has over 27 million square feet of existing data center space, officials expect the total real and property tax revenues collected from local data centers in fiscal year 2025 to approach $900 million, nearly as much as the county’s entire operating budget. The proportion of revenue derived from data centers has grown so lopsided that the county’s board of supervisors is considering adjusting the tax rate, so as not to be so reliant on a single source.

Existing and planned data centers in Northern Virginia. The state has been dubbed “the data center capital of the world.” Credit: Piedmont Environmental Council.

While many communities see data centers as an economic boon due to that tax revenue, the facilities themselves are not powerful long-term job engines. Most of the jobs they create are rooted in their construction, not their ongoing operation, and thus are largely temporary.

Decades ago, PEC supported some of the data center development in Northern Virginia, says Julie Bolthouse, PEC’s director of land policy. But the industry has changed dramatically since then. When AOL had its headquarters in what’s known as Data Center Alley, for example, the company’s data center was a small part of a larger campus, “which had pedestrian trails around it, tennis courts, basketball courts … at its peak, it had 5,300 employees on that site,” Bolthouse says. The campus has since been demolished, and three large data center facilities are being built on the site. “There’s a big fence around it for security purposes, so it’s totally isolated from the community now, and it is only going to employ about 100 to 150 people on the same piece of land. That’s the difference.”

The facilities have also gotten “massive,” Bolthouse adds. “Each one of those buildings is using as much as a city’s worth of power, so that power infrastructure is having a huge impact on our communities. All the transmission lines that have to be built, the eminent domain used to get the land for those transmission lines, all of the energy infrastructure, gas plants, pipelines that deliver the gas, the air pollution associated with that, the climate impacts of all of that.”

Across Northern Virginia, on-site diesel generators—thousands of them, each the size of a rail car—spew diesel fumes, creating air quality issues. “No other land use that I know of uses as many generators as a data center does,” Bolthouse says. And while such generators are officially classified as emergency backup power, data centers are permitted to run them for “demand response” for 50 hours at a time, she adds. “That’s a lot of air pollution locally. That’s particulate matter and NOx [nitrogen oxides], which impacts growing lungs of children, can add cases of asthma, and can exacerbate heart disease and other underlying diseases in the elderly.”

And then there’s the water issue.

‘Like a Giant Soda Straw’

A study by the Houston Advanced Research Center (HARC) and University of Houston found that data centers in Texas will use 49 billion gallons of water in 2025, and as much as 399 billion gallons in 2030. That would be equivalent to drawing down the largest reservoir in the US—157,000-acre Lake Mead—by more than 16 feet in a year.

Anyone who’s accidentally left their phone out in the rain or dropped it in a puddle might wonder what a building full of expensive, delicate electronics could want with millions of gallons of water. It’s largely for cooling purposes. Coursing with electrical current, server stacks can get very hot, and evaporative room cooling is among the simplest and cheapest ways to keep the chips from getting overheated and damaged.

What that means, however, is that the water isn’t just used for cooling and then discharged as treatable wastewater; much of it evaporates in the process—poof.

“Even if they’re using reclaimed or recycled water, that water is no longer going back into the base flow of the rivers and streams,” Bolthouse says. “That has ecological impacts as well as supply issues. Everybody is upstream from someone else.” Washington, DC, for example, will still lose water supply if Northern Virginia data centers use recycled or reclaimed water, because that water won’t make it back into the Potomac River. Evaporative cooling also leaves behind high concentrations of salts and other contaminants, she adds, creating water quality issues.

There are less water-intensive ways to cool data centers, including closed-loop water systems, which require more electricity, and immersion cooling, in which servers are submerged in a bath of liquid, such as a synthetic oil, that conducts heat but not electricity. Immersion cooling allows for a denser installation of servers as well, but is not yet widely used, largely due to cost.

Ironically, it can be hard to confirm specific data about data centers. Given the proprietary nature of AI technology and, perhaps, the potential for public backlash, many companies are less than forthcoming about how much water their data centers consume. Google, for its part, reported using more than 5 billion gallons of water across all its data centers in 2023, with 31 percent of its freshwater withdrawals coming from watersheds with medium or high water scarcity.

A 2023 study by the University of California Riverside estimated that an AI chat session of 20 or so queries uses up to a bottle of freshwater. That amount can vary depending on the platform, with more sophisticated models demanding larger volumes of water, while other estimates suggest it could be closer to a few spoonfuls per query.

“But what goes unacknowledged, from a natural systems perspective, is that all water is local,” says Peter Colohan, director of partnerships and program innovation at the Lincoln Institute, who helped create the Internet of Water. “It’s a small amount of water for a few queries, but it’s all being taken from one basin where that data center is located—that’s thousands and thousands of gallons of water being drawn from one place from people doing their AI queries from all over the world,” he says.

“Wherever they choose to put a data center, it is like a giant soda straw sucking water out of that basin,” Colohan continues. “And when you take water from a place, you have to reduce demand or put water back in that same place, there’s no other solution. In some cases, at least, major data center developers have begun to recognize this problem and are actively engaging in water replenishment where it counts.”

Locating data centers in cooler, wetter regions can help reduce the amount of water they use and the impact of their freshwater withdrawals. And yet roughly two-thirds of the data centers built since 2022 have been located in water-stressed regions, according to a Bloomberg News analysis, including hot, dry climates like Arizona.

The warm water-cooling system at a Sandia Labs data center in Albuquerque, New Mexico. The data center earned LEED Gold certification for efficiency in 2020. Credit: Bret Latter/Sandia Labs via Flickr CC.

It’s not just cooling the server rooms and chips that consumes water. About half of the electricity currently used by US data centers comes from fossil fuel power plants, which themselves use a lot of water, as they heat up steam to turn their massive turbines.

And the millions of microchips processing all that information? By the time they reach a data center, each chip has already consumed thousands of gallons of water. Manufacturing these tiny, powerful computing components requires “ultrapure” treated water to rinse off silicon residue without damaging the chips. It takes about 1.5 gallons of tap water to produce a gallon of ultrapure water, and the typical chip factory uses about 10 million gallons of ultrapure water each day, according to the World Economic Forum—as much as 33,000 US households.


As communities consider the benefits and risks of data center development, consumers might consider our own role in the growth of data centers, and whether our use of AI is worth the price of the water, power, and land it devours.

There could be important uses for artificial intelligence—if it can be harnessed to solve complex problems, for instance, or to improve the efficiency of water systems and electric grids.

There are clearly superfluous uses, too. A YouTube channel with 35 million subscribers, for example, features AI-generated music videos … of AI-generated songs. The MIT Technology Review estimates that, unlike simple text queries, using AI to create video content is extremely resource-heavy: Making a five-second AI-generated video uses about as much electricity as running a microwave nonstop for over an hour.

Data center defenders tend to point to the fact that Americans use more water each year to irrigate golf courses (more than 500 billion gallons) and lawns (over 2 trillion gallons) than AI data centers use. However, that argument rings false: America has a well-documented addiction to green grass that is also not serving us well. The solution, water experts say, lies in water conservation and consumer education, not comparing one wasteful use to another.


 

Putting a Finite Resource First

Even a small data center can place an immense, concentrated burden on local infrastructure and natural resources. In Newton County, Georgia, a Meta data center that opened in 2018 uses 500,000 gallons of water per day—10 percent of the entire county’s water consumption. And given Georgia’s cheap power and generous state tax breaks, Newton County continues to field requests for new data center permits—some of which would use up to 6 million gallons of water per day, more than doubling what the entire county currently consumes.

The intense demands that data centers place on regional resources make for complicated decision-making at the local level. Communities and regional water officials must engage in discussions about data centers early on, and with a coordinated, holistic understanding of existing resources and potential impacts on the energy grid and the watershed, says Mary Ann Dickinson, policy director for land and water at the Lincoln Institute. “We would like to help communities make smarter decisions about data centers, helping them analyze and plan for the potential impacts to their community structures and systems.”

“Water is often one of the last things that gets thought about, so one of the things that we’re really promoting is early engagement,” says John Hernon, strategic development manager at Thames Water in the UK. “So when you’re thinking about data centers, it’s not just about the speed you’re going to get, it’s not just about making sure there’s a lot of power available—we need to make sure that water is factored in at the earliest possible thinking … at the forefront, rather than an afterthought.”

Despite its damp reputation, London doesn’t receive a whole lot of rainfall compared to the northern UK — less than 25 inches a year, on average, or roughly half of what falls in New York City. Yet because so much growth is centered on London, the Thames Water service area holds about 80 percent of the UK’s data centers, Hernon says, and another 100 or so are proposed.

What’s more, their water usage peaks during the hottest, driest times of the year, when the utility can least accommodate the extra demand. “That’s why we talk about restricting or reducing or objecting to [data centers],” Hernon says. “It’s not because we don’t like them. We absolutely get it, we need them ourselves. AI will massively help our call center … which means we can have more people out fixing leaks and proactively managing our networks.”

Keeping the Lights On

One way for data centers to use less water is to rely more heavily on air-cooling technology, but this requires more energy —which may in turn increase water use indirectly, depending on the power source. What’s more, regional grids are already struggling to meet the demand of these power-hungry facilities, and there are hundreds more in the works. “A lot of these projects have been announced, but it’s not clear what can come on fast enough to power them,” says Kelly T. Sanders, associate professor of engineering at University of Southern California.

The government wants US technology companies to build their AI data centers domestically—not just for economic reasons, but for national security purposes as well. But even as the Trump administration appears to understand the enormous energy demands data centers will place on the electric grid, it has actively squashed new wind power projects, such as Revolution Wind off the coast of Rhode Island.

NREL (the National Renewable Energy Laboratory) created this overlay map of transmission lines and data center locations to “help visualize the overlap and simplify co-system planning.” Credit: NREL.gov.

Other carbon-free alternatives like small modular reactors (SMRs) and geothermal energy have bipartisan support, Sanders says. “But the problem is, even if you put shovels in the ground for an SMR today, it’s going to take 10 years,” she says. “The things that we can do the fastest are wind, solar, and batteries. But in the last six months we’ve lost a lot of the incentives for clean energy, and there’s an all-out war on wind. Wind projects that are already built, already paid for, are being canceled. And to me, that’s peculiar, because that’s electricity that would be ready to go out on the grid soon, in some of these regions that are really congested.”

Data centers are among the reasons ratepayers nationwide have seen their electric bills increase at twice the rate of inflation in the past year. Part of that is the new infrastructure data centers will require, such as new power plants, transmission lines, or other investments. Those costs, as well as ongoing grid maintenance and upgrades, are typically shared by all electric customers in a service area, through charges added to utility bills.

This creates at least two issues: While the tax revenues of a new data center will benefit only the host community, the entire electric service area must pay for the associated infrastructure. Secondly, if a utility makes that huge investment, but the data center eventually closes or needs much less electricity than projected, it’s the ratepayers who will foot the bill, not the data center.

Some tech companies are securing their own clean power independent of the grid—Microsoft, for example, signed a 20-year agreement to purchase energy directly from the Three Mile Island nuclear plant. But that approach isn’t ideal either, Sanders says. “These data centers are still going to use transmission lines and all those grid assets, but if they’re not buying the electricity from the utility, they’re not paying for all that infrastructure through their rate bills,” she says.

Aside from generating new power, Sanders says, there are strategies to squeeze more capacity from the existing grid. “One is good old energy efficiency, and the data centers themselves have all of the incentives aligned to try to make their processes more efficient,” she says. AI itself could potentially also help enhance grid performance. “We can use artificial intelligence to give us more information about how power is flowing through the grid, and so we can optimize that power flow, which can give us more capacity than we would have otherwise,” Sanders says.

Another strategy is to make the grid more flexible. Most of the time, and in most regions of the US, we only use about 40 percent of the grid’s total capacity, Sanders says, give or take. “We build the capacity of the grid to meet the hottest day … and that’s where we worry about these large data center loads,” she says. A coordinated network of batteries, however —including in people’s homes and EVs—can add flexibility and stabilize the grid during times of peak demand. In July, California’s Pacific Gas and Electric Company (PG&E) conducted the largest-ever test of its statewide “virtual power plant,” using residential batteries to supply 535 megawatts of power to the grid for two full hours at sundown.

With some intentional, coordinated planning—”it’s not just going to happen naturally,” Sanders says—it may be possible to add more capacity without requiring a lot of new generation if data centers can reduce their workloads during peak times and invest in large-scale battery backups: “There is a world in which these data centers can actually be good grid actors, where they can add more flexibility to the grid.”

Confronting Trade-Offs With Land Policy

As the demand for data centers grows, finding suitable locations for these facilities will force communities to confront myriad and imperfect trade-offs between water, energy, land, money, health, and climate. “Integrated land use planning, with sustainable land, water, and energy practices, is the only way we can sustainably achieve the virtuous circle needed to reap the benefits of AI and the economic growth associated with it,” Colohan says.

For example, using natural gas to meet the anticipated electricity load of Texas data centers would require 50 times more water than using solar generation, according to the HARC study, and 1,000 times more water than wind. But while powering new data centers with wind farms would consume the least water, it would also require the most land—four times as much land as solar, and 42 times as much as natural gas.

Absent an avalanche of new, clean power, most data centers are adding copious amounts of greenhouse gases to our collective emissions, at a time when science demands we cut them sharply to limit the worst impacts of climate change. Louisiana regulators in August approved plans to build three new gas power plants to offset the expected electricity demand from Meta’s Hyperion AI data center.

While towns or counties compete with one another to attract data centers, the host communities will reap the tax benefits while the costs—the intense water demand, the higher electricity bills, the air pollution from backup generators—will be dispersed more regionally, including to areas that won’t see any new tax revenue.

That’s one reason data center permitting needs more state oversight, Bolthouse says. “The only approval that they really have to get is from the locality, and the locality is not looking at the regional impacts,” she says. PEC is also pushing for ratepayer protections and sustainability commitments. “We want to make sure we’re encouraging the most efficient and sustainable practices within the industry, and that we’re requiring mitigation when impacts can’t be avoided.”

Too close for comfort? A data center abuts homes in Loudoun County, Virginia. Credit: Hugh Kenny via Piedmont Environmental Council.

PEC and others are also pressing for greater transparency from the industry. “Very often, data centers are coming in with non-disclosure agreements,” Bolthouse says. “They’re hiding a lot of information about water usage, energy usage, air quality impacts, emissions—none of that information is disclosed, and so communities don’t really know what they’re getting into.”

“We need communities to be educated about what they’re facing, and what their trade-offs are when they let in a data center,” Colohan says. “What is the cost—the true cost—of a data center? And then how do you turn that true cost into a benefit through integrated land policy?”

Rueben says she understands the desire, especially in communities experiencing population loss, to tap into a growing industry. But rather than competing with each other to attract data centers, she says, communities ought to be having broader conversations about job growth and economic development strategies, factoring in the true costs and trade-offs these facilities present, and asking the companies to provide more guarantees and detailed plans.

“Forcing data center operators to explain how they’re going to run the facility more efficiently, and where they’re going to get their water from—and not just assuming that they have first access to the water and energy systems,” she says, “is a shift in perspective that we kind of need government officials to make.”


Jon Gorey is a staff writer at the Lincoln Institute of Land Policy.

Lead image: Data center facilities in Prince William County, Virginia. The county has 59 data centers in operation or under construction. Credit: Hugh Kenny via Piedmont Environmental Council.

Coming to Terms with Density: An Urban Planning Concept in the Spotlight 

September 15, 2025

By Anthony Flint, September 15, 2025
 

It’s an urban planning concept that sounds extra wonky, but it is critical in any discussion of affordable housing, land use, and real estate development: density.

In this episode of the Land Matters podcast, two practitioners in architecture and urban design shed some light on what density is all about, on the ground, in cities and towns trying to add more housing supply. 

The occasion is the revival of a Lincoln Institute resource called Visualizing Density, which was pushed live this month at lincolninst.edu after extensive renovations and updates. It’s a visual guide to density based on a library of aerial images of buildings, blocks, and neighborhoods taken by photographer Alex Maclean, originally published (and still available) as a book by Julie Campoli. 

It’s a very timely clearinghouse, as communities across the country work to address affordable housing, primarily by reforming zoning and land use regulations to allow more multifamily housing development—generally less pricey than the detached single-family homes that have dominated the landscape. 

Residential density is understood to be the number of homes within a defined area of land, in the US most often expressed as dwelling units per acre. A typical suburban single-family subdivision might be just two units per acre; a more urban neighborhood, like Boston’s Back Bay, has a density of about 60 units per acre. 

Demographic trends suggest that future homeowners and renters will prefer greater density in the form of multifamily housing and mixed-use development, said David Dixon, a vice president at Stantec, a global professional services firm providing sustainable engineering, architecture, and environmental consulting services. Over the next 20 years, the vast majority of households will continue to be professionals without kids, he said, and will not be interested in big detached single-family homes.  

Instead they seek “places to walk to, places to find amenity, places to run into friends, places to enjoy community,” he said. “The number one correlation that you find for folks under the age of 35, which is when most of us move for a job, is not wanting to be auto-dependent. They are flocking to the same mixed-use, walkable, higher-density, amenitized, community-rich places that the housing market wants to build … Demand and imperative have come together. It’s a perfect storm to support density going forward.” 

Tensions often arise, however, when new, higher density is proposed for existing neighborhoods, on vacant lots or other redevelopment sites. Tim Love, principal and founder of the architecture firm Utile, and a professor at Harvard University’s Graduate School of Design, said he’s seen the wariness from established residents as he helps cities and towns comply with the MBTA Communities Act, a Massachusetts state law that requires districts near transit stations with an allowable density of 15 units per acre. 

Some towns have rebelled against the law, which is one of several state zoning reform initiatives across the US designed to increase housing supply, ultimately to help bring prices down. 

Many neighbors are skeptical because they associate multifamily density with large apartment buildings of 100 or 200 units, Love said. But most don’t realize there is an array of so-called “gentle density” development opportunities for buildings of 12 to 20 units, that have the potential to blend in more seamlessly with many streetscapes. 

“If we look at the logic of the real estate market, discovering over the last 15, 20 years that the corridor-accessed apartment building at 120 and 200 units-plus optimizes the building code to maximize returns, there is a smaller ‘missing middle’ type that I’ve become maybe a little bit obsessed about, which is the 12-unit single-stair building,” said Love, who conducted a geospatial analysis that revealed 5,000 sites in the Boston area that were perfect for a 12-unit building. 

“Five thousand times twelve is a lot of housing,” Love said. “If we came up with 5,000 sites within walking distance of a transit stop, that’s a pretty good story to get out and a good place to start.” 

Another dilemma of density is that while big increases in multifamily housing supply theoretically should have a downward impact on prices, many individual dense development projects in hot housing markets are often quite expensive. Dixon, who is currently writing a book about density and Main Streets, said the way to combat gentrification associated with density is to require a portion of units to be affordable, and to capture increases in the value of urban land to create more affordability. 

“If we have policies in place so that value doesn’t all go to the [owners of the] underlying land and we can tap those premiums, that is a way to finance affordable housing,” he said. “In other words, when we use density to create places that are more valuable because they can be walkable, mixed-use, lively, community-rich, amenitized, all these good things, we … owe it to ourselves to tap some of that value to create affordability so that everybody can live there.” 

Visualizing Density can be found at the Lincoln Institute website at https://www.lincolninst.edu/data/visualizing-density/. 

Listen to the show here or subscribe to Land Matters on  Apple Podcasts, Spotify,  Stitcher, YouTube, or wherever you listen to podcasts.

 


Further reading 

Visualizing Density | Lincoln Institute

What Does 15 Units Per Acre Look Like? A StoryMap Exploring Street-Level Density | Land Lines

Why We Need Walkable Density for Cities to Thrive | Public Square

The Density Conundrum: Bringing the 15-Minute City to Texas | Urban Land

The Density Dilemma: Appeal and Obstacles for Compact and Transit Oriented Development | Anthony Flint

 


Anthony Flint is a senior fellow at the Lincoln Institute of Land Policy, host of the Land Matters podcast, and a contributing editor of Land Lines. 

Graduate Student Fellowships

2025–2026 Programa de becas para el máster UNED-Instituto Lincoln

Submission Deadline: October 10, 2025 at 11:59 PM

El Instituto Lincoln de Políticas de Suelo y la Universidad Nacional de Educación a Distancia (UNED) ofrecen el máster en Políticas de Suelo y Desarrollo Urbano Sostenible, un programa académico online en español que reúne de manera única los marcos legales y herramientas que sostienen la planificación urbana, junto con instrumentos fiscales, ambientales y de participación, desde una perspectiva internacional y comparada.

El máster está dirigido especialmente a estudiantes de posgrado y otros graduados con interés en políticas urbanas desde una perspectiva jurídica, ambiental y de procesos de participación, así como a funcionarios públicos. Los participantes del programa recibirán el entrenamiento teórico y técnico para liderar la implementación de medidas que permitan la transformación sostenible de las ciudades.

Plazo de matrícula ordinario: del 8 de septiembre al 28 de noviembre de 2025

El inicio del máster es en enero de 2026.  La fecha exacta se anunciará antes del 28 de noviembre de 2025.

El Instituto Lincoln otorgará becas que cubrirán parcialmente el costo del máster de los postulantes seleccionados.

Términos de las becas: 

  • Los becarios deben haber obtenido un título de licenciatura de una institución académica o de estudios superiores. 
  • Los fondos de las becas no tienen valor en efectivo y solo cubrirán el 40 % del costo total del programa. 
  • Los becarios deben pagar la primera cuota de la matrícula, que representa el 60 % del costo total del máster. 
  • Los becarios deben mantener una buena posición académica o perderán el beneficio. 

El otorgamiento de la beca dependerá de la admisión formal del postulante al máster UNED-Instituto Lincoln. 

Si son seleccionados, los becarios recibirán asistencia virtual para realizar el proceso de admisión de la Universidad Nacional de Educación a Distancia (UNED), el cual requiere una solicitud online y una copia del expediente académico o registro de calificaciones de licenciatura y/o posgrado. 

Aquellos postulantes que no obtengan la beca parcial del Instituto Lincoln podrán optar a las ayudas que ofrece la UNED, una vez que se hayan matriculado en el máster. 

Fecha límite para postular: 10 de octubre de 2025, 23:59 horas de Boston, MA, EUA (UTC-5) 

Anuncio de resultados: 22 de octubre 2025 


Details

Submission Deadline
October 10, 2025 at 11:59 PM

Keywords

Climate Mitigation, Development, Dispute Resolution, Environmental Management, Exclusionary Zoning, Favela, Henry George, Informal Land Markets, Infrastructure, Land Market Regulation, Land Speculation, Land Use, Land Use Planning, Land Value, Land Value Taxation, Land-Based Tax, Local Government, Mediation, Municipal Fiscal Health, Planning, Property Taxation, Public Finance, Public Policy, Regulatory Regimes, Resilience, Reuse of Urban Land, Urban Development, Urbanism, Value Capture

Webinar and Event Recordings

Land Use and Transportation Scenario Planning in Greater Boston

October 16, 2025 | 12:00 p.m. - 1:00 p.m. (EDT, UTC-4)

Offered in English

Watch the Recording


The Consortium for Scenario Planning is hosting a peer exchange featuring Sarah Philbrick and Conor Gately from the Metropolitan Area Planning Council (MAPC), who will discuss their summer 2025 project conducting four land use scenarios using a travel demand model to understand the impact of different transit-oriented development (TOD) strategies on greenhouse gas (GHG) emissions in Greater Boston.

Local and regional planners, metropolitan planning organizations (MPOs), professionals, and community members interested in learning more about land use and transportation planning and how TOD strategies impact GHG emissions are invited to tune in to this webinar. Simultaneous English-Spanish translation will be available via Zoom. If you would like to use the translation service, please join the webinar five minutes early.


Speakers

Sarah Philbrick

Research Manager, MAPC

Conor Gately

Senior Land Use and Transportation Analyst, MAPC


Details

Date
October 16, 2025
Time
12:00 p.m. - 1:00 p.m. (EDT, UTC-4)
Registration Period
August 19, 2025 - October 16, 2025
Language
English

Keywords

Infrastructure, Land Use, Land Use Planning, Pollution, Scenario Planning, Transport Oriented Development