Topic: City and Regional Planning

Data Drain: The Land and Water Impacts of the AI Boom

By Jon Gorey, October 17, 2025

A low hum emerges from within a vast, dimly lit tomb, whose occupant devours energy and water with a voracious, inhuman appetite. The beige, boxy data center is a vampire of sorts—pallid, immortal, thirsty. Sheltered from sunlight, active all night. And much like a vampire, at least according to folkloric tradition, it can only enter a place if it’s been invited inside.

In states and counties across the US, lawmakers aren’t just opening the door for these metaphorical, mechanical monsters. They’re actively luring them in, with tax breaks and other incentives, eager to lay claim to new municipal revenues and a piece of the explosive growth surrounding artificial intelligence.

That may sound hyperbolic, but data centers truly are resource-ravenous. Even a mid-sized data center consumes as much water as a small town, while larger ones require up to 5 million gallons of water every day—as much as a city of 50,000 people.

Powering and cooling their rows of server stacks also takes an astonishing amount of electricity. A conventional data center—think cloud storage for your work documents or streaming videos—draws as much electricity as 10,000 to 25,000 households, according to the International Energy Agency. But a newer, AI-focused “hyperscale” data center can use as much power as 100,000 homes or more. Meta’s Hyperion data center in Louisiana, for example, is expected to draw more than twice the power of the entire city of New Orleans once completed. Another Meta data center planned in Wyoming will use more electricity than every home in the state combined.

And of course, unlike actual clouds, data centers require land. Lots of it. Some of the largest data centers being built today will cover hundreds of acres with impermeable steel, concrete, and paved surfaces—land that will no longer be available for farmland, nature, or housing—and require new transmission line corridors and other associated infrastructure as well.

Data centers have been part of our built landscape for over a decade, however—many of them tucked into unassuming office parks, quietly processing our web searches and storing our cellphone photos. So why the sudden concern? Artificial intelligence tools trained with large language models, such as Open AI’s ChatGPT, among others, use exponentially more computing power than traditional cloud services. And the largest technology companies, including Amazon, Meta, Google, and Microsoft, are investing quickly and heavily in AI.

The number of US data centers more than doubled between 2018 and 2021 and, fueled by investments in AI, that number has already doubled again. Early in the AI boom, in 2023, US data centers consumed 176 terawatt-hours of electricity, roughly as much as the entire nation of Ireland (whose electric grid is itself nearly maxed out, prompting data centers there to use polluting off-grid generators), and that’s expected to double or even triple as soon as 2028.

This rapid proliferation can put an enormous strain on local and regional resources—burdens that many host communities are not fully accounting for or prepared to meet.

“Demand for data centers and processing has just exploded exponentially because of AI,” says Kim Rueben, former senior fiscal systems advisor at the Lincoln Institute of Land Policy. Virginia and Texas have long had tax incentives in place to attract new data centers, and “other states are jumping on the bandwagon,” she says, hoping to see economic growth and new tax revenues.

But at a Land Policy and Digitalization conference convened by the Lincoln Institute last spring, Rueben likened the extractive nature of data centers to coal mines. “I don’t think places are acknowledging all the costs,” she says.

Yes, Virginia, There Is a Data Clause

At that conference, Chris Miller, executive director of the Piedmont Environmental Council, explained how roughly two-thirds of the world’s internet traffic passes through Northern Virginia. The region already hosts the densest concentration of data centers anywhere in the world, with about 300 facilities in just a handful of counties. Dozens more are planned or in development, ready to consume the region’s available farmland, energy, and water, enticed by a statewide incentive that saves companies more than $130 million in sales and use taxes each year.

Despite the state-level tax break, the data centers make significant contributions to local coffers. In Loudon County, which has over 27 million square feet of existing data center space, officials expect the total real and property tax revenues collected from local data centers in fiscal year 2025 to approach $900 million, nearly as much as the county’s entire operating budget. The proportion of revenue derived from data centers has grown so lopsided that the county’s board of supervisors is considering adjusting the tax rate, so as not to be so reliant on a single source.

Existing and planned data centers in Northern Virginia. The state has been dubbed “the data center capital of the world.” Credit: Piedmont Environmental Council.

While many communities see data centers as an economic boon due to that tax revenue, the facilities themselves are not powerful long-term job engines. Most of the jobs they create are rooted in their construction, not their ongoing operation, and thus are largely temporary.

Decades ago, PEC supported some of the data center development in Northern Virginia, says Julie Bolthouse, PEC’s director of land policy. But the industry has changed dramatically since then. When AOL had its headquarters in what’s known as Data Center Alley, for example, the company’s data center was a small part of a larger campus, “which had pedestrian trails around it, tennis courts, basketball courts … at its peak, it had 5,300 employees on that site,” Bolthouse says. The campus has since been demolished, and three large data center facilities are being built on the site. “There’s a big fence around it for security purposes, so it’s totally isolated from the community now, and it is only going to employ about 100 to 150 people on the same piece of land. That’s the difference.”

The facilities have also gotten “massive,” Bolthouse adds. “Each one of those buildings is using as much as a city’s worth of power, so that power infrastructure is having a huge impact on our communities. All the transmission lines that have to be built, the eminent domain used to get the land for those transmission lines, all of the energy infrastructure, gas plants, pipelines that deliver the gas, the air pollution associated with that, the climate impacts of all of that.”

Across Northern Virginia, on-site diesel generators—thousands of them, each the size of a rail car—spew diesel fumes, creating air quality issues. “No other land use that I know of uses as many generators as a data center does,” Bolthouse says. And while such generators are officially classified as emergency backup power, data centers are permitted to run them for “demand response” for 50 hours at a time, she adds. “That’s a lot of air pollution locally. That’s particulate matter and NOx [nitrogen oxides], which impacts growing lungs of children, can add cases of asthma, and can exacerbate heart disease and other underlying diseases in the elderly.”

And then there’s the water issue.

‘Like a Giant Soda Straw’

A study by the Houston Advanced Research Center (HARC) and University of Houston found that data centers in Texas will use 49 billion gallons of water in 2025, and as much as 399 billion gallons in 2030. That would be equivalent to drawing down the largest reservoir in the US—157,000-acre Lake Mead—by more than 16 feet in a year.

Anyone who’s accidentally left their phone out in the rain or dropped it in a puddle might wonder what a building full of expensive, delicate electronics could want with millions of gallons of water. It’s largely for cooling purposes. Coursing with electrical current, server stacks can get very hot, and evaporative room cooling is among the simplest and cheapest ways to keep the chips from getting overheated and damaged.

What that means, however, is that the water isn’t just used for cooling and then discharged as treatable wastewater; much of it evaporates in the process—poof.

“Even if they’re using reclaimed or recycled water, that water is no longer going back into the base flow of the rivers and streams,” Bolthouse says. “That has ecological impacts as well as supply issues. Everybody is upstream from someone else.” Washington, DC, for example, will still lose water supply if Northern Virginia data centers use recycled or reclaimed water, because that water won’t make it back into the Potomac River. Evaporative cooling also leaves behind high concentrations of salts and other contaminants, she adds, creating water quality issues.

There are less water-intensive ways to cool data centers, including closed-loop water systems, which require more electricity, and immersion cooling, in which servers are submerged in a bath of liquid, such as a synthetic oil, that conducts heat but not electricity. Immersion cooling allows for a denser installation of servers as well, but is not yet widely used, largely due to cost.

Ironically, it can be hard to confirm specific data about data centers. Given the proprietary nature of AI technology and, perhaps, the potential for public backlash, many companies are less than forthcoming about how much water their data centers consume. Google, for its part, reported using more than 5 billion gallons of water across all its data centers in 2023, with 31 percent of its freshwater withdrawals coming from watersheds with medium or high water scarcity.

A 2023 study by the University of California Riverside estimated that an AI chat session of 20 or so queries uses up to a bottle of freshwater. That amount can vary depending on the platform, with more sophisticated models demanding larger volumes of water, while other estimates suggest it could be closer to a few spoonfuls per query.

“But what goes unacknowledged, from a natural systems perspective, is that all water is local,” says Peter Colohan, director of partnerships and program innovation at the Lincoln Institute, who helped create the Internet of Water. “It’s a small amount of water for a few queries, but it’s all being taken from one basin where that data center is located—that’s thousands and thousands of gallons of water being drawn from one place from people doing their AI queries from all over the world,” he says.

“Wherever they choose to put a data center, it is like a giant soda straw sucking water out of that basin,” Colohan continues. “And when you take water from a place, you have to reduce demand or put water back in that same place, there’s no other solution. In some cases, at least, major data center developers have begun to recognize this problem and are actively engaging in water replenishment where it counts.

Locating data centers in cooler, wetter regions can help reduce the amount of water they use and the impact of their freshwater withdrawals. And yet roughly two-thirds of the data centers built since 2022 have been located in water-stressed regions, according to a Bloomberg News analysis, including hot, dry climates like Arizona.

The warm water-cooling system at a Sandia Labs data center in Albuquerque, New Mexico. The data center earned LEED Gold certification for efficiency in 2020. Credit: Bret Latter/Sandia Labs via Flickr CC.

It’s not just cooling the server rooms and chips that consumes water. About half of the electricity currently used by US data centers comes from fossil fuel power plants, which themselves use a lot of water, as they heat up steam to turn their massive turbines.

And the millions of microchips processing all that information? By the time they reach a data center, each chip has already consumed thousands of gallons of water. Manufacturing these tiny, powerful computing components requires “ultrapure” treated water to rinse off silicon residue without damaging the chips. It takes about 1.5 gallons of tap water to produce a gallon of ultrapure water, and the typical chip factory uses about 10 million gallons of ultrapure water each day, according to the World Economic Forum—as much as 33,000 US households.


As communities consider the benefits and risks of data center development, consumers might consider our own role in the growth of data centers, and whether our use of AI is worth the price of the water, power, and land it devours.

There could be important uses for artificial intelligence—if it can be harnessed to solve complex problems, for instance, or to improve the efficiency of water systems and electric grids.

There are clearly superfluous uses, too. A YouTube channel with 35 million subscribers, for example, features AI-generated music videos … of AI-generated songs. The MIT Technology Review estimates that, unlike simple text queries, using AI to create video content is extremely resource-heavy: Making a five-second AI-generated video uses about as much electricity as running a microwave nonstop for over an hour.

Data center defenders tend to point to the fact that Americans use more water each year to irrigate golf courses (more than 500 billion gallons) and lawns (over 2 trillion gallons) than AI data centers use. However, that argument rings false: America has a well-documented addiction to green grass that is also not serving us well. The solution, water experts say, lies in water conservation and consumer education, not comparing one wasteful use to another.


 

Putting a Finite Resource First

Even a small data center can place an immense, concentrated burden on local infrastructure and natural resources. In Newton County, Georgia, a Meta data center that opened in 2018 uses 500,000 gallons of water per day—10 percent of the entire county’s water consumption. And given Georgia’s cheap power and generous state tax breaks, Newton County continues to field requests for new data center permits—some of which would use up to 6 million gallons of water per day, more than doubling what the entire county currently consumes.

The intense demands that data centers place on regional resources make for complicated decision-making at the local level. Communities and regional water officials must engage in discussions about data centers early on, and with a coordinated, holistic understanding of existing resources and potential impacts on the energy grid and the watershed, says Mary Ann Dickinson, policy director for land and water at the Lincoln Institute. “We would like to help communities make smarter decisions about data centers, helping them analyze and plan for the potential impacts to their community structures and systems.”

“Water is often one of the last things that gets thought about, so one of the things that were really promoting is early engagement,” says John Hernon, strategic development manager at Thames Water in the UK. “So when you’re thinking about data centers, it’s not just about the speed you’re going to get, it’s not just about making sure there’s a lot of power available—we need to make sure that water is factored in at the earliest possible thinking … at the forefront, rather than an afterthought.”

Despite its damp reputation, London doesn’t receive a whole lot of rainfall compared to the northern UK — less than 25 inches a year, on average, or roughly half of what falls in New York City. Yet because so much growth is centered on London, the Thames Water service area holds about 80 percent of the UK’s data centers, Hernon says, and another 100 or so are proposed.

What’s more, their water usage peaks during the hottest, driest times of the year, when the utility can least accommodate the extra demand. “That’s why we talk about restricting or reducing or objecting to [data centers],” Hernon says. “It’s not because we don’t like them. We absolutely get it, we need them ourselves. AI will massively help our call center … which means we can have more people out fixing leaks and proactively managing our networks.”

Keeping the Lights On

One way for data centers to use less water is to rely more heavily on air-cooling technology, but this requires more energy —which may in turn increase water use indirectly, depending on the power source. What’s more, regional grids are already struggling to meet the demand of these power-hungry facilities, and there are hundreds more in the works. “A lot of these projects have been announced, but it’s not clear what can come on fast enough to power them,” says Kelly T. Sanders, associate professor of engineering at University of Southern California.

The government wants US technology companies to build their AI data centers domestically—not just for economic reasons, but for national security purposes as well. But even as the Trump administration appears to understand the enormous energy demands data centers will place on the electric grid, it has actively squashed new wind power projects, such as Revolution Wind off the coast of Rhode Island.

NREL (the National Renewable Energy Laboratory) created this overlay map of transmission lines and data center locations to “help visualize the overlap and simplify co-system planning.” Credit: NREL.gov.

Other carbon-free alternatives like small modular reactors (SMRs) and geothermal energy have bipartisan support, Sanders says. “But the problem is, even if you put shovels in the ground for an SMR today, it’s going to take 10 years,” she says. “The things that we can do the fastest are wind, solar, and batteries. But in the last six months we’ve lost a lot of the incentives for clean energy, and there’s an all-out war on wind. Wind projects that are already built, already paid for, are being canceled. And to me, that’s peculiar, because that’s electricity that would be ready to go out on the grid soon, in some of these regions that are really congested.”

Data centers are among the reasons ratepayers nationwide have seen their electric bills increase at twice the rate of inflation in the past year. Part of that is the new infrastructure data centers will require, such as new power plants, transmission lines, or other investments. Those costs, as well as ongoing grid maintenance and upgrades, are typically shared by all electric customers in a service area, through charges added to utility bills.

This creates at least two issues: While the tax revenues of a new data center will benefit only the host community, the entire electric service area must pay for the associated infrastructure. Secondly, if a utility makes that huge investment, but the data center eventually closes or needs much less electricity than projected, it’s the ratepayers who will foot the bill, not the data center.

Some tech companies are securing their own clean power independent of the grid—Microsoft, for example, signed a 20-year agreement to purchase energy directly from the Three Mile Island nuclear plant. But that approach isn’t ideal either, Sanders says. “These data centers are still going to use transmission lines and all those grid assets, but if they’re not buying the electricity from the utility, they’re not paying for all that infrastructure through their rate bills,” she says.

Aside from generating new power, Sanders says, there are strategies to squeeze more capacity from the existing grid. “One is good old energy efficiency, and the data centers themselves have all of the incentives aligned to try to make their processes more efficient,” she says. AI itself could potentially also help enhance grid performance. “We can use artificial intelligence to give us more information about how power is flowing through the grid, and so we can optimize that power flow, which can give us more capacity than we would have otherwise,” Sanders says.

Another strategy is to make the grid more flexible. Most of the time, and in most regions of the US, we only use about 40 percent of the grid’s total capacity, Sanders says, give or take. “We build the capacity of the grid to meet the hottest day … and that’s where we worry about these large data center loads,” she says. A coordinated network of batteries, however —including in people’s homes and EVs—can add flexibility and stabilize the grid during times of peak demand. In July, California’s Pacific Gas and Electric Company (PG&E) conducted the largest-ever test of its statewide “virtual power plant,” using residential batteries to supply 535 megawatts of power to the grid for two full hours at sundown.

With some intentional, coordinated planning—”it’s not just going to happen naturally,” Sanders says—it may be possible to add more capacity without requiring a lot of new generation if data centers can reduce their workloads during peak times and invest in large-scale battery backups: “There is a world in which these data centers can actually be good grid actors, where they can add more flexibility to the grid.”

Confronting Trade-Offs With Land Policy

As the demand for data centers grows, finding suitable locations for these facilities will force communities to confront myriad and imperfect trade-offs between water, energy, land, money, health, and climate. “Integrated land use planning, with sustainable land, water, and energy practices, is the only way we can sustainably achieve the virtuous circle needed to reap the benefits of AI and the economic growth associated with it,” Colohan says.

For example, using natural gas to meet the anticipated electricity load of Texas data centers would require 50 times more water than using solar generation, according to the HARC study, and 1,000 times more water than wind. But while powering new data centers with wind farms would consume the least water, it would also require the most land—four times as much land as solar, and 42 times as much as natural gas.

Absent an avalanche of new, clean power, most data centers are adding copious amounts of greenhouse gases to our collective emissions, at a time when science demands we cut them sharply to limit the worst impacts of climate change. Louisiana regulators in August approved plans to build three new gas power plants to offset the expected electricity demand from Meta’s Hyperion AI data center.

While towns or counties compete with one another to attract data centers, the host communities will reap the tax benefits while the costs—the intense water demand, the higher electricity bills, the air pollution from backup generators—will be dispersed more regionally, including to areas that won’t see any new tax revenue.

That’s one reason data center permitting needs more state oversight, Bolthouse says. “The only approval that they really have to get is from the locality, and the locality is not looking at the regional impacts,” she says. PEC is also pushing for ratepayer protections and sustainability commitments. “We want to make sure we’re encouraging the most efficient and sustainable practices within the industry, and that we’re requiring mitigation when impacts can’t be avoided.”

Too close for comfort? A data center abuts homes in Loudoun County, Virginia. Credit: Hugh Kenny via Piedmont Environmental Council.

PEC and others are also pressing for greater transparency from the industry. “Very often, data centers are coming in with non-disclosure agreements,” Bolthouse says. “They’re hiding a lot of information about water usage, energy usage, air quality impacts, emissions—none of that information is disclosed, and so communities don’t really know what they’re getting into.”

“We need communities to be educated about what they’re facing, and what their trade-offs are when they let in a data center,” Colohan says. “What is the cost—the true cost—of a data center? And then how do you turn that true cost into a benefit through integrated land policy?”

Rueben says she understands the desire, especially in communities experiencing population loss, to tap into a growing industry. But rather than competing with each other to attract data centers, she says, communities ought to be having broader conversations about job growth and economic development strategies, factoring in the true costs and trade-offs these facilities present, and asking the companies to provide more guarantees and detailed plans.

“Forcing data center operators to explain how they’re going to run the facility more efficiently, and where they’re going to get their water from—and not just assuming that they have first access to the water and energy systems,” she says, “is a shift in perspective that we kind of need government officials to make.”


Jon Gorey is a staff writer at the Lincoln Institute of Land Policy.

Lead image: Data center facilities in Prince William County, Virginia. The county has 59 data centers in operation or under construction. Credit: Hugh Kenny via Piedmont Environmental Council.

Requests for Proposals

Planejamento exploratório por cenários para abordar a resiliência hídrica na América Latina e no Caribe

Submission Deadline: November 13, 2025 at 11:59 PM

O Instituto Lincoln convida organizações comunitárias parceiras da América Latina ou do Caribe a apresentarem candidaturas para coorganizar, em 2026, um workshop de planejamento exploratório por cenários (XSP) sobre resiliência hídrica. Os parceiros selecionados trabalharão com o Consórcio para Planejamento por Cenários do Instituto Lincoln no projeto e na realização de um workshop fundamentado no contexto local que envolva as partes interessadas na exploração de um desafio urgente relacionado à água por meio de um processo imersivo e participativo. O workshop de XSP terá como foco compreender os impactos das questões locais dentro de um lugar ou região específica, explorar múltiplos futuros plausíveis e identificar estratégias para lidar com incertezas e criar resiliência hídrica a longo prazo.

Os diretrizes de submissão também estão disponíveis em espanhol.


Details

Submission Deadline
November 13, 2025 at 11:59 PM

Keywords

Scenario Planning, Water

Requests for Proposals

Planificación exploratoria de escenarios para abordar la resiliencia en América Latina y el Caribe

Submission Deadline: November 13, 2025 at 11:59 PM

El Instituto Lincoln invita a presentar postulaciones de contrapartes comunitarias en América Latina o el Caribe que estén interesados en ser coanfitriones de un taller de planificación exploratoria de escenarios (XSP, por su sigla en inglés) sobre resiliencia hídrica en 2026. Los socios seleccionados trabajarán con el Consorcio para la Planificación de Escenarios del Instituto Lincoln a fin de diseñar y ofrecer un taller con base local que involucre a las partes interesadas en la investigación de un desafío hídrico apremiante mediante un proceso participativo e inmersivo. El taller de XSP se centrará en comprender las consecuencias de problemas locales en un lugar o región específicos, explorar múltiples futuros plausibles e identificar estrategias para responder a la incertidumbre y desarrollar resiliencia hídrica a largo plazo.

La guía de postulación también está disponible en portugués.


Details

Submission Deadline
November 13, 2025 at 11:59 PM

Keywords

Scenario Planning, Water

Coming to Terms with Density: An Urban Planning Concept in the Spotlight 

September 15, 2025

By Anthony Flint, September 15, 2025
 

It’s an urban planning concept that sounds extra wonky, but it is critical in any discussion of affordable housing, land use, and real estate development: density.

In this episode of the Land Matters podcast, two practitioners in architecture and urban design shed some light on what density is all about, on the ground, in cities and towns trying to add more housing supply. 

The occasion is the revival of a Lincoln Institute resource called Visualizing Density, which was pushed live this month at lincolninst.edu after extensive renovations and updates. It’s a visual guide to density based on a library of aerial images of buildings, blocks, and neighborhoods taken by photographer Alex Maclean, originally published (and still available) as a book by Julie Campoli. 

It’s a very timely clearinghouse, as communities across the country work to address affordable housing, primarily by reforming zoning and land use regulations to allow more multifamily housing development—generally less pricey than the detached single-family homes that have dominated the landscape. 

Residential density is understood to be the number of homes within a defined area of land, in the US most often expressed as dwelling units per acre. A typical suburban single-family subdivision might be just two units per acre; a more urban neighborhood, like Boston’s Back Bay, has a density of about 60 units per acre. 

Demographic trends suggest that future homeowners and renters will prefer greater density in the form of multifamily housing and mixed-use development, said David Dixon, a vice president at Stantec, a global professional services firm providing sustainable engineering, architecture, and environmental consulting services. Over the next 20 years, the vast majority of households will continue to be professionals without kids, he said, and will not be interested in big detached single-family homes.  

Instead they seek “places to walk to, places to find amenity, places to run into friends, places to enjoy community,” he said. “The number one correlation that you find for folks under the age of 35, which is when most of us move for a job, is not wanting to be auto-dependent. They are flocking to the same mixed-use, walkable, higher-density, amenitized, community-rich places that the housing market wants to build … Demand and imperative have come together. It’s a perfect storm to support density going forward.” 

Tensions often arise, however, when new, higher density is proposed for existing neighborhoods, on vacant lots or other redevelopment sites. Tim Love, principal and founder of the architecture firm Utile, and a professor at Harvard University’s Graduate School of Design, said he’s seen the wariness from established residents as he helps cities and towns comply with the MBTA Communities Act, a Massachusetts state law that requires districts near transit stations with an allowable density of 15 units per acre. 

Some towns have rebelled against the law, which is one of several state zoning reform initiatives across the US designed to increase housing supply, ultimately to help bring prices down. 

Many neighbors are skeptical because they associate multifamily density with large apartment buildings of 100 or 200 units, Love said. But most don’t realize there is an array of so-called “gentle density” development opportunities for buildings of 12 to 20 units, that have the potential to blend in more seamlessly with many streetscapes. 

“If we look at the logic of the real estate market, discovering over the last 15, 20 years that the corridor-accessed apartment building at 120 and 200 units-plus optimizes the building code to maximize returns, there is a smaller ‘missing middle’ type that I’ve become maybe a little bit obsessed about, which is the 12-unit single-stair building,” said Love, who conducted a geospatial analysis that revealed 5,000 sites in the Boston area that were perfect for a 12-unit building. 

“Five thousand times twelve is a lot of housing,” Love said. “If we came up with 5,000 sites within walking distance of a transit stop, that’s a pretty good story to get out and a good place to start.” 

Another dilemma of density is that while big increases in multifamily housing supply theoretically should have a downward impact on prices, many individual dense development projects in hot housing markets are often quite expensive. Dixon, who is currently writing a book about density and Main Streets, said the way to combat gentrification associated with density is to require a portion of units to be affordable, and to capture increases in the value of urban land to create more affordability. 

“If we have policies in place so that value doesn’t all go to the [owners of the] underlying land and we can tap those premiums, that is a way to finance affordable housing,” he said. “In other words, when we use density to create places that are more valuable because they can be walkable, mixed-use, lively, community-rich, amenitized, all these good things, we … owe it to ourselves to tap some of that value to create affordability so that everybody can live there.” 

Visualizing Density can be found at the Lincoln Institute website at https://www.lincolninst.edu/data/visualizing-density/. 

Listen to the show here or subscribe to Land Matters on  Apple Podcasts, Spotify,  Stitcher, YouTube, or wherever you listen to podcasts.

 


Further reading 

Visualizing Density | Lincoln Institute

What Does 15 Units Per Acre Look Like? A StoryMap Exploring Street-Level Density | Land Lines

Why We Need Walkable Density for Cities to Thrive | Public Square

The Density Conundrum: Bringing the 15-Minute City to Texas | Urban Land

The Density Dilemma: Appeal and Obstacles for Compact and Transit Oriented Development | Anthony Flint

 


Anthony Flint is a senior fellow at the Lincoln Institute of Land Policy, host of the Land Matters podcast, and a contributing editor of Land Lines.