Topic: Tecnología e instrumentos

City Tech

Chattanooga’s Big Gig
By Rob Walker, Octubre 1, 2015

Universal high-speed Internet access is a popular dream these days—everyone from the president to Google, Inc., has embraced it. And the tech press is full of testy critiques wondering why typical broadband speeds in the United States lag so far behind those in, say, South Korea.

Just five years ago, this wasn’t such a hot topic. Back then, the discussion—and action—wasn’t led by the federal government or the private sector. The first movers were a number of diverse but forward-thinking municipalities: cities and towns like Chattanooga, Tennessee; Lafayette, Louisiana; Sandy, Oregon; and Opelika, Alabama.

Motives and solutions varied, of course. But as high-speed connectivity is becoming recognized as crucial civic infrastructure, Chattanooga makes for a useful case study. Its journey to self-proclaimed “Gig City” status—referring to the availability of Internet connections with 1 gigabit-per-second data transfer speeds, up to 200 times faster than typical broadband speed for many Americans—started with visionary municipal initiative, built upon via thoughtful private and public coordination. Most recently, this effort has even begun to show tangible effects on city planning and development, particularly in the form of an in-progress reimagining of a long-sleepy downtown core. In short, Chattanooga is starting to answer a vital question: once a city has world-class Internet access, what do you actually do with it?

The story begins more than a decade ago, when Chattanooga’s city-owned electric utility, EPB, was planning a major upgrade to its power grid. Its CEO, Harold Depriest, argued for a plan that involved deploying fiber-optic cable that could also be used for Internet access. After clearing local regulatory hurdles, the new system was built out by 2010, and every EPB power customer in the Chattanooga area—meaning pretty much every home or business—had gigabit access. But you had to pay for it, just like electricity. And the early pricing for the fastest access was about $350 a month.

“They had very, very few takers,” recalls Ken Hays, president of The Enterprise Center, a nonprofit that since 2014 has focused, at the behest of local elected officials, on strategizing around what Chattanoogans call “the gig.” The head of Lamp Post Group, a successful local tech-focused venture firm, made a point of signing up immediately, Hays continues. But on a citywide level, “we didn’t have the excitement” that talk of gig-level access generates today. And in 2010, he adds, “there weren’t many good case studies out there.”

But broader change was afoot. The announcement of Google Fiber—the Internet search giant’s foray into building out high-speed online infrastructure—sparked new interest. And in 2013, Jenny Toomey, a Ford Foundation director focused on Internet rights, helped organize a summit of sorts where officials from municipalities like Chattanooga, Lafayette, and elsewhere could meet and compare notes. “It was still pretty nascent at the time,” recalls Lincoln Institute President and CEO George W. McCarthy, an economist who was then director of metropolitan opportunity at the Ford Foundation. But that summit, he continues, helped spark new conversations about how such initiatives can make cities more competitive and more equitable, and less reliant on the purely private-sector solutions we often assume are more efficient than government. “And over the course of two years since, this issue has just exploded,” he says.

In fact, that summit turned out to be the rare event that actually spawned a new organization: Next Century Cities, founded in 2014, now has more than 100 member municipalities. They share best practices around an agenda that treats high-speed Internet access as a fundamental, nonpartisan infrastructure issue that communities can and should control and shape.

Against this backdrop, Chattanooga was taking steps to demonstrate how “the gig” could be leveraged. The Lamp Post Group had moved into downtown space, and superlative Internet access was just a starting point for the young, tech-savvy workers and entrepreneurs it wanted to attract. “If we don’t have housing, if we don’t have open space, if we don’t have cool coffee shops—they’re going to go to cities that have all that,” says Kim White, president and CEO of nonprofit development organization River City Company.

Starting in 2013, a city-center plan and market study conducted by River City proposed strategies to enhance walkability, bikeability, green space, and—especially—housing options. More than 600 people participated in the subsequent planning process, which ultimately targeted 22 buildings for revitalization (or demolition). Today, half of those are being redeveloped, says White, and more than $400 million has been invested downtown; in the next year and a half, 1,500 apartments will be added to the downtown market, plus new student housing and hotel beds. The city has provided tax incentives, some of which are designed to keep a certain percentage of the new housing stock affordable. The city has also invested $2.8 million in a downtown park that’s a “key” part of the plan, White continues, to “have areas where people can come together and enjoy public space.” One of the apartment projects, the Tomorrow Building, will offer “micro-units” and a street-level restaurant. “I don’t think we would have attracted these kinds of businesses and younger people coming to look,” without the gig/tech spark, White concludes. “It put us on the map.”

The gig was also the inspiration for a city-backed initiative identifying core development strategies that led to the Enterprise Center pushing a downtown “innovation district,” says Hays. Its centerpiece involves making over a 10-story office building into The Edney Innovation Center, featuring co-working spaces as well as the headquarters of local business incubator CO.LAB. The University of Tennessee at Chattanooga has a project involving a 3D printer lab in the Innovation District, and even the downtown branch of the Chattanooga Public Library has been made over to include a tech-centric education space.

EPB, whose original fiber-optic vision set the Gig City idea in motion, has long since figured out more workable pricing schemes—gig access now starts at about $70 a month—and drawn more than 70,000 customers. More recently, it has also offered qualified low-income residents 100-megabit access, which is still much faster than most broadband in the U.S., for $27 a month. And its efforts to expand into underserved areas adjacent to Chattanooga have become an important component of broader efforts to challenge regulations in many states, from Texas to Minnesota to Washington, that effectively restrict municipalities from building their own high-speed access solutions.

In short, a lot has changed—in Chattanooga and in other cities and towns that have pushed for Internet infrastructure that the private sector wasn’t providing. “Most of this work right now is happening at the local level,” says Deb Socia, who heads Next Century Cities. “It’s mayors and city managers and CIOs taking the steps to figure out what their city needs.” The implications for crucial civic issues from education to health care to security are still playing out. And precisely because the thinking and planning is happening on a municipal level, it won’t be driven solely by market considerations that favor what’s profitable instead of what’s possible. “The beauty of it is,” McCarthy summarizes, “it’s a both/and argument.”

Rob Walker (robwalker.net) is a contributor to Design Observer and The New York Times.

Communications Technology and Settlement Patterns

Benjamin Chinitz and Thomas Horan, Septiembre 1, 1996

In four years, there will be a fresh count of Americans. The 2000 Census will reveal how many of us there are, who we are in terms of race, nativity, income, family size and occupation, what kind of housing we occupy, where we live and where we work.

All these numbers, but especially the latter two, will reflect what is happening to what planners and social scientists call settlement patterns. The Census will show how people and jobs are distributed regionally between North and South and East and West; within regions between metropolitan and non-metropolitan areas; and within metropolitan areas between cities and suburbs.

Settlement patterns have been transformed radically in the twentieth century (see graph 1). On a regional basis, the trend has been from East to West and North to South. In the decade between 1980 and 1990, for example, three states in the West and South accounted for 50 percent of the nation’s population growth: California, Florida and Texas.

Within all regions, the trend has been toward ever larger metropolitan agglomerations. By 1990, metropolitan areas of 1,000,000 or more accounted for 50 percent of the nation’s population. Within metropolitan areas, cities grew faster than suburbs at the beginning of the century, but by the 1950s the trend was sharply in favor of the suburbs, which now account for more than half of the nation’s population.

Will the 2000 Census confirm the continuation of these trends? What stakes do we have in the outcome? Quite a few. We worry about trends that erode the economic base of cities because we are concerned about job opportunities for the poor who are committed, by choice or circumstance, to live in the city. We are also concerned about the health of the tax base, which affects the capacity of the local government to deal with the needs of all its residents.

We also worry about land use patterns in the suburbs which both require and increase auto-dependency. This trend in turn leads to more auto travel, aggravates congestion, pollutes the air, and complicates our international relations because of our heavy dependence on imported oil.

We are in the throes of a revolution comparable in scope to the revolution in transportation technology that heavily influenced settlement patterns in the nineteenth and twentieth centuries. The transportation revolution, from ships and trains to cars and planes, made it possible for both workers and their employers to have a wider choice of locations.

The pace of the revolution in data processing and communications, which began slowly in the middle of the twentieth century, has quickened rapidly in recent years. We speak of a post-industrial information economy. By that we mean that information constitutes an ever-increasing share of the Gross National Product, both as “input” to the production of other goods and services and as “output” in the form of entertainment and related activities.

Household Location Decisions

How will settlement patterns be affected by the transition to an information economy? Let us first consider the worker’s choice of a residential location. In classical urban economics, this choice is seen as a “trade-off” between the merits of a particular place in terms of quality of life and the cost of commuting to work. As the transportation revolution reduced the time and money costs of commuting, more and more workers were able to afford to locate in what they considered an attractive suburb that offered the lifestyle they preferred: a private home with a lawn, good schools, parks and open space, shopping facilities, and friendly neighbors.

The New York Times of July 14, 1996, reports that because of the revolution in communications and data processing, accompanied by company downsizing, as many as 40 million people work at least part time at home, with about 8,000 home-based businesses starting daily.

Logic suggests that some of this new-found workplace freedom will manifest itself in location choices that favor places considered desirable, be they in the farther reaches of suburbia, exurbia, or rural America. On the other hand, if these dispersed self-employed workers end up commuting less, their freedom may not “cost” the society more in terms of congestion and pollution.

Business Location Decisions

What about the conventional company and its location decisions? Like the household, the company does a “balancing” act when it chooses a location. From the perspective of product distribution, Place A might be preferred. From the perspective of the inputs of materials, Place B might be ideal. From the point of view of labor costs, Place C might be best. For tax purposes and related “public” issues, Place D might be most beneficial.

If the entire company has to be in one place, then compromise is inevitable. But if the communications revolution permits the “dis-integration” of the company via the physical separation of functions or the “outsourcing” of particular functions, then what used to be one location decision becomes a multiplicity of decisions, each component responding to a compelling argument for a particular place.

The classic example is the “front” office of a bank or insurance company in the midst of a congested city center with the “back” office in a rural area in another region or even in another country.

Settlement Trends

How these changes in household and business location choices will ultimately affect settlement patterns in metropolitan America was the subject of a major study by the Office of Technology Assessment (OTA), an agency that served the U.S. Congress for many decades but was abolished by the Congress in 1995. The summary chapter in The Technological Reshaping of Metropolitan America states that “technology is connecting economic activities, enabling them to be physically farther apart, reducing the competitive advantage of high-cost, congested urban locations, and allowing people and businesses more (but not total) freedom to choose where they will live and work.”

But OTA concludes that “the new wave of information technologies will not prove to be the salvation of a rural U.S. economy that has undergone decades of population and job loss as its natural resource-based economy has shrunk.” Rather, most economic activity will locate in large and medium-sized metropolitan areas (see graph 2).

“Technological change. . .threatens the economic well being of many central and inner cities, and older suburbs of metropolitan areas,” the report continues. Overall, the trends suggest that these places will find it hard to compete without economic development policies designed to offset their competitive disadvantages.

In short, the OTA expects that, the communications revolution notwithstanding, the 2000 Census will report a continuation of the trends manifested throughout the latter half of the twentieth century. The favored locus of activity in both residential and business terms will be the outer suburbs of metropolitan areas. Given our concerns with the adverse effects of prevailing settlement patterns, the challenge to land policy is greater than ever.

______________

Benjamin Chinitz is an urban economist who served as director of research at the Lincoln Institute from 1987 to 1990. He continues to serve as a faculty associate at the Institute and as visiting professor in urban and regional planning at Florida Atlantic University.

Thomas Horan is director of Applied Social and Policy Research at Claremont Graduate School in Claremont, CA.

Report from the President

Improving Access to Land and Tax Data
Gregory K. Ingram, Enero 1, 2010

A major tragedy of empirical work is the low ratio of analysis to data, in part due to the lack of publicly available datasets. Many data collectors are reluctant to share data with other researchers until they have harvested all its new insights. Accordingly, researchers often collect new data because they cannot access existing information.

A new initiative of the Lincoln Institute is to compile data relevant to the analysis of land and tax policy, make it available on our Web site, and encourage new research. Three very different datasets are currently available, and a fourth is under development.

Significant Features of the Property Tax. This database title refers to the well-known publication, Significant Features of Fiscal Federalism, produced by the Advisory Commission on Intergovernmental Relations, which between 1959 and 1996 reported on the relationships among local, state, and national levels of government. This online and interactive database, produced and continually updated in partnership with the George Washington Institute of Public Policy, presents property tax data for all 50 states

Great care is taken to ensure that data reported across jurisdictions are comparable and similarly defined. Users may access property tax information and data online in standard tables or create new downloadable tables containing the specific data they seek. Unlike many interactive databases, Significant Features also includes many table entries in text that explain, for example, how each state categorizes property, defines taxable value, and restricts or caps rates and assessments.

Land and Property Values in the U.S. These more traditional tabular files contain numeric data on the values and rents of residential properties in the United States. The national ratio of rents to prices for the stock of all owner-occupied housing is available quarterly from 1960 to the present. National indices of prices and values of housing (land inclusive of structures), land, and structures are available quarterly from 1975 to the present and annually from 1930 to the present. For 46 metropolitan areas, quarterly indices of prices and values of single-family, owner-occupied housing (land inclusive of structures), land, and structures are available from 1985 to the present.

The implicit rents of owner-occupied housing, the value of structures, and the value of residential land are rarely observed directly, and therefore are estimated using techniques that are explained on the Web site. These data were created and are updated by Morris A. Davis, a fellow at the Lincoln Institute and faculty member at the University of Wisconsin School of Business, Department of Real Estate and Land Economics.

University Real Estate Development Cases. Many university real estate development projects involve the expansion of facilities, the upgrading of neighboring properties, and long-term investment in real estate. Such projects are often controversial when they displace current residents and businesses or transform neighborhoods. As part of the Lincoln Institute’s research on town-gown issues, this database presents quantitative and qualitative information on 897 projects that are outside traditional campus boundaries. These cases provide a useful composite picture of recent university real estate activities.

Digital Maps of Urban Spatial Extension. Visiting fellow Shlomo Angel is examining the spatial growth of a sample of global cities and has created a set of digital maps derived from satellite data and historic sources. Focusing on measures of developed versus undeveloped land, the maps form the basis for several Lincoln Institute working papers on the spatial growth of cities over time. The maps will exist as digital files that can be downloaded and analyzed by others who want to pursue related work.

These datasets are the Lincoln Institute’s first steps toward increasing the availability of data to researchers, analysts, policy makers, and concerned citizens with an interest in land policy and taxation. The information is freely accessible on the Tools and Resources section of the Institute Web site at www.lincolninst.edu.

La trazabilidad del progreso

PolicyMap democratiza el análisis de datos
Alex Ulam, Octubre 1, 2015

El precio de la vivienda está subiendo en espiral en muchas áreas de los estados unidos, limitando la capacidad de los estadounidenses para ahorrar, y llevando al aburguesamiento a barrios que antes eran asequibles. No obstante, como con muchos desafíos de política pública, no siempre se puede determinar a simple vista dónde se encuentran los problemas más graves. A Helen Campbell, una analista del Departamento de Vivienda e Inversión Comunitaria de Los Ángeles, esto le quedó muy claro un viernes por la tarde en el mes de julio. Una solicitud de información de la oficina del alcalde la llevó a descubrir que gran parte de los inquilinos del Valle de San Fernando en Los Ángeles pagaban el precio de alquiler de vivienda más alto de todo el país, que el Departamento de Vivienda y Desarrollo Urbano de los EE.UU. (U.S. Department of Housing and Urban Development, HUD) define como la situación en la que las familias destinan más del 30 por ciento de sus ingresos al alquiler de su vivienda.

Los funcionarios de Los Ángeles sabían que había zonas donde los propietarios e inquilinos tenían dificultades para pagar la vivienda, dice Campbell, pero no tenían idea de cuán grave era la situación e incluso dónde era más acuciante. La oficina del alcalde necesitaba datos fidedignos sobre esta evolución preocupante para poder solicitar el mantenimiento del Programa HOME de Sociedades de Inversión, el mayor programa federal de subvenciones en bloque destinado a vivienda social. El congreso federal está considerando actualmente un proyecto de ley en el senado que haría desaparecer el programa.

Si Campbell hubiera utilizado un software de Sistema de Información Geográfica (SIG) convencional, hubiera tardado demasiado tiempo en analizar el precio de la vivienda de la ciudad. Pero pudo acceder a la información rápidamente con varias sencillas consultas en PolicyMap, un software excepcional basado en la web que está cambiando la manera en que se adquieren y presentan los datos de planificación. “Si no hubiéramos tenido PolicyMap, sencillamente habríamos rechazado la solicitud”, dice Campbell. “Nos habría tomado demasiado tiempo hacer el trabajo”.

Cuando Campbell hizo su búsqueda en Policy Map, descubrió que el distrito congresual 29, parte del cual está situado en la ciudad de Los Ángeles, era el primero de los 435 distritos congresuales del país en carestía de vivienda en alquiler y el tercero en carestía de vivienda en propiedad. Para el distrito congresual 29, que incluye una gran parte del Valle de San Fernando, esta estadística significa que el 62,9 por ciento de los inquilinos y poco más del 50 por ciento de los propietarios estaban padeciendo una sobrecarga en el precio de la vivienda. “Pensábamos que L.A Sur o L.A. Noreste tendrían una coste de vivienda en alquiler más alto, pero en realidad es el Valle el que lo tiene”, dice Campbell.

Datos públicos para todos

Desde su lanzamiento en 2007, PolicyMap se ha convertido en la mayor base de datos geográficos de la web, y es el recurso de información pública más utilizado por instituciones financieras, universidades, organizaciones sin fines de lucro, y cerca de 2.500 agencias gubernamentales. Esta herramienta en línea tiene en la actualidad más de 37.000 indicadores, que cubren categorías que van desde crimen a acceso a tiendas de comestibles, y facilitan de manera importante el acceso a los datos públicos. El año pasado, el sitio tuvo 434.000 visitantes distintos. La mayor parte de los datos almacenados en PolicyMap es gratuita, pero también se puede obtener acceso a datos privados de varios proveedores por medio de suscripciones de pago. En general, las herramientas de cartografía de PolicyMap son muy fáciles de usar y han ayudado a democratizar el análisis de datos, poniéndolo al alcance de los gobiernos locales y las organizaciones sin fines de lucro, las cuales en general no tienen los recursos necesarios para contratar equipos de especialistas en SIG. El sitio puede servir de ayuda a toda persona involucrada en políticas públicas que no tenga los recursos para acceder de manera independiente a datos digitalizados, evitando la brecha digital.

Una de las características más notables de este sitio web es su capacidad para mostrar simultáneamente distintos tipos de indicadores, como sitios federales de limpieza de contaminación (Superfund), niveles de ingreso por barrio o emprendimientos inmobiliarios financiados con créditos tributarios de vivienda social. Esta capacidad puede facilitar iniciativas de planificación actuales, como los programas Promise Zone o Choice Neighborhood de la administración Obama, que requiere colaboración entre agencias y pone énfasis en la coordinación de distintos tipos de inversiones en áreas de escasos recursos.

PolicyMap también permite a los usuarios trazar la efectividad de programas específicos a lo largo de un periodo de tiempo determinado, ayudándoles a cosechar logros o recortar pérdidas más adelante. Aunque el dinero del gobierno se distribuye principalmente mediante el uso de fórmulas, ha habido un incremento marcado en programas de subvenciones competitivas que requieren informes de progreso y datos que demuestren el detalle de necesidades. En lo que se refiere a subvenciones competitivas, “las ciudades que tienen mejores datos y presentan las propuestas más pulidas, obviamente van a tener ventaja sobre las demás”, según el Presidente y Director Ejecutivo del Instituto Lincoln, George W. McCarthy.

El punto de partida

PolicyMap es una idea del Fondo de Reinversión (The Reinvestment Fund, TRF), una Institución Financiera de Desarrollo Comunitario (Community Development Financial Institution, CDFI) de Filadelfia, que administra un capital de US$839 millones de dólares e invierte en personas y barrios de bajos ingresos. La organización financia una amplia gama de bloques de construcción comunitaria, como viviendas sociales, y guarderías y tiendas de comestibles. PolicyMap nació de la necesidad de TRF de controlar sobre el terreno cómo estaban funcionando estos programas comunitarios.

A principios de la década de 2000, TRF comenzó a explorar maneras de organizar y comprender el impacto de sus propias inversiones. “Estábamos tratando de decidir dónde realizar inversiones a lo largo del tiempo”, dice la presidenta de PolicyMap, Maggie McCullough, en ese entonces investigadora del Departamento de Política de TRF. “También queríamos saber qué tipo de impacto estábamos teniendo y cómo estábamos cambiando los mercados en los que participábamos”.

En 2005, el estado de Pensilvania contrató a TRF para recopilar y organizar una gran cantidad de datos sobre precios de viviendas, ejecuciones hipotecarias e ingresos. El objetivo del proyecto era permitir que los funcionarios pudieran pensar más estratégicamente en cómo utilizar el dinero del estado destinado a vivienda en toda su jurisdicción. Pero aun con un contrato de casi US$200.000, la tarea que podía realizar TRF tenía limitaciones. Los datos y mapas estaban contenidos en un formato fijo en disco. “Después de haber entregado el disco”, dice McCullough, “recuerdo haber pensado que iba a ser como un informe impreso: se iba a guardar en un estante y no se iba a actualizar nunca”.

Esta revelación inspiró a McCullough y otros miembros de TRF a imaginar cómo construir una plataforma cartográfica basada en la web, que permitiera actualizar los datos y a los usuarios cargar sus propias bases de datos. Para desarrollar PolicyMap, McCullough utilizó los conocimientos que adquirió como pionera en el diseño de portales web de información pública. En la década de 1990, fue parte del equipo que construyó el sitio web inicial del Departamento de Vivienda y Desarrollo Urbano (Housing and Urban Development, HUD) de los EE.UU. “Mi experiencia [en] HUD me permitió darme cuenta de que si una persona que no es investigador necesita o quiere comprender datos, tenemos que hacerlos fáciles de comprender”, dice McCullough. “Teníamos que dar nombres normales a los indicadores de datos así como descripciones sencillas, igual que los que teníamos que dar a los programas de HUD”.

McCullough quería que PolicyMap sirviera para todo el país, a diferencia de otras iniciativas de datos que se concentraban en geografías locales. Cuando PolicyMap se lanzó en 2007, “no había realmente ningún SIG en línea”, explica McCullough. “Uno podía obtener instrucciones para llegar a su destino o encontrar un restaurante local con Google Maps, pero gran parte del software SIG estaba encerrado en computadoras de escritorio. Queríamos crear algo a lo que el público pudiera acceder en forma sencilla, a través de la web”.

El primer juego de datos que TRF cargó en PolicyMap en 2007 estaba compuesto por los informes de la Ley de Divulgación de Hipotecas de Vivienda (Home Mortgage Disclosure Act, HMDA), la fuente de datos más importante del gobierno para detectar préstamos abusivos y discriminatorios. En ese momento, la burbuja inmobiliaria se estaba pinchando, y los funcionarios del gobierno y las fuerzas del orden estaban buscando desesperadamente maneras de controlar la incipiente crisis; el primer lugar donde buscarían información eran los datos de HMDA. Pero los datos de HMDA no estaban organizados en un formato adecuado para SIG, con lo cual ciertos tipos de búsqueda eran extremadamente difíciles. Por ejemplo, si un investigador con conocimiento de SIG quería centrarse en una sección de Detroit donde se sospechaba la presencia de una gran cantidad de préstamos de alto costo, no había una herramienta en línea disponible para extraer los datos de HMDA para esa área en particular.

El éxito inicial de PolicyMap para mostrar datos públicos ayudó a atraer a importantes clientes de pago, como la Junta de Reserva Federal en Washington, D.C., que era responsable en ese momento de recopilar los datos de HMDA. Además de cargar todos los datos de HMDA con fines cartográficos y ponerlos a disposición del público en general, el equipo de McCullough construyó una herramienta personalizada de PolicyMap para generar informes para la Reserva Federal, la cual permitió a su personal extraer los datos de HDMA para cualquier localidad de interés. “Facilitamos la tarea [de la Reserva Federal] para acceder a sus propios datos”, dice McCullough.

Un marco de igualdad para todos

Los grandes prestamistas e inversores inmobiliarios generalmente se suscriben a juegos de herramientas que pueden alcanzar precios de seis cifras para acceder a servicios que brindan información exclusiva, como informes de valuación de propiedades e investigaciones detalladas de mercado. Pero muchas organizaciones comunitarias y gobiernos locales no tienen los recursos para comprar estos datos bajo licencia. Y aunque pudieran pagar estas suscripciones tan caras, muchas organizaciones comunitarias y gobiernos locales no tendrían el personal o la capacidad de SIG para usarlas en mapas interactivos.

Considérese, por ejemplo, NeighborWorks, una red nacional de 240 organizaciones comunitarias que no tiene un especialista en SIG. Harry Segal, un especialista en gestión y planificación de NeighborWorks America, dice que PolicyMap ha cambiado la ecuación para su red, al darle acceso a datos y herramientas cartográficas que de otra manera no podrían pagar. “Cualquier emprendedor inmobiliario, ya sea público o privado, que quiera comenzar a trabajar en un barrio nuevo tiene que cortejar a los poderes establecidos y demostrar un conocimiento de las condiciones del mercado de la zona”, dice Segal. “Para organizaciones sin fines de lucro, es mucho más difícil compilar este tipo de datos”. Sin PolicyMap, dice, “casi no vale la pena exprimir el jugo”.

La suscripción a PolicyMap de NeighborWorks, que cuesta US$5.000 al año, brinda acceso a este tipo de datos exclusivos y permite a los miembros de la organización realizar consultas sobre distintas secciones de un mapa para obtener información sobre una variedad de indicadores, como el ingreso promedio de los residentes de determinado barrio y el nivel de hipotecas de alto costo otorgadas en la zona. Esta capacidad para analizar a distintas escalas geográficas empodera a los grupos comunitarios locales que están tratando de acceder a financiación o llamar la atención sobre préstamos abusivos en sus barrios. “Tenemos un par de organizaciones en el norte del estado de Nueva York. Si uno busca estadísticas sobre esa región, van a estar distorsionadas por la dominancia de la Ciudad de Nueva York”, dice Segal. “Pero con PolicyMap, podemos extraer datos por distrito o división censal”.

Algunas agencias municipales tampoco tienen capacidad para diseñar o mantener los tipos de bases de datos a los que ahora pueden acceder por medio de una suscripción a PolicyMap. “Soy la única persona aquí que sabe de SIG”, dice Sara Eaves, analista de planificación y política para la Autoridad de Vivienda de San Antonio. Agrega que PolicyMap permite a muchas personas de su oficina realizar tareas que de otra manera exigirían una capacitación especializada. Con su suscripción a PolicyMap, la Autoridad de Vivienda de San Antonio también puede publicar datos sobre escuelas, tasas de vacancia residencial, niveles de ingreso en los barrios y otras informaciones que un residente municipal podría considerar al decidir dónde comprar una casa o alquilar un apartamento. “Podríamos mantener bases de datos similares en nuestra agencia, pero no tenemos los recursos. PolicyMap nos permitió poner mapas interactivos en nuestro sitio web, con lo cual no sólo disponemos de información internamente sino que también la ponemos al alcance del público en general”.

Racionalizar el proceso para las ciudades y los grupos comunitarios

Muchos analistas políticos usan un software SIG tradicional, como Esri, y además las herramientas SIG simplificadas disponibles en PolicyMap. Campbell, del Departamento de Vivienda y Desarrollo Comunitario de Los Ángeles, dice que Esri ofrece la capacidad de hacer pronósticos y realizar ciertos tipos de análisis complejos que no son posibles con PolicyMap. Pero señala que PolicyMap le ahorra tiempo y le permite explicar más fácilmente sus investigaciones a personas sin conocimientos especializados. “Me gusta PolicyMap porque se basa en datos concretos y es irrefutable”, dice, mientras que Esri contiene predicciones sobre el futuro. “A veces, cuando uno entrega un informe de análisis comunitario con datos de Esri, hay demasiada información para digerir. Habrá información sobre 2005, 2010 y 2015. Pero para la información de 2020 hay una fórmula de cómo crearon el pronóstico, que quizás no sea necesaria y puede estar equivocada”.

PolicyMap también es lo suficientemente flexible como para responder a las necesidades cambiantes de los usuarios. A medida que los requisitos de datos han aumentado y se han hecho más complejos, los clientes de PolicyMap de hace muchos años han pedido nuevas herramientas para ayudar a mejorar la eficiencia. Por ejemplo, Melissa Long, subdirectora de la Oficina de Vivienda y Desarrollo Comunitario de Filadelfia, había estado usando PolicyMap para mostrar datos de censo agregados y simplificados. Pero hace varios años se dio cuenta de que su agencia necesitaba herramientas analíticas más exhaustivas para poder solicitar una cantidad creciente de subvenciones por licitación.

“Necesitábamos una gran cantidad de información demográfica vecinal y conocer qué tipos de programas municipales se estaban implementando”, dice Long, señalando que los datos municipales disponibles de PolicyMap han mejorado la coordinación entre las distintas agencias municipales y también el posicionamiento de la ciudad para solicitar subvenciones competitivas.

Long dice que las herramientas desarrolladas por PolicyMap para Filadelfia permitirán que la ciudad controle su progreso en una Subvención de Implementación de Choice Neighborhoods, que respalda estrategias locales para barrios que tienen problemas con viviendas públicas o asistidas por HUD. “La subvención cubre un periodo de cinco años. Si vemos que nuestra propuesta de estabilización de barrios no está funcionando”, dice, “podemos realizar correcciones a la subvención a medio camino”.

La posibilidad de analizar distintos tipos de datos simultáneamente también permite a los investigadores trazar los beneficios compartidos de una inversión en particular. Por ejemplo, en Filadelfia hay dos programas distintos de limpieza y reverdecimiento de lotes vacantes. PolicyMap permite que los usuarios vean los lotes rehabilitados por ambos programas simultáneamente, y estudiar si han mejorado la calidad de vida en los barrios circundantes. El contrato de Filadelfia con PolicyMap ha permitido superponer datos de múltiples estudios, como el de la Escuela Wharton de la Universidad de Pensilvania, que mostró cómo los valores inmobiliarios crecieron un 17 por ciento en promedio alrededor de los lotes embellecidos, y otro que mostró como los delitos a mano armada disminuyeron significativamente en dichas zonas. Un tercer co-beneficio son los cientos de puestos de trabajo de verano que se necesitan para mantener estos lotes rehabilitados. “No se puede analizar solamente la vivienda”, dice Long. Hay que considerar “todas las demás cosas que pasan en un barrio”.

Una de las herramientas analíticas más populares de PolicyMap es el Análisis de Valor de Mercado (Market Value Analysis, MVA), que TRF desarrolló para Filadelfia y después fue reproducido en alrededor de 18 ciudades más. MVA analiza la fortaleza de las distintas áreas de la ciudad mostrando secciones codificadas en color de un mapa de valores asignados, que van desde “Problemáticos” a “Selección Regional”, que es la clasificación más alta. Esta clasificación se realiza usando una técnica llamada análisis de conglomerados, que evalúa divisiones censales de acuerdo con grupos de indicadores tales como actividades de venta de casas, tasas de vacancia y ejecuciones hipotecarias. Si se hace clic en cualquier sección del mapa, aparece una tabla con los datos utilizados para determinar la clasificación de esa área específica. Los barrios clasificados como Selección Regional, dice McCullough, en general tienen buenas ventas, bajas tasas de vacancia y se combinan los propietarios y los inquilinos.

Estos MVA proporcionan a las agencias gubernamentales y organizaciones sin fines de lucro la información que necesitan para abordar los problemas específicos de la zona, dice McCarthy, del Instituto Lincoln. “Siempre se desea el mejor rendimiento posible para una inversión de dinero público”, dice. “En los barrios realmente problemáticos, ello puede significar invertir en demolición a gran escala para acelerar la reutilización de las propiedades. En un barrio en transición, se podrían comprar casas abandonadas y arreglarlas”.

El camino a seguir

El equipo de PolicyMap publica frecuentemente nuevos índices y herramientas inmediatamente después de decisiones judiciales y dictámenes de organismos. El mes de julio pasado, por ejemplo, McCullough y su equipo publicaron el índice de Áreas Raciales y Étnicamente Concentradas de Pobreza (Racially and Ethnically Concentrated Areas of Poverty, RCAP/ECAP), que se usa para identificar distritos censales de los EE.UU. que tienen una gran proporción de individuos no caucásicos y gente que vive por debajo del umbral de pobreza. McCullough dice que su equipo anticipó el dictamen de la Corte Suprema en junio sobre “impacto desigual” en prácticas de vivienda y, varios meses antes, comenzó a elaborar el índice para ayudar a individuos y organizaciones a comprender los temas relacionados con la decisión de la corte. “La sincronización fue perfecta”, dice. “Cuando se produjo [el dictamen de la Corte Suprema], estábamos listos para actuar”.

PolicyMap todavía no cuenta con juegos de datos importantes que McCullough quisiera tener para ayudar a los investigadores a comprender mejor los temas críticos que enfrenta el país. Por ejemplo, McCullough siempre quiso incorporar los datos nacionales de ejecución hipotecaria como parte de los esfuerzos de PolicyMap para efectuar el seguimiento de los factores que influyen en los precios de venta inmobiliaria, pero es difícil encontrar juegos de datos integrales y fidedignos sobre ejecución hipotecaria. Además, la compra de licencias de datos de ejecución hipotecaria de proveedores privados es excesivamente cara. Los clientes de PolicyMap también expresaron interés en acceder a los historiales de crédito, y estos son datos muy difíciles de obtener. “No pudimos siquiera obtener permiso de las agencias de crédito para la licencia de datos”, dice McCullough. “Y si pudiéramos obtener sus datos, estarían agrupados a una escala geográfica muy grande, de estado”.

Mientras tanto, PolicyMap recibirá uno de los mayores suministros de datos en octubre, con la primera parte de un proyecto provisionalmente titulado “Estado del suelo en el país”, subsidiado por el Instituto Lincoln. El “Estado del suelo en el país” incluirá una colección de 18 bases de datos enormes de 150 agencias gubernamentales distintas, que cubrirá criterios tales como sitios altamente contaminados, inversiones públicas en suelos, zonas de inundación e información de zonificación.

El proyecto del Instituto Lincoln pretende ayudar a las agencias gubernamentales a mejorar su tarea y brindar al ciudadano común herramientas con las que puedan pedir rendición de cuentas a los funcionarios electos. Debería arrojar más luz sobre algunos de los problemas más complejos del país, como la persistencia de la pobreza en ciertas áreas o la discriminación inversa, cuando se atrae a consumidores minoritarios para ofrecerles préstamos en términos no favorables. En última instancia —como con el descubrimiento de que el Valle de San Fernando es el lugar más caro para vivir en el país si se consideran los ingresos locales de los residentes—, no podemos anticipar algunos de los hechos y tendencias más interesantes que se descubrirán en el futuro, sino a medida que los investigadores vayan aprendiendo a navegar por PolicyMap.

“Cada vez que uso PolicyMap, empiezo a ver cosas distintas”, dice McCarthy. “Hay un proceso completo de descubrimiento que se abre, y es muy iluminador”.

Alex Ulam es un periodista enfocado en temas de arquitectura, arquitectura paisajista, y temas de planificación urbana y vivienda.

Habitat Conservation Plans

A New Tool to Resolve Land Use Conflicts
Timothy Beatley, Septiembre 1, 1995

As sprawling, low-density development patterns consume thousands of acres of natural habitat, the force of urban growth is increasingly bumping up against the need to protect biodiversity. The fastest growing states and regions in the South and West are also those with high numbers of endemic species, and species endangered or threatened with extinction.

One tool that has emerged for reconciling species-development conflicts is the habitat conservation plan (HCP). Authorized under Section 10 of the federal Endangered Species Act (ESA), HCPs allow for limited “take” of listed species in exchange for certain measures to protect and restore habitat. These plans vary in their geographical scope from a single parcel or landowner to large areas involving many landowners and multiple governmental jurisdictions.

The HCP mechanism grew out of a controversy over development plans on San Bruno Mountain in the Bay Area of California that threatened several species of butterflies, including the federally listed mission blue. A collaborative planning process generated a biological study of the butterflies’ habitat needs and a conservation plan that allowed some development in designated nodes while setting aside about 87 percent of the butterfly habitat as permanent open space. The HCP also included a funding component, procedures for carefully monitoring development and minimizing its impact, and a long-term program of habitat restoration.

The positive experience of San Bruno led to a 1982 amendment to the ESA specifically allowing HCPs. Since then, their use has grown slowly but steadily. About 40 plans have been approved by the U.S. Fish and Wildlife Service, and another 150 are in progress, most of them initiated in the last five years.

The Typical HCP Process

Regional habitat conservation plans usually follow a similar process. They start with the formation of a steering committee with representation from the environmental community, landowners and developers, local governments, and state and federal resource management agencies, among others. Frequently, consultants are hired to prepare background biological and land use studies as well as the actual plan and accompanying environmental documentation. The content of these plans can vary substantially depending on the species and potential threats at issue, but most create habitat preserves through fee-simple acquisition or land dedication. Plans also include provisions for habitat management, ecological restoration, and research and monitoring. Much of the deliberation in preparing a plan centers on how much habitat must be preserved, the boundaries and configuration of proposed preserves, how funds will be generated to finance the plan, and which entities or organizations will have management responsibility for the protected habitat once secured.

While the HCP process has encountered problems, the experience to date suggests it can be a viable and constructive mechanism for resolving species-development conflicts. For the development community, the stick of ESA brings them to the table and keeps them there, realizing that without a strong plan any development might be jeopardized. For the environmental community, the plan represents a way to generate funds to acquire habitat that would be difficult to raise otherwise. The HCP process, thus, provides a useful pressure valve under the ESA–a tool to provide flexibility in what is frequently criticized as being an overly rigid and inflexible law.

Successes and Concerns

From the perspective of preserving biodiversity, the plans, even those not officially adopted or approved, have lead to the acquisition of important habitat. The Coachella Valley HCP in California sets aside three preserves totaling nearly 17,000 acres of desert habitat to protect the fringe-toed lizard. Other plans preserve biologically rich hardwood hammocks in the Florida Keys, desert tortoise habitat in Nevada, and forested habitat for the northern spotted owl in California. The ambitious Balcones Canyonlands Conservation Plan in Austin, Texas, would protect more than 75,000 acres of land, including a newly created 46,000 acre national wildlife refuge. Though this plan has encountered political and financial obstacles, more than 20,000 acres have already been secured.

One of the key concerns about HCPs is the effectiveness of their conservation strategies, especially whether the amount of habitat set aside is sufficient to ensure the survival of threatened species. The long-term ecological viability of preserves is another problem, because many will become mere “postage stamps” surrounded by development. These concerns suggest that more habitat should be protected, that preserves should be configured in larger, regional blocks, and that plans should seek to protect multiple rather than single species within broad ecosystem functions. The Balcones example suggests a positive direction for future HCPs in its emphasis on a regional, multi-species approach, including endangered migratory songbirds, cave-adapted invertebrates and plant species.

Another criticism of HCPs is that they have failed to change the ways we allow development to occur because they generally accept the current pattern of low-density sprawl and wasteful land consumption. In addition, it often takes four or five years before a plan can be prepared and approved. Even given that seemingly long timeframe, plans are often based on limited biological knowledge.

One of the most difficult issues in the HCP process is funding. Habitat acquisition in fast-urbanizing areas is extremely expensive. The Coachella Valley plan cost $25 million; the Balcones plan could cost more than $200 million. Most plans are funded through a combination of federal, state and local funds, with some private funding. At the local level the plans usually impose a mitigation fee assessed on new development in habitat areas ranging from a few hundred dollars per acre to the $1950 per acre in the case of the Stephens’ kangaroo rat HCP in Southern California.

Ideas for future funding sources include the creation of habitat acquisition revolving funds (similar to state revolving funds for financing local sewage treatment plant construction) and the use of special taxing districts designed to capture land value increases of property located adjacent to habitat preserves. Greater reliance needs to be placed on less expensive alternatives than fee-simple acquisition, such as transfers of development rights, tradable conservation credits, mandatory clustering and other development controls.

The Future of HCPs

The considerable progress in habitat conservation made through this mechanism to balance development and conservation could be halted if current proposals in Congress to substantially weaken ESA prevail. Clearly it is the “teeth” of ESA that gets opposing parties to the bargaining table. Without a strong ESA, there will be little reason to expect this form of collaborative habitat conservation to occur.

The experience to date suggests that flexibility does exist under current law, and that the problems encountered with HCPs do require some fine tuning. The challenge is to make the HCP process an even more effective tool for conserving biodiversity. At the same time, if habitat conservation is incorporated into local comprehensive plans, then new development can be steered away from important habitat areas and public investment decisions can minimize potential species-development conflicts.

Timothy Beatley is chair of the Department of Urban and Environmental Planning in the School of Architecture at the University of Virginia and the author of Habitat Conservation Planning: Endangered Species and Urban Growth, University of Texas Press, 1994. He spoke at the Institute’s May 1995 meeting of the Land Conservation in New England Study Group.

Additional information in printed newletter:
Map: Balcones Canyonlands, Austin, Texas. Source: Adapted from maps by Butler/EH&A Team, City of Austin Environmental and Conservation Services, Balcones Canyonlands Conservation Plan, Preapplication Draft, Austin, 1992

Informe del presidente

La evolución de las herramientas de planificación computarizadas
Gregory K. Ingram, Abril 1, 2012

La utilización de modelos computarizados en la planificación del uso del suelo y el transporte y para analizar los mercados de viviendas urbanas tiene una larga y variable historia. Una aplicación pionera de modelo computarizado a gran escala que relacionaba el uso del suelo con el transporte urbano fue el Estudio sobre el Transporte en el Área de Chicago de 1960. Este estudio utilizaba un modelo con desglose espacial que incluía una detallada red de transporte y abarcaba los clásicos pasos de uso del suelo, generación de viajes, elección de la modalidad y asignación de redes de toda la planificación de transporte urbano.

Un modelo de gran repercusión, con un enfoque más analítico para predecir patrones de uso del suelo, fue el formulado por Ira Lowry en 1964 para Pittsburgh, en el que se empleaba la teoría de base económica para distribuir la actividad económica orientada a la exportación. Luego se utilizó la distribución de residencias y empleo comunitario dentro del área metropolitana para obtener patrones de desplazamiento por razones de trabajo y compras.

A comienzos de la década de 1970 se prestó más atención a los modelos con desglose espacial para los mercados de viviendas urbanas, tales como el Modelo de Vivienda del Instituto Urbano (que representaba los cambios ocurridos en el mercado de la vivienda durante una década) y el Modelo de Simulación Urbana de la Oficina Nacional de Investigación Económica (un modelo microanalítico que proyectaba anualmente el comportamiento de los miembros de 85.000 hogares según el lugar de trabajo y la ubicación residencial). Ambos modelos se utilizaron para analizar el impacto de los programas de asignación de viviendas y se aplicaron más en el análisis de políticas que en la planificación.

A finales de la década de 1970, el énfasis se puso en el desarrollo y la aplicación de bocetos de modelos de planificación, en especial en los referidos al transporte. Aunque estos modelos seguían presentando un desglose espacial, utilizaban decenas (en lugar de centenas) de zonas de tránsito, y las redes de transporte se representaban con menor detalle. Estos modelos fueron adaptados para representar los resultados relacionados con el transporte más allá de los flujos de las redes, tales como emisiones vehiculares, exposición de la población a la contaminación del aire, kilómetros de viaje por vehículo y consumo de energía. En la década de 1980, estos modelos más pequeños pasaron a gestionarse desde computadoras centrales a computadoras personales, lo que facilitó su aplicación. Las necesidades de datos continuaban siendo muy importantes, pero muchos de los modelos utilizaban de manera más sistemática los datos provenientes de censos con desglose espacial que se encontraban disponibles, por lo que se vieron facilitadas la transferencia y la calibración de modelos entre los diferentes lugares.

En las dos últimas décadas, la llegada de sistemas de información geográfica (SIG) y el desarrollo de programas de representación visual de datos en tres dimensiones han ido transformando la manera en que se utilizan las computadoras a la hora de llevar a cabo una planificación. Los datos compatibles con SIG hoy están disponibles gracias a los satélites, las fuentes encargadas de realizar censos y las agencias gubernamentales. Los municipios locales se han adaptado rápidamente a fin de combinar sus datos catastrales con información sobre actividad delictiva, transporte y demografía, y dichos archivos de datos municipales con frecuencia se encuentran disponibles en Internet. Aunque resulta evidente que se ha incrementado la disponibilidad de datos provenientes de SIG, la gran variedad de formatos, definiciones y tipos de cobertura puede representar un desafío a la hora de combinar la información de diferentes fuentes en un conjunto unificado de datos para una región metropolitana.

La utilización de visualizaciones en tres dimensiones de datos con desglose espacial ha transformado la presentación de los datos y los resultados de los modelos. Estas técnicas, tales como los mapas en 3D a nivel metropolitano y la capacidad de “volar” a través de una calle o barrio a nivel del proyecto, facilitan la consulta popular. Además, resulta mucho más fácil para aquellos que no son especialistas comprender y participar en el proceso e interpretar los resultados de escenarios de planificación alternativos.

Junto con los avances en cuanto a los datos y a la presentación de los mismos, los programas de computación son en la actualidad más fáciles de usar y se encuentran cada vez más disponibles en plataformas de código abierto. Aunque los códigos de muchas de las primeras herramientas de planificación computarizadas han estado a disposición del público en general, la utilización de dichas herramientas generalmente ha requerido tener conocimientos avanzados de programación. A medida que una mayor cantidad de estas herramientas se presenten en formatos fáciles de usar e integradas a otros módulos, la utilización de métodos computarizados para comparar y contrastar escenarios de desarrollo alternativos se hará cada vez más accesible. De hecho, actualmente muchas agencias de planificación son capaces de usar las herramientas de planificación de escenarios con el fin de producir posibles futuros alternativos que brinden un fundamento para el debate y la consulta popular, con el objetivo de identificar cuáles son los resultados deseables y cuáles deben evitarse.

Tal como se informa en este número de Land Lines, el Instituto Lincoln apoya el uso de distintas herramientas de planificación para investigar y evaluar la efectividad de las políticas que apuntan a mejorar los resultados referentes al desarrollo del suelo.

Charting Progress

PolicyMap Democratizes Data Analysis
By Alex Ulam, Octubre 1, 2015

Housing costs are spiraling upward in many areas throughout the United States, cutting down on the ability of Americans to save and leading to the gentrification of formerly affordable neighborhoods. However, as with many public policy challenges, it is not always immediately apparent where problems are the most acute. This became clear to Helen Campbell, an analyst in Los Angeles’s Housing + Community Investment Department, late on a Friday afternoon in July. An information request from the mayor’s office led her to discover that a large part of the San Fernando Valley in L.A. was home to the nation’s highest rental cost burden, which the U.S. Department of Housing and Urban Development (HUD) defines as a situation where families are paying more than 30 percent of their income on renting a home.

Los Angeles officials knew they had areas where home owners and renters were struggling to pay for housing, says Campbell, but they had no idea how severe the situation was or even where it was most pronounced. The mayor’s office needed authoritative data on this troubling trend for a lobbying effort to preserve the HOME Investment Partnerships Program (HOME), the largest federal block grant program for affordable housing. Currently, in Washington, DC, lawmakers are considering a Senate bill that would eviscerate the program.

If Campbell had used conventional geographic information software (GIS), it would have taken her an inordinate amount of time to analyze the city’s housing cost burden. But she was able to access the necessary information quickly by typing several simple queries into PolicyMap—a unique web-based software program that is changing the way that planning data is gathered and displayed. “If we didn’t have PolicyMap, we simply would have said no to the request,” Campbell says, “It would have taken too many hours to do the work.”

When Campbell ran her PolicyMap search, she discovered that the 29th Congressional District, part of which is situated within the city of Los Angeles, was, out of all of the 435 congressional districts in the country, number one in terms of rental cost burden and number three in terms of home owner cost burden. Those rankings for the 29th Congressional District, which includes a large part of the San Fernando Valley, translate into 62.9 percent of renters and slightly more than 50 percent of home owners there suffering from a housing cost burden. “We thought that South L.A. or Northeast L.A. would have higher rent burdens, but you have Valley as being the higher rent burden,” Campbell says.

Public Data for All

Since its launch in 2007, PolicyMap has grown into the largest geographic database on the web, and become the go-to public information resource for financial institutions, universities, nonprofits, and close to 2,500 government agencies. The online tool currently has more than 37,000 indicators, on categories ranging from crime to grocery store access, making the world of public data significantly easier to parse. Last year, the site had 434,000 unique visitors. Most of the data housed on PolicyMap is free, but proprietary data is available from various providers through paid subscriptions. Overall, PolicyMap’s easy-to-use mapping tools have helped democratize data analysis by making the process relatively affordable for nonprofits and local governments, which usually don’t have the resources to hire teams of GIS specialists. The site can help anyone in the public policy world avoid getting stuck on the wrong side of the widening digital divide.

One of the website’s most notable attributes is its capacity to simultaneously display various types of indicators, such as Superfund Sites, neighborhood income levels, or developments built with low-income housing tax credits. That capacity can facilitate contemporary planning initiatives, like the Obama administration’s Promise Zone or Choice Neighborhood programs, which require interagency collaboration and emphasize coordination of various types of investments in underserved areas.

PolicyMap also allows users to chart the effectiveness of particular programs over a period of time, helping them reap rewards or cut their losses down the road. Although government money is primarily doled out through formulas, there has been a marked increase in competitive grant programs that require progress reports and data that details evidence of needs. When it comes to competitive grants, according to Lincoln Institute President and CEO George W. McCarthy, “cities that have better data, and put together more polished proposals, are obviously going to have an advantage over those that don’t.”

The Starting Point

PolicyMap is the brainchild of The Reinvestment Fund (TRF), a Philadelphia-based Community Development Financial Institution (CDFI), which has $839 million in capital under management, and which invests in low-wealth people and neighborhoods. The organization finances a wide array of community building blocks, such as affordable housing developments, daycare centers, and grocery stores. PolicyMap was born out of TRF’s need to track how these community programs were working on the ground.

In the early 2000s, TRF began exploring ways to map and understand the impact of its own investments. “We were looking at where we were making investments over time,” says PolicyMap President Maggie McCullough, who was then a researcher with TRF’s Policy Department. “We also wanted to know what kind of impact we were making—how we had changed the markets in which we were working.”

In 2005, the state of Pennsylvania hired TRF to collect and map a vast amount of data on housing prices, foreclosures, and incomes. The project’s goal was to enable officials to think more strategically about how state money was being spent on housing throughout the state. But even with a contract worth almost $200,000, there were limitations to what TRF could do. The data and maps were trapped in a fixed format on a disk. “After we handed the disk over,” McCullough says, “I remember thinking that it was going to be like a paper report that sat on a shelf and was never going to get updated.”

That epiphany inspired McCullough and others at TRF to brainstorm on how to build a dynamic web-based mapping platform—one that would allow datasets to be refreshed and enable users to upload their own databases. In developing PolicyMap, McCullough was able to draw on her background as one of the pioneers in designing web portals for public information. In the 1990s, she was part of the team that built the U.S. Department of Housing and Urban Development’s (HUD) initial web presence. “My experience [at] HUD made me realize that if people [other than] researchers needed or wanted to understand data, we had to make it easier to understand,” says McCullough. “We had to give data indicators common names and simple descriptions, just like we had to give HUD programs common names.”

McCullough wanted PolicyMap to serve the entire country, unlike other data initiatives that focused on local geographies. Upon PolicyMap’s launch in 2007, “there really wasn’t any online GIS,” McCullough explains. “You could get driving directions and find a local restaurant with Google Maps, but a lot of that GIS software was locked on desktops. We wanted to create something that the public could access simply, over the web.”

The first dataset that TRF loaded onto PolicyMap in 2007 was comprised of reports from the Home Mortgage Disclosure Act (HMDA), the government’s most important data source for spotting predatory and discriminatory lending. At the time, the housing bubble was bursting, and officials from government and law enforcement were scrambling to get a grip on the burgeoning crisis; the HMDA data was one of the first places where they would look for information. But HMDA data wasn’t arranged in GIS user-friendly format, making certain types of searches extremely difficult. For example, if a researcher with a background in GIS wanted to zero in on a section of Detroit where she suspected there might have been a flood of high-cost loans, there was no online tool available to extract the HMDA data for that particular area.

PolicyMap’s initial success making data publicly available helped attract prominent paying customers—including the Federal Reserve Board in Washington, DC, which was in charge of collecting the HMDA data at the time. In addition to loading all of the HMDA data for mapping purposes and making it available to the general public, McCullough’s team custom-built a reporting tool within PolicyMap for the Fed that enabled its staffers to pull out HMDA data for any locale they wanted to study. Says McCullough, “We had made it easier for [The Fed] to access its own data.”

Leveling the Playing Field

Big lenders and real estate investors typically have in their toolkits subscriptions that can cost in the six figures for access to services that provide proprietary information such as property evaluation reports and in-depth market research. But many community-based organizations and local governments can’t afford to buy such licensed data. And even if they could afford expensive subscriptions, many community organizations and local governments lack the staffers or GIS capabilities to use it on interactive maps.

Take NeighborWorks, a national network of 240 community-based organizations that doesn’t have a GIS specialist on staff. Harry Segal, a performance and planning specialist at NeighborWorks America, says that PolicyMap has changed the equation for his network by giving them access to data and mapping tools that they couldn’t otherwise afford. “Any developer, public or private, trying to move into a new neighborhood has to court the powers that be and demonstrate an understanding of local market conditions,” Segal says. “It’s much more difficult for nonprofit organizations to compile this sort of data.” Without PolicyMap, he says, “the juice almost isn’t worth the squeeze.”

NeighborWorks’ PolicyMap subscription, which costs $5,000 per year, provides access to this kind of proprietary data and allows the organization’s members to query different sections of a map for information on a variety of indicators such as the average income of residents within a certain neighborhood and the level of high-cost mortgages that have been made there. This ability to look at different geographic scales empowers local community groups that are trying to access funding or call attention to predatory lending in their neighborhoods. “We have a couple of organizations in upstate New York. If you are looking at statistics on that region, they are going to be heavily skewed by New York City,” Segal says. “But with PolicyMap, we can pull up data by census tract or block group.”

Some city agencies also lack the capability to design or maintain the types of databases that they can now get through a PolicyMap subscription. “I am the only person here who has GIS capabilities,” says Sara Eaves, a planning and policy analyst for the San Antonio Housing Authority. She adds that PolicyMap allows many people in her office to perform tasks that would otherwise require specialized training. Through their PolicyMap subscription, the San Antonio Housing Authority also makes data publicly available about schools, residential vacancy rates, neighborhood income levels, and other information that a city resident might want to consider when deciding where to buy a house or rent an apartment. “We could maintain similar databases in-house, but we don’t have the resources. PolicyMap has allowed us to put interactive maps on our website, which is making the information available not just internally, but to the general public as well.”

Streamlining the Process for Cities and Community Groups

Many policy analysts use both full-blown GIS software, such as Esri, and the simplified GIS tools available on PolicyMap. Campbell from the Los Angeles Department of Housing + Community Development says that Esri offers the ability to do forecasts and run certain types of complex analyses that are not possible with PolicyMap. But she notes that PolicyMap saves her time and makes it easier to explain her research to laypeople. “I like PolicyMap because it is just based on facts and it is irrefutable,” she says, whereas Esri contains predictions about the future. “Sometimes, when you hand someone a community analysis report with Esri data, it may be too much information for them to digest. There will be 2005, 2010, and 2015 information. But for the 2020 information, there is a formula for how they created that forecast, which we may not need, and which may be wrong.”

PolicyMap is also flexible enough to respond to users’ changing needs. As data requirements have become larger and more complex, long-time PolicyMap customers have requested new tools to help improve efficiency. For instance, Melissa Long, the deputy director of Philadelphia’s Office of Housing and Community Development, had been using PolicyMap to display aggregated and cleaned-up census data. But several years ago, she realized that her agency needed more comprehensive analytic tools in order to apply for the increasing number of grants that are being awarded on a competitive basis.

“We needed a lot of neighborhood demographic information, and we needed to know what types of city programs were being deployed,” Long says, noting that having city data available on PolicyMap has improved the coordination among different city agencies and better positioned the city to apply for competitive grants.

Long says the tools that PolicyMap has developed for Philadelphia will enable the city to monitor its progress while implementing a Choice Neighborhoods Implementation Grant, which supports locally driven strategies to address struggling neighborhoods with distressed public or HUD-assisted housing. “The grant covers a five-year period. If we look and see that our neighborhood stabilization proposal is not working,” she says, “then we can make midterm grant corrections.”

Being able to map different types of data simultaneously also lets researchers chart the co-benefits from a particular investment. For example, two different programs in Philadelphia involve cleaning up and greening vacant lots. PolicyMap lets users see the lots rehabilitated by both programs simultaneously, and study whether they have improved the quality of life in surrounding neighborhoods. Philadelphia’s contract with PolicyMap has made it possible to overlay data from multiple studies—such as one from University of Pennsylvania’s Wharton School that showed how real estate values rose 17 percent on average around the cleaned-up lots, and another that showed how gun crime dropped significantly in the areas around them. A third co-benefit is the several hundred summer jobs that are tied to keeping the rehabilitated lots in good shape. “You cannot just look at housing alone,” Long says. One has to consider “all the other things going on in a neighborhood.”

One of PolicyMap’s most popular analytic tools is the Market Value Analysis (MVA), which TRF developed for Philadelphia and has replicated in about 18 other cities. MVAs evaluate the strength of different areas of a city by looking at color-coded sections of a map that denote assigned values, which range from “Distressed” to “Regional Choice,” which is the highest rating. The rankings are established using a technique called Cluster Analysis, which evaluates census blocks based on groups of indicators, such as home sale activity, vacancy rates, and foreclosures. When you click on any section of the map, a table pops up to reveal the data that was used to determine the ranking for that specific area. The Regional Choice Neighborhoods, McCullough says, are generally defined by strong sales values, low vacancy rates, and a mixture of home owners and renters.

Those MVAs provide government agencies and nonprofits the information they need to address an area’s specific problems, says the Lincoln Institute’s McCarthy. “You want to get the best bang for your buck from public money,” he says. “In the really terrible neighborhoods, that might mean investing in large-scale demolition to accelerate the reuse of properties. In a transitional neighborhood, you might want to acquire abandoned homes and fix them up.”

The Road Ahead

The PolicyMap team often releases new indices and new tools right on the heels of court decisions and agency rulings This past July, for instance, McCullough and her team released the Racially and Ethnically Concentrated Areas of Poverty (RCAP/ECAP) index, which is used to identify U.S. Census tracts that have both a high proportion of nonwhite individuals and people living below the poverty line. McCullough says that her team anticipated the Supreme Court’s ruling in June on “disparate impacts” in housing practices and, several months earlier, had started developing the index to help individuals and organizations understand the issues related to the court’s decision. “The timing was great,” she says. “When [the Supreme Court decision] happened, we were ready to go.”

PolicyMap is still missing major data sets that McCullough would like to upload, to help researchers get a better picture on critical issues facing the country. For example, McCullough says that she has long wanted to incorporate national foreclosure data as part of PolicyMap’s efforts to track factors influencing home sale prices, but it’s difficult to find comprehensive and authoritative foreclosure data sets. Plus, it’s still prohibitively expensive to purchase licenses for the foreclosure data from private vendors. PolicyMap clients have also expressed interest in accessing credit scores—some of the most difficult data to obtain. “We couldn’t even get permission from the credit-score agencies to license the data,” McCullough says. “And if we were going to get it from them, it would be aggregated at a high geography, [like] at a statewide level.”

Meanwhile, PolicyMap will get one of its biggest-ever data resources this coming October, with the first segment of a project tentatively titled “State of the Nation’s Land,” subsidized by the Lincoln Institute. “State of the Nation’s Land” will include a collection of 18 huge databases from 150 different government agencies, covering criteria such as heavily polluted sites, public investments in land, flood zones, and zoning information.

The Lincoln Institute project is intended to help government agencies do their jobs better and provide average citizens with tools they can use to hold their elected officials more accountable. It should also shed more light on some our country’s most vexing problems, like the persistence of poverty in certain areas or reverse redlining, when minority consumers are targeted for loans on unfavorable terms. Ultimately, however—as with the discovery that the San Fernando Valley is in fact the most unaffordable place to live in the country relative to local residents’ income—we cannot even anticipate some of the most interesting facts and trends that will be unearthed in the future, as more researchers get savvy about navigating PolicyMap.

“Every time I get into PolicyMap, I start looking at new things,” says McCarthy. “There is a whole process of discovery that I go through, and it’s very illuminating.”

Alex Ulam is a journalist who focuses on architecture, landscape architecture, urban planning issues, and housing.

Land and Biodiversity Conservation

A Leadership Dialogue
James N. Levitt, Julio 1, 2002

The Lincoln Institute, with the Land Trust Alliance and the National Park Service Conservation Study Institute, is working with two dozen senior conservation practitioners from public, private, nonprofit and academic organizations across the nation to consider the grand challenges facing the North American land and biodiversity conservation community in the twenty-first century. The conservationists, who shared ideas electronically for several months prior to their March 2002 meeting in Cambridge, explored emerging and needed conservation innovations that may prove commensurate with the challenges. Organized by James Levitt of Harvard’s Kennedy School of Government, Armando Carbonell of the Lincoln Institute and Fara Courtney, an environmental consultant based in Gloucester, Massachusetts, the group exchanged ideas through presentations, case studies and working groups. E.O. Wilson, the distinguished author and biodiversity scholar at Harvard University, addressed the session and participated in the discussions. This article presents several highlights of that leadership dialogue on conservation in the twenty-first century (C21).

“We have entered the twenty-first century, the century of the environment. The question of the century is, how can we best shift to a culture of permanence, both for ourselves and for the biosphere that sustains us?” E.O. Wilson

The Historical Context

Ask almost any American concerned with natural resources, “How and when did we start practicing conservation in this country?” In most cases, the response involves the role of the federal government at the turn of the twentieth century under President Teddy Roosevelt. While Roosevelt and his close associate Gifford Pinchot do stand as giants in the history of conservation in this nation, the record shows that Americans have a remarkable tradition of conservation that stretches back at least to the early days of the Republic.

Individuals and organizations in the private, nonprofit, public and academic sectors have throughout our history brought landmark conservation innovations to life, and they continue to do so. They have focused their attention on sites that span the urban-rural continuum, from city parks to remote wildernesses. In the context of repeated waves of immigration and population growth, a chain of stunning technological advances and a pattern of long-term economic growth, American conservation innovators have acted creatively and often with considerable passion to protect and manage natural and scenic wonders, working landscapes, native wildlife and recreational open space for their own benefit, for the benefit of the public at large, and for the benefit of future generations.

Consider the history of the land trust movement. Thomas Jefferson set an early precedent for private and nongovernmental protection of natural beauty in America. In 1773, three years before he penned the Declaration of American Independence, Jefferson purchased a parcel of land known as Natural Bridge near the Blue Ridge Mountains. He treasured the parcel throughout his adult life, inviting writers, painters and dignitaries to visit the site and record its wonders. By 1815 he wrote to William Caruthers to say that he held Natural Bridge “to some degree as a public trust, and would on no consideration permit the bridge to be injured, defaced or blocked from public view.”

Some 60 years after Jefferson’s death, Charles Eliot, son of the president of Harvard University and a protégé of Frederick Law Olmsted, took another historic step toward the nongovernmental protection of open space. He proposed the formation of a private association to hold parcels of land for the enjoyment of the citizens of Massachusetts, particularly the less affluent residents of Boston who needed an escape from the “poisonous” atmosphere of the crowded city so closely associated with the technological progress and demographic turmoil of the Gilded Age. With a charter from the Commonwealth granted in 1891, that organization, now known as The Trustees of Reservations, became the first statewide nongovernmental land trust.

Eliot’s innovation has proved to be truly outstanding, a landmark conservation innovation that meets all the criteria for outstanding innovations in the public interest set out by the Innovations in American Government program at Harvard’s Kennedy School of Government. The notion behind the Trustees has proved to be novel in conception, measurably effective, significant in addressing an important issue of public concern, and transferable to a large number of organizations around the world. Furthermore, and critically important in the field of conservation, Eliot’s innovation has demonstrated its ability to endure and remain vibrant after more than a century. The Trustees’ current director of land conservation, Wesley Ward, emphasizes that nongovernmental conservation organizations will continue to be called upon in the twenty-first century to “provide leadership by identifying challenges, advocating effective responses and providing relevant models of conservation and stewardship.”

The Lincoln Institute played an important role in the resurgent growth of the land trust movement in the early 1980s, when it focused its resources as an academic institution on how an exchange of information among several dozen land trusts in the U.S. might strengthen conservation standards and practices throughout the entire land conservation community. Jean Hocker, at that time organizing the Jackson Hole Land Trust, remembers well the early discussions convened at Lincoln House by Boston-area lawyer Kingsbury Browne. She explains that emerging from those deliberations was the idea that “we ought to form a new organization called the Land Trust Exchange that could help us all do our jobs better.” Hocker moved to Washington, DC, in 1987 to run the group, which became known as the Land Trust Alliance (LTA). Under her leadership, the organization led the land trust movement into a period of rapid growth and enduring achievement. In 2002, there are more than 1,200 local and regional land trusts in the U.S. that have helped to protect more than six million acres of open space. Furthermore, the LTA’s annual Rally is a now a high point of the year for more than a thousand land conservation volunteers and professionals spread across the continent and beyond who convene to share their best ideas.

The Trustees’ long history of conservation innovation and achievement is paralleled by the histories of many other public, nonprofit, academic and private sectors organizations represented by C21 participants. Nora Mitchell and Michael Soukup of the National Park Service underscore the significance of America’s creation of the world’s first national park at Yellowstone in 1872, an innovation of worldwide significance that was in part the brainchild of two private railroad entrepreneurs, Jay Cooke and Frederick Billings. Laura Johnson, president of the Massachusetts Audubon Society, takes justifiable pride in the achievement of her organization’s “Founding Mothers,” two women who established the nation’s oldest continuously operating Audubon society in 1896 and catalyzed the campaign that led to the signing of the first international migratory bird treaty. Robert Cook, director of Harvard’s Arnold Arboretum, explains the pivotal role of that institution in the emergence of American forestry policy as far back as the1870s. And Keith Ross of the New England Forestry Foundation, who spearheaded the precedent-setting effort concluded in 2000 to place a conservation easement on more than 760,000 acres of forest land owned by the Pingree family in Maine, emphasizes that the family’s private forest stewardship practices date back to the 1840s.

Complex Conservation Challenges

Notwithstanding the conservation community’s collective record of achievement, the land and biodiversity conservationists at the C21 meeting foresee grand challenges of extraordinary complexity and difficulty in the coming 50 to 100 years. In the context of expected growth in North American and world populations, changes in demographic patterns and ongoing technological development, as well as systemic changes in climate and other earth systems, they express deep concern regarding myriad potential changes on the landscape. These may include the accelerating loss of open space; intensified landscape fragmentation; further degradation of wildlife habitat; alarming declines in the viability of a wide range of biological species; and potentially significant stresses to earth systems that provide essential ecosystem services. Will Rogers, president of the Trust for Public Land, notes, “from a conservation viewpoint, the pace of growth and development is rapidly running us out of time.”

The concern of many C21 participants regarding the potential impact of growing human populations starts with the straightforward projection of the U.S. Census Bureau that the population of the U.S. will grow from some 280 million Americans in 2000 to about 400 million by 2050. Beyond the numbers, it is critical for conservation planners to understand that the diversity of the American population is forecast to change significantly, with particularly strong growth in the ranks of Hispanic Americans and Asian Americans. Jamie Hoyte, an authority on conservation and diversity at Harvard University, explains, “one of the most significant challenges we face is broadening and diversifying the community of conservation-minded citizens. Those who advocate for conservation must do so in a way that speaks to people of all backgrounds and races, demonstrating an understanding of the needs of a broad range of people.” Robert Perschel of the Wilderness Society expands on the idea, advising that we need to “enter into a new dialogue with the American people … to touch the hearts and spirits and wisdom of our citizenry.”

C21 participants also pointed out that new conservation initiatives are likely to be launched in the context of continuing economic growth and personal affluence. For perspective, note that real U.S. gross domestic product (GDP) grew more than five-fold between 1950 and 2000, and many economists expect to see comparable growth in coming decades. To protect open space and biodiversity in the midst of such great affluence, conservationists will need to leverage the nation’s economic power. According to Chip Collins of The Forestland Group, “North America’s economic growth has helped fuel the loss of biodiversity. At the same time, North America has led the world in the development and implementation of conservation strategies in large measure because of the extraordinary growth and vigor of its economy. One of the great challenges will be to manage this seeming dichotomy by effectively harnessing the private sector and redirecting its immense capital power base toward constructive conservation initiatives. The private sector, in stride with its nonprofit, public and academic counterparts, must be a full and constructive partner.”

As in the past, new and increasingly powerful technologies are likely to continue to proliferate. While offering considerable social and economic benefits, the new technologies may also be closely associated with large-scale environmental disturbances. In the past half-century, for example, the spread of interstate highways has effectively stimulated the American economy but has also been associated with pervasive environmental disruptions such as urban and rural landscape fragmentation, the creation of unhealthy air quality conditions, and the generation of significant volumes of gases associated with global climate change. Similarly, more recently introduced networked technologies, such as the Internet and advanced wireless communications networks, appear to be enabling continued net migration of Americans to formerly remote and highly environmentally sensitive locations across the continent. Technology-related change is not limited to the U.S., of course. Larry Morris of the Quebec-Labrador Foundation explains that new communications and transportation networks are influencing where and how people live worldwide, from Atlantic Canada to the Middle East.

Biodiversity scientists E.O. Wilson of Harvard and Leonard Krishtalka of the University of Kansas point out that while emerging technologies may be associated with environmental disruptions in coming decades, the same technologies are also proving critical to advancing our understanding of the diversity of life on earth. Krishtalka explains that “researchers are now learning how to harness the vast store of authoritative biodiversity information in natural history museums worldwide (about three billion specimens of animals and plants) and integrate it with other earth systems data for predictive modeling of environmental phenomena.” Such a predictive model was recently built by researchers in Kansas, California and Mexico to examine the fate of a wide variety of Mexican species under a range of global warming scenarios. The outcome of this and similar studies should be particularly useful to organizations striving to prioritize land and habitat protection opportunities in ecosystems throughout the western hemisphere that may be facing significant disruption in future.

In sum, despite remarkable progress, conservationists are in no position or mood to rest. John Berry of the National Fish and Wildlife Foundation advises, “if our standard is that of the ancient Greeks, that is, to leave our nation ‘not only not less, but richer and more bountiful than it was transmitted to us,’ than we have not yet earned the laurel crown.”

A New Generation of Conservation Innovators

Inspired by the precedents set by creative American conservationists in the nineteenth and twentieth centuries, twenty-first century conservation practitioners are highly motivated to identify and implement new initiatives commensurate with the complex challenges of our day. C21 participants expressed interest in a wide variety of areas ripe for game-changing innovation, including the following.

Winning Hearts and Minds

Bill Weeks of The Nature Conservancy emphasizes that “the grandest challenge is to complete the task of getting the overwhelming majority of the public to care and act and vote like they care.” Rand Wentworth of the Land Trust Alliance agrees that conservationists should use the “tremendous power” of mass marketing to help create a national mandate for land conservation. Clare Swanger of the Taos Land Trust adds that the insight of mass marketers, but also of people living on the land, should be employed in such an effort. The outstanding question facing these conservationists is how to leverage modern marketing tools in a truly historic fashion. The aim would be to put together an effort comparable to the highly effective antismoking campaign of the last several decades, so as to build sustained momentum for the long-term protection and stewardship of “land for life.”

Building the Green Matrix

Addressing the multiple problems of open space consumption, loss of working landscapes, habitat fragmentation and biodiversity decline is a job that no single sector can tackle alone. Larry Selzer, president of the Conservation Fund and a proponent of smart conservation that balances economic returns with environmental principles, explains that effective action will require the cooperative efforts of landowners, policy makers and a wide diversity of individuals working across sectors. Furthermore, as Charles H.W. Foster of Harvard’s Kennedy School of Government points out, effective conservation efforts are at least as likely to take place at local and regional levels as at federal and international ones. Just how effective “green matrix” landscapes and organizational structures can be effectively assembled and maintained over the long term remains an area for thorough exploration and experimentation. Among other C21 participants, Peter Stein of the Lyme Timber Company, Jay Espy of the Maine Coast Heritage Trust, and Ian Bowles of the Kennedy School and the Moore Foundation are actively advancing the evolving art of assembling protected landscapes where economic and conservation goals can be pursued simultaneously.

Following Through with Stewardship

Achieving long-term conservation goals, of course, requires that once protection is gained for a given area a well-crafted stewardship plan, and in some cases an environmental restoration plan, must be conceived, agreed to by the relevant parties and then implemented. Getting this done has proved to be neither simple nor easy. Financing and organizing such stewardship efforts is too often overlooked during intense, short-fused campaigns to protect given parcels of land. Bringing a new level of attention and expertise to land and habitat stewardship and restoration efforts will be an ongoing challenge to the conservation community, particularly as its portfolio of protected lands grows in coming decades.

Fortunately, conservationists can point to some forward-looking stewardship efforts now underway. For example, Ralph Grossi, president of the American Farmland Trust, notes that the 2002 Farm Bill will provide significant levels of funding for USDA-sponsored stewardship efforts on agricultural lands. Similarly, Jaime Pinkham, a member of the Nez Perce Tribe in Idaho, offers eloquent testimony about how tribes can work with local, federal and other authorities to restore keystone species to entire ecosystems, as was accomplished with the gray wolf in the Northern Rockies. Still, there is room for a great deal of progress and innovation in this area.

Synthesizing Conservation Science

Conservation scientists E.O. Wilson, Leonard Krishtalka and Douglas Causey all underscore the argument that very significant progress can be made in the coming century to build large-scale syntheses in conservation biology and ecology. Wilson is particularly emphatic about the need to catalog all living species, a global work-in-progress that is only about 10-percent complete. The All Species Foundation that Wilson helped to form proposes to “complete the censusing of all the plants, animals and micro-organisms in the world in 25 years.” “Is this a pipe dream?,” asks Wilson, rhetorically. “No way,” he answers. “It is megascience backed by the same sort of technology drivers as the Human Genome Project. The important thing is to see the exploration of the biosphere as a crucial task.”

Gaining a comprehensive understanding of the biosphere and the ability to predict ecosystem outcomes under a variety of possible futures is indeed a grand challenge for conservation scientists. Kathy Fallon Lambert of the Hubbard Brook Research Foundation adds, “a complementary challenge is to find clear and concise ways to explain significant field and laboratory research findings to the general public and to key decision makers so that they can carry out policy debates with the best available scientific information.”

From our vantage point at the commencement of this century we cannot accurately predict just what future generations, 50 or 100 years from now, will judge to be our generation’s most significant conservation innovations, comparable to earlier creations of the world’s first statewide land trust or national park. We do know, however, that we face significant and complex conservation challenges, and our ideas for powerful innovation will only yield results if we act on them with great personal and organizational energy and intensity. There is no argument that the best time to begin such efforts is now.

James N. Levitt is director of the Internet and Conservation Project, Taubman Center for State and Local Government, Kennedy School of Government, Harvard University. His research focuses on the potentially constructive and disruptive impacts of new communications and transportation networks on land use and the practice of conservation, as well as opportunities for landmark conservation innovation in the twenty-first century.

Progress Toward Value-Based Taxation of Real Property in Lithuania

Kestutis Sabaliauskas and Albina Aleksiene, Octubre 1, 2002

The Republic of Lithuania, which declared its independence from the USSR in 1990, is the largest and the southernmost of the Baltic countries, with a total area of 65,300 sq. km. and a population of 3.5 million. Although the other Baltic countries introduced market value-based land taxes earlier, Lithuania anticipates that its up-to-date real property information system and administration network, managed by the State Land Cadastre and Register (SCLR), will speed its implementation. SLCR has been assigned the task of valuing property for taxation, and will utilize its computerized real property information system of land and building data for this purpose.

Tax systems in Lithuania, established early in the post-Soviet period, are gradually being reformed to accommodate development of democratic institutions and market economies, and to advance negotiations for entry into the European Union. The Lithuanian Governmental Action Program for 2001-2004 identified the introduction of market value-based taxes on land and buildings as a priority, contemplating an expanded tax base and a greater role for local government in fiscal decision-making.

Taxes on Land and Buildings

Currently there are two national taxes: a 1.5 percent land tax paid by landowners and 1.0 percent property tax on the value of property (excluding land) paid by corporations and other legal entities. The tax proceeds are returned to the municipalities, where in 2001 they provided on average just over 8 percent of municipal budgets. The revenue from the property tax was nearly 10 times higher than the revenue from the land tax, and has increased annually, representing 2.3 percent of national budget revenues. Neither tax has a market value base at present, although some market elements have been introduced gradually in the land tax base.

Development of the Mass Valuation System

Information Systems

Lithuania initiated development of computer-based real property data 10 years ago. Since establishment of the SLCR in 1997, a fully computerized Real Property Registration System links land parcels and buildings, and cadastre and register data into one unified system. The computer network covers the entire country and links counties and districts to the central databank, so that computerized registration of real property can take place in any branch office or client service bureau throughout the country. Analysis of the data permits monitoring of changes in the real property market, statistical analysis, and utilization of computer-assisted mass appraisal techniques. Figure 1 illustrates the current operation of the Real Property Register and flows of information on real property.

As of August 2002 nearly 4.7 million properties were registered, including more than 1 million land parcels, 615,000 buildings, 1.6 million auxiliary buildings, 950,000 flats and premises, and 395,000 engineering constructions. The central databank is expected eventually to register 6 million properties, including 2.3 million land parcels and 3.7 million buildings of different types.

Sales Data

The SLCR has been collecting real property sales information since 1998, and there are a sufficient number of transactions of flats, garages and land parcels to support mass valuation modeling based on market principles. The SLCR has created a databank of real property sales, and when a new real property unit is formed, it is inventoried and described in the Real Property Cadastre and all property rights are registered in the Real Property Register. At the conclusion of a transaction, a new owner registers the ownership in the register, but the data in the cadastre are not changed. When the transaction is registered the sale price indicated in the purchase-and-sale agreement is recorded into the database, allowing the price information to be supplemented by descriptive (cadastral) attributes. Table 1 shows the number of property sales registered during 1998-2001.

Mass Valuation Pilot Project

To prepare for the implementation of value-based real property taxation, the Ministry of Finance assigned to SLCR the task of undertaking a pilot project using mass valuation techniques. The results will be presented to the Ministry of Finance and other interested state institutions.

SLCR’s objectives are to complete the development of a real property mass valuation system to accomplish the following goals:

  • introduce data analysis and mass valuation technologies into practice;
  • prepare property mass valuation methods, corresponding to Lithuanian conditions;
  • train specialists to carry out mass valuation; and
  • propose improvements to the real property database and adaptations for purposes of mass valuation.

    At the conclusion, SLCR will be able to analyze various possibilities for introducing a computer-assisted mass appraisal (CAMA) system in Lithuania, and to prepare proposals regarding ad valorem property tax administration and relevant institutional infrastructure development. The project involves 40 property valuers from both SLCR’s central and branch offices, who have been trained by specialists within SLCR and international experts, including the Lincoln Institute, Organisation for Economic Co-operation (OECD), Swedesurvey and the Finnish National Land Survey.

Progress of Project Implementation

Property valuations have been nearly completed in the 11 municipalities selected as demonstration projects, one located in the territory of each SLCR branch office. The experience gained from these pilot projects will be valuable in extending the valuation throughout the entire country. Table 2 summarizes the progress made by SLCR and the Ministry of Finance in completing various steps in the implementation of the mass appraisal system.

Kestutis Sabaliauskas is director general of the State Land Cadastre and Register (SLCR) of Lithuania and Albina Aleksiene is chief of the Market Data Analysis Group.

A History of SLCR and Lincoln Institute Collaboration

The Lincoln Institute and the Lithuanian State Land Cadastre and Register (SLCR) have been collaborating on a series of seminars and research studies since 1997, in preparation for the introduction of market value-based taxation of real property in this Baltic country. A May 2001 Land Lines article, “Market Value-Based Taxation of Real Property,” reported on a weeklong course presented in February 2001 at the Lincoln Institute for a group of senior-level public officials from Lithuania. Participants included representatives from Parliament, the Prime Minister’s office and the Ministry of Finance; the United Nations Development Program provided financial support for the program. Their visit was important both in developing knowledge of real property taxation systems and in creating a working group of representatives from different governmental institutions who were eager to cooperate in establishing an up-to-date taxation system in Lithuania.

In November 2001, the Institute conducted a follow-up series of programs on market value-based taxation in Vilnius for representatives from institutions including the Government of the Republic of Lithuania, several ministries, the Tax Inspectorate, the Association of Municipalities, and the Lithuanian Association of Property Valuers. A second seminar, “Value-Based Taxation Of Real Property in the Baltic Countries: A Comparative Review,” drew participants from Estonia, Latvia and Lithuania to discuss the progress of property tax reforms and shared experiences in undertaking mass valuations. A third seminar, organized in cooperation with the Committee of Budget and Finance of the Lithuanian Parliament, attracted many members of Parliament and top-level governmental officials involved in shaping various aspects of tax policy: policy considerations in introducing a real property tax based on market value; the challenges and benefits of value-based taxation; and ways of implementing an efficient real property tax acceptable to the general public in Lithuania. Over 100 representatives of various institutions of Lithuania and the Baltic States attended one or more of these November seminars.

In May 2002 a faculty group organized and sponsored by the Lincoln Institute visited Lithuania for another series of meetings and briefings organized by SLCR to explore effective approaches to implementing value-based real property tax system. SLCR staff presented extensive information on its activities and readiness to perform mass valuations at central headquarters as well as local offices, where most property valuers work. One outcome of the May meetings is development of an educational program on mass valuation using Lithuania as a case study, which may be valuable to other countries in economic transition. This case will be presented during the next collaborative program to be held in Vilnius in 2003.

Lincoln Institute faculty participating in these programs are Joan Youngman, senior fellow and chairman of the Institute’s Department of Valuation and Taxation; Jane Malme, fellow of the Lincoln Institute; Richard Almy and Robert Gloudmans, partners in Almy, Gloudemans, Jacobs & Denne, Phoenix; and John Charman, consultant valuation surveyor, London.

Report from the President

The Evolution of Computer-Based Planning Tools
Gregory K. Ingram, Abril 1, 2012

The use of computer models in the planning of land use and transportation and for the analysis of urban housing markets has a long and variable history. One pioneering application of a large-scale computer model that linked land use and urban transportation was the 1960 Chicago Area Transportation Study. It used a spatially disaggregated model that included a detailed transportation network and embodied the classic land use, trip generation, modal choice, and network assignment steps of urban transport planning.

Applying a more analytic approach to predicting land use patterns, an influential model formulated by Ira Lowry for Pittsburgh in 1964 used economic base theory to distribute export-oriented economic activity. This was followed by the allocation of residences and population-serving employment within the metropolitan area to derive work and shopping trip patterns.

More attention to spatially disaggregated models of urban housing markets followed in the early 1970s in the form of the Urban Institute Housing Model (representing decadal housing market changes) and the National Bureau of Economic Research Urban Simulation Model (a microanalytic model annually projecting the behavior of 85,000 households identified by workplace and residential locations). Both models were used to analyze the impact of housing allowance programs and were applied more for policy analysis than planning.

In the late 1970s, the focus turned to the development and application of sketch planning models, particularly in transportation. While these models were still spatially disaggregated, they used tens instead of hundreds of traffic zones, and transport networks were represented in less detail. Such models were adapted to represent transport-related outcomes beyond network flows, including vehicular emissions, exposure of populations to air pollution, vehicle miles of travel, and energy consumption. These smaller models migrated from mainframe computers to personal computers in the 1980s, easing their application. Their data needs were still great, but many of them made more systematic use of available spatially disaggregated census data, aiding the transfer and calibration of models among locations.

In the past two decades, the advent of geographic information systems (GIS) and the development of software to visually display data in three dimensions have been transforming the use of computers in planning. GIS-compatible data are now available from satellites, census sources, and government agencies. Local municipalities have moved rapidly to combine their data on property records with data on crime, transport, and demographics, and such municipal data files are often available on the web. While the availability of GIS data has clearly increased, variations in formats, definitions, and coverage can make it challenging to combine information from different sources into a unified data set for a metropolitan region.

The use of three-dimensional displays of spatially disaggregated data has transformed the presentation of data and model results. These techniques, including 3D maps at the metropolitan level and the ability to “fly through” a street or neighborhood at the project level, facilitate community consultation. They also make it much easier for nonspecialists to understand and participate in the process and interpret the results of alternative planning scenarios.

Along with the advances in data and its presentation, computer software has become easier to use and more widely available on open source platforms. While the codes of many earlier computer-based planning tools have been available in the public domain, using them generally has required high-level programming skills. As more of these tools are presented in user-friendly formats and integrated with other modules, the use of computer-based methods to compare and contrast alternative development scenarios will be more accessible than ever. Indeed, many planning agencies are now able to use scenario planning tools to produce alternative possible futures that provide a foundation for discussions and public consultations to identify which outcomes are desirable and which are to be avoided.

As reported elsewhere in this issue of Land Lines, the Lincoln Institute is supporting the use of various types of planning tools for research and evaluation on the effectiveness of policies intended to improve land development outcomes.