Una versión más actualizada de este artículo está disponible como parte del capítulo 5 del libro Perspectivas urbanas: Temas críticos en políticas de suelo de América Latina.
El valor del suelo está determinado primariamente por factores externos, principalmente por cambios que ocurren en el ámbito vecinal u otras partes de la ciudad, más que por las acciones directas de los propietarios del suelo. Esta observación tiene especial validez en el caso de solares pequeños cuya forma o clase de ocupación no genera externalidades suficientemente poderosas como para lograr aumentos retroactivos de su valor. Un terreno pequeño, por lo general, no tiene influencia significativa en esos factores muy externos que podrían afectar su propio valor. En cambio, los grandes proyectos urbanos (“GPU”) sí tienen peso en esos factores, como también en el valor del suelo que los sustenta. Este escenario sienta la base del interés del Instituto Lincoln en esta temática.
Para el análisis de los GPU proponemos dos perspectivas que complementan y hacen contraste con otras que solían predominar en este debate: La primera apunta a la idea de que los GPU pueden ser una fuerza estimulante que impulsa cambios urbanos inmediatos capaces de afectar los valores del suelo y en consecuencia su uso, bien sea para grandes áreas como también para una ciudad-región completa. Esta perspectiva se concentra en el diseño urbano o urbanismo y prioriza el estudio de las dimensiones físicas, estéticas y simbólicas de los grandes proyectos urbanos. La segunda, enfocada en el marco normativo, trata de entender la valorización del suelo generada por el desarrollo y la ejecución de estos proyectos como mecanismo potencial de autofinanciamiento y viabilidad económica, y analiza el papel de los GPU en la refuncionalización de ciertos terrenos o áreas de la ciudad. Ambas perspectivas demandan una lectura más integral que incluya la diversidad y los niveles de complejidad de los proyectos, su relación con el Plan de Ciudad, el tipo de marco normativo que requieren, el papel del sector público y el sector privado en su gestión y financiamiento, la tributación del suelo y las políticas fiscales, entre otros factores.
Los grandes proyectos no son algo novedoso en América Latina. A principios del siglo XX, muchas ciudades estuvieron marcadas por el efecto de programas de gestión público-privada que incluían la participación de actores externos (nacionales e internacionales) y complejas estructuras financieras. Algunos proyectos tuvieron el potencial de servir como catalizadores de procesos urbanos capaces de transformar sus alrededores o incluso la ciudad como un todo, así como también acentuar la polarización socioespacial preexistente. Con frecuencia se impusieron los proyectos sobre las regulaciones existentes, lo que llevó a cuestionar las estrategias de planificación urbana vigentes. Grandes empresas de desarrollo urbano y compañías de servicios públicos (inglesas, canadienses, francesas y otras) coordinaron la prestación de servicios con complejas operaciones de desarrollo inmobiliarios en casi todas las ciudades más importantes de América Latina.
Hoy en día los grandes proyectos tratan de intervenir en áreas de sensibilidad especial a fin de reorientar los procesos urbanos y crear nuevas identidades urbanas a nivel simbólico. Intentan también crear nuevas áreas económicas (en ocasiones, enclaves territoriales) que tengan capacidad de promover entornos protegidos de la violencia y pobreza urbana, y más favorables a las inversiones privadas nacionales o internacionales. Al describir los motivos que justifican estos programas, sus partidarios realzan su papel instrumental en la planificación estratégica, su supuesta contribución a la productividad urbana y su eficacia para reforzar la competitividad de la ciudad.
En un escenario de transformaciones provocadas por los procesos de globalización, las reformas económicas, la desregulación y la introducción de nuevos enfoques en la gestión urbana, no sorprende que estos programas hayan sido blanco de una gran controversia. Su escala y complejidad suelen incitar la aparición de nuevos movimientos sociales, redefinir oportunidades económicas, poner en duda marcos normativos de desarrollo urbano y reglamentos del uso del suelo, exceder las arcas municipales y ampliar escenarios políticos, todo lo cual altera la función de los grupos de interés urbanos. A esta diversidad de factores se le agrega la complicación del largo marco temporal que requiere la ejecución de estos grandes proyectos urbanos, usualmente excediendo los periodos de los gobiernos municipales y los límites de su autoridad territorial. Esta realidad plantea retos de gerencia adicionales y enormes controversias dentro del debate público y académico.
La contribución del Instituto Lincoln a este debate es recalcar el componente del suelo en la estructura de estos grandes proyectos, específicamente los procesos asociados con la gestión del suelo urbano y los mecanismos de recuperación o movilización de las plusvalías para el beneficio de la comunidad. Este artículo es parte de un esfuerzo continuo mayor para sistematizar la experiencia latinoamericana reciente con los GPU y para analizar los aspectos pertinentes.
Una gran gama de proyectos
Al igual que ocurre en otras partes del mundo, los grandes proyectos urbanos de América Latina comprenden una gran gama de actividades que van desde la recuperación de centros históricos (La Habana Vieja o Lima), pasando por la renovación de áreas céntricas descuidadas (São Paulo o Montevideo), la reconfiguración de puertos y malecones (Puerto Madero en Buenos Aires o Ribera Norte en Concepción, Chile), la reutilización de aeropuertos o zonas industriales en desuso (la arteria Tamanduatehy en Santo Andre, Brasil, o el aeropuerto Cerrillos en Santiago de Chile), las zonas de expansión (Santa Fe, México, o la zona antigua del Canal de Panamá), hasta la puesta en marcha de proyectos de mejoramiento de barrios o viviendas (Nuevo Usme en Bogotá o Favela Bairro en Rio de Janeiro) y así sucesivamente.
La gestión del suelo es componente clave de todos estos proyectos, y presenta diversos grupos de condiciones (Lungo 2004; en publicación). Un rasgo común es que los proyectos son gestionados por autoridades gubernamentales como parte de un plan o proyecto de ciudad, aun cuando disfrutan de la participación privada en varios aspectos. Por ello, los programas de naturaleza exclusivamente privada tales como centros comerciales y comunidades enrejadas, caen en una categoría diferente de proyecto de desarrollo y no se incluyen en esta discusión.
Escala y complejidad
En términos de área de tierra o del monto financiero de la inversión, ¿cuál es el umbral mínimo de la escala para que una intervención urbana pueda recibir el calificativo de “GPU? La respuesta depende de la dimensión de la ciudad, su economía, estructura social y otros factores, todos los cuales ayudan a definir la complejidad del proyecto. En América Latina, los proyectos suelen combinar una gran escala y un grupo complejo de actores asociados con funciones clave en la política y la gestión del suelo, incluidos representantes de los distintos niveles gubernamentales (ejecutivo, provincial y municipal), además de entidades privadas y dirigentes de comunidades de la zona afectada. Hasta los proyectos de mejoramiento relativamente pequeños suelen presentar una extraordinaria complejidad en lo que respecta el componente de reajuste del suelo.
Obviamente hay tremendas diferencias entre un proyecto propuesto por uno o unos pocos propietarios de una gran área (tal como ParLatino, zona de instalaciones industriales abandonadas en São Paulo) y otro que requiera la cooperación de muchos propietarios de áreas pequeñas. Este último requiere una serie compleja de acciones capaces de generar sinergias o suficientes economías externas para posibilitar la viabilidad económica de cada acción. La mayoría de los proyectos caen entre los dos extremos y frecuentemente exigen la previa adquisición de derechos de parcelas más pequeñas por unos pocos agentes, a fin de centralizar el control del tipo y gestión del desarrollo.
Para efectos del análisis y del diseño de los GPU en América Latina, es fundamental que la organización institucional encargada de la gestión del proyecto tenga capacidad para incorporar y coordinar adecuadamente la escala y la complejidad. En algunos casos se han creado corporaciones gubernamentales que funcionan de manera autónoma (como es el caso en Puerto Madero) o como agencias públicas especiales adosadas a los gobiernos centrales o municipales (como es el caso del programa habitacional que se está desarrollando en la ciudad de Rosario, Argentina, o del programa Nuevo Usme en Bogotá). El fallido proyecto de construcción del nuevo aeropuerto de Ciudad de México es prueba contundente de las consecuencias negativas de no definir correctamente este aspecto fundamental de los GPU.
Relación de los GPU con el Plan de Ciudad
¿Qué sentido tiene desarrollar grandes proyectos urbanos cuando no existe un plan comprensivo de desarrollo urbano o una visión social integral? Es posible encontrar situaciones en que la ejecución de los GPU puede estimular, mejorar o fortificar el Plan de Ciudad, pero en la práctica muchos de esos proyectos se establecen sin plan alguno. Una de las principales críticas hechas a los GPU es que se convierten en instrumentos para excluir la participación ciudadana en el proceso de decisiones sobre lo que se espera o supone que sea parte de un proyecto urbano integrado, tal como normalmente se estipularía en un plan maestro o plan de uso de suelo de una ciudad.
Todo esto constituye un debate interesante dentro del marco de las políticas urbanas en América Latina, dado que la planificación urbana misma ha sido acusada de fomentar procesos de elitización y de exclusión. Algunos autores han concluido que la planificación urbana ha sido una —si no la principal— causa de los excesos de la típica segregación social de las ciudades latinoamericanas; en este contexto, la reciente popularidad de los GPU puede ser vista como una reacción de la élite a la redemocratización y planificación urbana participativa. Para otros, los GPU constituyen una manifestación avanzada (y dañina) de la planificación urbana tradicional, producto de los fracasos o ineficacias de la planificación urbana, mientras que otros los consideran como “el menor de los males”, porque al menos garantizan que algo se haga en alguna parte de la ciudad.
En lo que se refiere a su relación con un Plan de Ciudad, los GPU se enfrentan a múltiples desafíos. Por ejemplo, pueden estimular la elaboración de un Plan de Ciudad cuando no exista, contribuir a modificar los planes tradicionales, o lo que podríamos llamar “navegar entre la bruma urbana” si lo anterior no es factible. En todo caso el manejo del suelo se presenta como un factor esencial tanto para el plan como para los proyectos, porque remite al punto crítico del marco normativo sobre los usos del suelo en la ciudad y su área de expansión.
Marco normativo
La solución normativa preferida sería una intervención bipartita: por un lado, mantener una normativa general para toda la ciudad pero modificando los criterios convencionales para que puedan tener flexibilidad y absorber los incesantes cambios que ocurren en los ámbitos urbanos, y por otro, permitir normativas específicas para determinados proyectos, pero evitando marcos normativos que puedan ir a contracorriente de los objetivos planteados en el Plan de Ciudad. Las “Operaciones Urbanas”, instrumento ingenioso y específico ideado bajo el derecho urbanístico brasileño (Decreto del Estatuto de la Ciudad, 2001), se han utilizado ampliamente para satisfacer estas necesidades duales: tan sólo en la ciudad de São Paulo se han implementado 16 de dichas operaciones. Otra versión de este instrumento es la llamada “planificación parcial”, estipulación que intenta reajustar grandes superficies de terreno y que se incluye en la igualmente novedosa Ley 388 colombiana de 1997.
Nuevamente, en la práctica observamos que se hacen excepciones aparentemente arbitrarias y que frecuentemente se pasan por alto las restricciones normativas. El punto aquí es que ninguna de estas normativas pasa por una evaluación de su valor socioeconómico y ambiental, por lo que se pierde una porción significativa de su justificación. Dada la fragilidad financiera y fiscal de las ciudades de América Latina, prácticamente no hay capacidad para discutir públicamente las solicitudes hechas por los proponentes de GPU. La ausencia de mecanismos institucionales que brindarían transparencia a estas negociaciones aumenta la venalidad de éstas, en la medida en que expongan la capacidad para fomentar otros desafíos jurídicos menos prosaicos.
La gestión pública o privada y el financiamiento
¿Cuál debe ser la combinación deseable de participación pública y privada en la administración de estos proyectos? A fin de garantizar la función del sector público en la gestión de un gran proyecto urbano, es preciso controlar y reglamentar el uso del suelo, aunque siguen sin resolverse asuntos como el grado de control que debería instituirse, y cuáles componentes específicos de los derechos de propiedad del suelo deberían controlarse. La ambigüedad de los tribunales y la incertidumbre que acompaña el desarrollo de los GPU suelen llevar a la frustración pública ante resultados imprevistos que favorecen los intereses privados. La esencia del problema radica en lograr un equilibrio adecuado entre controles efectivos ex ante (formulación, negociación y diseño de los GPU) y ex post (implementación, gestión, explotación y efectos) sobre los usos y derechos del suelo. En la experiencia latinoamericana con los GPU, suele haber una diferencia abismal entre las promesas originales y los verdaderos resultados.
En los años recientes parece haberse confundido la utilidad y viabilidad de las asociaciones público-privadas que se han constituido en muchos países para la ejecución de proyectos o programas específicos, llegándose incluso a plantear la posibilidad de privatizar la gestión del desarrollo urbano en general. Sin embargo, al tener el sector privado el control absoluto del suelo, se dificulta seriamente que estos proyectos contribuyan a un desarrollo urbano socialmente sostenible, a pesar de que en muchos casos generen importantes tributos a la ciudad (Polese y Stren, 2000).
El sistema de gestión pública preferido debe apoyarse en la mayor participación social posible e incorporar al sector privado en el financiamiento y la ejecución de estos proyectos. Las grandes intervenciones urbanas que aportan la mayor contribución al desarrollo de la ciudad tienen como base la gestión pública del suelo.
Valorización del suelo
Alrededor de la valorización del suelo generada por los grandes proyectos urbanos existe consenso sobre su potencial. Las discrepancias surgen cuando se discute y se trata de evaluar el monto verdadero de esta valorización, si debe haber una redistribución, y en ese caso, cómo debe hacerse y a quiénes beneficiar, tanto en términos sociales como territoriales. Aquí nuevamente nos enfrentamos al enigma de la cuestión “público-privada”, dado que esta fórmula de redistribución suele conducir a la apropiación de los recursos públicos por parte del sector privado.
Una manera de medir el éxito de la gestión pública de estos proyectos podría ser la valorización del suelo, como un recurso que pueda movilizarse para autofinanciamiento de los GPU o transferirse a otras zonas de la ciudad. Sin embargo, raramente se cuentan con estimados aceptables de estas plusvalías. Incluso en el proyecto del Puerto Madero en Buenos Aires, considerado como exitoso, hasta la fecha no se ha hecho una evaluación de los incrementos en el valor del suelo asociados bien sea con las propiedades dentro del proyecto mismos o las de las zonas vecinas. Como resultado, las conversaciones sobre una posible redistribución no han llegado muy lejos.
Los GPU concebidos como instrumentos para el logro de ciertas metas urbanas estratégicas suelen considerarse exitosos cuando se ejecutan de acuerdo con el plan. Sin embargo, las preguntas sobre hasta qué punto se alcanzaron estas metas, no obtienen respuestas completas y a menudo se “olvidan” convenientemente. Pareciera que la hipótesis que mejor cuadra para la experiencia latinoamericana con los GPU es que la aparente falta de interés en las metas no tiene mucho que ver con la incapacidad técnica para observar la transparencia de la fuente de la valorización, sino que más bien proviene de la necesidad de esconder el papel de la gestión pública como ente facilitador de la recuperación de la valorización creada por el sector privado, o de apoyo a la transferencia de recursos públicos a este sector a través de la construcción del proyecto.
No se trata de fingir ignorancia ni de minimizar los desafíos que conlleva avanzar en el conocimiento de cómo se forma la valorización y medir su dimensión y circulación. Sabemos que hay una gran cantidad de obstáculos derivados de los complicados derechos del suelo, las vicisitudes o fallas permanentes de catastros y registros inmobiliarios y la falta de una serie histórica de valores inmobiliarios con referencia geográfica. Hasta el plan más pequeño debe distinguir entre la valorización generada por el proyecto mismo y la generada por externalidades urbanas que casi siempre existen sin importar la escala del proyecto, las diferentes fuentes y ritmos de valorización, etc., etc. Ciertos trabajos han medido y evaluado la valorización asociada con el desarrollo, pero pareciera que los obstáculos técnicos no son tan importantes como la falta de interés político en conocer el modo de gestión de estos proyectos.
La distribución de la valorización creada puede privilegiar el uso en el terreno mismo del proyecto o en su entorno urbano inmediato. Esta idea se basa en la necesidad de financiar determinado proyecto dentro del área, para compensar los impactos negativos generados, o aun para acciones como la relocalización de viviendas precarias asentadas en el terreno o en sus alrededores que se considera perjudican la imagen del gran proyecto. Dadas las típicas condiciones socioeconómicas que se encuentran en la mayoría de las ciudades latinoamericanas, no es difícil entender que la asignación preferida de la valorización recuperada sería para proyectos de índole social en otras partes de la ciudad como conjuntos de vivienda. De hecho, una porción significativa de la valorización del suelo generada es justamente resultado del retiro de externalidades negativas producidas por la presencia de familias de bajos recursos en el área. Está de más decir que esta estrategia suscita posiciones divergentes.
Sin duda se necesitan mejores leyes e instrumentos para manejar las ventajas y riesgos que suponen la valorización por movilización social y la elitización (gentrification) del área por el desplazamiento de los pobres. No obstante la falta de estudios empíricos, hay razones para creer que algunas de las transferencias compensatorias dentro de la ciudad podrían terminar resultando contraproductivas. Por ejemplo, es posible que las diferencias en los aumentos resultantes en el precio del suelo y la segregación residencial social ocasionen mayores costos sociales, a los que habrá que asignar recursos públicos adicionales en el futuro (Smolka y Furtado 2001).
Impactos positivos y negativos
Por otra parte, los impactos negativos que provocan los grandes proyectos urbanos oscurecen muchas veces los impactos positivos en todas sus variedades. El desafío es cómo reducir los impactos negativos producidos por este tipo de intervenciones urbanas. Rápidamente se hace obvio que bien sea directa o bien indirectamente, la forma en que se maneje la tierra es crítica para entender los efectos de las grandes intervenciones en el desarrollo de la ciudad, en la planificación y regulación urbana, en la segregación socio-espacial, en el medio ambiente o en la cultura urbana. Aquí la escala y la complejidad tienen un papel dependiendo del tipo de impacto. Por ejemplo, la escala tiene más peso en los impactos urbanísticos y ambientales, mientras la complejidad lo tiene en los impactos sociales y la política urbana.
Tal como se mencionó anteriormente, la elitización que suele resultar de estos proyectos promueve el desplazamiento de la población existente —usualmente pobre— de la zona del nuevo proyecto. La elitización, sin embargo, es un fenómeno complejo que requiere análisis ulteriores de sus propios aspectos negativos, como también de cómo podría ayudar a elevar los niveles de vida. En vez de la simple mitigación de los impactos negativos indeseables, podría ser más útil dedicarse a mejorar el manejo de los procesos que generan dichos impactos.
Dependiendo de la gestión del desarrollo urbano, del papel del sector público y del nivel existente de participación ciudadana, cualquier GPU puede tener efectos positivos o negativos. Hemos recalcado el papel fundamental de la gestión del suelo y de la valorización de éste asociada con estos proyectos. No se puede hacer un análisis aislado de los GPU sin tomar en cuenta el total desarrollo de la ciudad. De la misma manera, el componente del suelo debe evaluarse respecto a la combinación de escala y complejidad apropiada para cada proyecto.
Sobre los autores
Mario Lungo es profesor e investigador de la Universidad Centroamericana (UCA José Simeón Cañas) en San Salvador, El Salvador. Anteriormente se desempeñó como director ejecutivo de la Oficina de Planificación del Área Metropolitana de San Salvador.
Martim O. Smolka es Senior Fellow, codirector del Departmento de Estudios Internacionales y director del Programa para América Latina y el Caribe del Instituto Lincoln.
Referencias
Lungo, Mario, ed. 2004. Grandes proyectos urbanos (Large urban projects). San Salvador: Universidad Centroamericana José Simeón Cañas.
Lungo, Mario (en publicación). Grandes proyectos urbanos. Una revisión de casos latinoamericanos (Large urban projects: A review of Latin American cases). San Salvador: Universidad Centroamericana José Simeón Cañas.
Smolka, Martim y Fernanda Furtado. 2001. Recuperación de plusvalías en América Latina (Value capture in Latin America). Santiago, Chile: EURE Libros.
Polese, Mario y Richard Stren. 2000. The social sustainability of cities. Toronto: University of Toronto Press.
Governments have long recognized the need to preserve certain open space lands because of their importance in producing public goods and services such as food, fiber, recreation and natural hazard mitigation, or because they possess important geological or biological features.
New impetus for open space preservation results from the desire to counteract the effects of declining urban cores, suburban sprawl, and the socioeconomic and land use changes now encroaching on high-amenity rural areas. The growing use of habitat conservation plans for reconciling environmental and economic objectives also draws attention to the values of open space, especially in comparison to alternative land uses.
It is likely that most decisions about open space preservation will be made at the local level, due in part to the general trend of devolution of governmental responsibility (with accompanying fiscal responsibility), as well as an increase in the institutional capacity and activism of local land conservation trusts. Since local governments are heavily dependent on the property tax for operating revenue, the fiscal and economic implications of open space preservation decisions are paramount. Conservationists are frequently called upon to demonstrate to local communities the economic value of preserving open space.
While much has been written about the economic value of the environment in general and of open space in particular, the literature is segregated by discipline or methodology. It is therefore difficult to assess the economic value of open space comprehensively. It is even more difficult to apply what is known in a public policy context, where open space holds significant non-monetary value.
Concepts of Value and Public Goods
Like all natural ecosystems, open space provides a variety of functions that satisfy human needs. However, attempting to assign monetary values to these functions presents several challenges. First, open space typically provides several functions simultaneously. Second, different types of value are measured by different methodologies and expressed in different units. Converting to a standard unit (such as dollars) involves subjective judgments and is not always feasible. Third, values are often not additive, and “double counting” is an ever-present problem. Finally, some would argue that it is morally wrong to try to value something that is by definition invaluable. At a minimum, they say, open space will always possess intangible values that are above and beyond any calculation of monetary values.
Open space often plays an important role in the provision of “public goods.” Public goods are nonexcludable: once they are produced it is impossible or very costly to exclude anyone from using them. They are also nonconsumptive: one person’s enjoyment of the good does not diminish its availability for others. The limited ability of producers to exclude potential users typically precludes the development of market allocation systems for public goods. As a result, easily observed measures of value, like those expressed through market prices, do not exist. Yet land use and resource management decisions imply tradeoffs between marketed and non-marketed goods and services, making it difficult to compare relative values and, through tradeoffs, arrive at socially optimal decisions.
Use and Nonuse Values
Much of the economic value associated with open space activities like recreation can be examined as use value and nonuse value. Use value results from current use of the resource, including consumptive uses (i.e., hunting and fishing), nonconsumptive uses (i.e., hiking, camping, boating and nature photography) and indirect uses (i.e., reading books or watching televised programs about wildlife).
Activities directly or indirectly associated with open space may provide an important source of revenue for businesses and state and local governments. For example, hunting and fishing license fees are a major source of funding for state wildlife agencies. Less direct but perhaps more important from an overall economic perspective are expenditures related to nonconsumptive open space activities that also have income and job multiplier effects and often occur in rural areas with limited commercial potential.
The economic implications of use and nonuse values across society can be very large, and many economists agree that these values should be considered in open space decisionmaking. Measuring use and nonuse values is difficult, however, due to the lack of markets and market prices and the existence of administratively set, quasi-market prices such as hunting and fishing license fees. To arrive at socially meaningful estimates of value for many nonmarket resources, economists use the concept of consumer surplus, or the amount above actual market price that a buyer would theoretically be willing to pay to enjoy a good or service.
Two methods are used to first estimate the demand curve for the resource: contingent valuation or travel cost methods. In the first, a hypothetical market is created in a survey and respondents are asked what they would be willing to pay for some defined activity or resource. In the second, the cost of travel to a site is viewed as an entry or admission price, and a demand curve is derived from observing visitation from various origins with different travel costs. While still controversial, these methods have been used in numerous studies to estimate the willingness to pay in addition to actual expenses for various recreational activities ( see chart 1), as well as for nonuse values such as maintaining populations of certain endangered species or preserving unique bird habitats.
Several types of nonuse values consider the possibility for future use. Option value represents an individual’s willingness to pay to maintain the option of utilizing a resource in the future. Existence value represents an individual’s willingness to pay to ensure that some resource exists, which may be motivated by the desire to bequest the resource to future generations.
Measuring the Economic Value of Open Space
As a result of decreased intergovernmental transfers of financial aid and increasing citizen resistance to taxes, local officials now scrutinize the fiscal consequences of land use decisions more than ever before. The primary analytic tool available to policymakers for this purpose is fiscal impact analysis, a formal comparison of the public costs and revenues associated with growth within a particular local governmental unit. Fiscal impact analysis is utilized frequently in large communities experiencing growth pressures on the metropolitan fringe, and it is being applied to open space preservation.
A review of fiscal impact studies by Robert Burchell and David Listokin concludes that generally residential development does not pay its own way. They found that nonresidential development does pay for itself, but is a magnet for residential development, and that open space falls at the break-even point. A study of eleven towns by the Southern New England Forest Consortium shows that on a strictly financial basis the cost of providing public services is more than twice as high for residential development as for commercial development or open space. (see chart 2)
Care must be taken when evaluating the results of fiscal impact analyses for several reasons: the choices of methodology and assumptions greatly influence the findings; specific circumstances vary quite widely from community to community; and fiscal impact analyses do not address secondary or long-term impacts. Nevertheless, fiscal impact analysis is a powerful and increasingly sophisticated planning tool for making decisions about land use alternatives at the community level.
The most direct measure of the economic value of open space is its real estate market value: the cash price that an informed and willing buyer pays an informed and willing seller in an open and competitive market. In rural areas, where highest and best use of land (i.e., most profitable use) is as open space, one can examine market transactions. In urban or urbanizing regions, however, where highest and best use (as determined by the market) has usually been development, the open space value of land must be separated from its development value, especially when land is placed under a conservation easement.
Open space may also affect the surrounding land market, creating an enhancement value. Casual observers find evidence of enhancement value in real estate advertisements that feature proximity to open space amenities, and it is explicitly recognized by federal income tax law governing the valuation of conservation easements. A number of empirical studies have shown that proximity to preserved open space enhances property values, particularly if the open space is not intensively developed for recreation purposes and if it is carefully integrated with the neighborhood. Enhancement value is important to the local property tax base because it offsets the effects of open space, which is usually tax-exempt or taxed at a low rate.
Open space possesses natural system value when it provides direct benefits to human society through such processes as ground water storage, climate moderation, flood control, storm damage prevention, and air and water pollution abatement. It is possible to assign a monetary value to such benefits by calculating the cost of the damages that would result if the benefits were not provided, or if public expenditures were required to build infrastructure to replace the functions of the natural systems.
An example of this approach is the Charles River Basin in Massachusetts, where 8,500 acres of wetlands were acquired and preserved as a natural valley storage area for flood control for a cost of $10 million. An alternative proposal to construct dams and levees to accomplish the same goal would have cost $100 million. In another study, the Minnesota Department of Natural Resources calculated that the cost of replacing the natural floodwater storage function of wetlands would be $300 per acre foot.
Lands valued for open space are seldom idle, but rather are part of a working landscape vital to the production of goods and services that are valued and exchanged in markets. Often, the production value resulting from these lands is direct and readily measured, as is the case in crops from farms and orchards, animal products from pasture and grazing lands, and wood products from forests. The economic returns from production accrue directly to the landowner and often determine current and future land use alternatives.
Open space lands may also play a less direct but nonetheless important production role for market-valued goods that depend in part on functions provided by private lands. Examples are the role of privately owned wetlands in fish and shellfish production and the role of private lands in supplying habitat for wild game. In addition to providing market-valued goods and services, direct and indirect production from open space lands supports jobs that are valuable to local, regional and national economies.
Conclusions
It will never be possible to calculate completely the economic value of open space, nor should it be. Certain intangible values lose significance when attempts are made to quantify them. Indeed, to incorporate into the real estate market the public values of open space without also developing a means of capturing those values for the public benefit would be counterproductive for conservation purposes.
Land use decisions ranging from the allocation of scarce conservation budgets to the property rights debate will be better informed if there is a more comprehensive understanding of the economic value of open space. Methods for determining and comparing value vary widely in level of sophistication and reliability. Some are based on long-established professional standards, while others continue to evolve. Given the inherent subjectivity of the term, any discussion of value must include a variety of disciplines, methodologies and approaches. The greatest benefit may be in prompting reassessment of the “conventional wisdom” about the economic consequences of development and conservation.
Charles J. Fausold is a fellow at the Lincoln Institute of Land Policy. Robert J. Lilieholm is an associate professor at Utah State University and a former visiting fellow at the Lincoln Institute. With partial support from the Boston Foundation Fund for the Preservation of Wildlife and Natural Areas they are reviewing and synthesizing existing information to develop a useful framework for considering the economic value of open space.
Carlos Morales-Schechinger ingresó al IHS, el Instituto de Estudios sobre la Vivienda y el Desarrollo Urbano de la Universidad Erasmus en Rotterdam, Holanda, en el año 2008. Dicho instituto internacional atrae estudiantes de todo el mundo, en su mayoría de los países en vías de desarrollo. Algunos programas del IHS están organizados conjuntamente con el Instituto Lincoln.
Anteriormente, Morales se desempeñó como profesor a tiempo parcial en la Universidad Nacional Autónoma de México (UNAM). En los últimos 12 años, ha colaborado en forma regular en seminarios y cursos organizados por el Instituto Lincoln en toda América Latina. Su labor docente se centra principalmente en temas tales como instrumentos de recuperación de plusvalías del suelo, tributación sobre suelo e inmuebles, y políticas preventivas basadas en el suelo como alternativas a los asentamientos informales.
Morales ha ocupado diferentes puestos gubernamentales: se desempeñó como Director de Políticas e Instrumentos de Suelo en la Secretaría de Desarrollo Urbano de México, donde diseñó e implementó un ambicioso programa de reservas territoriales, y como director de política catastral del gobierno de la Ciudad de México, donde manejó una extensa reforma fiscal de los impuestos sobre la propiedad. También ocupó puestos en bancos públicos y privados en México, donde se ocupó de valuaciones inmobiliarias, hipotecas, administración de propiedades y préstamos para grandes desarrollos urbanos y para gobiernos municipales.
Morales obtuvo el título de grado en Arquitectura por la UNAM, un diploma en Financiamiento de Gobiernos Locales por la Universidad de Birmingham, Reino Unido, y una maestría en Estudios Urbanos por la Universidad de Edimburgo, Reino Unido.
Land Lines: ¿Cómo se involucró usted con el Instituto Lincoln?
Carlos Morales: Mi primer contacto fue a principios de la década de 1980, cuando asistí a una conferencia internacional patrocinada por el Instituto que tuvo lugar en Cambridge y que estaba relacionada con mi trabajo para el gobierno sobre políticas de suelo urbano. Las ideas que aprendí allí pude ponerlas directamente en práctica dos años más tarde cuando trabajaba en una reforma para aumentar la oferta de suelos con servicios en ciudades de tamaño mediano y logré subsidio cruzado para lotes con servicios para las familias de bajos recursos en México. A principios de la década de 1990, al estar trabajando para el gobierno de la Ciudad de México en una ambiciosa reforma del impuesto sobre la propiedad, asistí a otra conferencia del Instituto sobre tributación sobre la propiedad.
A partir del año 2000, participé en varias actividades educativas organizadas por Martim Smolka a través del Programa para América Latina y el Caribe. Alrededor del año 2004, el Instituto creó una iniciativa conjunta con el IHS y me contrató como uno de los conferencistas invitados por el Instituto para dictar clases en estos programas. Más adelante, me invitaron a ser parte del IHS a tiempo completo para manejar esta iniciativa conjunta.
Land Lines: ¿Cómo compara usted la efectividad de instituciones como el IHS y el Instituto Lincoln?
Carlos Morales: Creo que son complementarios. El Instituto Lincoln es líder en investigación y educación sobre políticas de suelo, con un enfoque internacional en América Latina y China. El IHS es reconocido por su tarea educativa y de formación de capacidades en temas de gestión y desarrollo urbano para una audiencia mundial, particularmente los países en vías de desarrollo y en transición. Los cursos del IHS se encuentran abiertos a estudiantes de todas las regiones, aunque la mayoría proviene de países de África, Asia, Europa Central y Europa Oriental. Mediante la iniciativa conjunta con el IHS, el Instituto Lincoln tiene la posibilidad de alcanzar a estudiantes de muchos más países de manera eficiente.
Land Lines: La tarea de transmitir conocimientos fundamentales sobre políticas de suelo y gestión urbana a profesionales no es fácil. En su opinión, ¿cuál es el enfoque más efectivo para lograrlo?
Carlos Morales: Es importante la combinación de dos factores: el perfil del profesor y una pedagogía adecuada. Los profesores deben tener experiencia tanto en lo práctico como en lo académico, para poder así responder las preguntas que resultan relevantes para los técnicos profesionales, especialmente cuando las respuestas impliquen alejarlos de su zona de confort y enfrentar algún tipo de desafío.
El objetivo último de las ciencias sociales es precisamente el de cambiar la realidad, no sólo entenderla. La consultoría acerca a los académicos a la práctica, pero no los confronta con el compromiso moral de implementar una política o con la responsabilidad ética de hacer que la política funcione en la realidad. La experiencia en la práctica directa es fundamental. Los programas del Instituto en América Latina emplean profesores con este perfil, quienes han probado ser efectivos al tratar cuestiones tales como el impacto de la tributación y las regulaciones en los mercados inmobiliarios y al escoger instrumentos de recuperación de plusvalías del suelo, ambos temas candentes en la región.
En cuanto a la pedagogía, los técnicos profesionales tienden a ser escépticos acerca de la teoría, ya que la consideran poco práctica y desean probarla para convencerse. El uso de ejemplos de políticas implementadas en otras ciudades resulta muy útil. Algunos estudiantes de países en vías de desarrollo no aceptan casos de países más desarrollados, ya que sostienen que sus estructuras de gobernanza son demasiado diferentes. Otros estudiantes prefieren casos de situaciones diversas, ya que, a pesar de las diferencias contextuales, aspiran a lograr mejores oportunidades de desarrollo para sus propios países. Un profesor debe tener un arsenal de casos diferentes para examinarlos cuando surjan las preguntas.
Los juegos de simulación también resultan una técnica muy efectiva. Los juegos de roles en los que los participantes compiten entre sí son los más útiles para comprender los mercados inmobiliarios y ayudar a resolver problemas. Los juegos de roles son muy reveladores, aunque los participantes no logren resolver los problemas, puesto que los motiva a preguntarse qué ocurrió. He visto cómo los participantes que experimentaron el fracaso en un juego comienzan a cooperar y a diseñar reglamentaciones ingeniosas por su propia cuenta. Otra estrategia es la de asignar a los participantes un rol que sea contrario a sus creencias o experiencias. Por ejemplo, los funcionarios gubernamentales que representan el papel de desarrolladores piratas descubren las grandes cantidades de dinero que tienen que gastar los pobres sólo para tener acceso a los terrenos.
Jugar al abogado del diablo funciona bien cuando se debaten conceptos controvertidos, como si los participantes estuvieran en un tribunal de tierras. Esta no es una técnica nueva, a menos que se juegue con algunas variaciones. Un ejemplo sería determinar los criterios para la compensación por expropiaciones. En este juego, un equipo sostiene ideas a favor de los valores de uso actual, y otro equipo, los valores de uso futuro. Se brinda literatura de apoyo e información práctica para que cada equipo pueda elaborar sus argumentos. Los profesionales de diferentes países pueden referirse a ejemplos de expropiaciones normativas, ya sean las expropiaciones ocurridas en China, las restituciones de terrenos en Europa Oriental o la venta de derechos de construcción en Brasil.
Debido a que los participantes deben defender una postura con la que no están de acuerdo, les resulta necesario estudiar y trabajar con más ahínco. En muchos casos, terminan cambiando de opinión o, al menos, identificando nuevos argumentos para su uso posterior en los debates con sus oponentes en la vida real. Al finalizar el juego del tribunal de tierras, el grupo que actúa como jurado vota dos veces en secreto: primero sobre al desempeño del equipo cuyos miembros actuaban como defensores; segundo, sobre los argumentos conceptuales. Cuando un equipo recibe más votos que la posición que defendían, queda claro que se necesita investigar el tema con mayor profundidad. Lo que más me gusta es que el juego no impone una posición a los participantes, sino que eleva el nivel de debate.
Land Lines: ¿Cuáles son los principales tipos de resistencia que existen en torno a los conceptos e ideas relacionadas con las políticas de suelo?
Carlos Morales: El concepto que con mayor frecuencia suscita resistencia tal vez sea la forma en que los impuestos y las normas se capitalizan en el precio del suelo. La resistencia puede provenir de un punto de vista ideológico (tanto la izquierda como la derecha tienen sus argumentos), del interés personal (los propietarios no aceptan fácilmente sacrificar sus ganancias) o de la ignorancia acerca de la forma en que funciona el concepto de capitalización. Como educador, es mi función tratar el tema de este último desafío.
Aunque a los profesionales se les explique la teoría, permanecen escépticos si su experiencia contradice la teoría. El malentendido puede surgir del hecho de referirse a un impuesto sobre un bien de consumo que no es tan escaso como el suelo, aunque también puede derivarse de la experiencia que tengan con los mercados inmobiliarios en sí. Esto ocurre cuando se presentan de forma conjunta dos políticas con efectos opuestos, como por ejemplo el aumento de las densidades y el aumento de los impuestos. El efecto combinado de estas medidas dificulta la comprensión del impacto que tiene cada una de ellas. Un juego de simulación puede ayudar a aislar cada impacto. Los profesionales deben experimentar con cada medida para poder entender mejor ambas políticas. He notado que a veces asienten con escepticismo cuando uno dicta la teoría, pero que luego sonríen con cara de “eureka” cuando logran comprenderla después de participar en un juego.
Land Lines: ¿Cómo supera usted la resistencia hacia temas tales como la recuperación de plusvalías?
Carlos Morales: Toda tarifa relacionada con el aumento de las densidades es una forma de recuperar la plusvalía del suelo, así como también una fuente de financiamiento de infraestructura, tal como lo está llevando a cabo la ciudad de São Paulo al cobrar por derechos de construcción adicionales. El debate sobre la forma en que esta política tiene un impacto sobre los precios de mercado es controvertido. Los propietarios no están de acuerdo, ya que esta política reduce sus expectativas de precios; por otro lado, los desarrolladores están a favor, ya que esta política reduce los precios del suelo y los pagos que se realizan a la ciudad vuelven en forma de obras públicas. Una situación similar se dio en Bogotá, cuando se creó un impuesto sobre la plusvalía del suelo.
Ambos casos resultan referencias útiles cuando se quiere explicar la recuperación de plusvalías del suelo en los países en vías de desarrollo, aunque es necesario documentar y divulgar más casos de ciudades, y algunos profesionales quieren ejemplos de países desarrollados. Esto no es fácil, ya que la recuperación de plusvalías del suelo es un término de moda en los círculos de América Latina, no así en la mayoría de los países desarrollados. Y esto no quiere decir que el concepto de recuperación de plusvalías no se utilice en los Estados Unidos u otros lugares, sino que se asume como parte del funcionamiento del mercado inmobiliario. Por lo tanto, los profesores tienen la función de resaltar esta cuestión y dar lugar a la posibilidad de compartir experiencias entre los profesionales provenientes tanto de países desarrollados como en vías de desarrollo.
Land Lines: ¿Qué podría comentarnos acerca de las dificultades que existen al tratar de transmitir conceptos sobre tributación a los planificadores?
Carlos Morales: Los planificadores aprenden acerca de los impuestos sobre la propiedad si estos son lo suficientemente altos como para tener un impacto sobre las decisiones que toman los propietarios, los desarrolladores y los usuarios del suelo, tal como ocurre en los Estados Unidos. En los países en vías de desarrollo, estos impuestos son, por lo general, tan bajos que no tienen un impacto sobre las decisiones del mercado, por lo que los planificadores no se interesan en ellos. Cuando participamos en juegos que ilustran el funcionamiento de los mercados de suelo a los arquitectos (quienes, con frecuencia, también son planificadores) y estos se dan cuenta de que la ciudad no está yendo hacia donde ellos esperan, su reacción más frecuente es la de sugerir más impuestos y mercados inmobiliarios más eficientes. Casi nunca proponen un plan de uso del suelo tradicional.
Land Lines: En su opinión, ¿cuáles son los conceptos o ideas fundamentales que podrían marcar la diferencia en el debate internacional sobre los mercados inmobiliarios urbanos?
Carlos Morales: Resaltar el hecho de que la recuperación de plusvalías del suelo es una fuente importante de financiamiento de infraestructura y prevención de asentamientos informales puede generar la participación de más partes interesadas en un debate serio. Las ideas relacionadas con la seguridad de la tenencia, el registro de inmuebles y los títulos de propiedad a fin de aumentar el acceso a préstamos han estado dominando las políticas, aunque los resultados no han sido tan positivos como se esperaba. Los asentamientos informales siguen desarrollándose y la prestación de servicios continúa bastante atrasada.
Aquellas políticas que tienen que ver con la tributación del suelo y las obligaciones —no solamente con los derechos de propiedad— tienen mayores posibilidades de mejorar el funcionamiento de los mercados inmobiliarios urbanos. UN-Habitat y el Banco Mundial adoptaron las primeras nociones de seguridad de la tenencia como una solución, pero ahora están comenzando a mostrar interés en los instrumentos de desarrollo urbano basados en el suelo. Las políticas de recuperación de plusvalías del suelo tendrán un efecto mañana, aunque su costo político se produce hoy, ya que entregar títulos de propiedad es barato y atractivo para los políticos de corto plazo. Este es el desafío que debemos enfrentar en el debate internacional con el fin de asegurar una reforma del mercado inmobiliario más efectiva y a largo plazo.
A major argument in support of land-value taxation is that it creates no incentives for altering behavior in order to avoid the tax. By contrast, a conventional property tax, levied on buildings, can deter landowners from erecting otherwise desirable structures on their land. For example, homeowners may decide against finishing a basement or adding a second bath because it would increase tax liability. Thus, a conventional property tax can lead to excessively low capital-land ratios and “excess burden”—a cost to taxpayers over and above the actual monetary payments they make to the tax authorities. This article reports on a recent study of excess burden resulting from an early British antecedent of the modern property tax—the 17th-century window tax.
The Case of the Window Tax
In 1696, King William III of England, in dire need of additional revenues, introduced a dwelling unit tax determined by the number of windows in an abode. The tax was designed as a property tax, as described by this discussion in the House of Commons in 1850: “The window tax, when first laid on, was not intended as a window tax, but as a property tax, as a house was considered a safe criterion of the value of a man’s property, and the windows were only assumed as the index of the value of houses” (HCD 9 April 1850).
In its initial form, the tax consisted of a flat rate of 2 shillings upon each house and an additional charge of 4 shillings on houses with between 10 and 20 windows, or 8 shillings on houses with more than 20 windows. The rate structure was amended over the life of the tax; in some cases, rates were raised dramatically. In response, owners of dwellings attempted to reduce their tax bills by boarding up windows or by constructing houses with very few of them. In some dwellings, entire floors were windowless, leading to very serious and adverse health effects. In one instance, lack of ventilation led to the death of 52 people in the surrounding town, as reported by a local physician who called on a house inhabited by poor families:
“In order to reduce the window tax, every window that even poverty could dispense with was built up, and all sources of ventilation were thus removed. The smell in the house was overpowering and offensive to an unbearable extent. There is no evidence that the fever was imported into this house, but it was propagated from it to other parts of town, and 52 of the inhabitants were killed.” (Guthrie 1867)
The people protested and filed numerous petitions to Parliament. But, despite its pernicious effects, the tax lasted more than 150 years before it was finally repealed in 1851.
The window tax represented a substantial sum for most families. In London, it ranged from about 30 percent of rents on “smaller houses on Baker Street” to as much as 40 to 50 percent on other streets, according to a House of Commons debate in 1850 (HCD 9 April 1850). The tax was particularly burdensome on poor families living in tenements, where assessors taxed the residents collectively. Thus, if a building contained 2 apartments, each with 6 windows, the building was taxed at a rate based on 12 windows. By contrast, on very large houses of the wealthy, the tax typically did not exceed 5 percent of the rental value.
The tax schedule underwent several significant changes before it was finally repealed. In 1784, Prime Minister William Pitt raised tax rates to compensate for lower taxes on tea. Then in 1797, Pitt’s Triple Assessment Act tripled the rates to help pay for the Napoleonic Wars. The day following this new act, citizens blocked up thousands of windows and wrote in chalk on the covered spaces, “Lighten our darkness we beseech thee, O Pitt!” (HCD 24 Feb. 1848).
England and Scotland were both subject to the window tax, but Ireland was exempted because of its impoverished state. One member of Parliament quipped, “In advocating the extension of the window tax to Ireland, the Honorable Gentleman seemed to forget that an English window and an Irish window were very different things. In England, the window was intended to let the light in; but in Ireland the use of a window was to let the smoke out” (HCD 5 May 1819).
The window tax, incidentally, was viewed as an improvement over its antecedent, the hearth tax. In 1662, Charles II (following the Restoration) imposed a tax of 2 shillings on every fire hearth and stove in England and Wales. The tax generated great resentment largely because of the intrusive character of the assessment process. The “chimney-men,” as the assessors and tax collectors were called, had to enter the house in order to count the number of hearths and stoves. The window tax, by contrast, did not require access to the interior of a dwelling; the “window peepers” could count the apertures from the outside and avoid invading the privacy of the home.
The window tax, however, created some administrative problems of its own—most notably the definition of a window for purposes of taxation. The law was vague, and it was often unclear what constituted a window for tax purposes. In 1848, for example, Professor Scholefield of Cambridge paid tax on a hole in the wall of his coal cellar (HCD 24 Feb. 1848). In the same year, Mr. Gregory Gragoe of Westminster paid tax for a trapdoor to his cellar (HCD 24 Feb. 1848). As late as 1850, taxpayers urged the Chancellor of the Exchequer to clarify the definition of a window.
Notches and Their Effects on Behavior
Throughout its history, the window tax consisted of a set of “notches.” A notch in a tax schedule exists if a small change in behavior—such as the addition of a window—leads to a large change in tax liability.
Notches are rare (Slemrod 2010) and not to be confused with kinks, which are far more common even today. A kink in a tax schedule exists if a small change in behavior leads to a large change in the marginal tax rate but just a small change in tax liability. The income tax in the United States, for example, has several kinks. Married couples with taxable income from $17,850 to $72,500 are in the 15 percent marginal tax bracket; couples with taxable income from $72,500 to $146,400 are in the 25 percent marginal tax bracket. If a couple with income of $72,500 were to earn an extra dollar, its marginal tax rate would jump to 25 percent, but its tax liability would increase by just $.25.
Microfilm records of local tax data in the U.K. from 1747 to 1830 allow for a more systematic examination of the impact of the window tax and notches. This article draws on a data set from 1747 to 1757, with information on 493 dwellings from Ludlow, a market town in Shropshire, near the border of Wales. Over this period, the window tax schedule included 3 notches. A homeowner in this period paid:
Homeowners who purchased a 10th window thus paid a 6 pence tax on the 10th window as well as on each of their 9 other windows, which previously had been untaxed. Thus the total tax on the 10th window was 60 pence, which was equal to 5 shillings. If the window tax distorted decisions and thus led to excess burden, then one would expect to find many homes with 9, 14, or 19 windows but very few with 10, 15, or 20. A test of this argument is discussed below.
Through the first half of the 18th century, the administration of the tax had been troublesome, as homeowners frequently camouflaged or boarded up windows until the tax collector was gone, or took advantage of loopholes or ambiguities in the tax code. As a result, tax collections were much lower than expected. In 1747, however, Parliament revised the tax by raising rates and introducing measures to improve its administration. Most notably, it prohibited the practice of blocking up and subsequently reopening windows in order to evade assessment; violators had to pay a penalty of 20 shillings (1 pound) for every window they reopened without notifying the tax surveyor (Glantz 2008).
The 1747 act reduced tax evasion significantly, so the data for the following 10 years should provide reasonable estimates of the actual number of windows. If the window tax distorted behavior, one would expect to find spikes in the number of dwellings at the notches, with 9, 14, or 19 windows. And this is precisely what the data demonstrate. Figure 1 is a histogram showing the number of windows for homes in the sample. The pattern is clear; there are sharp increases in the number of homes with 9, 14, or 20 windows:
Standard statistical tests reject the hypothesis that there are equal numbers of houses with 8, 9, or 10 windows; with 13, 14, or 15 windows; or with 18, 19, or 20 windows. It is manifestly clear that people responded to the window tax by locating at one of the notches so as to minimize their tax liability.
Data on a sample of 170 houses for the period 1761 to 1765 shed light on the response to Parliamentary revisions to the tax in 1761. In addition to rate increases, the 1761 revisions expanded coverage of the tax to include houses with 8 or 9 windows. Under the earlier rate structures, houses with fewer than 10 windows paid no window tax. For this second sample, figure 2 shows a large spike at 7 windows: 28.2 percent of the houses have 7 windows, but only 5.2 percent have 6 windows, and just 2.9 percent have 8 windows. Once again, it’s easy to reject the hypothesis that there were an equal number of houses with 6, 7, or 8 windows.
In summary, the evidence from our two samples makes it quite clear that there was a widespread tendency to alter behavior in order to reduce tax payments. People chose the number of windows not to satisfy their own preferences, but to avoid paying higher levels of taxes. The window tax, in short, generated a real “excess burden.”
How Large Was the Excess Burden from the Window Tax?
As discussed, the window tax was substantial and induced widespread tax-avoiding behavior. Based on some standard techniques of economic analysis, our simulation model generates an estimate of what people would have been willing to pay for their preferred number of windows. The model captures each consumer’s demand for windows with and without the tax, the taxes paid, and the loss of welfare from adjusting the number of windows in response to the tax.
In the sample from 1747 to 1757, the estimated welfare losses were very large for households at one of the notches. For them, the welfare loss (i.e., excess burden) is 62 percent of the taxes they paid. That is to say, for every dollar collected under our simulated version of the window tax, the tax imposed an additional burden or cost of 62 cents on these households. The excess burden, not surprisingly, is particularly large for households that chose 9 windows. One criterion economists use to evaluate a tax is excess burden relative to taxes paid. By this standard, a good tax is one that collects significant revenue buts leads to very small changes in decisions. Consumers who purchased 9 windows are thus the worst possible case. Those consumers paid no tax; so, for them, the entire burden of the tax is excess burden.
For our entire sample of 1,000 simulated households, the excess burden as a fraction of taxes paid is about 14 percent. Thus for each tax dollar raised by the window tax, our simulation suggests an additional cost of 14 cents to taxpayers as a result of their distorted choices.
Some Concluding Remarks
The window tax represents a very clear, transparent case of excess burden—a tax that placed heavy costs on taxpayers in addition to their tax liabilities resulting from tax-avoiding adjustments in behavior. But, as mentioned early on, modern property taxes also create an excess burden, although the consequences are less dramatic than in the case of the window tax.
In designing a tax system, it is important to consider this issue. The ideal, in principle, is a neutral tax that raises the desired revenues but doesn’t distort taxpayer behavior so as to create additional burdens. Such a tax is a pure land-value tax levied on the site value of the land—that is, its value with no improvements. Thus, the assessed value of the land (and hence the tax liability of the owner) is completely independent of any decisions made by the owner of the land parcel. Unlike the window tax, which provides a compelling example of the additional costs that arise when property tax liabilities depend on the behavior of the property owner, a land-value tax creates no incentives for tax-avoiding behavior.
About the Authors
Wallace E. Oates is Distinguished University Professor of Economics, Emeritus, University of Maryland, and University Fellow at Resources for the Future.
Robert M. Schwab is a professor of economics at the University of Maryland.
Resources
Binney, J. E. D. 1958. British Public Finance and Administration, 1774–92. Oxford: Clarendon Press.
Blinder, Alan S., and Harvey S. Rosen. 1985. “Notches.” American Economic Review 78 (September): 736–747.
Dickens, Charles. 1850. Household Words. Vol. 1. London: Bradbury and Evans.
Douglas, Roy. 1999. Taxation in Britain since 1660. London: MacMillan.
Dowell, Stephen. 1884. A History of Taxation and Taxes in England from the Earliest Times to the Present Day. Vols. 2 and 3. London: Frank Cass & Co.
Fielding, Henry. 1975. The History of Tom Jones, A Foundling. Wesley University Press.
George, M. Dorothy. 1926. London Life in the XVIIIth century. New York: Alfred A. Knopf.
Glantz, Andrew E. 2008. “A Tax on Light and Air: Impact of the Window Duty on Tax Administration and Architecture.” Penn History Review 1696–1851 15 (2): 1–23.
Guthrie, Thomas. 1867. “How to Get Rid of an Enemy.” The Sunday Magazine.
HCD (House of Commons Debates). 5 May 1819. Vol. 40 cc 126–148. “Motion for the Repeal of the Window Tax in Ireland.”
HCD. 24 February 1848. Vol. 96 cc 1259–1297. “Lowest Classes Under Assessment.”
HCD. 9 April 1850. Vol. 110 cc 68–99. “Window Tax.”
Kennedy, William. 1913. English Taxation, 1640–1799. London: G. Bell and Sons, Ltd.
Marshall, Alfred. 1948. Principles of Economics, 8th edition. New York: Macmillan.
Neary, J. Peter, and Kevin S. W. Roberts. 1980. “The Theory of Household Behaviour under Rationing.” European Economic Review 13 (January): 25–42.
Sallee, James M., and Joel Slemrod. “Car Notches: Strategic Automaker Responses to Fuel Economy Policy,” NBER Working Paper #16604, 2010. http://www.nber.org/papers/w16604.pdf.
Sinclair, Sir John. 1804. The History of the Public Revenue of the British Empire. London: Strahan and Preston.
Slemrod, Joel. 2010. “Buenas Notches: Lines and Notches in Tax System Design.” Unpublished working paper. http://webuser.bus.umich.edu/jslemrod/pdf/Buenas%20Notches%20090210.pdf.
Smith, Adam. 1937. The Wealth of Nations. New York: Random House.
Walpole, Spencer. 1912. A History of England from the Conclusion of the Great War in 1815. Vol. 5. London: Longmans, Green, and Company.
Weitzman, Martin L. “Prices and Quantities.” Review of Economic Studies 41: 477–491.
The single greatest challenge to any type of land value taxation system is accurate valuation of land on a large scale. In urban areas where nearly all real estate sales data represent transfers of land with improvements, it is difficult to divide prices between land and building components. Although many jurisdictions require a separate listing of land and building values on their tax rolls, these allocations will not affect the final tax bill if the tax rate is the same on both.
Any special tax on land value alone would increase the need to assign more accurate land values to parcels that have been improved over many years. As a result, skepticism as to the feasibility of this process has proven a major stumbling block to serious consideration of two-rate property taxes and other forms of special land taxation. Many observers have concluded that the practical problems of land assessment prevent the realization of the many theoretical benefits it offers.
New advances in computerized approaches to property assessment have important implications for this debate. While land valuation presents special problems in the analysis of sales data for improved parcels, it also can benefit from location analysis and land value mapping techniques. Buildings can and will vary unpredictably in both type or value from lot to lot, but land values for adjoining or nearby parcels should have a more constant relationship to one another. More than 20 years ago, Oliver Oldman of Harvard Law School, considered the implications of this situation for an appeals process under a land value tax, recognizing that a successful challenge to one parcel’s valuation would have implications for many other assessments as well. He wrote, “The key to developing an accurate land-value assessment roll is the process of land-value mapping.” Now the technology is available to achieve this goal.
In a recent seminar at the Lincoln Institute, representatives of the Auditor’s Office in Lucas County, Ohio, which includes the city of Toledo, joined a group of economists, appraisers, lawyers and local officials to examine current methods of land valuation. Lucas County has one of the most sophisticated appraisal systems in the country, with almost 20 years of experience in using computerized methods of spatial data analysis for property taxation. The seminar provided a valuable opportunity to discuss the county’s innovative approaches to the integration of geographic information systems and computer-assisted land valuation to estimate the effect of location on real estate market value.
Traditional Methods of Land Valuation
There are several standard methods of deriving a value for unimproved land, all extremely problematic as the basis for jurisdiction-wide assessment.
Comparable Sales: The most straightforward method is an analysis of sales of comparable unimproved land, adjusting the prices to account for any differences in size, location, and features. Similarly, the capitalization of rental income for comparable vacant land can serve as a basis for estimating its sale price. However, these methods are difficult to apply in densely populated urban areas where sales or rentals of unimproved land are rare. The pool of sales data can be expanded if sales of improved land are followed soon after by demolition of the buildings. In that case, the unimproved land value can be estimated as the purchase price minus the costs of the demolition. Although such sales provide an important check for estimated values produced by other approaches, they do not exist in sufficient numbers over a varied enough geographic range to serve as the sole basis for assessment.
Income Analysis: The land residual method begins with an estimate of the income yielded by the developed property. The building value is then calculated, and from that the income attributable to the building is derived. Capitalizing the remaining income then provides a value for the land. However, even a cursory description of this method suggests the difficulties of its application. In particular, the existence of depreciation, or any deviation from highest and best use that would distort the income available to the unimproved land, can leave the independent value of the improvements extremely uncertain.
Cost Analysis: Similar problems confront a division of value according to the depreciated reproduction cost of the improvements. This method assumes that structures can be worth no more than their cost of construction, and assigns all remaining value in the improved parcel to the land itself. Physical, economic or functional depreciation greatly complicates the attempt to calculate building value, however, so this method requires fairly new construction whose price can be confidently estimated as a measure of value. The financial effect of various forms of obsolescence can only be measured accurately through examination of sales data, which will almost never be available for the building alone.
Cost of Development: A full-scale market appraisal of potential development alternatives provides another basis for estimating the sale price of unimproved land. This is the approach taken by developers considering new uses for land, land trusts seeking to acquire and preserve undeveloped open space, and taxpayers claiming deductions for charitable contributions of development rights. However, it is most suitable for valuing undeveloped land to be used for residential subdivisions. Even in these situations, it requires extensive study of the potential market for such properties, local restrictions on development, and the physical attributes of the land that would affect its building capacity, such as soil and drainage characteristics. This type of exhaustive individual appraisal is appropriate for purchasers or developers of individual parcels, but is not feasible for annual assessments for all parcels in a taxing jurisdiction.
Other valuation methods, such as derivation of typical ratios of site value to total improved property value, are even less useful in the case of densely developed urban property, where buildings of all sizes, ages and utility may be found in close proximity on fairly similar parcels of land.
New Approaches: CAMA and GIS
The greatest change in assessment practice over the past three decades has involved the use of computers and mathematical formulas to establish a relationship between property characteristics and sale prices, thereby permitting an estimate of the market value of other properties not subject to a recent sale. This approach is known as computer-assisted mass appraisal (CAMA). Site characteristics such as size and location are important elements of these mathematical models, raising the possibility of estimating the effect of location on parcel value.
At the same time, the development of computerized geographic information systems (GIS) has permitted assessors to develop location-based property records or cadastres, and to coordinate sales data with location. More sophisticated and less expensive GIS technology now offers the potential for full integration with CAMA for spatial analysis. Initial attempts to quantify location effects faced difficulties not only in defining and maintaining “economic neighborhoods” or zones, i.e., contiguous areas of relatively homogeneous land values, but also in understanding the dynamics of the interactive, elusive locational factor. Some efforts developed different mathematical models for each geographic region or “cluster” of properties with similar characteristics. However, these approaches could not capture the many complex, interrelated and significant micro-variations within any given neighborhood, and could not reduce the determination of location value to an objective process.
Lucas County pioneered a new approach to location value-the use of GIS tools to develop a response surface that represents the effect of location on land value. The response surface is a fitted three-dimensional surface that represents a percentage adjustment to land and/or land and improvements based on a parcel’s geocoded location. Included in the analysis are geographic coordinates and distances from important features, such as other recent sales, institutions, amenities or other “value influence centers.” This analysis results in a three-dimensional representation, with the height of the surface (z) at any specific x-y coordinate indicating the approximated location value of that parcel. This variable is then evaluated with others, such as land and building size, quality, condition and depreciation, to produce a total estimated value for the parcel.
In the Lucas County example, the response surface differs from a mathematical equation in that it is developed through a spatial analysis process available in GIS to estimate the effects of location on value and refine those estimates after comparing them with sales and appraisal data. This approach still relies on an element of appraisal and economic judgment in determining neighborhood boundaries for location effects, but it can be tested and refined by observing the effect of different neighborhood “breaklines” on the resulting three-dimensional value surface.
To be used successfully in mass appraisal, these sophisticated approaches must yield results that are reasonable, understandable and available to typical taxpayers. Lucas County has pioneered this aspect of the assessment process, as well. All real estate records, values and maps are available on a CD with GIS viewing software, priced at its production cost of $10, and online free at all public libraries in the county. Taxpayers can view property records or create customized maps showing the location of multiple parcels and the relationships among their taxable values.
Future Directions
Participants in the Lincoln Institute seminar found great promise in the Lucas County approach to location value, and identified many points for further development and investigation. All agreed that recent decades have seen a literal revolution in assessment practice, with great potential for increasing the feasibility of large-scale land valuation. Among the most important theoretical questions were the “functional form” of this spatial analysis, including the type of effect on value observed with changes in location and distance variables; the identification of omitted variables (those for which data is not available or which have been overlooked in the past); and the relationship between marginal value estimates and the total parcel value needed for assessment. Similarly, the effect of substandard buildings and less than “highest and best use” on values requires further exploration.
Development of these new approaches must be matched by educational efforts to explain their operation to taxpayers, local officials, and the lawyers and judges who will consider their consistency with legal standards for assessment practice. Through its innovative efforts in both of these areas, Lucas County has made an important contribution to the theory and practice of land valuation.
Jerome C. German is the chief assessor for Lucas County, Ohio. Dennis Robinson is vice president of programs and operations at the Lincoln Institute. Joan Youngman is senior fellow and director of the Institute’s Program on Taxation of Land and Buildings.
References
International Association of Assessing Officers. Property Appraisal and Assessment Administration (1990).
Oliver Oldman and Mary Miles Teachout. “Valuation and Appeals Under a Separate Tax on Land.” 15 Assessor’s Journal 43-57 (March 1980).
Richard D. Ward, James R. Weaver, and Jerome C. German. “Improving CAMA Models Using Geographic Information Systems/Response Surface Analysis Location Factors.” 6 Assessment Journal 30-38 (January/February 1999).
Lucas County website: www.co.lucas.oh.us
When a handful of judges and tax experts met around a table at the Lincoln Institute’s original Moley House headquarters in January 1980, they could hardly have foreseen that their one-day seminar would initiate the major educational program for the nation’s state tax judges. Over the past 30 years, the Lincoln Institute has sponsored the annual National Conference of State Tax Judges as a means of improving land-related tax policy as applied and interpreted by adjudication in tax courts and tax appeal tribunals.
Participants in the conference include members of administrative and judicial tax tribunals who hear appeals of tax assessments, denials of refund requests, or other property tax disputes on a jurisdiction-wide basis in all 50 states, the District of Columbia, and the cities of Chicago (Cook County) and New York.
Tax Tribunals and Effective Tax Policy
Taxation is a highly complex and emotionally charged reality in countries throughout the world. Patrick Doherty, the past president of the United Kingdom’s Institute of Revenues, Rating, and Valuation, has observed that to be acceptable, it is not sufficient or sometimes even necessary for a tax to meet an abstract standard of fairness. The essential element is that the levy “feel fair” to the taxpayer. An open, accessible, timely, and unbiased forum for appeals is crucial to this sense of fairness.
Tax tribunals play a vital role in affording aggrieved taxpayers a fair opportunity for consideration of their claims. Legislation and regulation enunciate policy at general and sometimes abstract levels. The actual application of the law to specific factual situations and disputes effectuates that policy. Tax tribunal interpretations can carry the force of law and precedent. Judicial education is an investment in sound tax policy that, by reaching a targeted and highly influential audience, can reap public benefits long into the future.
Although consideration of appeals is central to any tax system, the special nature of the property tax increases the importance of access to a fair and equitable tribunal. Unlike income taxes withheld from one’s salary or sales taxes collected as part of numerous transactions and never totaled for the taxpayer, property taxes generally require significant and highly visible periodic payments. Furthermore, the intricacies of the income tax leave most taxpayers without any intuitive sense of potential errors in its calculation. The property tax, on the other hand, is usually based on fair market value, a dollar figure that a homeowner may be able to estimate with some precision.
The property tax is primarily a local tax that engenders a greater sense of personal involvement than levies that support more distant levels of government. Not incidentally, the property tax is also the primary focus of much taxpayer discontent. It is easier to mobilize opposition to a visible local taxing body than to state or federal revenue departments. State legislators are also far more receptive to pressure to constrain local revenues than to efforts to reduce their own budgets.
At the same time, property tax disputes may involve large business enterprises owning complex structures whose valuation may raise highly theoretical or technical questions. Courts may need to determine the value of manufacturing plants that can constitute the major portion of a locality’s tax base, or of property that is part of a going concern whose sale price represents intangibles such as business good will.
The Special Challenges of Judicial Education
The importance of judicial education is matched by the challenge of presenting ongoing, impartial, and up to date instruction, especially with regard to specialized subject matter. Judges grappling with complex factual and legal issues often face a lonely task, and specialized tribunals such as tax courts can be particularly isolated. Some state tax courts have only one member.
Before the establishment of the National Conference of State Tax Judges, no national association or other formal means existed for judges in one state to confer with colleagues facing similar issues in another jurisdiction. Training in highly technical tax issues requires faculty with specialized expertise and a sophisticated understanding of sometimes arcane provisions.
At the same time, judges must be far more cautious than lawyers in private practice when they seek advice and instruction. They must avoid even the appearance of special access or private influence, particularly when dealing with specialists who may at some point serve as expert witnesses or litigants in their courtrooms. Outside the judicial realm, many educational conferences are supported by commercial sponsors with special ties to the subject matter, but in the tax area these organizations and corporations would be the most likely to have a stake in future litigation. Public funding for judicial education is rarely a legislative priority, even when budgets are relatively generous, and is among the first items to be curtailed in economic downturns.
All of these factors present special hurdles to the development of effective, ongoing educational programs for members of tax tribunals. The success of the National Conference of State Tax Judges speaks to both the effectiveness of the Lincoln Institute’s support and the enthusiastic work of judges across the country who have found it a professionally and personally rewarding means of creating ties with colleagues in other states.
The judges who volunteer to produce each year’s program are the cornerstone of the organization. The planning committee members begin monthly telephone meetings soon after the close of the previous conference so they can evaluate participant suggestions and consider topics of particular current importance. Individual judges then contact scholars and professionals who can address these issues. Faculty members include academic experts in law and economics as well as legislators, policy analysts, appraisers, journalists, and specialists in fields such as housing and commercial property markets.
The central role of the judges themselves is highlighted by a session on case law updates. One of the most lively and interactive parts of the program, this forum offers the participants an opportunity to describe and comment on recent decisions of special interest in their own states. Judges submit these cases in the months preceding the conference to the session moderators, who have responsibility for choosing the cases to be examined and guiding the discussion. Moderating a conference of judges carries special challenges of its own.
Wide Scope within a Specialized Area
The range of topics covered by the conference reflects underlying property valuation problems confronted in changing economic, social, and technological conditions. In the early years, discussions included utility properties, office complexes, and rental apartments, whereas now the focus is likely to be subsidized housing, golf courses, or nonprofit landowners. More complex methods of valuation, made possible by the introduction of computer-based techniques, have also entered the agenda. What approach to value is best suited to a specific type of property? What properties are exempt from property taxation? How does one value properties that are partially taxable and partially exempt, such as a structure that includes a hospital (exempt) and doctors’ offices (taxable).
Tax exemptions for charitable institutions are often expressed in legislative language reflecting an earlier and far starker division between commercial and nonprofit enterprises, such as hospitals. Retirement communities that require significant initial and continuing payments may not fit the pattern of a “home for the aged,” just as a traditionally exempt YMCA may seem a commercial competitor to a neighboring health club. The continuing challenge of distinguishing exempt and taxable property takes on new forms over time.
Many similar issues of statutory interpretation and the application of legal analysis to new factual situations have been the subject of ongoing examination at the annual conference.
In addition to valuation questions of this sort, the conference agenda includes a session on developments concerning state income and sales taxes, and presentations on such topics as judicial writing, ethical issues, and the effect of business cycles on property markets. Judges serving on appellate courts and state supreme courts have also contributed their perspectives on tax decisions that reach them on appeal.
Times of economic stress also require special attention to case management, as property tax appeals increase when market values fall. Courts require flexibility and excellent procedures to offer taxpayers accessible, timely determinations under these circumstances, particularly since economic downturns usually bring reductions in state budgets, and the need to do more with less.
In times of rapidly rising prices, taxpayers are often pleased to see an assessment figure that has not kept pace with market values. In downturns, taxpayers are more likely to be disturbed if bills reflect an earlier assessment date when prices were higher. The Minnesota Tax Court, for example, saw a 50 percent rise in appeals from 2,954 in 2008 to 4,760 in 2009 (through November). The state’s Chief Judge George Perez, the current chair of the National Conference of State Tax Judges, explained, “There’s a direct inverse relationship between Tax Court filings and the economy. When the economy is down, numbers are up. When the economy is up and healthy, the numbers begin to drop.”
Evidence of Impact
Strong ties have been formed among the members of tax courts and tribunals from over 30 states and the cities of New York and Chicago who have participated in the judges’ conference. These connections, as much as the technical material covered at the conference, have provided a means of improving the judicial process. Many participants have observed that the conference has produced a positive impact on the quality of tax decisions.
Glenn Newman, president of the New York City Tax Commission and Tax Appeals Tribunal, says, “It is remarkable how many issues cut across state lines. The issues we all face can be discussed and analyzed, helping all participants focus and come to a clearer understanding.”
Michelle Robert, counsel to the Maine State Board of Property Tax Review, states, “This is a wonderful conference that offers attendees a forum for exchange of ideas and viewpoints in this are of the law that is truly not otherwise available.”
Chief Judge George Perez of the Minnesota Tax Court, the current conference chair, sums up these perspectives: “It’s the best seminar given for tax judges and tax officials.”
The conference’s contribution to improved tax policy is a tribute to the foresight of those who gathered at Moley House 30 years ago, and the current participants who are eager to carry that benefit to future tax judges as well.
“Through informal as well as formal discussions at the annual meetings, and frequent phone calls during the rest of the year, judges have been able to tap the knowledge, experience, and intelligence of their colleagues all over the nation. The National Conference has provided an introduction and a phone number that have helped many of its members make a better decision concerning issues that at an earlier point seemed intractable. These personal connections may be one of the Lincoln Institute’s greatest contributions to the improvement of tax policy through the improvement of state and local tax adjudication.”
—Joseph C. Small
About the Authors
Joseph C. Small is the retired presiding judge of the Tax Court of New Jersey and a former chair of the National Conference of State Tax Judges.
Joan Youngman is an attorney and senior fellow and director of the Department of Valuation and Taxation at the Lincoln Institute of Land Policy.
Most jurisdictions require residential assessments to be proportional to market value, but in practice assessment ratios—assessed value divided by sale price—are often lower for high-priced than low-priced properties. This tendency for assessment ratios to fall as sales prices rise is termed regressivity, because it means that property taxes are a higher percentage of property value for lower-priced properties. Regressive assessments have been identified in many jurisdictions and times (such as Cornia and Slade 2005; McMillen and Weber 2008; and Plummer 2010).
Assessment regressivity is an important issue because it has the potential to undermine support for a property tax system. Consider a simple system in which taxes are 1 percent of a home’s assessed value, with no exemptions or deductions. For example, a $100,000 home should have a $1,000 tax bill, and a $1 million home a $10,000 tax bill. However, it is not uncommon to find that a $1 million home is actually assessed at $800,000 or $900,000, resulting in effective tax rates of 0.8 or 0.9 percent rather than the statutory 1 percent.
Having lower-than-prescribed assessment rates for some high-priced properties may result in greater variability in assessments within price groups. One owner of a high-priced home may accept a $1 million assessment as an accurate measure of market value, while another owner may appeal and win a lower assessment. Different tax bills for identical properties can cause taxpayer resistance and resentment.
The Assessment Process in Illinois
I have analyzed data from two counties in the Chicago metropolitan area that provide quite different perspectives on assessment regressivity. In suburban DuPage County, assessment ratios decline uniformly with sales prices and there is no marked difference in the degree of variability in assessments across the range of sales prices. In the City of Chicago, which is part of Cook County, the degree of variability in assessment ratios is greater than the degree of regressivity. Notably, assessment ratios in Chicago are highly variable at low and very high sales prices, while not varying greatly with mid-range sales prices.
Illinois has a simple flat-rate property tax, but the homestead exemption produces a degree of progressivity. This exemption is generally a flat amount that does not vary by price, although Cook County has an “alternative general homestead exemption” that can make the exemption higher in areas with rapid price appreciation. The basic homestead exemption is designed to produce much lower effective tax rates for low-priced properties—where the exemption is often high relative to market value.
Assessment practices in DuPage County are similar to those in all but one of the 102 counties in Illinois, where properties are assessed on a four-year cycle at 33 percent of market value. In DuPage County, properties were most recently assessed in 2007 and new assessments will be established in 2011. Cook County alone has a classified system with varying statutory assessment rates. Prior to 2009, the statutory rates were 16 percent for residential properties, 38 percent for commercial, and 36 percent for industrial, although actual assessment rates were much lower. In 2009, the statutory rates were “recalibrated” to 10 percent for residential and 25 percent for commercial and industrial properties. Cook County assesses its properties on a rotating, three-year cycle. The City of Chicago was last reassessed in 2009, and all city properties will be reassessed again in 2012. Properties in the north suburban part of Cook County were reassessed in 2010, and south suburban properties will be reassessed in 2011.
Traditional Measures of Regressivity
The importance of assessment regressivity has led the International Association of Assessment Officers (IAAO 2007) to recommend that an analysis of regressivity be included as part of any study of assessment accuracy. One common procedure recommended by the IAAO to evaluate assessment regressivity is a descriptive statistic, the price-related differential (PRD), which is the ratio of the simple mean assessment ratio to a comparable statistic that places more weight on higher-priced properties. Typically this ratio is greater than one, which implies that higher-priced properties have lower average assessment ratios than lower-priced homes.
Table 1 presents traditional IAAO measures of residential assessment performance for the most recent reassessment year for which I have data—2006 in Chicago and 1999 in DuPage County. The data on sales prices and assessed values come from the Illinois Department of Revenue, which is responsible for monitoring assessment performance for all counties in the state. I focus on Chicago rather than all of Cook County to keep the sample size more manageable, to focus on a single assessment year, and to avoid combining the county’s three assessment districts.
Chicago’s average assessment rate (mean) of 9.4 percent differs significantly from the statutory value of 16 percent. In DuPage County, the average assessment rate of 29.8 percent is much closer to the statutory 33 percent rate, and it would likely be even closer if the timing of the sales prices and assessment origination dates were closer. The value-weighted mean is calculated by weighting each observation by its sale price. The finding that the value-weighted mean is less than the arithmetic mean implies that higher-priced properties tend to have lower than average assessment ratios in both counties.
The price-related differential (PRD), which is the ratio of the value-weighted mean to the arithmetic mean, formalizes this measure. IAAO standards call for the PRD to be no higher than 1.03; by this standard, DuPage County’s degree of regressivity is acceptable while Chicago’s is not. The coefficient of dispersion (COD) is the traditional measure of assessment variability. By IAAO standards for residential properties, the COD should not exceed 15. Again, Chicago’s COD indicates excessive variability while DuPage County’s degree of variability is within IAAO’s acceptable range.
Statistical Analysis of Regressivity
A second IAAO-recommended procedure to measure regressivity is a statistical regression of a sample of assessment ratios on sales prices, which typically produces a negative coefficient for the price variable, i.e., a downward sloping line. This type of analysis provides estimates of the conditional expectation of the assessment ratio for any given sale price. Although several approaches exist in the literature, the basic idea is to estimate a function that produces a simple relationship between sales prices and assessment ratios. If the function implies that assessment ratios decline with sales prices, the assessment pattern is said to be regressive.
Figure 1 shows the estimated functions when assessment ratios are regressed on sales prices using data from Chicago and DuPage County. The straight lines are simple linear regressions. The curved lines are a nonlinear estimation procedure—a locally weighted regression technique that estimates a series of models at various target values, placing more weight on values closer to the target points. For example, to estimate a regression with a target point of $100,000, one might use only observations with sales prices between $75,000 and $125,000, with more weight placed on sales prices closer to $100,000.
The linear and locally weighted regression estimates are much more discrepant for Chicago’s data set than for DuPage County’s. While both approaches indicate that assessment ratios fall with sales prices, the nonlinear procedure indicates that expected assessment ratios are extremely high in Chicago at very low sales prices—but still below the statutory rate of 16 percent.
The regression lines imply precise relationships, but they do not address differences in the degree of variability at different sales prices. It may be that both unusually high and unusually low prices are simply hard to assess accurately. If so, assessment ratios could have high variances at both low and high sales prices while being tightly centered on statutory rates near the mean sale price. Neither the traditional PRD statistic nor standard regression procedures are well-suited for analyzing a situation where the accuracy of the assessment process varies with sales prices.
Quantile Regressions Using Simulated Data
Another statistical procedure, quantile regression, provides much more information on the relationship between assessment ratios and sales prices by showing how the full distribution of ratios varies by price. The easiest way to understand quantile regression is to imagine two data sets, A and B, where both have 10,000 observations. Each observation represents a sale price and assessment ratio pair, but sales prices are constrained to integers between 1 and 10 (figure 2).
In constructing data set A, a sale price is assigned, and then an assessment ratio is drawn from a normal distribution with a mean (and median) of 0.33 (the statutory rate in DuPage County). Data set A then matches the assumptions of a classical regression model, where the variance of the assessment ratios is constant across all values of sales prices. In constructing data set B, however, the variance of the assigned assessment ratio is higher for lower sale price levels, but the mean is constant and equals 0.33 at each price.
In both data sets the mean is equivalent to the estimated linear regressions in this case, indicating no relationship between sale price and assessment ratio. If these regressions were estimated using real data, they would be interpreted as indicating that assessment ratios are proportional to sales prices, i.e., assessments are neither regressive nor progressive. Despite this finding, figure 2 clearly shows that in data set B assessments converge on the statutory 33 percent rate at high sales prices, whereas homes with low sales prices run the risk of having extremely high assessment rates.
Quantile regression estimates reveal the differences between data sets A and B in the degree of assessment ratio variability, and this approach can be estimated at any target value of the assessment ratio distribution. For example, since the 10 percent and 90 percent quantile lines are converging as sales prices increase, the quantile regression reveals what standard regression procedures do not—low sales prices have highly variable assessments and high sales prices have more precise assessments.
Quantile Regressions for the City of Chicago and DuPage County
In practice, linear regression, locally weighted regression, and a linear version of quantile regression all proved too restrictive to represent accurately the relationship between assessment ratios and sales prices in Chicago and DuPage County, especially for extremely low and extremely high sales prices. Instead, a nonlinear version of quantile regression provides the most accurate representation of the underlying relationship.
Figure 3 shows the results of nonlinear versions of the quantile regressions, which can be estimated at a series of target points, with more weight given to observations that are near the targets. From bottom to top, the graphs show the estimated 10, 25, 50, 75, and 90 percent quantile regression lines.
Chicago’s results suggest that assessment ratios are relatively high at all quantiles for quite low prices, but the high variability is evident in the large spread between the 10 and 90 percent quantile lines. However, as the sale price increases from about $250,000 to nearly $800,000, the regression lines are close to horizontal. The variability is also low in this range. The quantile lines begin to have a downward slope again for prices above $800,000, with a moderate increase in the variance. Thus, the Chicago results suggest that the standard analysis of regressivity is misleading in that most of the regressivity is concentrated at low sales prices where the variance is also quite high.
In contrast, DuPage County has relatively high assessment ratios and lower variances in the $100,000–$200,000 range of prices where most sales took place in 1999. Assessment ratios decline with sale price for all prices beyond about $100,000, while the variance is increasing. The pattern of results for DuPage County is closer to what is implicitly assumed in a standard regression analysis of assessment regressivity.
Assessment Ratios Distributions at Alternative Sales Prices
An alternative to quantile regression is to examine the actual distribution of assessment ratios at a variety of different target values for sales prices to see how assessment ratios vary at given sales prices. Since most of the interesting patterns occur at low sales prices, figure 4 shows estimated conditional density functions for sales prices ranging from $50,000 to $200,000. The density function for Chicago has a huge variance at a sale price of $50,000. As the price increases to $100,000, $150,000, and finally $200,000, the density function moves to the left, meaning that lower assessment ratios become more common—an indication of regressivity. The distribution is also much more tightly clustered around the mean value of 9–10 percent, which indicates that the variance is reduced substantially.
In the contrasting case of DuPage County, the conditional density functions simply shift to the left as the target sale price increases with no pronounced change in variance. This parallel leftward shift of the conditional density function shows what would be predicted by a classic regression analysis of a regressive assessment system.
Implications for Property Taxes
Assessment regressivity has important implications for individual tax bills, as exemplified in a simplified analysis of residential taxes in Cook County. Though not a literal representation of the county’s tax system, the analysis is a close approximation. The starting point for table 2 is the estimated market value, which we assume to be accurate. Although the statutory assessment rate in Cook County was 16 percent prior to 2009, I use an assessment rate of 10 percent because it is closer to the actual rate and it matches the recent recalibration. Thus, the proposed assessed valuation for the property is $10,000.
However, Illinois also requires that assessments across the state must average 33 percent of market value. If assessments average less than 33 percent—as is mathematically a near certainty under Cook County’s classification system—the Department of Revenue calculates an equalization factor by which all assessments are multiplied. Using a representative value of 2.7 for the multiplier in table 2, the $10,000 assessment turns into an adjusted equalized assessment value of $27,000. Finally, the standard homestead exemption of $5,500 (again, a representative value) is subtracted to produce the base for the homeowner’s property tax bill. Thus, the sample tax rate of 10 percent and the adjusted equalized assessed value of $21,500 produce a tax bill of $2,150.
Table 3 compares house values and property tax rates under the assumption that assessments are regressive and are more variable for $100,000 houses than for $500,000 houses. Due to the homestead exemption, the property tax is somewhat progressive even when assessments are proportional to market value. Thus, a $100,000 house that is accurately assessed at 10 percent of market value ($10,000) ends up with a tax bill of $2,150 or an effective tax rate of 2.15 percent, while a $500,000 house that is assessed correctly at $50,000 has a tax bill of $12,950, or 2.59 percent of market value.
But, suppose that assessment rates for $100,000 homes actually range from 9 to 14 percent, while the range for $500,000 homes is only 8 to 12 percent. In this case, the progressivity of the homestead exemption can be reversed completely. Owners of low-priced homes who are “unfortunate” in receiving high assessments end up with effective tax rates of 3.23 percent, which is much higher than the average 10 percent value for owners of $500,000 homes, and is even higher than the 3.13 percent tax rate paid by owners of high-priced homes assessed at 12 percent.
Moreover, actual tax payments vary significantly for otherwise identical homes—from $1,800 to $3,230 for $100,000 houses and from $10,250 to $15,650 for $500,000 homes. In other words, a homeowner may receive a tax bill that is nearly 80 percent higher than the neighboring house even if both have a market value of $100,000.
Conclusion
Because assessment accuracy is the key to an equitable property tax, statistical measures of regressivity are essential tools for evaluating property valuation systems. Standard measures of regressivity can present an incomplete or even misleading picture of the range of assessment ratios in a jurisdiction. Newer analytic tools such as quantile regression can improve our understanding of the distribution of tax burdens and in this way help improve assessment equity.
Note: The statistical tools used in this article are included in a contributed extension package for the statistical program R. The package (aratio) is designed to be accessible to people who have limited knowledge of the R program but are familiar with other statistical software packages. Both R and aratio can be downloaded at no charge from www.r-project.org.
About the Author
Daniel P. McMillen is a professor at the Institute of Government and Public Affairs and in the Department of Economics at the University of Illinois. He is also a visiting fellow of the Lincoln Institute’s Department of Valuation and Taxation in 2010–2011.
References
Cleveland, William S., and Susan J. Devlin. 1988. Locally weighted regression: An approach to regression analysis by local fitting. Journal of the American Statistical Association 83(403, September): 596–610.
Cornia, Gary C., and Barrett A. Slade. 2005. Property taxation of multifamily housing: An empirical analysis of vertical and horizontal equity. Journal of Real Estate Research 27(1, (January/March): 17–46.
International Association of Assessing Officers (IAAO). 2007. Standard on ratio studies. Kansas City, MO: IAAO.
McMillen, Daniel P., and Rachel Weber. 2008. Thin markets and property tax inequities: A multinomial logit approach. National Tax Journal 61(4, December): 653–671.
Plummer, Elizabeth. 2010. The effect of land value ratio on property tax protests and the effects of protests on assessment uniformity. Working Paper. Cambridge, MA: Lincoln Institute of Land Policy.