The Consortium for Scenario Planning invites you to register for our fifth annual conference, a virtual gathering that will run from February 3 to 4, 2022.
Building on last year’s successful gathering, the fifth annual Consortium for Scenario Planning Conference will focus on how scenario planning can help us better prepare for and reduce the impacts of climate change.
The extreme weather events of summer 2021 and the IPCC’s Sixth Assessment highlighted some of climate change’s most disastrous impacts and underscored the urgency of accelerating climate action—especially in the face of far-reaching, uncertain, and varying localized effects on land, equity, housing, health, transportation, and natural resources.
Scenario planning offers a robust way for cities and regions to prepare and plan for this uncertain future.
The 2022 Consortium for Scenario Planning conference will feature presentations from practitioners, consultants, and academics showcasing cutting-edge advances in the use of scenarios for climate action. Conference sessions will be eligible for AICP Certification Maintenance credits.
Adaptação, Mitigação Climática, Recuperação de Desastres, Desenvolvimento Econômico, Planejamento Ambiental, Várzeas, SIG, Infraestrutura, Uso do Solo, Planejamento de Uso do Solo, Planejamento, Resiliência, Planejamento de Cenários
A Cartographic Meditation
Mapping the Colorado River Basin in the 21st Century
Where is the Colorado River Basin? A novice attempting a cursory Google search will be surprised—and perhaps frustrated, confused, or a little of both—to find that there is no simple answer to that question. Winding through seven U.S. states and two states in Mexico—and supporting over 40 million people and 4.5 million acres of agriculture along the way—the Colorado River is one of our most geographically, historically, politically, and culturally complex waterways. As a result, creating an accurate map of the basin—the vast area of land drained by the river and its tributaries—is not a simple undertaking.
Commonly used maps of the region vary widely, even on basic details like the boundaries of the basin, and most haven’t kept up with changing realities—like the fact that the overtapped waterway no longer reaches its outlet at the sea. At the Babbitt Center, we began to hear a common refrain as we worked on water and planning integration efforts with stakeholders throughout the West: people frequently pointed out the flaws in available maps and suggested that addressing them could contribute to more effective water management decisions, but no one seemed to have the capacity to fix them. So, with the help of the Lincoln Institute’s newly established Center for Geospatial Solutions, we embarked on a mapping project of our own.
Our newly published peer-reviewed Colorado River Basin map seeks to correct several common errors in popular maps while providing an updated resource for water managers, tribal leaders, and others confronting critical issues related to growth, resource management, climate change, and sustainability. It is a physical and political map of the entire Colorado River Basin, including the location of the 30 federally recognized tribal nations; dams, reservoirs, transbasin diversions, and canals; federal protected areas; and natural waterways with indications of year-round or intermittent streamflow. We are making the map freely available with the hope that it will become a widely used resource, both within the basin and beyond.
Challenges, Choices, and Rationale
Even though they have few words, maps still speak. All maps are somewhat subjective, and they influence how people perceive and think about places and phenomena. During the peer review process for our new map, one reviewer asked whether our purpose was to show the “natural” basin or the modern, aka engineered and legally defined, basin. This seemingly simple question raised several fundamental questions about what a “natural” basin actually is or would be. This struck us as akin to a perennial question facing ecological restoration advocates: to what past condition should one try to restore a landscape?
In the case of the Colorado, this question becomes: when was the basin “natural”? Before the construction of Hoover Dam in the 1930s? Before Laguna Dam, the first dam built by the U.S. government, went up in 1905? The 18th century? 500 years ago? A million years ago? In an era when the human–natural binary has evolved into a more enlightened understanding of socioecological systems, these questions are difficult to answer.
We struggled with this quandary for some time. On the one hand, representing a prehuman “natural” basin is practically impossible. On the other hand, we felt an impulse to represent more of the pre-dam aspects of the basin than we typically see in conventional maps, which often privilege the boundary based on governmental contrivances of the 19th and 20th centuries.
Ultimately, after multiple internal and external review sessions, we agreed on a representation that does not attempt to resolve the “natural” versus “human” tension. We included infrastructure, clearly showing the highly engineered nature of the modern basin. We also included the Salton Basin and Laguna Salada Basin, two topographical depressions that were formed by the Colorado. Both are separate from the river’s modern engineered course, and often excluded from maps of the basin. We didn’t choose to show them because we expect the Colorado River to jump its channel any time soon, nor because we presume to accurately represent how the delta looked prior to the 20th century. But from our research, we learned that the 1980s El Niño was of such magnitude that river water from the flooded lower delta reached back up into the dry bed of the Laguna Salada, making commercial fishing possible there. Environmental management of the heavily polluted Salton Sea, meanwhile, is a contested issue that has figured in recent discussions about future management of the Colorado. These areas are not hydrologically or politically irrelevant.
Our map doesn’t attempt to answer every question about the basin. In many ways, our contribution to Colorado River cartography highlights the unresolved tensions that define this river system and will continue to drive the discourse around water management and conservation in the Colorado Basin.
There is no simple definition of the Colorado River Basin. That might be the most important underlying message of this new map.
Zachary Sugg is a senior program manager at the Babbitt Center for Land and Water Policy.
Over the course of my career, I’ve had the opportunity to teach in many different places and contexts, from a vocational high school on the South Shore of Massachusetts to undergraduate and graduate classrooms in New York, North Carolina, England, Italy, and Russia. Though the students and subjects have differed, one thread has emerged: teaching is the best way to learn.
There’s no better way to discover the gaps in your own knowledge than by trying to convey that knowledge to someone else; no better way to understand how people absorb and act upon information than by actively engaging in that process with them. This isn’t a novel concept: the Latin phrase docendodiscimus, often attributed to Seneca, means “by teaching, we learn”; the Germans promulgated a pedagogical approach called LernendurchLehren, or “learning by teaching.”
What you learn by teaching, first and foremost, is that teaching is more than a “sage on stage” waltzing into a classroom to deliver information from on high. Yes, it requires command of your subject, but it also requires being mindful and present—with an open mind, willing to experiment, and most importantly, listening in order to reframe the discussion when your words aren’t landing well.
Those qualities abounded in our founder, John C. Lincoln. From the earliest days of the Lincoln Foundation, he made education and experimentation a priority. Lincoln was motivated by a fervent belief that the value of land belongs to the community and should be used for the community’s benefit, a concept he first encountered at a lecture by the political economist and author Henry George. He disseminated this idea through his own prolific writing—pamphlets, articles, even a monthly “Lincoln Letter”—and by funding educational institutions.
In 1949, just three years after establishing the Lincoln Foundation, Lincoln penned a letter on behalf of the Henry George School—whose work he funded and whose board he chaired for 17 years—to promote a 10-week discussion course based on George’s work. “The course offers no ready-made panaceas or medicine-man formulas,” Lincoln cautioned. “It attempts, through open discussion and stimulating analysis, to make clear the underlying causes of the problems that face the modern world and to discover the means for solving them.”
That commitment to discussing problems and discovering solutions remains central to our mission. Though we face global challenges John Lincoln could not have foreseen, from climate change to COVID, some of the problems of his era are all too familiar: economic inequality, soaring housing costs, social injustice, and overuse or abuse of natural resources, to name a few.
After John Lincoln’s death in 1959, David Lincoln took the helm of the family foundation. It didn’t take long for David to expand his father’s commitment to education, providing grants to the Claremont Men’s College in California, the University of Virginia, New York University, the University of Chicago, and the Urban Land Institute. A decade later, the Lincoln Foundation established the Land Reform Training Institute in Taipei, now called the International Center for Land Policy Studies and Training and still a partner of the Lincoln Institute. David and his wife, Joan, were also generous supporters of Arizona State University and other institutions.
Even as he supported education in other venues, David dreamed of establishing a freestanding organization that could conduct its own research on land policy—a place that could develop and deliver courses in partnership with like-minded institutions without being in thrall to them. The establishment of the Lincoln Institute of Land Policy in 1974 represented a bold step, a foray into the active pedagogy that powers our work today and that would, in turn, accelerate our own learning.
In the nearly five decades since David Lincoln took that leap, we have taught—and learned from—students around the world, from undergraduates grappling with the basics to seasoned urban practitioners eager to expand their skills. We’ve delivered courses about land value capture and land markets in Latin America; about valuation and the property tax in Eastern Europe and Africa; about municipal finance and conservation in the United States and China; and much more. During the past decade, our courses and trainings have reached nearly 20,000 participants.
Along the way, we’ve learned a few important lessons. We learned, for instance, that when it comes to land policy education, critical gaps exist. As we prepared to launch a municipal fiscal health campaign in 2015, we conducted a straw poll with the American Planning Association to determine the number of graduate planning schools that required students to take public finance courses. The answer? None. To address this puzzling oversight, we developed a curriculum on public finance for planners, which we have since delivered in Beijing, Chicago, Dallas, Taipei, and Boston, in formats ranging from a three-day professional certificate program to a full-semester course for graduate students.
We’ve also learned that professionals working on land policy have a huge appetite for practical training, and we’ve learned how much people value credentialed courses. As the pandemic set in last year, our staff tried out some new virtual approaches that heightened participation and engagement. These ranged from prerecording presentations that could be viewed prior to live sessions to spreading what would have been a tightly packed, in-person schedule across multiple days. In some cases, we reached more people; a virtual seminar on taxation in Eastern Europe, for example, reached 500 people instead of the 40 who would have attended in person. In other cases, we reached a more geographically diverse pool while intentionally keeping enrollment low to foster engagement and active learning. Even as we begin making plans to return to in-person learning, we have become more adept at leveraging the possibilities afforded by virtual instruction and look forward to enhancing those offerings.
This year, building on what we’ve learned and honoring the Lincoln family tradition of taking leaps, we’re launching our first degree-granting program in partnership with Claremont Lincoln University (CLU), a nonprofit, online graduate university dedicated to socially conscious education. Together we’ve created online, affordable Master in Public Administration and Master in Sustainability Leadership programs, and we are working on a third option—the first Master in Land Policy in the United States—which we hope will follow soon.
These degree programs, which can be completed in 13 to 20 months, represent a way of rethinking advanced education from the ground up. They are specifically designed for working professionals who need to gain practical skills they can implement in their daily lives, while they do their jobs. They are both comprehensive and streamlined. Lincoln Institute staff will design and deliver several courses, using real-world case studies and cross-sector analyses to tackle topics ranging from public finance to civic engagement. This fall, I’ll teach a course on Urban Sustainability, helping students acquire the knowledge and skills they need to diagnose urban challenges, design interventions to make cities sustainable, and mobilize resources to implement those solutions—and I have no doubt that I’ll learn a great deal along the way.
The students who enroll at CLU won’t be there simply to get an advanced degree; they’ll be there to explore issues, discover solutions, and become part of a national movement of lifelong learners. With the climate crisis bearing down in alarming new ways, infrastructure crumbling, and affordable housing an increasingly endangered species, public officials are facing seemingly insurmountable challenges with fewer resources at their disposal. This program will build a growing network of informed, hands-on problem solvers who can use land policy to address our thorniest environmental, economic, and social challenges.
At the Lincoln Institute, we are intent on “finding answers in land.” We don’t claim to have all the answers. We are committed to finding them through our research and through collaborations with partners around the world. Through initiatives like our new CLU partnership, we will continue to teach, to learn, and to experiment—and we will seek to shed, as John Lincoln wrote in 1949, “some new, searching light on the vital questions that concern us all.”
Learn more about the Claremont Lincoln University–Lincoln Institute of Land Policy partnership and current fellowship opportunities by visiting www.claremontlincoln.edu/lincolninstitute75.
George W. McCarthy is president and CEO of the Lincoln Institute of Land Policy.
Image:Claremont Lincoln University headquarters in Claremont, California. Credit: CLU.
principios de los 70, los tributos inmobiliarios eran uno de los villanos preferidos en los Estados Unidos. Las facturas de cobro de los propietarios estaban por las nubes, en las noticias abundaban las historias de valuadores corruptos, y los legisladores y ejecutores de políticas de todo el espectro concluían que los gobiernos locales administraban mal los tributos inmobiliarios a costa de los residentes a quienes debían atender.
En su discurso del Estado de la Nación de 1972, el presidente Richard Nixon catalogó a los tributos inmobiliarios de “opresivos y discriminatorios”. En la elección presidencial de ese año, los candidatos principales mencionaron dicho impuesto en su campaña. Tras las elecciones, el senador Edmund Muskie de Maine, quien había sido derrotado en la primaria demócrata, encargó una investigación detallada de los tributos inmobiliarios estatales y locales.
“La perpetuación de sistemas arcaicos, injustos (y muchas veces solapados) de tributos inmobiliarios socava la credibilidad de todos los niveles del gobierno”, dijo Muskie en una audiencia del senado en 1973, poco después de que se completara el estudio. “Indigna a todo el país que en una era de tecnología informática casi ningún gobierno pueda administrar los tributos inmobiliarios de forma justa”.
Durante la década siguiente, la tecnología mencionada por Muskie evolucionó de forma drástica. Los grandes avances en la potencia informática y el surgimiento de una generación de valuadores bien entrenados y con habilidades de computación que podían aprovecharla revolucionaron uno de los aspectos más asediantes de los tributos inmobiliarios: la determinación del valor de mercado de cada propiedad. En el núcleo de esta revolución había una pequeña organización fundada en 1974 en Cambridge, Massachusetts, para estudiar y enseñar políticas de suelo.
La tasación de valores inmuebles (también conocida como valuación o avalúo) es tanto un arte como una ciencia, y ha presentado un desafío para los tributos inmobiliarios durante siglos. En Inglaterra, en el s. XVII, los funcionarios gubernamentales contaban las chimeneas y los fogones de cada vivienda para tasarla. Más tarde, un impuesto sobre cada ventana pretendió funcionar casi del mismo modo, pero hizo que la gente sellara ventanas o construyera casas con menos cantidad de aberturas. El parlamento derogó el impuesto en 1851.
Hacia principios del s. XX, los tasadores solían usar uno de tres métodos básicos para determinar el valor de una propiedad; los tres se siguen usando hoy. El primero compara cada propiedad con otras cercanas que se hayan vendido recientemente. El segundo considera el ingreso que recibiría el propietario si alquilara la propiedad. Y el tercero estima el costo en mano de obra y materiales por reconstruir determinada estructura, más el valor del suelo en el que esta se encuentra.
El tercer método, conocido como “propuesta de costos”, se adoptó mucho en los 20 y los 30. Para calcular el valor del suelo, los tasadores se basaban en el precio de predios vacantes vendidos poco antes en la zona. Estos eran frecuentes en zonas rurales o suburbios nuevos, pero no abundaban en ciudades consolidadas.
“Las ventas de valor territorial son casi imposibles de hallar”, dijo Jerry German, quien empezó su carrera como tasador en Cleveland, Ohio, en 1974, cuando aún se hacían muchos cálculos a mano. “Se colocaba el mapa de la jurisdicción en el suelo o una mesa gigante. Los valuadores miraban el mapa y decían: ‘parece que en esta zona, la tierra se vende más o menos a un dólar por pie cuadrado’ . . . Recuerdo que los valuadores expertos iban por ahí con pequeñas reglas de cálculo en el bolsillo para sacar números”.
Lo que los tres métodos de tasación tenían en común era que los valuadores hacían cálculos individuales para cada propiedad y los registraban a mano en fichas que se solían almacenar en largas filas de archivos. El proceso era vulnerable a errores, incoherencias y corrupción, y no era muy transparente sobre quién decidía el valor de cada propiedad, cómo se hacía el cálculo o quién más podría haber influido en la decisión.
Para cuando German llegó a Cleveland, hacía ya más de una década que un puñado de ciudades había empezado a establecer discretamente las bases para la tasación computarizada. En la década de 1960, los avances informáticos se juntaron con nuevas demandas de datos, porque muchos estados exigían por primera vez que se revelaran los precios de venta inmobiliaria de forma precisa. Los valuadores usaban los datos para identificar las características que afectaban el precio de una propiedad, como superficie, cantidad de baños y ubicación. Las jurisdicciones grandes que podían costear las primeras computadoras (y los asesores con pericia especial para programarlas) ya podían calcular valores inmuebles de forma automática. La nueva práctica, valuación masiva asistida por computadora (CAMA), fue un avance, pero también tenía desventajas importantes.
“Para un tasador, aparte del costo, lo peor era su inflexibilidad”, dijo German. “Todo estaba programado de forma fija, y cuando . . . lograbas encaminarte y programabas todo, era casi imposible cambiar algo”.
En 1974, cuando se fundó el Instituto Lincoln de Políticas de Suelo como una escuela, el primer director ejecutivo, Arlo Woolery, vio una oportunidad. Una de las prioridades de la organización era promover tributos inmobiliarios que funcionaran bien. Si ayudaba a los valuadores a computarizar su trabajo, el Instituto Lincoln podría ofrecer el tipo de asistencia que cambiaría las prácticas locales.
El Instituto Lincoln organizó el primer coloquio sobre valuación masiva asistida por computadora en 1975. En ese entonces, solo un puñado de las cerca de 13.500 jurisdicciones de los Estados Unidos usaban computadoras para valuación masiva; “no serían más de 400, quizás menos de 200 jurisdicciones”, estimó el experto en tasaciones Richard Almy en un artículo que preparó para el coloquio. Charles Cook, director de educación en el Instituto Lincoln, quien antes había trabajado para una empresa privada de valuación masiva, comenzó a reunir y capacitar valuadores en una iniciativa por mejorar la valuación computarizada y expandir el uso.
El Instituto Lincoln, al reconocer que, debido al costo y la inflexibilidad del software de tasación, este no estaba al alcance de muchas ciudades y pueblos, a principios de los 80 desarrolló un software llamado SOLIR (Small On-Line Research, “pequeña búsqueda en línea”), que los valuadores podrían usar y personalizar solos con una computadora Radio Shack TRS 80 disponible en el mercado. Este fue un gran avance. Por primera vez, las oficinas locales de tasación que no tenían grandes presupuestos o habilidades de programación informática pudieron acceder a CAMA. El Instituto Lincoln entregaba SOLIR sin costo a valuadores que realizaban un curso de una semana, y durante varios años, lo actualizó con regularidad.
Gracias al proyecto, el Instituto Lincoln se sintió menos como una organización de investigación y más como “un emprendimiento informático emergente”, dijo Dennis Robinson, quien era vicepresidente ejecutivo del Instituto Lincoln y director ejecutivo de finanzas, y hace poco se jubiló. Robinson empezó a trabajar en 1982 como supervisor de desarrollo de software y capacitación. Evocó “una alfombra sucia, arrugada, con manchas de café. Esa era la sala de máquinas. Había un banco de unas ocho computadoras Radio Shack con programadores que se encargaban de SOLIR”.
Los primeros valuadores que usaron el software ayudaron a mejorarlo: probaron las limitaciones y recomendaron nuevas funciones. Por solicitud de estos, el Instituto Lincoln creó un módulo que ayudaba a determinar el valor del suelo separado de cualquier edificio; esta era una función esencial para mantener las tasaciones al día.
Hacia fines de los 80, empresas privadas de software y consultoría estaban incorporando la tecnología de SOLIR a sus propios productos, y el Instituto Lincoln dejó de desarrollar el software. Sin embargo, siguió investigando sobre usos innovadores de CAMA y siguió reuniendo y capacitando valuadores a medida que la tecnología avanzaba. En los 90, los valuadores empezaron a usar software de sistemas de información geográfica (SIG) para desarrollar registros de propiedad según su ubicación. Al integrar estos registros con los sistemas de CAMA, entre otras cosas, podían medir cómo afectaban al valor territorial las características del vecindario, como escuelas o parques. “Tomaron esas herramientas e hicieron cosas muy creativas y sofisticadas”, dijo Robinson.
Hoy, CAMA es esencial para los sistemas de tributos inmobiliarios en los Estados Unidos, Canadá y Europa occidental. Muchos gobiernos de Europa oriental, América Latina, Asia y África también adoptaron alguna versión de la herramienta, y en algunos casos usaron imágenes satelitales o fotografías aéreas para abandonar los registros en papel, que fueron la base de los primeros sistemas de CAMA.
En China, que se prepara para instaurar el primer impuesto sobre la propiedad inmobiliaria, los funcionarios locales de Shenzhen, un centro tecnológico que crece velozmente, hace poco desarrollaron aplicaciones innovadoras de CAMA. Fueron los pioneros de un sistema conocido como GAMA, que combina GIS y CAMA para crear modelos tridimensionales detallados que consideran factores como vistas y el trayecto de la luz y el sonido. Estas consideraciones adicionales pueden provocar diferencias de hasta un 20 por ciento en el valor de los departamentos o condominios de un mismo edificio.
En total, los avances de CAMA en las últimas décadas crearon una marea de cambios en la administración de los tributos inmobiliarios. “La tasación computarizada puede parecer obvia hoy”, dijo Joan Youngman, miembro sénior del Instituto Lincoln. “Pero ofreció la infraestructura necesaria para tasar el verdadero valor de mercado de cada propiedad, y esta es la base de cualquier sistema de tributos inmobiliarios justo y equitativo”.
Will Jason es el director de comunicaciones del Instituto Lincoln de Políticas de Suelo.
Fotografía: La tasación computarizada, que el Instituto Lincoln ayudó a impulsar en los 70 y los 80, generó un sistema de tributos inmobiliarios más equitativo. Crédito: cortesía de Data Cloud Solutions, LLC.
Property Taxation and Land Value Capture in Africa
Maio 5, 2021 - Maio 7, 2021
Offered in inglês
SHARE
This conference, organized in partnership with the African Tax Institute, provides a forum for scholars and practitioners to discuss a range of issues on property taxation in Africa. In addition to an update and critical analysis of property taxation issues in Africa, there will be a special focus on existing initiatives to improve mapping and revenue collection efforts and discussion of the potential for future work in this area. The conference will also consider alternative revenue streams, such as different forms of land value capture.
In the early 1970s, the property tax was one of America’s favorite villains. Homeowners had seen their tax bills soar to new heights. Stories of corrupt assessors filled the news. And policy makers across the spectrum concluded that local governments were maladministering the property tax at the expense of the residents they were supposed to serve.
In his 1972 State of the Union address, President Richard Nixon called the property tax “oppressive and discriminatory.” In the presidential election that year, all the major candidates addressed the property tax during their campaigns. After the election, Senator Edmund Muskie of Maine, who had been defeated in the Democratic primary, commissioned a detailed investigation of state and local property taxes. “The perpetuation of archaic, unfair—and too often secretive—systems of property taxation undermines the credibility of government at all levels,” Muskie said at a Senate hearing in 1973, shortly after the study was complete. “It is a national outrage that in an age of computer technology, most governments fail to administer property taxes fairly.”
Over the course of the next decade, the technology Muskie had alluded to evolved dramatically. Major advances in computing power, along with the emergence of a generation of well-trained, tech-savvy assessors who could harness it, revolutionized one of the most bedeviling aspects of the property tax: determining the market value of every property. At the center of this revolution was a small organization that had been established in 1974 in Cambridge, Massachusetts, to study and teach land policy.
As much an art as a science, the assessment of real estate values—also known as valuation, or appraisal—has been a challenge of the property tax for centuries. In 17th-century England, government officials conducted assessments by counting the hearths and stoves in each home. Later, a tax on every window was intended to function in much the same way, but it spurred people to board up windows or build houses with fewer of them. Parliament repealed the tax in 1851.
By the early 20th century, assessors typically used one of three basic methods of determining a property’s value, all of which are still in use today. The first compares each property to recently sold properties nearby. The second looks at the income the owner could receive by leasing the property. And the third estimates the cost, in labor and materials, of rebuilding a given structure, plus the value of the underlying land. The third method, known as the “cost approach,” was widely adopted in the 1920s and 1930s. To calculate the value of the land, assessors relied on the price of recently sold vacant parcels in the same area. These were common in rural areas or new suburbs, but rare in established cities.
“Land value sales are like hen’s teeth—you can hardly find them,” said Jerry German, who became an assessor in Cleveland, Ohio, in 1974, when many calculations were still done manually. “You’d lay the map of the jurisdiction on the floor or some giant table. Appraisers would look at the map and say, ‘It appears in this area, land is going for about a dollar per square foot.’ . . . I can remember our senior appraisers walking around with little slide rules in their pocket to do calculations.”
What all three valuation methods had in common is that assessors made individual calculations for every property and recorded them by hand on property record cards, which were often stored in long rows of filing cabinets. The process was vulnerable to errors, inconsistencies, and corruption, with little transparency as to who decided each property’s value, how the calculation was made, or who else might have influenced the decision.
By the time German arrived in Cleveland, a handful of cities had been quietly laying the groundwork for computerized assessment for more than a decade. During the 1960s, advances in computer technology collided with new data requirements, as many states mandated the accurate disclosure of real estate sale prices for the first time. Assessors used the data to identify the characteristics of a property that influenced its price, such as square footage, the number of bathrooms, and location. Large jurisdictions that could afford early computers—and consultants with the special expertise to program them—could now calculate property values automatically. The new practice, Computer Assisted Mass Appraisal (CAMA), represented a leap forward, but it also had serious drawbacks. “The worst thing for the assessor, aside from the expense, was the inflexibility of it,” German said. “Everything was hard-coded in there, and once you . . . set your path and programmed everything in, it was hell and high water to get anything changed.”
When the Lincoln Institute of Land Policy was founded as a school in 1974, its first executive director, Arlo Woolery, saw an opportunity. One of the organization’s priorities was promoting a well-functioning property tax. By helping assessors computerize their work, the Lincoln Institute could provide the kind of support that had the potential to change local practices.
The Lincoln Institute held its first Colloquium on Computer Assisted Mass Appraisal in 1975. Only a handful of the roughly 13,500 assessing jurisdictions in the United States used computers to conduct mass appraisals then—“probably no more than 400 and possibly fewer than 200 jurisdictions,” the appraisal expert Richard Almy estimated in a paper prepared for the colloquium. The Lincoln Institute’s director of education, Charles Cook, who had worked previously for a private mass appraisal firm, began to convene and train assessors in an initiative to improve computerized appraisal and expand its use.
Recognizing that the cost and inflexibility of assessing software put it out of reach for most cities and towns, the Lincoln Institute developed software in the early 1980s called SOLIR (Small On-Line Research), which assessors could use and customize themselves with an off-the-shelf Radio Shack TRS-80 computer. This represented a breakthrough. For the first time, CAMA was accessible to local assessing offices without large budgets or computer programming skills.
The Lincoln Institute provided SOLIR free to assessors who took a weeklong training course, releasing regular updates to the software for several years. The project made the Lincoln Institute feel less like a research organization and more like “a computer startup company,” said Dennis Robinson, who recently retired as the Lincoln Institute’s executive vice president and chief financial officer. Robinson was hired in 1982 to oversee software development and training. He remembered “a coffee-stained, dirty, wrinkled carpet. That was our computer room. There was a bank of eight or so Radio Shack computers with programmers in there working on SOLIR.”
The first assessors to use the software helped to improve it by testing its limits and recommending new features. At their request, the Lincoln Institute created a module that could help determine the value of land separate from any buildings—a critical function for maintaining up-to-date assessments.
By the late 1980s, private software and consulting companies were incorporating the SOLIR technology into their own products, and the Lincoln Institute stopped developing its own software. But the Lincoln Institute continued to conduct research on innovative applications of CAMA and to convene and train assessors as the technology advanced. In the 1990s, assessors began using geographic information systems (GIS) software to develop location-based property records. By integrating these records with their CAMA systems, they could, among other things, measure the effects of neighborhood features, such as schools or parks, on the value of land. “They took these tools and did very creative, sophisticated things,” Robinson said.
Today, CAMA has become central to property tax systems in the United States, Canada, and Western Europe. Many governments in Eastern Europe, Latin America, Asia, and Africa have also adopted some version of the tool, in some cases using satellite imagery or aerial photography to leapfrog over the paper records that undergirded the first CAMA systems.
In China, which is preparing to institute its first property tax, local officials in the fast-growing technology hub of Shenzhen recently developed cutting-edge applications of CAMA. They pioneered a system known as GAMA, which combines GIS with CAMA to build detailed three-dimensional models that account for factors such as views and the paths of light and sound. These added considerations can create differences of up to 20 percent in the value of apartments or condominiums within the same building.
Altogether, the advances in CAMA over the past few decades created a sea change in the administration of the property tax. “Computerized assessment might seem obvious today,” said Lincoln Institute Senior Fellow Joan Youngman. “But it provided the infrastructure needed to assess every property at its true market value—the underpinning of any fair and equitable property tax system.”
Will Jason is director of communications at the Lincoln Institute of Land Policy.
Photograph: Computerized assessment, which the Lincoln Institute helped usher in during the 1970s and 1980s, has led to a more equitable property tax system. Credit: Courtesy of Data Cloud Solutions, LLC.