The Way Concrete Goes
This article was originally published on The Metropole, the blog of the Urban History Association
We interact with concrete on a daily basis: we walk on its sturdy surface, sit on its even finish, and receive its shelter in case of inclement weather. Despite this intimacy, however, few of us know what concrete actually is, let alone where it comes from or where it goes. This short essay tracks the surprisingly circular life cycle of this commonplace material, using the Lehigh Valley in Pennsylvania as a case study. As the birthplace of the American cement industry, the region tellingly reveals the broader relationship between urban development and capitalism that has shaped American cities throughout the twentieth century. Starting with limestone quarries in urban outskirts, moving to major cities, and finally returning to the abandoned cavities, concrete’s life cycle reveals that the material does not simply progress through stages of life and death; it instead shifts its shape along the way to maintain its monopoly with distinctive environmental consequences.
Although concrete was not discovered in the United States — the Romans and the English claimed that title — the nation did modernize the material with the introduction of new technologies like Thomas Edison’s 1904 horizontal rotating kiln. The medium soon became critical in any kind of building, from basic infrastructure like roads and bridges, to the foundations of urban structures. Thanks to Americans’ embrace of concrete for all sorts of building projects throughout the late nineteenth and twentieth centuries, the medium became the most consumed material on earth after water. Yet this aggressive modernization came at a high cost: regions that were involved in the manufacture of concrete’s components, particularly cement, experienced significant environmental harm. But before the geography of cement’s mobility can be pinned down, it is critical to first clarify the process of its manufacture and use.
Although we know concrete as a hard medium — the most material of all materials — it is more accurate to think of it as an infinite number of recipes for various concoctions. The “bread and butter” are water and cement, the latter ingredient acquired by burning limestone at the hellish temperatures of 2,700 degrees Fahrenheit. Then, aggregates, like sand, gravel, or even glass are mixed in and undergo a chemical reaction. Once the mixture is complete, there is a half-hour window to bring the wet matter to its final destination before it begins to harden. Poured carefully into wooden formwork, the mass continues to release heat and solidify over time. While smaller pours might take several weeks or months to stiffen, large building projects could take many years to set — the Hoover Dam, for example, is still curing.
Despite its urban associations, the life cycle of concrete begins in the natural landscape. Over millions of years, soils and organic matter shifted and settled at the bottom of the ocean, solidifying and creating layers of limestone, known as beds. The porous mineral became especially critical for sustaining local ecosystems and filtering water through its many cavities and cracks that reached down to the groundwater table.
Lehigh Valley men discovered the extensive mineral deposits only in the second half of the nineteenth century: iron industrialists first used limestone to purify metal while cement men applied it to construction. These capitalists established sizable quarries and employed immigrant men and children to loosen and extract the rock. The quarrymen first used primitive mining tools like chisels and forks, but industrialists soon introduced dynamite to improve efficiency. The new technology enabled workers to blow up entire rock formations with one fell swoop.
As capitalists figured out how to more efficiently remove limestone, they also proceeded to extract maximum effort from laborers and local residents, often at the cost of their health and wellbeing. Quarrymen regularly lost limbs or died in workplace accidents while local residents dealt with flying rocks the size of basketballs striking their homes, schools, and businesses. In some extreme cases, the intensive blasts cracked building foundations or collapsed structures altogether. The vibrations of the 1942 accident in Sandts Eddy, for example, sent shock waves as far as Philadelphia. Once the limestone was extracted, workers moved the giant boulders to cement plants, where the rocks were crushed, burned, and then crushed again to manufacture cement.
The manufacture of cement within the plant was predictably a messy business. Most significantly, the operation necessitated significant inputs of anthracite coal to burn the limestone at 2,700 degrees Fahrenheit. (Cement plants have since improved their manufacturing process, though they continue to rely on burning toxic materials, like used tires and household waste, to generate appropriate heat levels). Yet it was not only carbon dioxide that Lehigh Valley plants generated — lesser known is the industry’s release of cement dust particles that enshrouded local towns like clockwork at the beginning of the work day. Nobody could escape this invasive medium: the dust found its way onto the surface of buildings and into their interiors; it traveled to agricultural lands, covering plants and fruit trees. Finally, the matter found its way into the interiors of human and animal bodies, where in some cases it accumulated to toxic levels and caused death.
Once manufactured and packed tightly in buckets or sacks, cement was brought to major metropolitan centers to build American modernity. In particular, Lehigh Valley cement was used to construct Rockefeller Center, the Empire State Building, and the Holland Tunnel in New York City. Perhaps most famously, the region’s product was put to good use in the shaping of the Panama Canal. The largest building project to date when it opened in 1914, it symbolized American industrial and engineering prowess. However, as concrete gained widespread recognition for its material qualities, its life cycle became more obscure and production processes essentially invisible.
Despite cement industrialists’ assumption that concrete was a material of permanence, the medium failed to live up to the hype: the mixtures were often inconsistent and thus caused different parts of buildings to deteriorate at different rates. Even when the material was mixed and poured correctly, buildings were torn down because they missed current corporate desires for space and architecture. As Daniel Abramson writes, building companies aggressively popularized economist Joseph Schumpeter’s theory of creative destruction, or the idea that buildings depreciated due to the constantly improving technologies and standards, and thus needed to be replaced.
When buildings were demolished to make way for taller and newer structures, a major dilemma arose: where could cities dispose of the used concrete? Unlike wood, steel, or other construction materials that could be recycled for a different purpose rather effortlessly, concrete could not — once it was poured and cured in its place, the material lost its flexibility and value. (Contemporary concrete research laboratories have been experimenting regularly, though with mixed results, with how to recycle used concrete and bring it back into the construction cycle). For some capitalists, the answer to this problem was obvious: return concrete to where it came from.
Lehigh Valley entrepreneurs were quick to jump on this proposition. They invited neighboring states New York and New Jersey to bring their construction debris and dispose of it in the empty cavities. Concrete’s homecoming to the quarry appeared to complete the material’s life cycle: workers extracted limestone, crushed and burned it to manufacture cement, mixed the product with other ingredients to create concrete, employed it in the construction of buildings, and then broke it down to dispose of it in the empty quarries. The empty hole — evidence of industrial environmental theft — was finally filled as if nothing had happened. Quarries that were not filled accumulated rainwater and runoff and deceptively appeared to be lakes or ponds.
Yet this camouflage was little more than smoke and mirrors, since quarries filled to the brim with concrete debris promised a vengeful return. Indeed, some of this material included not only “clean” rubble like concrete, but also polychlorinated biphenyls — the notorious cancer-causing chemicals that were used in electrical equipment. And although concrete was thought to be innocuous, it was far from it. Because of concrete’s nature as a recipe rather than a material, it included surprising and potentially unknown additives that could leak into the quarry and travel down to the groundwater table. One example was American builders’ use of asbestos fibers in concrete mixtures to prevent the material from cracking. Manufactured in different colors to suit varying tastes, asbestos-reinforced concrete materialized roofing tiles, siding, and panels that were commonly used in the building of homes, schools, and other everyday built environments until the U.S. Environmental Protection Agency put severe restrictions on the concoction in 1973. Therefore, concrete’s return to the Lehigh Valley in the form of debris was a kind of material déjà vu: cement dust along with asbestos once again found their way into the lungs of workers, residents, and animals.
The life cycle of concrete — much like capitalism itself — is circulant and can never be put to rest. For urban historians, this case study raises meaningful questions about the material legacies of aggressive urban development efforts: while policies and design schemes evolve, demolished projects do not disappear but find a new life in the urban periphery. By tracking concrete’s life cycle and challenging its material coherence, we can grasp not only its accomplishments, but also harmful legacies upon built and natural environments.
Do you know where the concrete in your neighborhood comes from and where it goes? Are you breathing it in?