Civilization & Sludge: Notes on the History of the Management of Human Excreta

Abby A. Rockefeller

Originally published in Current World Leaders, Volume 39, No. 6.

People have been "civilized"--have been settled as opposed to nomadic or hunting-and-gathering--for a mere ten thousand years. And most of us Homo sapiens sapiens remained "uncivilized," in this narrowly meant sense of living without the advantages or constraints of a settled abode, for probably at least the first half of that ten thousand year period.

Before people became "citizens" living in "cities," these smartest alecks of the animal world deposited their excreta--their urine and feces--on the ground, here and there, widely dispersed, in the manner of all other land creatures. Of course, some groups, such as the cats, bury their feces and urine in shallow holes. But the effect of surface deposit or shallow burial is the same: ready access by the decomposer creatures in the soil to the nutrients and stored energy in the excreta; ready cycling through life of the elements necessary to it, attended by an incremental enrichment and diversification of the forms of life.

This meant keeping the nutrients characteristic of excreta in the cycle of soil-to-bacteria-to-plants-to-animals-to-soil. The soil and its communities of life long ago grabbed hold, so to speak, of this major source of nutrients. Keeping these nutrients--especially the major, or "macro," ones such as nitrogen and phosphorus--locked up in the cycles of the land, besides making the land-based life cycles nutrient-rich, kept them out of the waters of the Earth. The lakes, rivers, streams, ponds, oceans, and aquifers were consequently relatively nutrient-poor--what we call "pure." Aquatic life forms evolved in precise relation to such pure waters, so that the characteristic of macro-nutrient scarcity has become, gradually but absolutely, crucial to the health of the species and the ecosystems of the aquatic environment.

When we speak of "healthy" eco-systems, we mean stable eco-systems: that is, both tending toward diversity and not subject to cataclysmic drops in diversity. Such conditions, also called balanced, create relationships--ever more intricate relationships--that increasingly locate the inorganic elements necessary to life in cycles that make those inorganic elements increasingly available to life. The more extensive these relationships, the more consistently available the nutrient-elements will be to the life forms within those relationships. Expanding diversity of life forms is, relatively speaking, a low entropy enterprise. The more diverse the forms of life, the more matter and energy are kept available for use, or "work," and the less they are lost to use or work through either irretrievable dissipation or unresolvable mixing.

So, when we talk of "pure" water, we do not mean pure in the chemical sense. We mean, rather, a dynamic balance between the nonliving macro-nutrient-scarce matter and the living organisms in water; a balance whereby the relationships of life forms to one another, perhaps developed over the course of a couple of billions of years, are, though always changing, nevertheless (excepting cataclysmic events), always stable, expanding in diversity, and healthy.

It is not that life will disappear in waters suddenly enriched by an infusion of macro-nutrients. (Nitrogen and phosphorus, both called macro-nutrients because most plants need large quantities in order to grow, are also sometimes called "limiting factors" since, when they are scarce, the growth of plants--such as algae--not accustomed to nutrient-poor waters, is limited.) But the effect of sudden infusions of any of the macro-nutrients will be to reduce the diversity of life in any body of pure water. We call waters polluted that look like pea soup--so full are they with living algae--because we understand that even a very great abundance of a single form of life in, say, a lake doesn't mean that all's well with the life system in the waters of that lake.

And, indeed, all is not well--much is, in fact, dreadfully wrong--with most of the waters on Earth. What happened to make this so? In brief, there was a sudden infusion (sudden compared to the slow pace of evolution) of nutrients into the Earth's waters--in the form of water-borne human excreta. What follows touches on how water came to be used to transport human excreta, how bodies of water came to be used as the recipient dumps for the water-borne excreta, and what environmental effects have been associated with the chain of behavioral and technological developments resulting from these practices.

* * *

Much of the history of human behavior is before our eyes in living societies today, the history of our excretory practices not excepted. It is likely that all practices ever associated with the disposition of excreta continue in some societies still. The patterns of settled community behavior early split into two courses: one that unambiguously assumed there to be in human excreta a fertilizer value to agriculture, and one that did not regard it as having such a value or that was at least ambivalent about its value.

It was, to be sure, agriculture that "caused" civilization: in its simplest and in its most elaborate forms, civilization altogether depends on agriculture. This dependence, however, has not inspired all agricultural societies with reverence for the economy of the cycles on which agriculture is dependent. Especially uneven has been awareness of the economy of giving back to the soil in the form of excreta what has been taken out in the form of food. The cultures that did consistently employ their own manure in agriculture were primarily Asian. Much has been written about the longevity of these civilizations and the significance of the persistent use of human manure for that longevity (King 1927).

Those settled cultures that do not---and did not--connect human manure with sustainable agricultural productivity followed, and still follow, a fairly standard pattern of "development" of their "sanitation" habits. Urinating and defecating on the ground's surface in the manner of pre-civilized days, but in the immediate vicinity of their dwellings, is the first phase. This soon becomes unviable--that is, too unpleasant--due to the increasing density of the settlers, which leads to the creation of the community pit. When privacy of excretory functions comes to be deemed important, then comes the pit privy, the privacy structure on top of the hole in the ground.

This "outhouse," on account of the smell, is placed at a distance from the dwelling. The odor caused by concentrating excreta in one spot in the manner of the pit latrine--an olfactory offense that causes many to choose the bushes--is legendary for its unpleasantness. But stink aside, and contrary to what some people think, the pit latrine--with or without the privacy structure--is not, and never was, environmentally viable. The pit toilet causes two related troubles--waste and pollution: waste through loss of the unretrieved nutrients in the excreta and pollution of the ground waters by those same wasted nutrients. The pit privy is not, from an environmental point of view, anywhere near as damaging as the flush toilet, but the kind of damage it caused--and still causes--is of a piece with the kind caused by the string of technologies, flush toilet included, that evolved in response to the pit privy's inadequacies.

European societies were for centuries ambivalent in their attitude toward their own excreta. Was it a fertilizer source for agriculture or a nuisance to be "got rid of"? Before the advent of piped-in water, human excreta was deposited in cesspools (lined pits with some drainage of liquids) or vault privies (tight tanks from which there is no drainage) in the backyards of European towns. The "night soil"--human manure collected at night--was removed by "scavengers" and either taken to farms or dumped into streams and rivers or in "dumps" on the land. In Europe, there was, in other words, no consistent perception of the agricultural value of these materials: not as in Asian cultures, where the husbanding of human excreta was (until very recently) unexceptional and routinized.

Five hundred years before Christ, Rome already had in place a system both for bringing in pure water through its famous aqueducts and for the removal via sewers of fouled water that included water-borne excreta from public toilets and from water closets in the homes of the rich (Pliny the Elder 1991; Mumford 1961). But until the middle of the 19th century, most of Europe prohibited the use of sewers for the disposal of human excreta. Sewers consisting of open gutters or sometimes covered trenches in the center or sides of streets had long been in use in European cities, but only for the drainage of rain run-off and for city filth. However, householding transgressors used the sewers to dump their kitchen slop water, and--to save on the cost of paying scavengers--the contents of chamber pots and overflowing cesspools. And when going all the way to the farm was an inconvenience or an extra expense for professional cesspool scavengers, they too took surreptitious advantage of the sewers to dump the product of their nightly labors. The putrefying matter in these stagnant ditches moved along only when it rained enough (hence the name "storm" sewers), and digging them out with shovels was the job of the "sewermen" (Reid 1991).

The "water closet" (so-called to distinguish it from the "earth-closet," an early species of compost toilet much favored by 19th century environmentalists) afforded the enormous convenience of simultaneously putting the toilet in the house while getting the excreta out of the house. The so-named "flush" toilet had been known to the privileged at the height of the Roman era and since the 18th century in northern parts of Europe. But this pivotal technology, symbol of civilization still, came to widespread use only after piped-in water had been made available to the major cities in Europe and the United States. The first waterworks in the United States was installed in Philadelphia in 1802. By 1860 there were 136 systems in the U.S., and by 1880 the number was up to 598 (Tarr and Dupuy 1988). The convenience of a constant water supply stimulated the adoption of residential water fixtures--baths and kitchen sinks as well as flush toilets--dramatically increasing the per capita use of water on average from three to five gallons per person per day to 30 and even 100 gallons per person per day.

Of course, once water was in great quantities piped into homes, it had to be piped out again, and the first "logical" place to pipe it, including the flush water from water closets, was backyard cesspools. These cesspools, which hitherto had received the contents of chamber pots--urine and feces--only, now regularly overflowed with fecally polluted water, and a new level of horrendous odors and the spread of water-borne diseases was the immediate result.

Thus the system of cesspools and vault privies, which had been to some extent effective in avoiding pollution of waterways through their periodic cleanout by scavengers and the at least partial returning of human manure to farms, was overwhelmed by the pressure created by the new availability of running water. The next "natural" step in the solve-one-problem-at-a-time approach was to connect the cesspools to the sewers, thereby moving the sewage from overflowing cesspools into the open sewers of city streets. The result: epidemics of cholera. In 1832, 20,000 people died of cholera in Paris alone (Reid 1991). Wherever and whenever this combination of piped-in water, flush toilets, and open sewers has appeared in the world, epidemics of cholera have followed.

By the middle of the 19th century, the diseases spawned by the convenience of running water and the flush toilet gave rise to a demand for the construction of sewers that would carry the sewage not only out of and away from the home, but away from the city as well. This demand entailed the evolution of the ditch-type storm sewer into the closed-pipe water-carriage system of sewerage. The wastewater itself was in this system the medium of transportation, so a large and regular supply of water was a built-in requirement to keep the wastes moving in the pipes (Tarr and Dupuy 1988). (Today, efforts to conserve water by promoting the use of low-flush toilets--1.6 gallons vs. five to seven gallons--have led to plugging of sewers engineered for a minimum hydraulic flow of five gallons per flush. To deal with this problem, owners of these "water-conserving" toilets have been instructed to flush two or three times per use.)

The water-carriage system of sewerage introduced a new set of problems and, about these problems, a new set of debates among sanitary engineers in Europe and the United States. The engineers were divided again between those who believed in the value of human excreta to agriculture and those who did not. The believers argued in favor of "sewage farming," the practice of irrigating neighboring farms with municipal sewage. The second group, arguing that "running water purifies itself" (the more current slogan among sanitary engineers: "the solution to pollution is dilution"), argued for piping sewage into lakes, rivers, and oceans. In the United States, the engineers who argued for direct disposal into water had, by the turn of the 19th century, won this debate. By 1909, untold miles of rivers had been turned functionally into open sewers, and 25,000 miles of sewer pipes had been laid to take the sewage to those rivers (Tarr and Dupuy 1988).

In the cities with water-carriage sewers, cholera epidemics abated. However, in cities downstream from those dumping raw sewage into the river, death rates from typhoid soared. This led to the next debate: whether to treat the sewage before dumping it into the recipient bodies of water or whether to filter the drinking water downstream. Health authorities argued that sewage should be treated before disposal into any bodies of water, but the sanitary engineers preferred filtration by the next town down the river. The engineers prevailed, and indeed, in those cities with filtered water, deaths from typhoid then dropped dramatically (Tarr and Dupuy 1988).

The practice of "purifying" water polluted with sewage from upstream in order to make drinking water safe downstream, rather than treating sewage where it is produced, persisted until the middle of the 20th century. By then, the rate of industrial development had been enormous, and every industry wanted cheap disposal of its wastes. And since the public was paying, this was cheap as could be. Industries' demand for more sewering to serve their own disposal needs stimulated the industrialized nations of the world to allocate vast sums of money for massive sewer construction programs.

To the nutrient burden on recipient waters from human excrement, then, was added a new and ever increasing flow of industrial waste, much of it toxic. Wherever on the globe there were sewers, the recipient rivers, lakes, and streams were discovered to have become unacceptably filthy, and in response came pressure to treat the sewage before it entered those waters. And so began the "treatment" phase of the get-rid-of-it approach to dealing with wastewater now consisting of human excrement mingled with all industrial wastes transported by water.

The first step in the effort to clean up the sewage before sending the effluent into the river is termed "primary treatment." From the point of view of improving water quality, it is a crude method, consisting of little more than settling and screening the sewage to remove the largest and most aesthetically offensive objects: all nutrients and chemicals not tied up in dead cats and intact feces remain in the water.

Continue to part 2.


ReSource
179 Boylston Street
Jamaica Plain, MA 02130 USA
info@riles.org
© 1997 The ReSource Institute for Low Entropy Systems