ABSTRACT

For much of the eighteenth and early nineteenth centuries, urban Americans acquired water through their own devices, from water merchants, or from public wells (some purchased by the local government). Beginning in the 1830s, many cities and towns developed centralized water systems managed and owned by the municipalities themselves. From that time until the late twentieth century, water was generally treated as a public good and providing it was regarded as a public responsibility, based on the assumption that market forces could not be depended upon to furnish services necessary to society (Jacobsen, 2000: 3, 13, 22). At the same time, freshwater was a commodity to be bought and sold, whether controlled by private or public entities. Gail Radford suggested the implications of developing public water systems:

Mundane as it might seem, providing water represented a sharp break for cities, which had previously confined themselves to supplying relatively indivisible public goods, such as police and fire protection, that did not lend themselves to the commodity form – that is, to being socially defined as objects bought and sold in markets. Water, by contrast, was generally charged for according to use. In a sense then, water opened Pandora’s box. The widespread reliance on municipal provision of this vital substance enhanced the plausibility of following the same course for other goods (Radford, 2003: 872).