ABSTRACT
Nature resonates in visions of America. From initial perceptions at the onset of
European colonization of a continent that some thought to be paradise to tran-
scendental musings to romantic depictions of the frontier in nineteenth-century
painting and literature, a special relationship with nature was presumed to be
rooted in the New World. The hope that dwelling in such a place would entail
living in the embrace of nature persisted even as America was transformed
from a nation of farms into an urban society and then into a suburban one.