ABSTRACT

This chapter explores the relationship between women’s experience in nature and their body image. In Westernized societies there is evidence of increasingly unhealthy, negative self-perceptions of body image among females. That negative body image impacts many aspects of women’s wellness, which have broader implications on family, community, and environmental well-being. Studies on the relationship between experience in nature and women’s body image illuminate ways to heal women’s relationships with themselves through experiences in nature. Through time in nature women are able to rebuff the patriarchal beliefs that contribute to the diminishing and objectifying of women and nature.