Science and technology permeate the culture and politics of modernity. On any day, the headline news provides crude but telling indicators of their influence. A Martian ethnographer visiting planet Earth in the first few years of the third millennium would have encountered a bewildering array of stories whose only discernible connection would have been the pervasive – though perversely inconsistent – role of science and technology in human affairs. The millennium opened with false fears of the so-called Y2K bug that might have made computer systems throughout the world crash at midnight, when 1999 rotated into 2000. In 2001, the seemingly well regulated technological system of American civil aviation was ferociously turned upon itself by young Islamic militants, who not only destroyed New York’s tallest buildings, the twin towers of the World Trade Center, but used planes to expose unsuspected vulnerabilities at the heart of US domestic security. In retaliation, the United States launched two militarily successful wars in Afghanistan and Iraq, demonstrating that the advent of “smart weapons” had radically altered the dynamics of battle since the Vietnam era; by the official end of the Iraq invasion, some US observers even wondered (in a luxury permitted only to winners) whether modern warfare any longer needed human bodies on the front lines. Early 2003 also saw the loss of the US space shuttle Columbia with seven crew members, underlining again the fragility of manned space exploration. Behind the dramatic disasters and the violence of terrorism and war, ordinary human attempts to master nature proceeded at slower rhythms, as societies debated how to manage global climate change, AIDS, and other epidemic diseases; how to solve problems of clean water and renewable energy; how to improve crop yields without endangering farmers’ livelihoods; how to treat the ancient infirmities of aging, infertility, mental illness, and disease; and how to stave off death itself.