ABSTRACT

Viewed from the West, and more especially from the United States, the Vietnam War has been regarded as a peculiarly American problem. The popular view of the war, as represented in films, documentaries, novels, and first-hand accounts, has issued from questions and anxieties about the war’s effect on American society. Historians, with some notable exceptions, have tended to reinforce this perspective, addressing such questions as: How did the United States become involved in Vietnam? Why did America fail? What are the lessons of Vietnam?