ABSTRACT

The war between the United States and Japan, which started with the Japanese attack on Pearl Harbor in December 1941 and ended with the American bombings of Hiroshima and Nagasaki in August 1945, was the most traumatic experience in the histories of these two nations in the twentieth century. Although more than fifty years have passed since the end of that war, its bitter memories still haunt relations between Japan and the countries with which it fought or whose lands it occupied. In the West the war with Japan is regarded, like the war with Nazi Germany, as a moral crusade in which the forces of good defeated the forces of evil. In Japan the Pacific War is seen by many people as a conventional power struggle over land and hegemony, in which neither side was particularly right or wrong.