ABSTRACT

Since the early 1970s several scholars and observers of international relations have argued that the United States has lost its hegemonic position in the world or is experiencing a decline in dominance. The late Susan Strange used to chide US academics, in particular, for perpetuating this “myth of America’s lost hegemony.” She was particularly critical of those who not only “unquestionably accepted” the proposition of American hegemonic decline but also took it upon themselves to spread that myth in such a way that it gained credence outside the United States.1