ABSTRACT

The perception has been gaining ground for some time now that Islam is against the western world and this has intensified since the incidents of 9/11. This is probably mainly a view from the western side but there might be a feeling in the Muslim ummah also that since western intervention in the Gulf in 1990, the recent confrontation in Afghanistan, and the present Iraqi crisis, that subsequent to the mitigation of tensions between the East and the West or between the Soviet and the western blocks, the main antagonisms in the world are indeed between the Muslim world and the West, and that the countries of the West view the Muslim world as hostile to them. Are these perceptions credible and valid?