ABSTRACT

It is important to reflect on how we have changed and how our own views of Africans are still in the process of changing. The Dark Continent portrayals of Africa developed at a time when Westerners envisioned themselves as potential masters of both society and nature. Indeed, there was much to encourage them. The peoples of Africa were subdued and organized into colonies that produced raw materials for the West's growing industries, while the Asian colonies continued to increase their output as well. In the United States, African Americans made headway against racism. And in Africa, Western-educated Africans led the push for independence in the 1940s and 1950s. By the mid-1960s, only a few African territories, including the five white settler colonies of southern Africa, remained under white rule. The independence of Africa had finally forced the West to consider Africans as real people, even if they were poor or powerless.