ABSTRACT

Roles in the workplace were drastically different and very narrowly defined. In the field of medicine, doctors, regardless of specialty, were almost always male. The roar of social independence burst forth in America in the 1960s and gender role perceptions began to change. The field of medicine exploded with new specialties, with some disappearing and others being combined into new areas of expertise. While insurance companies fought to remain in traditional roles, with limited choices for customers, new vendors introduced creative programs to offer customers a wider variety of services, based on their individual needs. Women were entering laboratories, operating rooms and the executive suite. Men were beginning to look at technical and support roles and entering these areas formerly dominated by women. There are more women healthcare executives and leaders than men.