ABSTRACT

At the end ofWorldWar II, the field of labor economicswas in a state of intellectual ferment as scholars weighed the consequences of industry’s new employment system. Nearly everyone agreed that a transformation had occurred, although some questioned whether the change was for the better. Sumner Slichter, for example, said in 1947 that union pressure had “converted millions of jobs which previously had been held on a day-to-day basis into lasting connections,” but a year later he warned that the surge of unionism meant that the United States was “gradually shifting from a capitalistic community to a laboristic one.”1 Others feared that, in the quest for security, too much had been relinquished: Labor market flexibility, economic efficiency, and even individual freedomwere said to have declined. Clark Kerr viewed the spread of internal labor markets as part of a general trend “from the largely open to the partially closed society,” with “fraternity triumph[ing] over liberty as ‘no trespassing’ signs are posted in more and more job markets.” 2

Fueling these concerns was a belief that the unions had succeeded almost too well in advancing labor’s interests. In the political arena, where they had become a powerful force, unionswere seeking to expand programs-national unemployment insurance, higher minimum wages, full employment laws-which augmented the gains already made through collective bargaining. In the workplace, union members were protected by a host of employment security plans that nonunion companies matched to the extent they felt necessary. These plans, it was argued, had rigidified the labor market and immobilized the labor force: Workers could no longer afford to leave their jobs. Terming this state of affairs a new industrial feudalism, critics claimed that golden handcuffs now chained workers to their jobs.3