ABSTRACT

There is little doubt that the U.S. health-care system has its share of problems (Arrow et al. 2009; Sainfort et al. 2008). Although the United States uses some of the most advanced medical technologies, has the largest medical workforce, and spends the largest proportion of its gross domestic product, the World Health Organization (2000, 2009) rated the U.S. health-care system worse than most of the Western world, with respect to quality and performance. Additionally, there has been extensive documentation of widespread errors that have resulted in avoidable injuries to patients and surging health-care costs (Institute of Medicine 2001; Jha et al. 2009). In an effort to rectify several of these problems, there has been a push to develop better health-care information systems. Properly designed and implemented technologies have the ability to improve quality of health care (Classen et al. 2007; Raymond and Dold 2002), decrease health-care costs (Jha et al. 2009; Meyer, Kobb, and Ryan 2002; Vaccaro et al. 2001), prevent medical errors (Agrawal 2009; Institute of Medicine 2001), and support the ever-growing demands placed on the health-care industry by governmental regulation and health-care consumers.