ABSTRACT

This chapter reports on a portion of a larger study on Canadian public relations professionals and focuses on the extent of systematic program evaluation within the Canadian public relations profession. We therefore hypothesized that as top management support for systematic research in public relations increases, there will be a concomitant increase in the use of scientific evaluation methods. The survey instrument was designed to measure evaluation methods across four program content areas and practitioner roles. Reliability coefficients were determined for the practitioner role measures as well as for the program evaluation scales. The coefficients for practitioner roles parallel closely those found by G. M. Broom and D. M. Dozier who used the longer version of the scale. The relationships between dominant roles and evaluation methods were explored and a number of hypotheses tested. Practitioners who assume a technical role are expected to primarily write and edit communication materials and deal with the media.