ABSTRACT

In Chapter 4, the authors present the eight basic rules of Participatory Action Monitoring and Evaluation (PAME). Rule 1 distinguishes between evaluative, diagnostic and scientific forms of inquiry, boundaries often blurred. Rule 2 calls for ‘structured conversations’ that have a purpose other than simply extracting information to serve the interests of the few. The next three rules warn evaluators against using fixed methods, qualitative or quantitative, or simply mixing them, without ever changing them. PAME is instead a craft involving the development of novel methodologies that fit each context and help groups better assess progress against goals. Rules 6 and 7 question the meaning and merit of ‘performance indicators’ of project or programme outcomes. The use of numbers and facts as though they ‘speak for themselves’ and the attribution of simple causation and credit for results observed in complex settings are also examined. The last rule concerns the value of ‘iterative’ assessments – of deciding and planning to decide and plan later, if and when circumstances require. The chapter ends with a tool called PIE (Planning, Inquiry, Evaluation) inspired by these rules, illustrated with a critical assessment of the learning culture of Canadian volunteer-sending organizations.