ABSTRACT

Those who work in allied health professions and education aim to make people’s lives better. Often, however, it is hard to know how effective this work has been: would change have occurred if there was no intervention? Is it possible we are doing more harm than good? To answer these questions and develop a body of knowledge about what works, we need to evaluate interventions. Objective intervention research is vital to improve outcomes, but this is a complex area, where it is all too easy to misinterpret evidence. This book uses practical examples to increase awareness of the numerous sources of bias that can lead to mistaken conclusions when evaluating interventions. The focus is on quantitative research methods, and exploration of the reasons why those both receiving and implementing intervention behave in the ways they do. Evaluating What Works: Intuitive Guide to Intervention Research for Practitioners illustrates how different research designs can overcome these issues, and points the reader to sources with more in-depth information. This book is intended for those with little or no background in statistics, to give them the confidence to approach statistics in published literature with a more critical eye, recognise when more specialist advice is needed, and give them the ability to communicate more effectively with statisticians.

Key Features:

  • Strong focus on quantitative research methods
  • Complements more technical introductions to statistics
  • Provides a good explanation of how quantitative studies are designed, and what biases and pitfalls they can involve

chapter 1|7 pages

Introduction

chapter 3|19 pages

How to select an outcome measure

chapter 8|7 pages

The researcher as a source of bias

chapter 11|5 pages

The importance of variation

chapter 12|12 pages

Analysis of a two-group RCT

chapter 15|7 pages

Drawbacks of the two-arm RCT

chapter 17|4 pages

Adaptive designs

chapter 18|5 pages

Cluster Randomized Controlled Trials

chapter 19|5 pages

Cross-over designs

chapter 20|17 pages

Single case designs

chapter 21|8 pages

Can you trust the published literature?

chapter 22|6 pages

Pre-registration and Registered Reports

chapter 23|9 pages

Reviewing the literature before you start

chapter 24|4 pages

Putting it all together