Abstract
Good evaluation practice in public health research has become equated with the inclusion of a mixed-methods 'process evaluation' alongside an 'outcome evaluation' to gather data on how and why interventions are effective or ineffective. While the incorporation of process evaluations in randomized controlled trials is to be welcomed, there is a danger that they are being oversold. The problematic position of process evaluations is illustrated by data from an evaluation of an unsuccessful schools health promotion intervention. The process evaluation data (designed to 'explain' the outcome evaluation results) must be collected before the outcome evaluation results are typically available: unanticipated outcomes cannot always be addressed satisfactorily from prior process data. Further, qualitative process data draw inductively general inferences from particular circumstances and the generalizability of those inferences is therefore uncertain: qualitative data can deepen our understanding of quantitative data, but the commensurability of the two classes of data remains problematic.
Original language | English |
---|---|
Pages (from-to) | 699-713 |
Number of pages | 15 |
Journal | Qualitative Research |
Volume | 10 |
Issue number | 6 |
DOIs | |
Publication status | Published - 1 Dec 2010 |
Keywords
- complex interventions
- drugs
- mixed methods
- process evaluation
- public health
- schools
- smoking
- triangulation
ASJC Scopus subject areas
- Social Sciences (miscellaneous)
- History and Philosophy of Science