Article Text
Statistics from Altmetric.com
Sport and Exercise Medicine (SEM) has had a good run. For a while it was the low-cost magic bullet. With efficacy demonstrated in study after study, the conclusion was clear: ‘Exercise is Medicine’, a potential public health panacea.
Sadly, the early promise waned. While we continue to be bombarded by original research and reviews extoling the efficacy of exercise, there is an apparent dearth of evidence of its effectiveness. This fact is highlighted in 2014 reports from the UK Government1 and Public Health England.2
It is often argued that the major challenge to the effectiveness of exercise is adherence. Adherence to exercise, variously reported at between 40% and 50%3 is no lower than that reported for drugs.4 However, while there is general confidence that licensed drugs are effective when taken, reports cited above1 ,2 suggest that this confidence does not currently extend to exercise.
Confidence in drugs results from their demonstrating efficacy and effectiveness in clinical trials. Efficacy, demonstrated in phases I–III of a trial, refers to “the extent to which a drug has the ability to bring about its intended effect under ideal circumstances”.5 Effectiveness, demonstrated in phase IV studies, refers to “the extent to which a drug achieves its intended effect in the usual clinical setting”.5 Effectiveness is what matters to commissioners and patients.
The requirement for effectiveness (ie, phase IV) studies is well recognised.6 A substantial volume of social science research has examined real-world exercise interventions and therefore constitutes phase IV research. However, all too often resultant data relate largely or exclusively to exercise behaviour, providing evidence of behavioural or implementation effectiveness but little evidence of clinical or treatment effectiveness.7 In all exercise interventions, exercise behaviour is the throughput, with health status the output. Outputs are more important to stakeholders.
Furthermore, a recent review8 identified that many studies examining the treatment effectiveness of exercise in the real world adopt laboratory style methods and controls that would be impractical and uneconomic in real-world interventions. Data resulting from such studies merely add to the efficacy data set.
We argue that despite metaphorically drowning in evidence of efficacy and implementation effectiveness, SEM is yet to provide sufficient evidence of treatment effectiveness. Furthermore, while it is a mistake to confuse efficacy with effectiveness,9 in lobbying for exercise as a public health tool, we often do just that.
On the basis of the above we believe that SEM risks being side-lined in public health. If we are to provide critical life support to SEM—and arguably to beleaguered health services—that lifeline is the production of high-quality phase IV/effectiveness research.
A phase IV methodology applicable to a wide range of exercise interventions is the large simple trial (LST).10 LSTs are embedded in the delivery of treatment, make use of existing data and service infrastructure, demand little extra effort of practitioners and patients, and can be conducted at relatively modest costs (this factor being critical at a time when commissioners legitimately question the allocation to research of funds better spent on care).10 While a randomised controlled trial maximises validity but has limited generalisability, and an observational study has limited validity but maximises generalisability, a well-conducted LST maximises validity and generalisability.10 In fact, LSTs represent a combination of a process evaluation, important to stakeholders, and a research study, important to science (perhaps ‘controlled evaluation’ would in fact be an appropriate alternative descriptor).
No matter how efficacious an intervention during phases I–III of a clinical trial, if patients do not take it, or it does not demonstrate its effectiveness among those who do, it should not be commissioned. If the SEM community fails to provide evidence for the effectiveness of exercise, we could condemn subsequent generations of the population to increasingly complex and expensive biomedical interventions, with the associated likelihoods of poorer public health and greater health inequalities. We might also condemn our discipline to the status of a side show to the main event.
The ‘SEM community’ extends to journal editors, commissioners, practitioners and representative bodies, all of whom have a part to play. Journal editors must recognise the value of real-world research, and accept that it cannot meet the rigorous methodological standards of laboratory work. Public health commissioners should not only insist on evidence-based practice, but should insist that ongoing data capture is a feature of all commissioned interventions. Accordingly, practitioners and providers must become adept at embedding data capture and analysis into all relevant activity. Representative bodies must lobby government, health agencies and research councils to provide greater funding for effectiveness research.
However, while the contributions above are important, it is SEM researchers who must play the leading role. A commitment to conducting rigorous effectiveness studies might be critical if SEM is to avoid the inexorable decline into an early grave!
Footnotes
Contributors CB wrote substantial sections and takes responsibility for all content. SM wrote substantial sections. GW, LK, AL, AJ, SD and SW contributed directly and substantially to the intellectual content of the paper.
Competing interests None.
Provenance and peer review Not commissioned; externally peer reviewed.