Article Text

Download PDFPDF

Trusting systematic reviews and meta-analyses: all that glitters is not gold!
Free
  1. Adam Weir,
  2. Safia Rabia,
  3. Clare Ardern
  1. Aspetar Orthopaedic and Sports Medicine Hospital, Doha, Qatar
  1. Correspondence to Dr Adam Weir, Aspetar Orthopaedic and Sports Medicine Hospital, PO Box 29222, Doha, Qatar; adam.weir{at}aspetar.com

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Systematic reviews provide Level 1 evidence. They are firmly part of modern medical practice. Ideally, systematic reviews provide readers with comprehensive evidence summaries and can highlight research deficiencies. Busy clinicians welcome bite sized summaries to inform their practice. As part of BJSM’s Education theme, we address the question ‘Should I trust this systematic review?’.

Systematic reviews are only as good as the papers they contain

Meta-analysis is the highest level of evidence, but the quality of any systematic review or meta-analysis is only as good as the studies identified and included. Summarising papers with a high risk of bias does not eliminate this bias. Pooling data (meta-analysis) from papers with a high risk of bias actually compounds the bias.1 These basic facts are often underappreciated and overlooked.

Systematically reviewing systematic reviews

Systematic reviews in sports medicine and sports physiotherapy are at once a cause for celebration and concern. Celebration—as scientific evidence supports many treatments in our young profession. Concern—because there are some important weaknesses in our field. The conclusions of more than half of the 200 clinical sports medicine systematic reviews, published between 2009 and 2013, in five major orthopaedic journals (note, not BJSM, but see below) were based on only level 4 or 5 evidence: expert opinion, case series and poor quality cohort or case control studies.2 Users of systematic reviews in sport and exercise medicine need to recognise these shortcomings.

AMSTAR scoring of 10 BJSM reviews

Turning the magnifying glass to BJSM, we conducted a mini-review of systematic reviews published in 2014 in BJSM. We selected 10 reviews, using a random number generator, having first listed and numbered all articles. Two authors (AW and CA) then independently assessed these 10 systematic reviews (see online supplement for references), using the ‘Assessment of Multiple Systematic Reviews’ (AMSTAR) checklist.3 AMSTAR is an 11-item tool for evaluating the methodological quality of systematic reviews. Discrepancies were resolved via consensus, to reach a final AMSTAR score for each review. Two of the reviews were narrative, and excluded. The remaining eight reviews, including two in which we were co-authors, had a mean AMSTAR score of 4.6/11 (range 2–8).

This raises some important issues. Only 2 (25%) reviews had registered or written a protocol prior to starting.4 ,5 Only 1 review used duplicate study selection and data extraction. Two reviews (25%) did not assess the risk of bias in included studies—one of the fundamental steps in systematically reviewing evidence. Publication bias was examined in only 1 review. None of the reviews assessed the conflicts of interest in included studies. These processes are all ways in which systematic review authors can reduce bias and improve quality. Bottom line? The reader should be familiar with key methodological elements of systematic reviews to make informed judgements about quality.

How does BJSM compare to other journals

The average AMSTAR score in our sample was 4.6/11. In contrast, the average AMSTAR score of the 200 systematic reviews assessed by DiSilvestro et al2 was 8.0/11. Duplicate data extraction (38%), providing a list of the studies included and excluded (28%), and assessment of publication bias (20%), were areas that could be improved. A review of 76 systematic reviews in the broader orthopaedic literature revealed an average AMSTAR score of 5.9/11.6 Momeni et al7 rated 42 hand surgery systematic reviews, which had a median score of 7/11. It is clear that there is room for quality improvement across the whole of musculoskeletal medical literature.

The great unknown

Does fulfilling Preferred Reporting Items for Systematic Reviews and Meta-Analyses (PRISMA) or AMSTAR criteria guarantee great content? Perhaps not. Although this field has not been extensively explored. At the end of the day, a systematic review performed with perfect methods cannot compensate for a lack of high-quality studies.

Room for improvement—how do we write better reviews?

We propose a top 5 list to improve the quality of future reviews in our journal. Useful web links are provided below.

  1. Systematic review authors should familiarise themselves with the PRISMA as well as AMSTAR guidelines before commencing new reviews.

  2. All new reviews should be registered prospectively (PROSPERO site) to prevent protocol changes and increase transparency.

  3. Two independent reviewers should always perform study selection and data extraction to reduce the risk of bias.

  4. The risk of bias in included studies should be assessed in all systematic reviews.

  5. Reviewers should not perform meta-analysis of studies with high risk of bias, as this will compound the bias.

All that glitters is not gold

A fancy title and complex data pooling methods can never compensate for poor-quality studies included in a systematic review. Readers must critically appraise systematic reviews and meta-analyses for themselves. Reflecting this, and to help the sport and exercise medicine clinician confidently judge the quality of a systematic review, BJSM has published a guide for appraising systematic reviews.8

Links to useful websites:

References

View Abstract

Footnotes

  • Twitter Follow Clare Ardern at @clare_ardern

  • Contributors All the authors were responsible for the study design. AW and CA reviewed the 10 articles using AMSTAR. The results were collated by SR. All the authors approved the final draft.

  • Competing interests None declared.

  • Provenance and peer review Not commissioned; externally peer reviewed.

Linked Articles