Article Text

Download PDFPDF

The training—injury prevention paradox: should athletes be training smarter and harder?
  1. Tim J Gabbett1,2
  1. 1School of Exercise Science, Australian Catholic University, Brisbane, Queensland, Australia
  2. 2School of Human Movement Studies, University of Queensland, Brisbane, Queensland, Australia
  1. Correspondence to Dr Tim J Gabbett, School of Exercise Science, Australian Catholic University, 1100 Nudgee Road, Brisbane, QLD 4014, Australia; tim_gabbett{at}yahoo.com.au

Abstract

Background There is dogma that higher training load causes higher injury rates. However, there is also evidence that training has a protective effect against injury. For example, team sport athletes who performed more than 18 weeks of training before sustaining their initial injuries were at reduced risk of sustaining a subsequent injury, while high chronic workloads have been shown to decrease the risk of injury. Second, across a wide range of sports, well-developed physical qualities are associated with a reduced risk of injury. Clearly, for athletes to develop the physical capacities required to provide a protective effect against injury, they must be prepared to train hard. Finally, there is also evidence that under-training may increase injury risk. Collectively, these results emphasise that reductions in workloads may not always be the best approach to protect against injury.

Main thesis This paper describes the ‘Training-Injury Prevention Paradox’ model; a phenomenon whereby athletes accustomed to high training loads have fewer injuries than athletes training at lower workloads. The Model is based on evidence that non-contact injuries are not caused by training per se, but more likely by an inappropriate training programme. Excessive and rapid increases in training loads are likely responsible for a large proportion of non-contact, soft-tissue injuries. If training load is an important determinant of injury, it must be accurately measured up to twice daily and over periods of weeks and months (a season). This paper outlines ways of monitoring training load (‘internal’ and ‘external’ loads) and suggests capturing both recent (‘acute’) training loads and more medium-term (‘chronic’) training loads to best capture the player's training burden. I describe the critical variable—acute:chronic workload ratio—as a best practice predictor of training-related injuries. This provides the foundation for interventions to reduce players risk, and thus, time-loss injuries.

Summary The appropriately graded prescription of high training loads should improve players’ fitness, which in turn may protect against injury, ultimately leading to (1) greater physical outputs and resilience in competition, and (2) a greater proportion of the squad available for selection each week.

This is an Open Access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/

Statistics from Altmetric.com

Request Permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.

Training–performance relationship

In a British Journal of Sports Medicine blog, Dr John Orchard1 proposed hypothetical relationships between training (both under-training and over-training), injury, fitness and performance. He speculated that both inadequate and excessive training loads would result in increased injuries, reduced fitness and poor team performance (see figure 1). The relationship between training load, injury, fitness and performance is critical to sports medicine/physiotherapy and sport science practitioners. In this paper I use the term ‘practitioners’ to refer to the wide gamut of health professionals and also sport scientists who work with athletes/teams (ie, strength and conditioning coaches, certified personal trainers, etc). Our field—sports performance and sports injury prevention is a multidisciplinary one and this paper is relevant to the field broadly.

Figure 1

Hypothetical relationship between training loads, fitness, injuries and performance. Redrawn from Orchard.1

Injuries impair team performance, but any injuries that could potentially be considered ‘training load-related’ are commonly viewed as ‘preventable’, and therefore the domain of the sport science and medicine team. Sport science (including strength and conditioning) and sports medicine (including doctors and physiotherapists) practitioners share a common goal of keeping players injury free. Sport science and strength and conditioning staff aim to develop resilience through exposing players to physically intense training to prepare players for the physical demands of competition, including the most demanding passages of play.

On the other hand, doctors and physiotherapists are often viewed as the staff responsible for ‘managing players away from injury’. A stereotype is the physiotherapist or doctor advocating to reduce training loads so that fewer players will succumb to ‘load-related’ (eg, overuse) injuries. However, how many of the decisions governing players and their individual training loads are based on empirical evidence or the practitioners’ ‘expert’ intuition (ie, ‘gut feel’)?

Banister et al2 proposed that the performance of an athlete in response to training can be estimated from the difference between a negative function (‘fatigue’) and a positive function (‘fitness’). The ideal training stimulus ‘sweet spot’ is the one that maximises net performance potential by having an appropriate training load while limiting the negative consequences of training (ie, injury, illness, fatigue and overtraining).3

Several studies have investigated the influence of training volume, intensity and frequency on athletic performance, with performance generally improved with increases in training load.4–10 In individual sports (eg, swimming and running) greater training volume,4 ,8 and higher training intensity5 ,6 ,8 improved performance. In a study of 56 runners, cyclists and speed skaters undertaking 12 weeks of training, a 10-fold increase in training load was associated with an approximately 10% improvement in performance.10 In competitive swimmers, significant associations were found between greater training volume (r=0.50–0.80) and higher training intensity (r=0.60–0.70) and improved performance.9 However, adverse events of exercise training are also dose related, with the highest incidence of illness and injury occurring when training loads were highest.10–15

Training loads can be measured in different ways

Sport scientists typically obtain measurements of a prescribed external training load (ie, physical ‘work’), accompanied by an internal training load (ie, physiological or perceptual ‘response’). External training loads may include total distance run, the weight lifted or the number and intensity of sprints, jumps or collisions (to name a few).16 Internal training loads include ratings of perceived exertion and heart rate. The individual characteristics of the athlete (eg, chronological age, training age, injury history and physical capacity) combined with the applied external and internal training loads determine the training outcome.16

For example, identical external training loads could elicit considerably different internal training loads in two athletes with vastly different individual characteristics; the training stimulus may be appropriate for one athlete, but inappropriate (either too high or too low) for another. An overweight, middle-aged male will have very different physiological and perceptual responses to an 800 m effort than a trained runner. Although the external training load is identical, the internal training load will be much higher in the older, unfit individual! As the dose–response to training varies between individuals, training should be prescribed on an individual basis.

External training load—‘tracking’ every metre!

Global positioning systems (GPS) have been a ‘game-changer’ in the monitoring of external loads.17 These devices, which are typically no larger than a mobile phone, are worn by athletes during training and match-play activities. GPS provides information on speed and distances covered, while inertial sensors (ie, accelerometers, gyroscopes) embedded in the devices also provide information on non-locomotor sport-specific activities (eg, jumps in volleyball, collisions in rugby and strokes in swimming).18 Importantly, most of this data can be obtained in ‘real-time’ to ensure athletes are meeting planned performance targets.

Internal training load—the athlete's perception of effort

The session-rating of perceived exertion (RPE) has been used to quantify the internal training loads of athletes. At the completion of each training session, athletes provide a 1–10 ‘rating’ on the intensity of the session. The intensity of the session is multiplied by the session duration to provide training load. The units are ‘RPE units×minutes’ and in football codes generally range between 300 and 500 units for lower-intensity sessions and 700–1000 units for higher-intensity sessions. For ease, we have referred to them as ‘arbitrary units’ in previous work. A more accurate term might be ‘exertional minutes’. The value of session-RPE will depend on the goal of those measuring it and that topic is beyond the scope of this paper.

Monitoring individual athlete well-being

Monitoring athlete well-being is common practice in high performance sport.19–21 A wide range of subjective questionnaires are used with many of them employing a simple 5, 7 or 10-point Likert scale.19–23 Longer, more time consuming surveys are also employed.24 ,25

These questionnaires are used to determine the readiness of team sport athletes to train. Typically players report their mood, stress level, energy, sleep and diet, along with their feelings of soreness in the upper-body, quadriceps, hamstring, groin and calf. The sum of the questions indicates the athlete's well-being. Practitioners can then adapt the training prescription for players on an individual basis (eg, continue regular training, investigate training loads or modify training programme).

Relationship between training loads and injury

Training load monitoring is increasingly popular in high performance sport to ensure athletes achieve an adequate training stimulus and to minimise the negative consequences of training (injury risk, overtraining). In the following section, I discuss the relationship between training loads (both internal and external loads) and injury in team sport athletes.

External workloads and injury

In elite rugby league, players who performed greater amounts (>9 m) of very high-speed (>7 m/s) running per session were 2.7 times more likely to sustain a non-contact, soft-tissue injury than players who performed less very high-speed running per session (table 1).26 This ‘threshold’ of 9 m of very high-speed running is lower than what would typically be performed in other team sport training sessions (eg, soccer and Australian football),27 and likely reflects the greater contact and repeated-effort demands, and lower running demands of rugby league.27 For example, in studies of Australian football players, higher 3-weekly total distance (73 721–86 662 m, OR=5.5) and 3-weekly sprint distance (>1453 m, OR=3.7) were associated with a greater risk of injury.13

Table 1

Relationship between external workloads and risk of injury in elite rugby league players

Although external loads are commonly measured using GPS devices, some team sports expose athletes to physically demanding external loads that require very little high-speed running (eg, baseball pitching, cricket fast bowling). Of the studies that have been performed in baseball,28–30 greater pitch counts were associated with greater injury rates. Youth pitchers who threw over 100 innings in a season, had 3.5 times greater injury risk than players who pitched fewer than 100 pitches.30

Similar findings have been observed in cricket players; fast bowlers who bowled more than 50 overs in a match were at increased risk of injury for up to 28 days (OR=1.62).31 Furthermore, bowlers who bowled more deliveries in a week (>188 deliveries, relative risk=1.4) and had less recovery between sessions (<2 days, relative risk=2.4) were at greater injury risk than those who bowled between 123 and 188 deliveries per week and had 3–3.99 days recovery between sessions. Complicating this issue is that bowlers who bowled fewer deliveries each week (<123 deliveries, relative risk=1.4) and had greater recovery (>5 days, relative risk=1.8) were also at increased risk of injury.32

Internal workloads and injury

These findings on external loads are consistent with results from studies on internal loads; higher training loads were associated with greater injury rates.11 ,15 ,33–36 In early work11 a strong relationship (r=0.86) was reported between training loads (derived from the session-RPE) and training injury rates across a playing season in semiprofessional rugby league players (figure 2). Furthermore, over a 3-year period, reduced training loads markedly reduced injury rates in the same cohort of players (figure 3).37 It is likely that excessive training loads performed early in the study, led to overtraining, resulting in a spike in injury rates. However, it should be noted that this study was published over 10 years ago, and no subsequent study has replicated these results.

Figure 2

Relationship between training load and injury rate in team sport athletes. Training loads were measured using the session-rating of perceived exertion method. Redrawn from Gabbett.11

Figure 3

Influence of reductions in preseason training loads on injury rates and changes in aerobic fitness in team sport athletes. Training loads were measured using the session-rating of perceived exertion method. Redrawn from Gabbett.37

In professional rugby union players, higher 1-week (>1245 arbitrary units) and 4-week cumulative loads (>8651 arbitrary units) were associated with a higher risk of injury.14 In professional rugby league players, training load was associated with overall injury (r=0.82), non-contact field injury (r=0.82), and contact field injury (r=0.80) rates.35 Significant relationships were also observed between the field training load and overall field injury (r=0.68), non-contact field injury (r=0.65), and contact field injury (r=0.63) rates. Strength and power training loads were significantly related to the incidence of strength and power injuries (r=0.63). There was no significant relationship between field training loads and the incidence of strength and power injuries. However, strength and power training loads were significantly associated with the incidence of contact (r=0.75) and non-contact (r=0.87) field training injuries. Collectively, these findings suggest that (1) the harder rugby league players train, the more injuries they will sustain and (2) high strength and power training loads may contribute indirectly to field injuries. Monitoring of training loads and careful scheduling of field and gymnasium sessions to avoid residual fatigue is warranted to minimise the effect of training-related injuries on professional rugby league players.

Differences in training adaptations between younger and older athletes

The age of the athlete influences adaptations to training.38 ,39 Gabbett38 investigated training loads, injury rates and physical performance changes associated with a 14-week field conditioning programme in junior (approximately 17 years) and senior (approximately 25 years) rugby league players. Training improved muscular power and maximal aerobic power in the junior and senior players, however the improvement in muscular power and maximal aerobic power were greatest in the junior players. Training loads and injury rates were higher in the senior players. Thus, junior and senior rugby league players may adapt differently to a given training stimulus, suggesting that training programmes should be modified to accommodate differences in training age.

Rogalski et al39 also showed that at a given training load, older and more experienced (7+ years’ experience in the Australian Football League competition) players were at greater risk of injury than less experienced, younger (1–3 years’ experience in the Australian Football League competition) players. It is likely that the higher training injury risk in the more experienced players is confounded by previous injury which is a major risk factor for a new injury.40 Older players likely had experienced a greater number of injuries across the course of their careers than the less experienced first to third year players. Clearly, further research investigating the dose–response relationship between training and injury in athletes of different ages and genders is warranted.41

Modelling the training load–injury relationship and using it to predict injury

This section focuses on the use of training monitoring to model the relationship between load and injury risk.

Over a 2-year period, Gabbett42 used the session-RPE to model the relationship between training loads and the likelihood of injury in elite rugby league players. Training load and injury data were modelled using a logistic regression model with a binomial distribution (injury vs no injury) and logit link function, with data divided into preseason, early competition and late competition phases.

Players were 50–80% likely to sustain a preseason injury within the weekly training load range of 3000 to 5000 arbitrary minutes (RPE×minutes, as above). These training load ‘thresholds’ for injury were considerably lower (1700–3000 session-RPE units/week) in the competitive phase of the season. Importantly, on the steep portion of the sigmoidal training load-injury curve, very small changes in training load resulted in very large changes in injury risk (figure 4).

Figure 4

Relationships between training load, training phase, and likelihood of injury in elite team sport athletes. Training loads were measured using the session-rating of perceived exertion method. Players were 50–80% likely to sustain a preseason injury within the training load range of 3000–5000 arbitrary units. These training load ‘thresholds’ were considerably reduced (1700–3000 arbitrary units) in the competitive phase of the season (indicated by the arrow and shift of the curve to the left). On the steep portion of the preseason training load-injury curve (indicated by the grey-shaded area), very small changes in training load result in very large changes in injury risk. Pre-Season Model: Likelihood of Injury=0.909327/(1+exp(−(Training Load−2814.85)/609.951)). Early Competition Model: Likelihood of Injury=0.713272×(1−exp(−0.00038318×Training Load)). Late Competition Model: Likelihood of Injury=0.943609/(1+exp(−(Training Load−1647.36)/485.813)). Redrawn from Gabbett.42

Training load and injury data were prospectively recorded over a further two competitive seasons in those elite rugby league players. An injury prediction model based on planned and actual training loads was developed and implemented to determine if non-contact, soft-tissue injuries could be predicted. One-hundred and fifty-nine non-contact, soft-tissue injuries were sustained over those two seasons. The percentage of true-positive predictions was 62% (N=121) and the false-positive and false-negative predictions were 13% (N=20) and 11% (N=18), respectively. Players who exceeded the weekly training load threshold were 70 times more likely to test positive for non-contact, soft-tissue injury, while players who did not exceed the training load threshold were injured 1/10 as often (table 2). Furthermore, following the introduction of this model, the incidence of non-contact, soft-tissue injuries was halved.42

Table 2

Accuracy of model for predicting non-contact, soft-tissue injuries

We also analysed the prevalence of injury and the predictive ratios obtained from the model. The prevalence of injury in this sample of professional rugby league players was 8.6%. If the predictive equation was positive for a given player, the likelihood of injury increased from 8.6% to 86%, and if the results of the test were negative, the likelihood of injury decreased from 8.6% to 0.1%. Furthermore, 87% (121 from 139 injuries) of the 8.6% of players who sustained an injury were correctly identified by the injury prediction model.

Although several commercially available software programs claim to predict training load-related injuries, to date, this is the only study to predict injury based on training load data, apply that model in a high performance sporting environment, and then report the results in a peer-reviewed journal. We acknowledge that any regression model that predicts injury is best suited to the population from which it is derived. Caution should be applied when extrapolating these results to other sports and populations. Despite this potential limitation, these findings provide information on the training dose–response relationship in elite rugby league players, and a scientific method of monitoring and regulating training load in these athletes. Importantly, in a team environment, this approach allows players to be managed on an individual basis.

The critical element of week-to-week change (usually increases!) in training load

Accepting that high absolute training loads are associated with greater injury risk,42 strength and conditioning practitioners must also consider how week-to-week changes in training load independently influence injury risk (aside from total training load). In a study of Australian football players, Piggott et al34 showed that 40% of injuries were associated with a rapid change (>10%) in weekly training load in the preceding week. Rogalski et al39 also showed that larger 1-weekly (>1750 arbitrary units, OR=2.44–3.38), 2-weekly (>4000 arbitrary units, OR=4.74) and previous to current week changes in internal load (>1250 arbitrary units, OR=2.58) were related to a greater risk of injury. Large week-to-week changes in training load (1069 arbitrary units) also increased the risk of injury in professional rugby union players.14 We have also modelled the relationship between changes in weekly training load (reported as a percentage of the previous weeks’ training load) and the likelihood of injury (unpublished observations). When training load was fairly constant (ranging from 5% less to 10% more than the previous week) players had <10% risk of injury (figure 5). However, when training load was increased by ≥15% above the previous week's load, injury risk escalated to between 21% and 49%. To minimise the risk of injury, practitioners should limit weekly training load increases to <10%.

Figure 5

Likelihood of injury with different changes in training load. Unpublished data collected from professional rugby league players over three preseason preparation periods. Training loads were measured using the session-rating of perceived exertion method. Training loads were progressively increased in the general preparatory phase of the preseason (ie, November through January) and then reduced during the specific preparatory phase of the preseason (ie, February). The training programme progressed from higher volume-lower intensity activities in the general preparatory phase to lower volume-higher intensity activities in the specific preparatory phase. Each player participated in up to five organised field training sessions and four gymnasium-based strength and power sessions per week. Over the three preseasons, 148 injuries were sustained. Data are reported as likelihoods ±95% CIs.

Considering both acute and chronic training load: a better way to model the training–injury relationship?

Is there a benefit in modelling the training–injury relationship using a combination of both acute and chronic training loads? Acute training loads can be as short as one session, but in team sports, 1 week of training appears to be a logical and convenient unit. Chronic training loads represent the rolling average of the most recent 3–6 weeks of training. In this respect, chronic training loads are analogous to a state of ‘fitness’ and acute training loads are analogous to a state of ‘fatigue’.2

Comparing the acute training load to the chronic training load as a ratio provides an index of athlete preparedness. If the acute training load is low (ie, the athlete is experiencing minimal ‘fatigue’) and the rolling average chronic training load is high (ie, the athlete has developed ‘fitness’), then the athlete will be in a well-prepared state. The ratio of acute:chronic workload will be around 1 or less. Conversely, if the acute load is high (ie, training loads have been rapidly increased resulting in ‘fatigue’) and the rolling average chronic training load is low (ie, the athlete has performed inadequate training to develop ‘fitness’), then the athlete will be in a fatigued state. In this case the ratio of the acute:chronic workload will exceed 1. The use of the acute:chronic workload ratio emphasises both the positive and negative consequences of training. More importantly, this ratio considers the training load that the athlete has performed relative to the training load that he or she has been prepared for.43

The first study to investigate the relationship between the acute:chronic workload ratio and injury risk was performed on elite cricket fast bowlers.43 Training loads were estimated from both session-RPE and balls bowled. When acute workload was similar to, or lower than the chronic workload (ie, acute:chronic workload ratio ≤0.99) the likelihood of injury for fast bowlers in the next 7 days was approximately 4%. However, when the acute:chronic workload ratio was ≥1.5 (ie, the workload in the current week was 1.5 times greater than what the bowler was prepared for), the risk of injury was 2–4 times greater in the subsequent 7 days.43

Using total weekly distance as a predictor variable, almost identical results have been found in elite rugby league44 and soccer45 players; ‘spikes’ in acute load relative to chronic load (ie, when the acute:chronic workload ratio exceeded 1.5) were associated with an increased risk of injury.

Taken from three different sports (cricket, Australian football and rugby league), a guide to interpreting and applying acute:chronic workload ratio data is shown in figure 6.46 In terms of injury risk, acute:chronic workload ratios within the range of 0.8–1.3 could be considered the training ‘sweet spot’, while acute:chronic workload ratios ≥1.5 represent the ‘danger zone’. To minimise injury risk, practitioners should aim to maintain the acute:chronic workload ratio within a range of approximately 0.8–1.3. It is possible that different sports will have different training load–injury relationships; until more data is available, applying these recommendations to individual sport athletes should be performed with caution.

Figure 6

Guide to interpreting and applying acute:chronic workload ratio data. The green-shaded area (‘sweet spot’) represents acute:chronic workload ratios where injury risk is low. The red-shaded area (‘danger zone’) represents acute:chronic workload ratios where injury risk is high. To minimise injury risk, practitioners should aim to maintain the acute:chronic workload ratio within a range of approximately 0.8–1.3. Redrawn from Blanch and Gabbett.46

The balance between injury prevention and high performance: training too much or not training enough

Successful sporting teams report lower injury rates and greater player availability than unsuccessful teams.47–49 Although the evidence linking greater training loads with high injury rates is compelling, focusing on the negative aspects of training detracts from the many positive adaptations that arise from the training process. In addition, there are several reasons why the results linking high training loads to injury should be taken in context with the wide range of performance issues relevant to sport. Wrapping players in cotton wool will not bring on-field success. How can practitioners help coaches train players at the ideal level (maximising performance while also maintaining a low risk of non-contact soft-tissue injuries)?

Does this mean athletes should stop training?!

Although studies have shown a positive relationship between training load and injury, there is also evidence demonstrating that training has a protective effect against injury. The results of these studies should be considered when evaluating the influence of high training workloads on injury risk:

  1. Team sport athletes who performed greater than 18 weeks of training before sustaining their initial injuries were at reduced risk of sustaining a subsequent injury.50 These findings are consistent with others43 ,44 who have shown that high chronic workloads may decrease the risk of injury. Furthermore, greater training prior to entering an elite junior soccer programme was associated with a decreased risk of developing groin pain.51

  2. Second, across a wide range of sports, well-developed physical qualities are associated with a reduced risk of injury.50 ,52–54 Clearly, for athletes to develop the physical capacities required to provide a protective effect against injury, they must be prepared to train hard.

  3. Importantly, there is evidence that over-training and under-training may increase injury risk.14 ,28 ,32 For example, cricket fast bowlers who bowled fewer deliveries per week with greater recovery between sessions were at an increased risk of injury, while bowlers who bowled more deliveries per week with less recovery between sessions were also at an increased risk of injury. Similar findings have been reported in baseball and rugby union.14 ,28 The ‘U’-shaped relationship between workload and injury from these data demonstrate that both inadequate and excessive workloads are associated with injury.

Collectively, these results emphasise that reductions in workloads may not always be the best approach to protect against injury. How do practitioners find the ‘sweet spot’ of training load?

Training smarter and harder—the mechanisms that may underpin these findings

Although high training loads have been associated with higher injury rates, results are equivocal with recent evidence also demonstrating a protective effect of high chronic training loads.43 ,44 In this section I elaborate on the data shown in tables 1 and 2. Table 1 above shows that players who performed greater amounts of very high-speed running were 2.7 times more likely to sustain a non-contact soft-tissue injury than players with lower running loads.26 Given the high risk of injury with greater running loads, it is tempting to suggest that athletes should avoid very high-speed running in training to minimise the risk of injury. However, by restricting running loads in an attempt to reduce injury risk, it is possible that during critical passages of play when players are required to exert maximally they are inadvertently put at greater risk of injury due to being under prepared.

Greater amounts of very high-speed running may be associated with increased injury risk, however there is evidence (from the same data set) of lower injury risk when players performed greater amounts of low-intensity activity and short acceleration efforts.26 High-intensity team sports such as soccer, basketball and the rugby codes require players to perform short (2–3 s) acceleration efforts,55 followed by longer durations of lower intensity activity.56 In competition, longer high-speed efforts are uncommon.57

Given that high training loads can be achieved in different ways (ie, volume, intensity and frequency of training, as well as the balance of training activities performed) it is inappropriate to consider all ‘high training loads’ as carrying identical injury risk. To be explicit, ‘high training loads’ per se may not be the largest contributing factor to increased injury risk, but rather the type of ‘high training load’ that is prescribed may be an important predictor of injury. Greater amounts of short, high-intensity acceleration effort training and game-specific aerobic activity may provide team sport athletes with the appropriate physical qualities to not only perform at a high level, but also protect against injury.

Table 2 illustrates the accuracy of an injury prediction model. It demonstrates that the training load model was both sensitive and specific for predicting non-contact, soft tissue injuries. However, the injury prediction model was far better at identifying when injuries were unlikely to occur (ie, true negatives) than it was at predicting injuries. These findings are intuitive; if performance staff focus on injury prevention, and prevent injuries through ‘managing athletes away from training’, then the low numbers of training-load related injuries may be expected, as athletes are unlikely to ever train with adequate volumes or intensities to sustain an injury.

Equally, note that in figure 4, on the steep portion of the training load–injury curve small changes in training load (either increases or decreases) result in large changes in injury risk (in the respective direction). Under-emphasised in this study, was that due to the sigmoidal nature of the curve, at large training loads the training load–injury relationship is almost completely ‘flat’. On this portion of the curve, large changes in training load result in very small changes in injury risk. Thus, if athletes can safely train through the ‘high risk’ portions of the curve (using the acute:chronic workload ratio model), then they may develop greater resilience and training tolerance.

Although injury prediction models may have sufficient predictive accuracy to warrant systematic use in an elite team sport programme, a fine balance exists between training, detraining and overtraining. Training programmes must be physiologically and psychologically appropriate58 to allow players to cope with the demands of competition. With this in mind, it may be argued that it is worthwhile using preseason training and training camps to prescribe high training loads (note, not excessive) to determine which players are most susceptible to injury under physically stressful situations (these players most likely will not tolerate the intensity and fatigue of competition), and which players are not susceptible to injury under physically stressful situations (these players are more likely to tolerate the intensity and fatigue of competition).

A new view of training—a ‘vaccine’ against injuries!

This paper proposes the training-injury prevention paradox. Physically hard (and appropriate) training may protect against injuries. There is no disputing that high training loads are generally associated with better developed fitness and thus, good performance. One cost of high training load is often considered to be soft tissue injury risk. To address this risk, training loads could be reduced to decrease the incidence of injury, however low training loads (in the form of reduced training volumes) have also been associated with increased injury risk; exposing players to low training loads may place them at risk of further injury. Once players enter the rehabilitation process, it is a challenge for practitioners to expose them to appropriate loads to enhance physical qualities which provide a protective effect against injury, and prevent the ‘spike’ in loads when players return to full training. As a result, it is not uncommon for teams to have a constant ‘rehab-er’ in their squad—a player who breaks down repeatedly (potentially with different injuries) because his or her training load is not high enough to adapt to match demands. The data presented suggest that prescribing high training loads can lead to improved levels of fitness, which in turn offers a protective effect against injury, ultimately leading to (1) greater physical outputs and resilience in competition, and (2) a greater proportion of the squad available for selection each week (figure 7).

Figure 7

Relationship between physical qualities, training load, and injury risk in team sport athletes.

Conclusions

In conclusion, while there is a relationship between high training loads and injury, this paper demonstrates that the problem is not with training per se, but more likely the inappropriate training that is being prescribed. Excessive and rapid increases in training loads are likely responsible for a large proportion of non-contact, soft-tissue injuries. However, physically hard (and appropriate) training develops physical qualities, which in turn protects against injuries. This paper highlights the importance of monitoring training load, including the load that athletes are prepared for (by calculating the acute:chronic workload ratio), as a best practice approach to the long-term reduction of training-related injuries.

What are the findings?

  • Dogma exists around the effects of high (and low) training loads on injury.

  • This review highlights the positive and negative effects of high training loads on injury risk, fitness and thus, performance.

  • There is a relationship between high training loads and injuries but well-developed physical qualities protect against injury.

  • The ratio of acute to chronic training load is a better predictor of injury than acute or chronic loads in isolation.

How might it impact on clinical practice in the future?

  • In many high performance settings, training loads are reported on a week-to-week basis. Recording acute and chronic training loads, and modelling the acute:chronic workload ratio allows practitioners to determine if athletes are in a state of ‘fitness’ (ie, net training recovery, lower than average risk of injury) or ‘fatigue’ (ie, net training stress, higher than average risk of injury).

  • The Training-Injury Prevention Paradox Model allows practitioners to monitor and prescribe training to team sport athletes on an individual basis.

  • Providing evidence around the effects of acute and chronic training load on injury risk, physical fitness and performance will allow practitioners to systematically prescribe high training loads while minimising the risk of athletes sustaining a ‘load-related’ injury.

References

View Abstract

Footnotes

  • Correction notice The paper has been amended since it was published Online First. The title of the paper has been changed slightly.

  • Competing interests None declared.

  • Provenance and peer review Not commissioned; internally peer reviewed.