Article Text
Abstract
In the past decade, the study of relationships among nutrition, exercise and the effects on health and athletic performance, has substantially increased. The 2014 introduction of Relative Energy Deficiency in Sport (REDs) prompted sports scientists and clinicians to investigate these relationships in more populations and with more outcomes than had been previously pursued in mostly white, adolescent or young adult, female athletes. Much of the existing physiology and concepts, however, are either based on or extrapolated from limited studies, and the comparison of studies is hindered by the lack of standardised protocols. In this review, we have evaluated and outlined current best practice methodologies to study REDs in an attempt to guide future research.
This includes an agreement on the definition of key terms, a summary of study designs with appropriate applications, descriptions of best practices for blood collection and assessment and a description of methods used to assess specific REDs sequelae, stratified as either Preferred, Used and Recommended or Potential. Researchers can use the compiled information herein when planning studies to more consistently select the proper tools to investigate their domain of interest. Thus, the goal of this review is to standardise REDs research methods to strengthen future studies and improve REDs prevention, diagnosis and care.
- relative energy deficiency in sport
- energy intake
- feeding and eating disorders
- methods
- female athlete triad syndrome
This is an open access article distributed in accordance with the Creative Commons Attribution Non Commercial (CC BY-NC 4.0) license, which permits others to distribute, remix, adapt, build upon this work non-commercially, and license their derivative works on different terms, provided the original work is properly cited, appropriate credit is given, any changes made indicated, and the use is non-commercial. See: http://creativecommons.org/licenses/by-nc/4.0/.
Statistics from Altmetric.com
- relative energy deficiency in sport
- energy intake
- feeding and eating disorders
- methods
- female athlete triad syndrome
WHAT IS ALREADY KNOWN ON THIS TOPIC
Relative Energy Deficiency in Sport (REDs) is a syndrome of various health and performance consequences stemming from exposure to prolonged and/or severe low energy availability (LEA).
While the REDs outcomes of reproductive dysfunction and poor bone health have been studied extensively in mostly homogenous, female athlete samples, further research is required to better understand these and other REDs consequences in athletes of different age, sex, gender, race, ethnicity and ability.
There are various methods used in REDs literature, from surveys to overnight hormonal sampling, but consensus on the best methods for assessing REDs outcomes in athletes has been lacking.
WHAT THIS STUDY ADDS
‘Preferred’, ‘Used and Recommended’ and ‘Potential’ tests are described for use in researching the various health and performance consequences of REDs.
Literature supporting the sports performance decrements of problematic LEA is limited and requires hypothesis-driven work that is sports-specific.
There are gaps in REDs literature that can be narrowed with consistent and improved reporting of methods.
Introduction
Disruptions to optimal health and body system function in athletes have been described for nearly 40 years,1 with early recognition of the causal role of suboptimal nutrition.2–4 Since then, the research field and collective understanding of the importance of adequate energy availability (EA) in athletes have grown, with specific attention paid to Relative Energy Deficiency in Sport (REDs), a multifactorial syndrome caused by exposure to problematic low EA (LEA).5 Many existing studies, however, were short-term, controlled interventions or cross-sectional assessments with small samples of homogenous populations of females. This has resulted in various extrapolations to inform hypotheses and understandings that may not apply to all athletes: studies of anorexia nervosa, habitually sedentary participants, able-bodied athletes, females, endurance athletes and samples of strictly white participants do not represent, and should not be directly applied to, a wide range of athletes. Furthermore, controlled laboratory studies are vastly outnumbered by cross-sectional, observational studies. Therefore, to unify the field and inform high-quality research in the future, it is necessary to undertake a comprehensive review of scientific methodologies used to assess the health and performance impairments associated with LEA and a diagnosis of REDs.
A priori agreement of definitions is required for the external validity of studies. As such, several terms require defining and can be seen in Box 1.
Definitions germane to the study of REDs and interpretation of REDs studies.
Energy availability (EA)
The dietary energy left over and available for optimum function of body systems after accounting for the energy expended from exercise. EA is expressed as kcal/kg fat-free mass/day, and is defined in the scientific literature in the form of a mathematical formula:146 147
Low energy availability (LEA)
LEA is any mismatch between dietary energy intake and energy expended in exercise that leaves the body’s total energy needs unmet, that is, there is inadequate energy to support the functions required by the body to maintain optimal health and performance.148 LEA occurs as a continuum between scenarios in which effects are benign (adaptable LEA) and others in which there are substantial and potentially long-term impairments of health and performance (problematic LEA).
Adaptable LEA
Exposure to a reduction in EA that is associated with benign effects, including mild and quickly reversible changes in biomarkers of various body systems that signal an adaptive partitioning of energy and the plasticity of human physiology. In some cases, the scenario that underpins the reduction in EA (eg, monitored, and mindful manipulation of body composition or scheduled period of intensified training or competition) might be associated with acute health or performance benefits (eg, increased relative VO2 max). Adaptable LEA is typically a short-term experience with minimal (or no) impact on long-term health, well-being or performance. Moderating factors may also alter the expression of outcomes.
Problematic LEA
Problematic LEA is exposure to LEA that is associated with greater and potentially persistent disruption of various body systems, often presenting with signs and/or symptoms and represents a maladaptive response. The characteristics of problematic LEA exposure (eg, duration, magnitude, frequency) may vary according to the body system and the individual. They may be further affected by interaction with moderating factors that can amplify the disruption to health, well-being and performance.
Moderating factors
Characteristics of individual athletes, their environment or behaviour/activities that may amplify or attenuate the effect of LEA exposure on various body systems. Relevant moderating factors (eg, gender, age, genetics) vary according to the body system. They may offer protection or additional risk in the progression from LEA exposure to the expression of disturbances to health, well-being or performance.
Eating disorders
Mental illnesses clinically diagnosed by meeting defined criteria characterised by abnormal eating behaviours (eg, self-induced restricting food intake, preoccupation with body shape or weight, binging and purging (self-induced emesis, laxative use, excessive exercise, diuretic use)).
Disordered eating behaviours
Abnormal eating behaviours including restrictive eating, compulsive eating or irregular or inflexible eating patterns, excessive exercise beyond assigned training to compensate for dietary intake and use of purgatives. The behaviours do not meet the clinical criteria for an eating disorder.
Relative Energy Deficiency in Sport (REDs)
A syndrome of impaired physiological and/or psychological functioning experienced by female and male athletes that is caused by exposure to problematic (prolonged and/or severe) LEA. The detrimental outcomes include, but are not limited to, decreases in energy metabolism, reproductive function, musculoskeletal health, immunity, glycogen synthesis and cardiovascular and haematological health, which can all individually and synergistically lead to impaired well-being, increased injury risk and decreased sports performance.5 145
Adapted from Mountjoy et al.5 REDs, Relative Energy Deficiency in Sport; VO2 max, maximal oxygen consumption
LEA has various aetiologies. Such causes include, but are not limited to:1 frank eating disorders (EDs), disordered eating (DE) behaviours, restrictive dietary practices (even if medically indicated),6 weight cycling,3 malabsorption disorders, food allergies,3 inadvertent undereating,7 food insecurity,8 excessive exercise and excessive non-exercise activity thermogenesis (NEAT). For review, see Burke et al.9
Standardisation of methodology for studying REDs and understanding the quality of previously used methods are important for designing high-quality studies in the future and the critical appraisal of published evidence. This review aims to (1) evaluate the quality of previous research methods used to study REDs; (2) define the current preferred method for evaluating each component of REDs and (3) identify gaps where best practices have not been developed or proven in athlete samples. While the target audience for this review is researchers, it may also assist clinicians in translating the science into practice.
Of note, this review focuses on REDs research through a biological lens. However, we acknowledge that social and sociocultural factors can also influence all aspects of REDs, including unique risk factors, as well as appropriate prevention, diagnostic and treatment strategies. Thus, while in-depth discussion of such factors is outside the scope of this paper, community engagement, with specific population considerations, should be included in all future REDs research. Accounting for social and sociocultural effects in future work will strengthen our understanding of mediating factors underpinning REDs and enhance the care of broader populations of athletes.
Methods
We conducted a narrative review with the aim to provide a general overview and standardised framework outlining various methods for REDs research. The authors conducted literature searches as well as their collective research expertise to guide the development of the recommended research methodologies for each health and performance outcome of REDs. Further consultation was obtained from experts in the fields of specific REDs health and performance domains.
Equity, diversity and inclusion statement
The author panel is a diverse group of experienced experts representing the following athlete health professional domains: sports medicine, internal medicine, paediatrics, endocrinology, sports nutrition, sports physiology and sports science. Authors were invited based on their clinical and/or research experience with REDs. In total, seven women and four men from five countries contributed to this work.
REDs research study designs
Defining the athlete sample
Researchers have struggled to consistently and accurately describe the fitness, training and performance level of participants in studies featuring exercise-related or health-related outcomes; REDs has been no exception. Recently, McKay et al proposed a new schema for classifying athlete level based on a six-tier system (Tier 0=Sedentary → Tier 5=World Class).10 Adopting this evidence-based approach helps facilitate comparison across study samples for both future athlete-focused studies and for retrospective analysis of existing research. More specific training descriptions, such as quantity, intensity and type, will further enhance the ability to compare datasets and extrapolate findings to clinical care, as appropriate.
Design approach
Research designs can be broadly classified into observational, interventional or qualitative (figure 1).11 Observational studies can be used to investigate the relationships between variables, identify potential risk factors or predictors of a disease or condition and generate hypotheses for future research. There are several types of observational study designs, including cross-sectional, case-control, cohort and case studies, each with advantages and disadvantages.12 On the other hand, an interventional study design assigns participants to different groups or treatments, comparing their results to assess the impact of the treatment on a desired outcome.12 The most common interventional study designs include randomised controlled trials, quasi-experimental designs and before-and-after studies. Although observational studies generally have lower evidence strength and often rely on indirect (surrogate) markers of LEA, they can allow for longer periods of observation. In contrast, interventional studies offer stronger evidence and allow researchers to strictly control important confounding factors. However, these studies are typically limited in duration due to the required resources, logistics, compliance of participants and ethical concerns of inducing prolonged LEA. Thus, they usually fail to assess long-term adaptations. Qualitative studies provide context of quantitative data or results or can serve as an initial investigation for quantitative hypothesis generation.13 Examples of qualitative studies include ethnography, phenomenology and narrative research.13 Qualitative studies often rely on interviews and focus groups for data collection. Because athlete experience (eg, coaching environment, training schedules, access to resources (including cost and availability of nutrition near training/competition arenas)) plays an important role in REDs incidence/progression and treatment, qualitative studies are an important and often overlooked avenue of research inquiry. For example, qualitative athlete interviews can provide more details and context pertaining to sport culture influences on LEA and REDs development. Delphi methods have recently been used for developing consensus statements.
Most existing studies of REDs are based on observational study designs that provide weaker levels of evidence (table 1). Therefore, interventional studies with stronger designs are needed to establish fundamental REDs concepts. Better quantification of characteristics of LEA exposure (eg, duration and severity of LEA), as well as other factors that moderate the expression of health and performance impairments associated with LEA (eg, sex, gynaecological age, health history, current health status), is required in research designs to understand the time course of REDs development and its mechanistic aetiology.14 15
Data collection methods
REDs studies have used both direct, objective assessments of physiological or psychological signs/symptoms of LEA (eg, blood hormone concentrations, bone mineral density (BMD), the prevalence of menstrual dysfunction, presence of aberrant eating pathologies) and self-report surveys (eg, the Low Energy Availability in Females Questionnaire (LEAF-Q16)) that serve as potential surrogate markers of LEA. The research questions of the study and the burden placed on the participants often dictate the degree to which each element is employed. Researchers should use the tools with the best validity and reliability for both indirect and direct methods to ensure accuracy of the data. Additionally, the limitations of the assessment methods should be acknowledged when reporting study findings and providing evidence-based recommendations.11 This practice will help ensure the credibility and utility of the study results for the scientific and clinical communities.
Energy availability and its components
While EA is defined by a simple mathematic equation, its measurement in real-life circumstances is challenging, and there are many caveats to measuring LEA when assessing or diagnosing the presence of LEA exposure in an athlete. Accurately assessing its key components—dietary energy intake (EI) and exercise energy expenditure (EEE) in free-living athletes—is difficult, time-consuming and prone to errors.17 18 Errors of omission and under-reporting of EI are most likely.19 Calculations will only reflect the recording period (typically 3–7 days) rather than the historical period that has contributed to the athlete’s current health and performance status.17 Given these limitations, we recommend against using assessments of EA to retrospectively derive key information on an athlete’s risk of problematic LEA exposure or to make a REDs diagnosis. This recommendation does not preclude a dietary assessment by a sports nutrition professional that includes an EA assessment within a more holistic nutritional screen, or as part of a larger screen of LEA risk. These EA assessments may be improved by the use of a standardised protocol that will at least attempt to allow comparison between individuals and studies, or allow better longitudinal monitoring. Importantly, EA calculations can be used prospectively to construct diet and exercise programmes based on EA targets, in research settings and for athlete counselling. For more detailed information, the reader is referred to Burke et al.20
The most commonly used protocol for field assessments of dietary EI is a self-report food log/diary19 using one or a combination of household measures, food scales and photographs.21 Sources of inaccuracy include forgetting to record all food/drink occasions and items, intentional or unintentional changes in behaviour due to the conscious act of recording and failure to quantify food intake accurately.19 It is also important to note that detailed logging of nutritional intake and physical activity can be triggering for individuals who have suffered from DE/EDs and can potentially contribute to obsessive-compulsive behaviours that create LEA. For controlled research settings, dietary intakes can be more carefully controlled and recorded, but these activities are a significant burden on the athlete and research staff and require substantial financial resources.22
The ease and accuracy of measuring EEE differs between types of activity. Some activities (eg, running and cycling) allow for a more accurate measurement of EEE via global positioning systems and power metres.23 In contrast, other activities (eg, team sports, skill or technical sports, power sports) make it nearly impossible to estimate EEE accurately in the field. Although many athletes tend to rely on the use of wearable technology, these devices tend to significantly underestimate or overestimate activity EEE (approximately +200 to −600 kcals depending on methodologies) and therefore are not recommended for use in calculating EA.24 25 In the laboratory, direct calorimetry and heart rate (HR) measures allow for the extrapolation of EEE into field-based measures with much more accuracy.23 However, these measures are less available outside research settings. Other sources of error include the failure to remove resting metabolic rate (RMR) during exercise to calculate EEE and discrepancies in selecting the activities included as an athlete’s ‘exercise’ activities (eg, including leisure activities or exercise undertaken for transport in exercise calculations).
While inaccuracies in fat-free mass (FFM) assessment contribute a smaller source of error in EA calculation, a valid and reliable assessment of body composition is still desirable. Each assessment method has sources and magnitudes of error, although body composition measurements via dual-energy X-ray absorptiometry (DXA) have become commonly used for FFM in EA calculations.5 26–28 Best practice protocols that standardise athlete preparation (eg, fasting and rested) and positioning on the DXA scanning bed, as well as the technician’s activities in capturing, analysing and interpreting the scan, are important in maximising the validity and reliability of measurements, especially when these are undertaken in longitudinal assessments.29 30 Participants undergoing multiple DXA scans should ideally have these performed on the same or a cross-calibrated scanner.31
Testing of REDs components
Definitions
Each health and performance component of REDs has numerous assessment approaches with methods or tests with varying validity, reliability, cost, accessibility and feasibility (see tables 2 and 3). The assessment measures have been stratified as Preferred, Used and Recommended or Potential according to the following definitions:
Preferred tests have repeatedly produced accurate results with reasonable availability and high reliability without prohibitive cost. Preferred tests may not be widely available to researchers or clinicians and may not be reasonable for every study (eg, the Preferred test for evaluating superior mesenteric artery syndrome, which can be seen in severe energy restriction with loss of visceral fat,32 requires imaging unusual in sports science studies). Preferred tests may not always be suitable for field-based studies. In most cases, the Preferred test should be the benchmark by which developing surrogate markers should be validated. Not all REDs sequelae have Preferred tests at the time of publication.
Used and Recommended tests are those that have been previously employed in REDs studies and sports science applications or are well-described in their respective fields as acceptable substitutes for a Preferred test. Typically, they are less expensive and easier to implement than a Preferred test but may have lower sensitivity, specificity, validity or reliability. When using a Used and Recommended test, specific care must be taken to consider sample characteristics of previous validations to ensure appropriate applicability (ie, test suitability for use in this specific athlete population as compared with the population in which the method was validated).
Potential tests have been recently developed and currently lack sufficient evidence to be recommended, yet are emerging approaches that may prove to be Preferred or Used and Recommended tests in the future. Researchers are encouraged to investigate these Potential tests further.
The current existence of a Preferred or Used and Recommended test should not preclude researchers from further developing new or refining current Potential tests. When developing new tests, we encourage researchers to consider the importance of maximising internal (reduction of confounders) and external validity. Overlooked components of internal validity that are specific to the measurement of performance in sports science include consistency of instruction to participants, verbal encouragement (especially applicable to maximal or ‘all-out’ tests), music, mental fatigue preceding test, presence of observers and knowledge of when the test will end.33 Subcomponents of external validity include generalisability (sample representative of the population), applicability (results translatable to other populations) and indirectness (the degree to which the study setting modulates results).34
Blood work: best practice principles
Because blood parameters are a major component of the assessment of LEA and REDs outcomes in an athlete,35 it is essential to ensure that best standards of practice are used to collect viable samples. This requires actions by the athlete (preparation and presentation for collection) and the researcher (collection, handling and storage of samples).36 37 This section summarises the general best practices for collecting blood samples; marker-specific considerations have been outlined in their respective sections.
Depending on the blood parameter assessed, outcome measurements may occur in whole blood, plasma or serum specimens from the blood sample. The researcher must make an a priori decision on this matter, with the use of appropriate collection tubes and standardised procedures for the processing and storage of human fluid samples according to the specimen that is to be assessed (whole blood, plasma, serum). Various bioanalytical procedures exist for the assessment of a blood panel, with laboratory accessibility and cost dictating the procedure used in such analysis. REDs researchers should choose the most accurate analytical protocols for each assay and report assay technique sensitivity and precision issues in their findings. For plasma or serum specimens undergoing later analytical assessment, storage in an ultra-freezer (−80 °C) helps safeguard against sample degradation when used for future analysis. Recognising that specimen thaw and refreeze cycles degrade certain hormones and metabolite constituents, researchers should preserve multiple aliquots to allow independent analysis of freshly thawed samples.36 38 In the case of serial blood measurements (eg, days, weeks), it is crucial to replicate the standardised procedures for specimen collection on each occasion.
Resting, fasted morning blood samples are the standard in clinical laboratory settings to control for diurnal variations in some parameters, as well as the effects of prior exercise or food intake (figure 2). Appropriate preparation by the athlete includes the following: (1) report fasted for a morning blood draw at a time that can be repeated for future tests; (2) avoid prior exercise activity for a minimum of 12 hours; (3) ensure exercise activity the day before is of easy duration and intensity (no competition or simulated competition)39 40; (4) refrain from certain dietary supplements and (5) maintain euhydration before arriving at the lab.36 41 The researcher/clinician should note the time of day and other sampling conditions to allow for future comparative sampling. While preferred testing conditions improve lab interpretation, there is considerable athlete burden to present to testing having met all of these parameters. Notation should involve any variability in presentation, in addition to noting the athlete’s state (eg, relaxed, agitated) and whether blood sampling procedures were complicated (eg, more than one vein puncture was necessary to collect the specimen). These notations can aid in discerning why certain findings might exist (eg, high levels of cortisol from the stress of catabolic processes of malnutrition, physical stress, psychological stress or a combination).
Before evaluating the results, pre-existing haematological or underlying medical conditions and use of any haematological agents (eg, iron, folate, vitamin B12 and other vitamin supplementation; medications such as antiplatelet agents, anticoagulants, bone marrow suppressants, erythropoietic-stimulating agents) should be considered. Age-appropriate and sex-appropriate reference ranges should be used when interpreting data. Using quartiles of laboratory-specific reference ranges has become common for some markers in REDs research.42–45
Recently, there is increased interest in assessing various standard blood parameters in alternative biological specimens (eg, saliva, urine). The non-invasive nature of the collection of such specimens is convenient/compelling. However, clinical reference ranges are lacking for many parameters from such specimens and some technological challenges exist in their assessment.36 46 Therefore, while future potential exists, caution is presently advised in their use and interpretation.
Methods to assess REDs-related health outcomes
1. Impaired reproductive function: some of the seminal work that identified the role of LEA in perturbing biomarkers of the reproductive system involved repeated blood sampling over 24 hours to assess diurnal hormone patterns (eg, every 10 min to assess luteinising hormone (LH) pulsatility).47 48 While such protocols may have a role in research, many logistical challenges (eg, expense, time commitment of the athlete, requirement for special equipment or facilities, volume of blood collection and potential violation of the World Anti-Doping Agency (WADA) rules around volumes of saline infusion involved in cannulation processes49) render this approach unsuitable for common field use, particularly with elite athletes who are subject to the WADA Code.
If considering an assessment of menstrual function, please refer to Elliott-Sale et al50 for methodological recommendations pertaining to capturing menstrual cycle characteristics (eg, period tracking, ovulation kits and measurement of oestradiol and progesterone blood concentrations during specific cycle phases) (figure 3). The LEAF-Q is a validated tool with fairly high sensitivity (78%–100%)16 51 and specificity (90%)16 in identifying LEA, functional hypothalamic amenorrhoea or low BMD in some, but not all,52 female athlete cohorts. Although questionnaires in conjunction with hormonal sampling can be quite helpful in characterising menstrual status and function, a stand-alone, thoroughly validated complete menstrual history questionnaire does not exist; such an instrument would require years of prospective hormonal sampling to develop. Appropriate assessment of reproductive and other hormone functions can only be performed in the absence of the use of hormonal contraceptives (eg, combined oral contraceptives (COCs), or other hormonal pills, transdermal patch, injectable or intrauterine device (IUD)). Exogenous hormonal intake can downregulate endogenous hormone levels, and oral oestrogen increases hepatic production of sex hormone-binding globulin (SHBG), thyroxine-binding globulin (TBG) and cortisol-binding globulin (CBG).53 Recognising that many female athletes use hormonal contraception, it is important to discuss contraceptive options with potential study participants. Athletes may need to carefully consider trying non-hormonal options versus remaining on current hormonal regimens for their own medical and personal reasons. Researchers need to be aware of the high rates of hormonal use when designing REDs studies that include female athletes.54 55
Reproductive assessment in males also involves hormonal measurements and questionnaire evaluation. Testosterone is the key hormonal determinant of reproductive health in males, and the measurement of total or free testosterone should be undertaken using viable bioanalytical techniques. Questionnaire evaluation of male reproductive parameters should focus on the athlete’s self-reported information regarding libido and the frequency of morning erections. Validated instruments in this regard include portions of the Low Energy Availability in Males Questionnaire (LEAM-Q)56 and Androgen Deficiency in Ageing Males Questionnaire (ADAM-Q).57 Standardisation for questionnaires is important; researchers should ensure that athletes are not rushed/hasty in completing questionnaires and can answer questions honestly, thoughtfully and completely.
When broadening REDs research to better serve understudied populations, it is important to consider common medical treatments that can confound results. For example, transgender athletes frequently use exogenous gender-affirming hormones, which will significantly impact endogenous hormonal interpretation.
2. Impaired bone health: DXA is the preferred method for determining an athlete’s overall skeletal health. A certified clinician should interpret the DXA to ensure that the correct sites are measured and reference databases are used. Z-scores compare a patient with an age-matched, ethnicity-matched and sex-matched reference database, and are appropriate for children, adolescents, premenopausal women and men aged <50 years, while T-scores compare a patient with young-adult white females and should be used outside the previous scenarios.31 For paediatric and adolescent athletes, areal BMD (aBMD) should be measured at the total body less head (TBLH) and lumbar spine (LS); a wrist radiograph should be obtained to adjust Z-score to bone age if skeletal immaturity or prematurity is suspected.58 In some situations, other anatomic sites may be scanned and reported.59 For all athletes aged ≥19 years, aBMD should be measured at the total hip (TH), femoral neck (FN) and LS.31 The distal forearm (33% radius) should be used if the hip or spine cannot be measured or interpreted, the athlete has hyperparathyroidism or exceeds the weight limit for the DXA table.31 The lowest Z-score or T-score (as appropriate) should be used when grading an athlete’s overall skeletal health, with interval changes in these scores noted when comparing serial DXAs. Of note, DXA cannot assess specific trabecular and cortical compartments. Further bone microarchitecture and strength evaluation can be accomplished with advanced imaging techniques such as high-resolution peripheral quantitative computed tomography (HRpQCT).60 61 These techniques should be reserved for researchers with specific imaging expertise.
Measurements of bone metabolism markers, including β-carboxyl-terminal cross- linked telopeptide of type I collagen (β-CTX) and amino-terminal propeptide of type 1 procollagen (P1NP), can be helpful for bone assessment in short-term LEA studies, as well as adjuncts to long-term bone monitoring.62
High-risk bone stress injuries (BSIs) include those of the femoral neck, pelvis and sacrum.63–65 Those with a history of high-risk BSI or multiple BSIs may be more likely to have low BMD or other measurements of poor bone health.65–68 However, using BSI alone to predict low BMD or quality is flawed: abrupt changes in training volume, years to accumulate BSIs and other factors—each independent of BMD—are pertinent to BSI incidence.69
3. Impaired gastrointestinal function: energy deficiency can affect any segment of the gastrointestinal (GI) tract.70 The study of GI disturbances requires the exclusion of primary disorders (eg, coeliac disease, Crohn’s disease, ulcerative colitis). Although many disturbances are experienced as discrete symptoms, which are well-assessed by questionnaires, most questionnaires were not developed with samples of athletes who sustain transient GI changes due to exercise (eg, splanchnic hypoperfusion, hypomotility, absorption/permeability changes, mechanical/postural effects and sports nutrition intake).71 Specific care should be taken when assessing GI symptom burden at rest versus during exercise (usually assessed via questionnaire immediately postexercise). Best practices for assessing exercise-associated GI symptoms have been well-reviewed elsewhere72; factors requiring controlling include reporting of inclusion/exclusion criteria, quantification of exercise load, heat stress, dietary intake (including before experimental period), pre-exercise and peri-exercise intake, hydration status and fluid temperatures.
4. Impaired energy metabolism/regulation: the measurement of blood markers of impaired energy metabolism commonly involves a biotinylated assay that may experience interference from biotin supplementation73; thyroid tests are particularly affected.74 75 In such circumstances, a 48-hour washout of biotin supplementation is recommended. The assessments of gonadotropins, leptin and cortisol are best undertaken via frequent (every 10–20 min) overnight samplings that approximate pulsatile release and diurnal patterns.76 COCs affect hormone-binding globulin levels, so participants taking these medications should have free rather than total thyroxine and cortisol concentrations measured and appropriately interpreted.
RMR is typically measured in the laboratory using protocols of indirect calorimetry that estimate metabolic rate from measurements of oxygen consumption and carbon dioxide production in respiratory gases.77 Metabolic carts or ventilated hoods that operate via indirect calorimetry are commonly available in sports institutes, research laboratories and some sports medicine clinics. Conversely, direct calorimetry assessment of RMR involving room calorimetry can occur but is much more difficult.78 Regardless of equipment, the most important factor for the validity of RMR measurement is the appropriate and standardised preparation of the athlete 24–48 hours prior to testing.77 Failure to control pretest feeding, use of supplements, training and activity status will likely increase measurement variability and error. Furthermore, within the measurement itself, the duration of the assessment period, the assessment environment (eg, temperature, darkness, background noise) and the data analysis approach can affect the outcomes and should be carefully considered.77 Outcomes should be interpreted by comparing with predicted RMR based on age-appropriate, sex-appropriate and sports-appropriate equations79 80 or an absolute cut-off value of 29–30 kcal/kg FFM/day.81–83
5. Impaired haematological status: haematological changes are best assessed via blood work. Iron deficiency is the leading cause of suboptimal red blood cell quality and quantity. The measurement and interpretation of iron status via blood measurements should take into account the acute effect of exercise per se on blood parameters (eg, shifts in plasma volume, acute phase changes in ferritin concentrations).84 85 Typically, ferritin <35 µg/L is used to indicate iron deficiency.85 When querying self-reported history of anaemia, the type and cause of anaemia should be included. Low ferritin, iron deficiency, bleeding disorders, heavy menstrual bleeding and chronic inflammatory conditions should be assessed, as these can cause anaemia through increased blood loss or decreased red blood cell production.84
6. Urinary incontinence: urinary incontinence is often a symptom-driven diagnosis, thus well-suited to assessment via questionnaires in the field setting. Existing surveys differ in their diagnostic accuracy, patient burden and applicability to the athletic population, in whom symptoms during exercise and rest must be elucidated to differentiate between stress and alternative types of incontinence. Studies in female athletes have used the Incontinence Questionnaire-Urinary Incontinence-Short Form (ICIQ-IU-SF), noting higher rates of stress and urge incontinence in female athletes versus non-athletes, and higher rates in female athletes with ED or LEA than those without.86–88 Studies in non-athletic populations demonstrate that the likelihood of an accurate diagnosis via surveys differs between types of incontinence, with better validation for urge urinary incontinence. A bladder stress test (cystometry) remains essential to clinically diagnose stress urinary incontinence.89 Alternative pathology (eg, pelvic organ prolapse, urinary tract infection) is also important to rule out when studying urinary incontinence.90
7. Impaired glucose and lipid metabolism: emerging data demonstrating impairments in glucose and lipid metabolism have been found during problematic LEA, although most of these data have been discovered in ED studies.91 92 Adipose tissue plays a role in regulating insulin sensitivity and glucose and lipid metabolism. Best practice procedures, including overnight fasting to assess glucose and lipid metabolism from blood samples, are outlined earlier (see Blood Work: best practice principles). Although continuous glucose monitoring (CGM) was developed decades ago for the management of diabetes, more recently, it has emerged as an assessment tool in sport,93 with the hypothesis that alterations in continuous or overnight glucose values in situations of LEA might be assessed via CGM for diagnostic purposes. Until research is undertaken to confirm the value of CGM use for the assessment of impaired glucose metabolism associated with LEA, caveats around the expense and interpretation of data from these devices should be considered.93
8. Mental health issues: a clinical assessment using the Diagnostic and Statistical Manual, Fifth Edition, Text Revision (DSM-5-TR)94 remains the gold standard for diagnosing mental health concerns, including depression, anxiety and EDs. The associated criteria for each specific diagnosis are best assessed via a clinical interview, although this can be time-consuming and unrealistic for athletes (especially with large sample sizes). Validated screening tools exist for several mental health domains and are universally accepted as appropriate surrogates in non-athletic populations. Notably, mental health concerns in athletes may present with atypical symptoms and overlap with performance deficits, such as decreased motivation and overtraining. Thus, results must be interpreted with care.95 96 The International Olympic Committee (IOC) Sport Mental Health Assessment Tool 1 (SMHAT-1) and Sport Mental Health Recognition Tool 1 (SMRHT-1) were recently developed to screen for mental health disorders in athletes.97 Additionally, further mental health investigations may help clarify the complex cause and effect relationships among ED, other mental health issues and LEA.
9. Impaired neurocognitive function: comprehensive neuropsychological evaluations assess several cognitive domains, including executive function, attention, verbal skills and memory. Components include a formal interview and numerous tests, often requiring a full day. The results are nuanced and must be interpreted against individuals of a similar age and level of education. Given the complexity, this evaluation is poorly suited for serial examinations in large study samples. Mini-screens and singular tests from this larger evaluation have been used to evaluate cognitive dysfunction in those with malnutrition, primarily in ED studies.98–100 Researchers must be aware that these screens can be affected by sleep, motivation and other external stressors.101 102
10. Sleep disturbances: athletes’ sleep patterns and habits differ from non-athletic populations and are affected by travel and competition schedules, as well as training variables, such as core temperature and muscular fatigue/pain.103 Despite these differences, polysomnography remains the gold standard for assessing physiological components of sleep in the laboratory setting. Research-grade actigraphy devices have been validated against polysomnography and are suitable alternatives for field-based studies, although they have several limitations, including overestimation of sleep quality and efficiency.104 Athlete-specific sleep questionnaires should be prioritised over generalised questionnaires to improve accuracy in athletic populations.105 Commercially available wearables to track sleep have grown in popularity among recreational and elite athletes. Although wearables reasonably capture sleep duration and timing, current technology fails to assess sleep stages accurately.106
11. Impaired cardiovascular function: cardiovascular function has many well-established, standardised clinical methods for a thorough assessment, with many of these modalities having been tested in the sports science field. Thus, the sports cardiology field has established some sport-specific ‘norms’ for athletes.107–113 When using a chest-mounted HR strap with electrodes, we recommend that all participants use the same brand and generation of devices to reduce error. The development of sports wearables is a rapidly evolving industry, but the study of these devices can be difficult due to frequent hardware, firmware and software alterations. These devices use photoplethysmography to measure HR instead of electrode measurement of cardiac electrical activity, as is used in an electrocardiogram (ECG). However, inconsistent skin contact, skin tone, skin moisture, motion artefacts with exercise, tissue perfusion and other sources of error114–117 preclude them from being used for lab-based or field-based studies at this time. Further improvements may make them suitable.
12. Reduced skeletal muscle function: LEA-related and REDs-related skeletal muscle outcomes include changes to substrate oxidation, intramuscular fuel stores and protein metabolism status. Substrate oxidation can be assessed both indirectly (expired whole-body gas analyses) and directly (tracers)118–120 at rest or during the exercise of varying intensities, durations and modes. Oxidation rates, however, will be affected by preceding feeding status (fasted vs fed, meal composition) and must be controlled. Further information can be obtained via skeletal muscle biopsy,121 a technique involving the removal of a small sample of the skeletal muscle to assess characteristics such as muscle protein synthesis rates (with concurrent tracer methodologies), fibre typing and concentration of intramuscular glycogen and lipids. Validated procedures should be followed when choosing the targeted muscle and the appropriate technique and equipment. Correct protocols should be in place to process (freezing, drying) and storage of specimens and subsequent biochemical analysis (eg, assays, etc). For protein metabolism, tracer techniques including indicator amino acid oxidation (IAAO)122 and D2O (deuterated water) stable isotopic tracer methodology123 are available, where the measurements usually rely on the collection of urine or blood samples over a specific period along with controlled and predefined intake of dietary protein (see the impaired reproductive function section for blood sampling standardisation).
13. Impaired growth and development: growth is an individualised and dynamic process in youth, with variations in the onset of puberty, attainment of peak height velocity and growth trajectories. As such, evaluating impaired growth and development must account for this individualisation with serial monitoring using standardised growth charts and Z-scores rather than static assessment against normative values (eg, median body mass index (BMI)). The rate of growth deviation must also be considered, as this can reflect the acuity of illness. Serum assays of insulin-like growth factor 1 (IGF-1) must be standardised to prevent intra-assay variability and compared against age-appropriate standards as levels decline following adolescence.124 125 Self-reported delays in puberty should be interpreted in the context of family history, personalised growth trajectories and accepted norms (eg, primary menarche onset by age 15 years).
14. Reduced immunity: immune function evaluation typically requires a specific question or set of symptoms to guide testing. Consequently, a standardised method for assessing immune function in athletes has yet to be defined. The association between LEA and immune dysfunction is now recognised as being more nuanced than originally proposed, with immune tolerance occurring during some scenarios of LEA.126
Methods to assess REDs-related performance outcomes
Decreased muscle strength, endurance and power performance and training response: there are many ways to assess the strength and performance of an individual. Several factors must be implemented to achieve high ecological validity, reliability and sensitivity that is specific to the sport and the individual athlete.127 These include activities prior to the performance assessment, such as familiarisation with the test to reduce the learning effect, and 24-48 hours pretrial standardisation of exercise and fuelling, supplementation use, immediate pretrial meal, warm-up and time of the day.127 128 Ideally, a validated set of lab-based and field-based performance-related metrics specific to the athlete and sport should be implemented to allow longitudinal tracking over time (table 3).129 130 These same lab-based and field-based performance metrics can also be used to assess longitudinal training responses to ascertain whether there are improvements, plateaus or decreases in training responses. On their own, decreased or plateaued training responses are not necessarily a maladaptation to training; hard training blocks can induce acute fatigue, causing short-term plateaus35 or an athlete may be approaching their genetic performance ceiling. However, plateaued or decreasing performance while maintaining individually normal or increased training loads can indicate aspects of Overtraining Syndrome (OTS) or REDs,35 or just general under-recovery. Adding metrics of internal load training responses (eg, lactate, HR, rating of perceived exertion) to external load metrics (performance-related outcomes) can assist the practitioner in differentiating acute training fatigue from maladaptation to the training response (table 3).
Decreased recovery: both REDs and OTS are chronic forms of decreased recovery. Recovery assessment is complex and multifactorial and can involve multiple body systems and both physiological and psychological assessments.35 The methodological diagnosis of REDs131 and OTS132 are covered in depth elsewhere, although it is noted that the most recent guidelines for OTS diagnosis were published before REDs was first described. In table 3, we highlight more acute measures of muscular recovery or muscle damage, as well as typical field-based recovery assessments and questionnaires with varying validation levels. More recently, a substantial industry has developed around wearable devices that claim to assess recovery and readiness via various black-box algorithms (based mainly on longitudinal assessments of individual HR, HR variability, temperature, oxygen saturation or exercise outcomes). Most of these devices remain to be validated with HR data measured by photoplethysmography, the limitations of which are discussed in the Impaired cardiovascular function section above.
Decreased cognitive performance/skill and motivation: attention, memory, reaction time and response to various stimuli are all important for sport success.133 When studying the relationship between LEA and cognition, neuropsychological tests can be used to test different domains, similarly to their utility in concussion assessment.134 Early problematic LEA may lead to subtle changes in cognitive performance/skill, but sport-specific testing is needed in this under-researched area and domain assessment should be clearly defined in future work. Similarly, sports training, with endurance, power and skill acquisition, requires motivation. The quality of motivation (intrinsic/self-determined, extrinsic/controlled, amotivation or mastery vs performance) may be affected by problematic LEA and can shift from intrinsic to more extrinsic motivation and even amotivation. Quantity of motivation (level of energy and strength of motivation) may remain high with maladaptive quality of motivation (eg, high drive to lose weight to improve performance). Such changes should be studied using motivation survey assessment tools135 that have been previously employed in athletes until more specific methods are developed.
Decreased athlete availability: remaining free of significant injury and illness is imperative to athlete success in sport. Data from athletics athletes show that the likelihood of achieving a performance goal decreases sevenfold in those athletes who complete 80% or less of planned yearly training weeks.136 Female athletes with signs of LEA were nine times more likely to develop an illness at the Olympic Games.137 Given the robust evidence that long-term LEA can increase the risk for BSIs68 138–140—injuries that can sideline athletes for weeks to months—the effect of LEA on decreased athlete availability is significant. Accordingly, we recommend assessing training and competition days lost to injury and illness.
The importance of establishing athlete or sport-specific norms for evaluating REDs outcomes
Exercise, particularly at the workloads undertaken by high-performance athletes (sometimes approaching 30 hours/week),35 can contribute unique effects on some parameters often measured in routine health screens, including those related to REDs. Examples include the increase in serum ferritin concentration as an acute phase response to a strenuous exercise session (acute perturbation),141 as well as increases in BMD in response to high-impact exercise (chronic perturbation). Furthermore, there may be a gap between a clinical deficiency/impairment and the onset of detectable reductions in athlete health, availability and performance. Examples from these same systems include the gap between frank iron deficiency anaemia and the iron status needed to sustain optimal recovery between training sessions or promote optimal adaptation to increased training stress (eg, altitude training),142 143 or the potential gap between osteoporosis and the bone microarchitecture and strength needed to tolerate the repetitive strain of high-volume training. Recognition of some acute perturbations associated with exercise have been built into best practice guidelines for athlete assessments to enhance internal validity of measurements (eg, DXA-measured body composition should be undertaken in a fasted, rested position; iron status should be measured on a rest day).29 31 Additionally, optimal status ranges of some metrics have been defined or suggested for athletic populations to consider what is optimal rather than what marks the absence of a clinical disease state. Examples include athlete-specific goals for ferritin concentrations that signal iron sufficiency rather than the absence of iron deficiency/anaemia and the call for sports-specific reference ranges for BMD to take into account higher BMD expected in weight-bearing athletes in association with their activity and training history.144
Athlete reference ranges are likely to evolve for a variety of health and performance parameters but require careful thought and expertise; there are dangers in both failing to adjust some reference ranges (underestimating what is important for an athlete’s optimal health and performance or missing an opportunity for early intervention in some health concerns) and overadjusting reference ranges (falsely inflating the prevalence of problems in athletes). A judicious approach to this theme and continual evaluation of the specificity and sensitivity in the interpretation of metrics of health and performance are imperative.
Conclusions and future directions
The introduction of REDs nearly a decade ago145 galvanised renewed academic and clinical interest in energy deficits in athletes.5 Sports scientists conducted studies to investigate the populations of athletes and physiological systems that had not been previously studied under the female athlete triad paradigm. This nearly blank canvas was filled with various methodologies and levels of control. In a race to prove a theory, attention was first paid to data creation rather than the rigour of collection.
As in any scientific field, the maturation of understanding is accompanied by improvements in testing and assessment. In this review, we summarise the wide range of methods available to REDs researchers. We also provide recommendations on the appropriate utilisation of these methods. With limited resources, researchers must make informed decisions on the number and types of tests they use. When interpreting data, they should acknowledge the limitations imposed on their results by their methodology.
This review is a snapshot in time and will require updating as existing methods become strengthened or discarded, new methods are developed, and new physiological changes are discovered. We encourage researchers to consider the generalised best practice guidelines we have reviewed when designing future REDs studies.
Ethics statements
Patient consent for publication
Ethics approval
Not applicable.
References
Footnotes
Twitter @drkateackerman, @MargotRogers_, @LouiseMBurke, @TStellingwerff, @margo.mountjoy, @BSHoltzman
Contributors Scientific contributions: KEA, MAR, IAH, LMB, TS, ACH, MM and BH were involved in the conception of the paper. All authors were involved in drafting, revising and approval of the final manuscript prior to submission.
Funding The authors have not declared a specific grant for this research from any funding agency in the public, commercial or not-for-profit sectors.
Competing interests KEA is a Deputy Editor of the BJSM and an Associate Editor of the BJSM IPHP. MM is a Deputy Editor of the BJSM and a member of the BJSM IPHP Editorial Board. EV is an Associate Editor of the BJSM, an Associate Editor of the BJSM IPHP and Editor in Chief of BMJ Open Sports and Exercise Medicine.
Provenance and peer review Not commissioned; externally peer reviewed.