Article Text

PDF

Identifying college athletes at risk for pathogenic eating
  1. M T DePalma1,
  2. W M Koszewski2,
  3. W Romani3,
  4. J G Case4,
  5. N J Zuiderhof3,
  6. P M McCoy3
  1. 1Ithaca College, Ithaca, NY, USA
  2. 2University of Nebraska, Lincoln, Nebraska, NE, USA
  3. 3University of Maryland, Baltimore, MD, USA
  4. 4Cornell University, Ithaca, NY, USA
  1. Correspondence to:
 Dr DePalma, Department of Psychology, Ithaca College, Ithaca, NY 14850, USA;
 depalma{at}ithaca.edu

Abstract

Objectives: To evaluate the effectiveness of a discriminant function that predicts risk of pathogenic eating in comparison with a standard self report measure (EAT) and a clinical interview. In addition, to determine the effectiveness of this discriminant function using a variety of collegiate athletes.

Methods: A total of 319 participants were asked to complete a series of self report measures that assessed dietary practices. In addition, anthropometric measures were obtained, and a random sample of 15% participated in a structured clinical interview.

Results: Correlational analyses indicated that the discriminant function categorisation of risk was significantly related to both the clinical interview and EAT (p≤0.05). The discriminant function was accurate in predicting risk category in this diverse group of athletes, particularly with respect to those at low risk (83.1%) and those at high risk (72.7%).

Conclusion: This information may be helpful in the development of a simple, accessible tool to identify athletes at risk of engaging in pathogenic eating behaviours.

Statistics from Altmetric.com

The terms “normal eating”, “pathogenic eating”, “disordered eating”, and “eating disorders” have been used to describe a continuum of individual eating behaviours.1 A pathogenic eater may routinely engage in chronic dieting, fasting, laxative use, and/or self induced vomiting during certain times of the year—for example, an in-season athlete trying to achieve or maintain a certain weight.2 A person is considered to be a disordered eater if they engage in bingeing, purging, food restriction, prolonged fasting, use of diet pills, and/or diuretics, have a strong preoccupation with food, and develop a distorted body image.3, 4 It is only after following these behaviours and meeting the strict DSM-IV criteria that a person is classified as having an eating disorder.5 As a result of the strict DSM-IV diagnostic criteria, the number of people exhibiting dangerous pathogenic eating behaviours is actually much higher than the prevalence of diagnosed disorders.2, 6

College athletes are particularly susceptible to the dangers of developing eating disorders.7–11 Skolnick10 reported “seriously abnormal eating patterns” in 15.4% of female college swimmers and 62% of college gymnasts. Thiel et al11 reported that 11% of male rowers and wrestlers in their study displayed eating disorders, and 52% of this population engaged in bingeing as a method of weight control. Similarly, DePalma et al12 reported that 9.9% of lightweight football players were at risk for eating disorders, and fully 42% practiced dysfunctional eating behaviour to meet the weight restrictions of their sport. Participants in activities that stress low body weight and a slim shape for professional competence, such as dance, gymnastics, and wrestling, appear to be at greater risk of developing eating disorders.11, 12 Moreover, certain traits such as a goal oriented, perfectionist personality may act as internal pressures, and a traumatic event or coercion from a coach or other important athletic personnel may act as an external catalyst in a high risk person.3, 6 The increased susceptibility of athletes to pathogenic eating is a serious concern because of the increased physiological demands placed on athletes compared with a sedentary population. Furthermore, these pathogenic behaviours may progress to eating disorders, which could increase risk for future health complications, including muscle weakness, cardiac palpitations, amenorrhoea, decreased oestrogen levels, disruption of bone formation, lower testosterone, renal complications, and coronary heart disease.13–16

Screening for those at risk is important for early intervention. An efficient and effective screening tool should include the ability to: associate behavioural or dietary factors with specific “diagnostic signs or health outcomes” in a valid and explicit manner17; assess the representative criteria from a variety of participants exhibiting diverse characteristics; maintain an adequate length so as to minimise the degree of effort required from the participant. The last of these is particularly important, as it is seldom viable to screen large numbers of athletes with lengthy surveys or costly and time intensive clinical interviews.17

For instance, the Diagnostic Survey of Eating Disorders (DSED) is a lengthy survey designed to identify susceptibility to anorexia nervosa or bulimia.18 The Eating Attitudes Test (EAT)19 is used to identify those at risk of developing an eating disorder or for evaluating the effect of treatment.20 A third method of screening for eating pathology is the Survey of Eating Disorders Among Athletes (SEDA).21 This 33 item questionnaire identifies self reported eating pathology as well as factors specific to the athletic environment that may contribute to the eating disorder. The SEDA was developed and revised by professionals exposed to athletic, student, and eating disordered populations, which makes the tool more appropriate for an athletic population. Although these tools appear to be valid and reliable, they can be time consuming and require considerable effort from the participant. Moreover, the DSED and the EAT are not specific to athletes. To adequately assess the potential risk of pathological eating in an athletic population, a shorter, more efficient screening tool that can be administered to a large population without extensive investment in time and resources is needed.

In 1986, Williams et al22 reported the successful use of discriminant analysis to derive self report items useful for accurately classifying a person as a “normal eater”, “dieter”, or “suspected bulimic”. In 1993, DePalma et al12 used a similar technique to classify participants. Their discriminant function comprised the responses from lightweight football participants on only eight items from the DSED, including self reported weight, weighing frequency, and interference with various aspects of the participant's life (fig 1, items 1–3). The researchers subjected this small amount of information to a direct discriminant analysis, and compared these results with the participants' reported frequency of actually engaging in various pathogenic behaviours during the previous month—for example, self induced vomiting, fasting, binge eating, laxative use, etc. The eight item discriminant function could correctly identify those at high risk for pathogenic eating in about 84% of the cases. Thus a categorical assessment of risk (low, moderate, or high) could be created that enables a coach, athletic trainer, or dietitian to identify most of the athletes who are considered to be high risk. These high risk athletes could then be counselled to reduce the negative consequences associated with pathological eating and help prevent the development of a clinical eating disorder.

Figure 1

Survey used for discriminant analysis including items from the Diagnostic Survey of Eating Disorders (DSED; items 1–3) and the Survey of Eating Disorders among Athletes (SEDA; item 4).

One drawback of the work by DePalma et al was that the eight item discriminant function was developed exclusively on male lightweight football players. Further empirical research is necessary to determine the effectiveness of this model across sport and sex. In addition, it is necessary to validate this inferential tool against a standard self report inventory and a structured clinical interview. Thus we hypothesised that the eight item discriminant function would positively correlate with the EAT and a structured clinical interview. Furthermore, we hypothesised that the discriminant function could differentiate risk within a variety of different athletes.

METHODS

Approval by athletic director and coach

Athletic directors and coaches received a copy of the approved Institutional Review Board for Human Subjects Research proposal and were asked to sign a written release granting permission to recruit student athletes to participate in the study. Of the 32 coaches who were contacted, 25 permitted recruitment of their athletes.

Participants

About 746 student athletes were recruited: 319 male and female participants (42.8% of the population surveyed) from division I and/or division III cross country, crew, track, wrestling, field hockey, lightweight football, gymnastics, swimming, lacrosse, and basketball, as well as a control group. The resulting sample consisted of 128 men and 191 women, with a mean (SD) age of 19.85 (1.67) years. The only incentive for participation was the opportunity for participants to review their personal results.

Instruments

Participants were asked to complete a series of questionnaires. An abbreviated 45 item version of the DSED18 was used to assess attitudes and behaviours related to food consumption, as well as provide information about current and desired weight, weight history, and dieting, binge eating, self induced vomiting, and laxative use. The 40 item EAT19 was also administered, as was that portion of the SEDA21 that assessed the impact of the athletic environment as a factor contributing to eating pathology (fig 1, item 4).

Data collection

From a prepared statement, a recruiter explained the purpose of the 60–75 minute study. If the subject agreed to participate, he/she was asked to sign an informed consent form. To maintain confidentiality, the participants were identified by a nine digit identification number.

Participants were asked to answer each question candidly, and were assured that their responses were confidential. They were told that coaches would not have access to individual results, only data in an aggregate form. During or upon completion of the questionnaires, an appropriately trained researcher took anthropometric measurements. These measurements included height and weight, from which body mass index was computed.23 Skinfold was also measured with a Lange caliper (Cambridge Scientific Industries, Inc, Cambridge, Massachusetts, USA) using the Jackson-Pollock method.24 Skinfold sites included thigh, chest, and abdomen for men and thigh, triceps, and suprailium for women. From these data, body density and percentage body fat were calculated (table 1).

Table 1

Physical measurements of participants

After about one week, a random sample of participants was contacted by telephone and asked to participate in a 30 minute clinical interview with a registered dietitian. Because of the resources required to administer a clinical interview, only a subsample of 15% (n = 49) was chosen for interview. The interviewer was blind to the participants' self report data, as well as to their anthropometric measurements. The content of the clinical interview, which was the same for all subjects, repeated some aspects of the questionnaires, asked additional information about the participants' definition of “binge”, “purge”, and “fast”, and asked more in-depth questions about their dietary practices. Participants were also asked to recall their food intake over the preceding 24 hours. The clinical interviewer categorised each of these participants as being at high, moderate, or low risk for disordered eating based on the degree to which the participant exhibited unrealistic weight expectations, and/or whether they reported purging, inadequate energy intake, or having an eating disorder. Those who reported no history of anorexia or bulimia, satisfaction with their weight, and reported meeting their energy needs were considered to be low risk. Those who reported not meeting their energy needs, who exhibited unrealistic weight expectations, and who had no history of anorexia or bulimia were considered moderate risk. Anyone who reported a history of anorexia or bulimia, who did not meet energy needs, and exhibited unrealistic weight expectations was regarded as high risk.

Specific objectives and overview of statistical analyses

The data presented here are selected from a more comprehensive study of the dietary practices of college athletes funded by the National Collegiate Athletic Association (NCAA). The statistical analyses are limited to those that deal with two objectives. The first objective of this study was to correlate the classification results from the eight item discriminant function with three important measures:

  1. participant classification by a structured clinical interview;

  2. a standard self report measure (EAT);

  3. a behavioural measure of risk.

This behavioural measure of risk, or the “risk index” (RI), was based on the frequency with which participants' reported actually participating in the various pathogenic dietary behaviours during the last month (vomiting, fasting, etc) in the DSED.

The second objective was to examine the ability of this discriminant function to detect risk on a new and more varied group of athletes. This particular analysis classified participants as being at high, moderate, or low risk for pathogenic eating behaviours. This classification was then compared with each participants' RI.

RESULTS

Table 2 indicates that the eight item discriminant function classification (Pred group) is significantly related to the RI, the clinical interview rating (CLINRATE), and EAT. In summary, all four measures of pathogenic eating are positively and significantly related (p≤0.05).

Table 2

Correlation matrix of risk indices

To satisfy our second objective, the data from all participants were used to assess the usefulness of the eight item discriminant function on the present sample. One discriminant function was calculated (Wilk's λ = 0.75, χ2(6, N = 290) = 82.1, p<0.05). (The total N for each analysis differs somewhat from the output classification tables because of missing data on a discriminating variable.) Chance base rate classification would be 33.3%. As can be seen from table 3, the overall percentage of grouped cases correctly classified was 47.6%. The discriminant analysis was useful in predicting those of low risk (83.1%), and poor at predicting those of moderate risk (29.6%). Most importantly, however, this analysis correctly classified those at high risk 72.7% of the time. Assuming a primary interest in screening for people at high risk, this model produces a false positive rate of 20%, and a false negative rate of 27%.

Table 3

Discriminant analysis using the total sample

Further, multivariate analysis of variance indicated that there were no meaningful significant differences between athletes and controls, men and women, or participants in either NCAA division (Fs(8, 300)<1.87, p>0.05) with regard to the frequency of the practice of pathogenic eating behaviours. These analyses suggest that the discriminating variables can be applied to athletes without regard to sex or NCAA division.

Finally, the analyses indicated that one additional variable increased the predictive value of the discriminant function for athletes: the impact of the athletic environment (SEDA). One discriminant function was calculated (Wilk's λ = 0.71, χ2(8, N = 283) = 93.72, p<0.05). As can be seen from table 4, the overall percentage of grouped cases correctly classified was 50.9%. The discriminant analysis was useful in predicting those of low risk (79.1%), and only slightly better than the original discriminant analysis at predicting those of moderate risk (37.0%). Most importantly, however, this analysis correctly classified people at high risk 71.9% of the time. Table 5 presents the group means across risk level for each of the discriminating variables. Both the degree of interference with various aspects of the participant's life and the impact of the athletic environment (SEDA) increase with level of risk.

Table 4

Discriminant analysis incorporating the impact of the athletic environment (SEDA), using the athletic sample only

Table 5

Group means across risk level for discriminating variables

DISCUSSION

The classifications obtained through the discriminant function are positively and significantly related to several measures of risk of pathogenic eating. Notably, in support of our first hypothesis, the highest correlation (r = 0.54) was with the EAT, a widely used instrument for assessing eating disorders.19 This meaningful correlation was found between these measures, despite the fact that no items from the EAT appear in the discriminant analysis itself (unlike the DSED or SEDA).

The discriminant function used in the earlier study on lightweight football players (self reported weight, weighing frequency, and interference) was repeated on the present sample, which consisted of a variety of different athletes. In support of our second hypothesis, there are significant similarities (table 3). The present discriminant analysis was very useful in predicting those of low risk, and poor at predicting those of moderate risk. Most importantly, however, more than 70% of individuals at high risk for dysfunctional eating were routinely correctly classified, although there were still some false positives and negatives. This is particularly notable given that the discriminant function predicts more than 70% of individuals, only 11% of this sample. The chance level determination of those at moderate risk should not minimise the importance of this discriminant analysis because intervention would primarily be directed at those of high risk. This conceptual replication of the DePalma et al study shows that, with only a small amount of information, it is possible to correctly identify high risk individuals about 70% of the time.

The current investigation has also identified a potential variable derived from SEDA (labelled “athletic environment”) for use in the discriminant analysis that helps to strengthen the classification of those at moderate risk. As table 4 indicates, the addition of this variable to the model continues to allow greater than 70% identification across two levels of risk. The importance of this variable is that it identifies aspects of the athletic environment—for example, public weighing, athletic department personnel comments, or instructions—that may contribute to pathogenic eating behaviours. Table 5 presents the group means across risk level for each of the discriminating variables, which indicate that both the degree of interference with various aspects of the participant's life and the impact of the athletic environment (SEDA) climb precipitously with level of risk. This particular section of the SEDA examines whether weight loss was required for performance excellence, to reach aesthetic ideals of beauty, or to meet a lower weight category. In addition, it assesses whether important athletic personnel comment on an athlete's weight or require the athlete to lose weight, and whether public weigh ins or public announcement of weight occur. Should the two latter potentially deleterious practices occur, there are relatively easy and straightforward solutions. The athletes' perception of the athletic environment may be one area that warrants further attention and modification.

The responses to the items in fig 1 can provide important information with respectable accuracy (>70%). Although accuracy is, of course, the most important issue, it is not the only one. The present instrument takes only about two minutes to administer and about two minutes to score. As can be seen, the items are short and relatively non-confrontational in nature. The questions are not likely to generate a high degree of defensiveness. Questions from the DSED, such as “how often do you self induce vomit?”, or from the EAT, such as “to what degree do you have the impulse to vomit after meals?”, are quite transparent and may generate a level of defensiveness that the present broader measure may not. These assertions on transparency and defensiveness are speculative and remain an empirical question. Given the minimal resources required to administer this instrument, less well resourced institutions could administer this battery when they would otherwise not intervene. Thus, where there are limited resources, those resources could be devoted to those that are likely to be at highest risk for pathogenic eating.

There are, however, several limitations of this study. Firstly, the results can only be applied to people willing to respond to the confidential survey. Participants were informed that coaches only had access to aggregate data, but even this limited exposure may have affected willingness to participate in the study. It is possible that the most affected people chose not to participate, and that the results may not be reliable when all athletes are considered. However, these data indicated that about 11% of respondents are at high risk for pathogenic eating, and this figure is not considerably different from estimates from studies that had more substantial response rates.11, 12, 25 Yates et al25 found that seven of 66 (10.6%) runners with traits similar to patients with anorexia nervosa had EAT scores above 30, indicating a possible eating disorder.25

Secondly, we are cautiously optimistic about the utility of this measure given that these findings have been replicated across sport and sex; however, we do believe that further research on a variety of different athletes is necessary. For example, this study examined only college athletes. We do not yet know if such an analysis would be effective on high school, prep school, community college, division II, or other elite or professional athletes across different regions of the country. Finally, many of these data are based on self report and, as such, are subject to the limitations inherent in this format. Participants may not have responded honestly, and this limitation is difficult to overcome. It is difficult to effectively and continually monitor participants to record actual participation in self induced vomiting, fasting, etc. Thus we are limited to an assessment using the participants' report. Other more costly methods such as metabolic analyses may be considered in future studies of this nature.

In summary, eating disorders are an important problem worthy of empirical attention. However, even engaging in pathogenic eating behaviours that do not meet DSM-IV criteria for eating disorders may have a serious psychological and physiological impact. Standard self report measures are generally designed to detect eating disorders, and can be intrusive, time consuming, and somewhat cumbersome; thus the development of a simple obsequious tool to identify pathogenic eating behaviours in student athletes may be of considerable benefit.

Take home message

Athletes who are at moderate risk for pathogenic eating may not exhibit the symptoms of a full blown eating disorder, but even pathogenic eating can cause health concerns. Resources are needed to develop simpler, more accessible tools to identify these athletes so that they can be referred for nutritional counselling.

Acknowledgments

This research was funded by grants from the National Collegiate Athletic Association (NCAA) and The Presidents Council of Cornell Women. The authors would like to thank Dick Darlington for his statistical guidance, Beth Rasch for her comments on an earlier draft of the manuscript, and Amy Fowle, Marc Greenberg, Victor Shu, and Matt Bennett for their help with data collection.

REFERENCES

View Abstract

Request permissions

If you wish to reuse any or all of this article please use the link below which will take you to the Copyright Clearance Center’s RightsLink service. You will be able to get a quick price and instant permission to reuse the content in many different ways.