GET THE APP

Calcium, vitamin D and iron status of elite rugby union players
Scholars Research Library

Scholars Research Library

A-Z Journals

+44 7389645282

European Journal of Sports & Exercise Science

Research Article - European Journal of Sports & Exercise Science ( 2018) Volume 6, Issue 3

Calcium, vitamin D and iron status of elite rugby union players during a competitive season.

Corresponding Author:
Deborah Smith
Leeds Beckett University, Leeds Beckett University
Institute for Sport, Physical Activity and Leisure, Leeds
United Kingdom
Tel: +44 1138124091
E-mail: D.R.Smith@leedsbeckett.ac.uk

Abstract

Sub-optimal calcium, vitamin D and iron intakes are typical in athletes. However, quantification by dietary intake may be erroneous, with biomarkers providing a more accurate assessment. This study aimed to determine the calcium, vitamin D and iron status of 8 junior (i.e., under-18 [U18]; age 15.5 ± 0.5 years; height 180.4 ± 6.7 cm; body mass 81.6 ± 14.3 kg) and 12 senior (i.e., over-18 [O18]; age 19.7 ± 1.8 years; height 184.9 ± 6.9 cm; body mass 97.4 ± 14.4 kg) male rugby union players, and assess their adequacy against reference values. Fasted serum calcium, 25(OH)D and ferritin concentrations were analysed using enzyme-linked immunosorbent assays during the in-season period (March-April). U18 had very likely greater calcium concentrations than O18 (2.40 ± 0.08 vs. 2.25 ± 0.19 mmol.l-1). Differences between U18 and O18 were unclear for 25(OH)D (20.21 ± 11.57 vs. 29.02 ± 33.69 nmol.l-1) and ferritin (59.33 ± 34.61 vs. 85.25 ± 73.53 µg.l-1). Compared to reference values, all U18 had adequate serum calcium concentrations, whereas 33% and 67% of O18 were deficient and adequate, respectively. All U18 and 83% of O18 had severely deficient, deficient or inadequate vitamin D concentrations. Adequate (8%) and optimal (8%) concentrations of vitamin D were observed in O18. All U18 and 75% of O18 had adequate ferritin concentrations. Potential toxicity (17%) and deficient (8%) ferritin concentrations were observed in O18. Vitamin D intake should be increased and multiple measures obtained throughout the season. More research is required on the variation of micronutrient status.

Keywords

Micronutrients, Team sports, Athlete, 25(OH)D, Ferritin, Deficiency.

Introduction

Sub-optimal calcium, vitamin D and iron intakes could impair health and performance and are frequently observed in athletes. Although inadequate intakes of other micronutrients are also observed, the aforementioned are the most commonly supplemented by athletes to correct a deficiency and therefore should be monitored. Such deficiencies are often observed from dietary assessment, comparing reported intakes to reference nutrient intakes specific to age (e.g. 15 years to 18 years and 19 years to 64 years). This can be erroneous given that athletes typically under-report energy intake by 19% [1-4], suggesting micronutrients may also be under-reported. To accurately capture the variability in micronutrient intake, prolonged periods of diet records are required. Specifically, a recording period of 74 and 68 days are required to obtain a true average within 10% for estimates of calcium and iron intake in males, respectively [5]. Providing supplementation based on under-estimated dietary intakes could therefore lead to overconsumption of micronutrients and may cause detrimental side effects in some cases. Thus, it is important to accurately measure the micronutrient status of athletes using biomarkers that allow comparisons of blood concentrations to reference data.

Given the importance of calcium, vitamin D and iron for growth and development [6], practitioners should monitor these micronutrients.

Calcium is mainly required for bone mineral accrual [7], and essential for optimising peak bone mass in order to reduce the risk of developing osteoporosis later in life. It is also important for the regulation of muscle contractions and the nervous system [1]. Within adolescent males (12 years to 14 years old) adequate total serum calcium levels have been observed (swimmers, footballers and cyclists) ranging from 2.49 ± 0.10 to 2.51 ± 0.11 mmol.l-1, similar to non-sporting controls [8]. However, sub-optimal dietary intakes of calcium have been observed in collegiate Japanese rugby players and under-16 year old rugby league players report to under-consume foods and drinks from the milk and dairy food group [9]. As such, the calcium status of adolescent rugby players warrants exploration. Given approximately 40% of bone mineral content is accrued from -2 to +2 years around peak height velocity [10], ensuring adequate calcium status during the adolescent years is the key.

Vitamin D facilitates calcium absorption, muscle repair and remodelling, and plays an important role in immune function [11]. Despite senior rugby players in the United Kingdom (UK) demonstrating greater vitamin D concentrations than other athletes and healthy controls during the winter months (November-January), 20% were considered inadequate (30 to 50 nmol.l-1) and 94% did not reach suggested optimal concentrations for sports performance (>100 nmol.l-1) [12]. 14 to 18 year olds from the general UK population demonstrated inadequate vitamin D concentrations during the winter period which decreased further by spring (October; 46.8 ± 11.4, March; 30.7 ± 8.6 nmol.l-1) [13]. Data are lacking in adolescent athletes, however it is likely that vitamin D levels observed in this cohort are sub-optimal for health and rugby performance, particularly during the rugby union (RU) competitive season (September-May).

Iron facilitates oxygen transport, adenosine triphosphate production and deoxyribonucleic synthesis [6]. Thus, inadequate iron status can increase muscle fatigue and decrease work capacity. A primary cause of inadequate iron status is dietary intake. Despite this, collegiate Japanese rugby players (positional backs) reported inadequate consumption of iron, while maintaining adequate serum ferritin concentrations (forwards, 73.4 ± 28.8; backs, 47.7 ± 17.6; controls, 72.0 ± 37.3 μg.l-1).

Given that dietary intake was assessed retrospectively using food frequency questionnaires, intakes may have been under-reported. Furthermore, differences have not been observed in younger male athletes from a range of sports with low (<35 μg.l-1) and adequate ferritin concentrations when compared to dietary iron intake. However, a greater number of under 18 year olds in the same study demonstrated low iron status (35%) compared to >18 year olds (13%). Relationships between ferritin concentration and age were unclear in older (23 ± 3 years) rugby 7s players [14-17]. The use of biomarkers to measure iron status is therefore warranted in younger players, and has yet to be done in UK rugby players.

Inadequate calcium, vitamin D and iron concentrations will likely lead to reduced sporting performance and may have negative short and long-term health implications [6]. Therefore, this study aimed to determine the calcium, vitamin D and iron status in elite English RU players from different age groups, and assess their adequacy against reference values.

Methods

Study design

A cross-sectional design was used to investigate the micronutrient status of 20 elite male English RU players during their competitive season (March-April). Fasted total serum calcium, 25-hydroxy vitamin D (25(OH)D) and ferritin concentrations were analysed to assess the adequacy of calcium, vitamin D and iron status, in comparison to reference values.

Participants

Eight junior (i.e., under 18 [U18]; age 15.5 ± 0.5 years; height 180.4 ± 6.7 cm; body mass 81.6 ± 14.3 kg) and twelve senior (i.e., over-18 [O18]; age 19.7 ± 1.8 years; height 184.9 ± 6.9 cm; body mass 97.4 ± 14.4 kg) male rugby union players based in the UK at latitude ~54ºN were recruited from a professional Championship RU club and their respective Regional Academy. Participants were classified as successful elite athletes [18].

A declaration was completed by each player prior to sample collection to ensure micronutrient status was not a result of supplementation. Ethics approval for the study was granted by Leeds Beckett University. Written informed consent was sought from senior players, and written informed player assent and parental consent was obtained for junior players (i.e., U18).

Blood samples

Players presented to the institutions laboratory during March and April following a 12 hour overnight fast. Venous blood samples (18 ml) were drawn from the antecubital fossa into serum-separating tubes (SSTTM) (BD Vacutainer®, Plymouth), inverted and then left for a minimum of 30 minutes at room temperature. Samples were separated by centrifugation at 3000 rpm for 10 minutes at 4ºC.

Serum was pipetted into 2 × 2.0 ml microtubes for each measure, and then stored at -40ºC until further analysis.

Calcium analysis

Stored serum samples were dispatched to Leeds Teaching Hospital for analysis. A Calcium Concentrated Assay (Siemens Healthcare Diagnostics Inc., Tarrytown, NY) was used to determine serum calcium concentrations automatically performed using the ADVIA Chemistry XPT System (Siemens Healthcare Diagnostics Inc., Tarrytown, NY).

Results were calculated by the system based on a wavelength of 658/694 nm, providing total calcium concentrations in mmol.l-1.

25(OH)D analysis

25(OH)D was analysed using enzyme-linked immunosorbent assay (ELISA) (IBL International, Germany) and assayed in duplicate. Unbound target analytes were washed according to instructions using a wash plate (Wellwash®, Thermo Fisher Scientific Inc.). An orbital shaker (Troemner LLC, Thorofare, NJ) was used to incubate plates according to manufacturer guidelines. Bound target analytes were analysed on a microwell plate reader (MultiScan® GO, Thermo Fisher Scientific Inc.) at a wavelength of 450 nm within 1 hour.

The mean of duplicate determinations and B/B0 (%) were calculated, where B is the intensity of a sample well and B0 is the maximum intensity (optical density (OD) of each calibrator, control or sample divided by the OD of the “0” calibrator, multiplied by 100). Values of the unknowns were read from a 4-parameter curve where mean OD was plotted on the Y-axis and calibrator concentration on the X-axis. Intra-assay coefficient of variation was 10.0%.

Ferritin analysis

Serum ferritin was analysed using ELISA (IBL International, Germany) and assayed in duplicate. Incubation and wash plates were used according to manufacturer guidelines. Bound target analytes were analysed on a microwell plate reader at a wavelength of 450 nm within 20 minutes of adding the stopping solution.

The mean OD of the “0” calibrator was subtracted from the mean of each calibrator, control and serum sample. Values of the unknowns were read from a 4-parameter curve where mean OD was plotted on the Y-axis and calibrator concentration on the X-axis. Intra-assay coefficients of variation were 7.4%.

Micronutrient reference values

The adequate 25(OH)D range recommended for athletes was defined as 50-100 nmol.l-1 (Close et al.), Data below this range were classified as severely deficient (<12.5 nmol.l-1), deficient (12.5-30 nmol.l-1) and inadequate (30 to 50 nmol.l-1), or above this range was classified as optimal (>100 nmol.l-1) and toxic (>180 nmol.l-1) [11]. Adequate ranges recommended for total serum calcium (2.2 to 2.6 mmol.l-1; [19] and ferritin (15 to 200 μg.l-1; [20] in athletes are the same as the general population.

Concentrations below these thresholds indicate deficiency, and potential overload or toxicity was classified as 2.75 mmol.l-1 [20] and >200 μg.l-1 [21], respectively.

Analyses of data

Data are presented as means ± standard deviation (SD), and compared to recommendations (i.e., micronutrient status vs. reference values), by age group (i.e. U18 and O18). For statistical analyses all data were log transformed to reduce bias. Effect sizes were calculated and interpreted as trivial (<0.2), small (0.2 to 0.6), moderate (0.6 to 1.2), large (1.2 to 2.0), very large (2.0 to 4.0) or extremely large (>4.0). Magnitude based inferences were calculated for practical significance. The threshold used for the observed change was 0.2 (i.e., a small effect; mean difference divided by between subject SD) [22,23].

Magnitudes for the observed change, based on 90% confidence interval (CI), were almost certainly not (<0.5%); very unlikely (0.5%-5%); unlikely (5%-25%); possibly (25%-75%); likely (75%-95%); very likely (95%-99.5%); or almost certainly (>99.5%). Effects with CI crossing upper and lower boundaries of the smallest worthwhile difference (± 0.2), were described as unclear [24].

Results

It was very likely that U18 RU players had greater calcium concentrations than O18 players (2.40 ± 0.08 vs. 2.25 ± 0.19 mmol.l-1; Figure 1). The difference between U18 and O18 was unclear for 25(OH)D (20.21 ± 11.57 vs.29.02 ± 33.69 nmol.l-1; Figure 2) and ferritin (59.33 ± 34.61 vs. 85.25 ± 73.53 μg.l-1; Figure 3).

Figure 1: Biomarkers measures of calcium status in elite male english rugby players.

Figure 2: Biomarkers measures of vitamin d status in elite male English rugby players.

Figure 3: Biomarkers measures of iron status in elite male english rugby players.

All U18 had adequate serum calcium concentrations, whereas 33% and 67% of O18 players had deficient and adequate serum calcium concentrations (Figure 1). All U18 and 83% of O18 players had severely deficient, deficient or inadequate vitamin D concentrations (Figure 2).

Adequate (8%) and optimal (8%) concentrations of vitamin D were observed for O18 players. Adequate ferritin concentrations were observed in all U18 and 75% of O18 players (Figure 3). Potential toxicity (17%) and deficiencies (8%) were observed for ferritin in O18 players.

Discussion

This study quantified and determined the adequacy of calcium, vitamin D and iron biochemical concentrations in elite male junior (U18) and senior (O18) RU players, during an in-season period. Calcium status was greater for U18 than O18 players, where 33% of O18 players were categorised as deficient. No clear differences were observed by age group for vitamin D or ferritin concentrations due to individual variation.

However, severely deficient, deficient or inadequate vitamin D status was observed in all U18 and 83% of O18 players. Adequate ferritin status was observed in the majority of players. Calcium and ferritin status should be monitored individually, given the variation in concentrations observed, especially in those O18. Due to the majority of players presenting with vitamin D deficiencies or inadequacies and no differences being observed by age, practitioners should frequently monitor to support optimal health and performance.

Differences in calcium status were observed by age group, with greater variability in the O18 players. Despite concentrations of calcium for both age groups being less than previously observed in adolescent male football players (2.49 ± 0.1 mmol.l-1) [8] and older male athletes (2.45 ± 0.1 mmol.l-1) [25], mean calcium concentrations (U18, 2.40 ± 0.08 and O18, 2.25 ± 0.19 mmol.l-1) were within adequate ranges. Given that adolescent rugby players have previously demonstrated adequate dietary intakes in comparison to sports nutrition recommendations [9], and the majority of rugby players in this study had adequate calcium concentrations, a focus on calcium intake as previously suggested by sports nutrition recommendations [1,6] may not be as imperative for RU players, if the findings from this cohort are generalizable.

Calcium deficiency was observed in 33% of O18 players. These players also presented with deficient or inadequate 25(OH)D concentrations. A low serum calcium concentration alone results in a lower bone absorption, which coupled with vitamin D deficiency where calcium absorption from the diet is reduced, could result in reduced bone mass [7]. Additionally, the high protein intakes (>2 g.kg-1.day-1) often observed in academy rugby players [9] may impact bone mineral content [26].

Further research should be undertaken to investigate the effect of calcium and vitamin D insufficiency on bone health in elite rugby players, considering dietary intakes, activity levels and musculoskeletal loading across the different training groups.

It is recognised that due to the highly regulated serum calcium concentrations, there is a lack of agreement on biochemical measures to determine calcium deficiency [27]. While serum calcium is an important determinant of calcium balance and body content, it is also a measure of calcium concentration in extra cellular fluid [28], thus does not necessarily reflect total body calcium content. Practitioners should be aware of other physical observations of calcium deficiency from chronically low intake, such as low bone density, hypertension, impaired muscle contraction, muscle cramps, tetany and convulsions [29], when identifying a calcium deficiency.

Differences by age group for vitamin D status were unclear in this study, which could have been due to the individual variance in the O18 players (U18, 8.4 to 44.6; O18, 4.4 to 117.7 nmol.l-1). However, the majority of players were classified with a vitamin D deficiency or severe deficiency. Concentrations observed in this study for U18 were less than those observed in 14-18 year old males from the general UK population during the same month (20.21 ± 11.57 vs. 30.7 ± 8.6 nmol.l-1, respectively). The latter study demonstrated that 25(OH)D concentrations decreased over the winter period [13], therefore players were potentially at their lowest level of vitamin D concentration by the start of spring (i.e. March).

This is the end of the RU season, and also when teams compete in ‘playoff’ matches, which determine winning their respective league or competition, or alternatively relegation or promotion between leagues. O18 players in this study had lower 25(OH)D concentrations than senior rugby league (RL) players (29.02 ± 33.69 vs. ~65 nmol.l-1, respectively) that were measured during winter months (November-January) [12]. This could be explained by 25(OH)D concentrations decreasing further from the winter to spring period. Differences may also be due to the RL off-season taking place throughout October where players typically travel to countries providing greater exposure to ultraviolet-B radiation, increasing vitamin D synthesis.

Without adequate vitamin D status, health could be compromised (Owens et al.) at the crucial time of a RU competitive season. While there is some evidence to suggest that performance measures such as sprint time and jump height [11,12] and adaptations to intense exercise [30] are improved with Vitamin D supplementation, others suggest that beyond increasing 25(OH)D concentrations to adequate levels for general health, supplementation does not provide any ergogenic affects in rugby union players [31].

Dietary intake data, including supplement consumption, has demonstrated vitamin D intake to be as low as 3.5 μg.day to 1 for 25 year old Spanish basketball players [32] and 5.1 to 7.1 μg.day-1 for 20 year old collegiate athletes [27]. Increasing dietary intake from the limited vitamin D rich foods (e.g. salmon, eggs, mushrooms) to meet the reference nutrient intake of 10 μg.day-1 would ensure 25(OH)D concentrations >25nmol.l-1 during periods of minimal ultraviolet-B sun exposure [21]. With vitamin D3 supplementation of 500 μg.week-1 for (6 to 12) weeks it has been demonstrated that athletes are able to increase 25(OH)D concentrations >50 nmol.l-1 (adequate), even when 57% of athletes demonstrated less than adequate concentrations at baseline [12].

However, the guidance level for supplementation of vitamin D in the general population is more conservative and set at 25 μg.day-1 (i.e. 175 μg.week-1) for UK adults [21]. In this study, one player had 25(OH)D concentrations >100 nmol.l-1 and supplementation may push them towards toxic levels. Previous research in New Zealand rugby players has demonstrated fortnightly Vitamin D3 supplementation of 50,000 IU (1250 μg) increased players with optimal 25(OH)D concentrations (125 nmol.l-1) to 149 nmol.l-1 [31]. Thus, an individual approach is required where dietary intakes should be optimised initially, supplementation considered if necessary (i.e., following biochemical assessment) and multiple measures of serum 25(OH)D should be obtained across the season.

Differences in ferritin concentrations for U18 and O18 players were unclear (59.33 ± 34.61 vs.85.25 ± 73.53 μg.l-1) due to the variance in O18 players, which supports previous research where significant differences by age group or relationships with age were not observed. U18 players in this study had lower concentrations than collegiate Japanese rugby players (forwards, 73.4 ± 28.8 and backs, 47.7 ± 17.6 μg.l-1) and were similar to concentrations observed in young athletes from other ball/team sports (60.3 ± 37.2 μg.l-1) [15-17].

Ferritin concentrations of O18 players in this study were at the lower end of the range reported throughout the competitive season for senior RU players (77.7 ± 21.1 to 114.7 ± 50.2 μg.l-1) [33]. Despite this, mean concentrations in this study were still adequate, only one O18 player presented with a deficiency, and 17% of O18 players presented greater concentrations than the adequate threshold.

High ferritin concentrations (>200 μg.l-1) are associated with cellular damage regulating from infection and inflammation, with the latter being common in rugby players due to exercise and impact-induced muscle damage [34]. Muscle damage is observed for 72 to 120 hours post-match in rugby players [35,36] and could have therefore resulted in the increased ferritin levels in this study as it was not possible to restrict players from training for more than 48 hours prior to data collection.

In both Australian rugby 7s players and Italian rugby union players, ferritin levels were observed to fluctuate over a season, with a ~20% decrease from pre- to mid-season and both codes increased towards the end of the competitive season [17,33].

Therefore, measures in this study could have been assessed at a time in the competitive season where ferritin levels are expected to be at their lowest. Given that the majority of players in this study were observed to have adequate serum ferritin concentrations during the in-season period, iron intake may not be as pertinent in RU players as previously suggested in the literature providing sports nutrition recommendations [1,6].

Conclusion

This study is the first to undertake biomarker assessment to determine the adequacy of key micronutrients in elite English rugby players. Although differences between junior and senior rugby players were not clear for vitamin D and iron concentrations, the individual variance of all micronutrients was greater in the senior players.

Regardless of age, increased vitamin D intake should be considered during the RU competitive season, and multiple measures of serum 25(OH)D obtained across the season. Calcium and iron intake may not be as pertinent in rugby players as previously recommended in the sports nutrition literature, however additional physical observations should be observed when defining calcium deficiency.

Acknowledgement

The study was designed by DRS, BJ and LCD; data were collected and analysed by DS; data interpretation and manuscript preparation were undertaken by DRS, BJ, LCD, LS and RFGJK. All authors approved the final version of the paper. The author would like to thank all of the players and coaching staff involved in this project. This research was part funded by Leeds Rugby as part of the Carnegie Adolescent Rugby Research (CARR) project.

References