Autor: |
Wyatt, Paige M., Drager, Kelly, Groves, Erik M., Stellingwerff, Trent, Billington, Emma O., Boyd, Steven K., Burt, Lauren A. |
Předmět: |
|
Zdroj: |
Calcified Tissue International; Oct2023, Vol. 113 Issue 4, p403-415, 13p |
Abstrakt: |
Relative Energy Deficiency in Sport (REDs) is a syndrome describing the relationship between prolonged and/or severe low energy availability and negative health and performance outcomes. The high energy expenditures incurred during training and competition put endurance athletes at risk of REDs. The objective of this study was to investigate differences in bone quality in winter endurance athletes classified as either low-risk versus at-risk for REDs. Forty-four participants were recruited (M = 18; F = 26). Bone quality was assessed at the distal radius and tibia using high resolution peripheral quantitative computed tomography (HR-pQCT), and at the hip and spine using dual X-ray absorptiometry (DXA). Finite element analysis was used to estimate bone strength. Participants were grouped using modified criteria from the REDs Clinical Assessment Tool Version 1. Fourteen participants (M = 3; F = 11), were classified as at-risk of REDs (≥ 3 risk factors). Measured with HR-pQCT, cortical bone area (radius) and bone strength (radius and tibia) were 6.8%, 13.1% and 10.3% lower (p = 0.025, p = 0.033, p = 0.027) respectively, in at-risk compared with low-risk participants. Using DXA, femoral neck areal bone density was 9.4% lower in at-risk compared with low-risk participants (p = 0.005). At-risk male participants had 21.9% lower femoral neck areal bone density (via DXA) than low-risk males (p = 0.020) with no significant differences in females. Overall, 33.3% of athletes were at-risk for REDs and had lower bone quality than those at low-risk. [ABSTRACT FROM AUTHOR] |
Databáze: |
Complementary Index |
Externí odkaz: |
|