TOPIC: Cardiology
STUDY TYPE: Other
Incidence of Sudden Cardiac Arrest and Death in NCAA Athletes: A 9-Year Surveillance Study
Katherine Wainwright, MD, MSc
All Other Authors: Brad Petek, MD, Maxey Cherel, Randi Delong, MPH, Kristen Kucera, PhD, Bridget Whelan, MPH, Jonathan Drezner, MD, Kimberly Harmon, MD, and Kimberly Harmon, MD
Affiliation: University of Washington, Seattle, WA.
Purpose: Cardiovascular-related fatalities are the leading cause of sudden death in sport. The purpose of this study was to determine the incidence and etiology of sudden cardiac arrest and death (SCA/D) in NCAA athletes using a multimethod approach to case identification.
Methods and Study Design: Cases of SCA/D in NCAA athletes were compiled over 9 years (July 2014–June 2023) from the National Center for Catastrophic Sport Injury Research, UW-NCAA Database, media reports, and a survey to NCAA institutions. Autopsy/coroner reports were reviewed to determine etiology of death. Incidence rates were calculated using athlete participation and demographics from the NCAA Demographic Database.
Results: One hundred cases of SCA/D were identified (47 SCA, 53 SCD) in 4,535,619 athlete years (AY). The incidence of SCD was 1:85,577 AY (95% CI 1:37,29101:55,745) and SCA/D was 1:45,356 AY (1:37,291–1:55,745); was higher in males (1:29,659 AY (1:24,015–1:37,080)) than females (1:141,782 AY (1:84,503–1:259,338); (IRR 4.78, 95% CI 2.72–8.41, P P
Conclusions: Incidence of SCA/D inclusive of cases with survival was high, especially in males, black athletes, and the sports of men’s basketball, football, and men’s track/cross country. Incidence numbers relying solely on SCD underestimate the rate of SCA/D by almost half. Hypertrophic cardiomyopathy and idiopathic left ventricular hypertrophy were the most common causes of SCA/D in this cohort of college athletes.
Significance: A better understanding of the epidemiology of SCA/D should inform prevention strategies. Cardiovascular screening inclusive of ECG should be considered in NCAA athletes, especially in higher-risk populations.
Acknowledgements: This research is supported by the University of Washington Center for Sports Cardiology and, in part, by the National Center for Catastrophic Sports Injury Research (NCCSIR) at the University of North Carolina at Chapel Hill. NCCSIR is supported by the National Collegiate Athletic Association (NCAA), the National Federation of State High School Associations (NFHS), the American Football Coaches Association (AFCA), the National Athletic Trainers’ Association (NATA), the National Operating Committee on Standards for Athletic Equipment (NOCSAE), and the American Medical Society for Sports Medicine (AMSSM). Conclusions drawn from or recommendations based on the data provided by the NCCSIR are those of the author(s) and do not necessarily represent the official views of the NCSSIR or any of the supporters.
TOPIC: Cardiology
STUDY TYPE: Cohort
Prevalence and Outcomes of T Wave Inversions on Screening EKGs in High School Athletes: An Analysis of 14 Years of Screenings
Zachary Morehouse, DO
All Other Authors: Noah Goldstein, DO, Dallas McCorkle, DO, and David Price, MD, FAMSSM
Affiliation: Atrium Health Carolinas Medical Center, Department of Family Medicine, Charlotte, NC.
Purpose: T-wave inversions (TWI) on screening EKGs have been identified as markers of abnormal cardiac pathology putting athletes at risk for sudden cardiac arrest/death. We sought to identify the prevalence of TWI on screening EKG, restrictions placed on athletes, and subsequent clinical cardiac outcomes.
Methods and Study Design: Retrospective cohort analysis of 14 years (2010–2024) of mass EKG preparticipation screenings in high school athletes with associated chart reviews. All identified T-wave inversions (TWI) were followed up through electronic health record review, patient and family interviews, and exhaustive media searches to determine sport participation status and clinical outcomes following TWI identification.
Results: From 2010 to 2024, 16,336 high school athletes were screened. 281 had abnormal EKG findings using modern EKG interpretation criteria in athletes (4 removed based on updated EKG criteria after screening occurred). T-wave inversions (TWI) were present in 53 athletes (0.33%, P P P P P
Conclusions: TWI showed a low prevalence in this 14-year analysis of EKG screening and when identified, most athletes after workup were still cleared for sport. However, 15% of athletes were restricted from sport in our cohort due to high-risk cardiac pathology. Given clearance status changed for 1 athlete at follow-up while another had cardiac arrest from noncompliance with restrictions, emphasizing the importance of following these recommendations is vital.
Significance: TWI on EKG may signal present or have potential cardiac pathology, which may develop over time. It is critical that physicians emphasize to athletes that they understand the importance of following restrictions, which may also change over time.
TOPIC: Concussion
STUDY TYPE: Cohort
The Effect of Region and Game Development on Headgear-Associated Protection From Concussions in High School Girls’ Lacrosse
Daniel Herman, MD, PhD, FAMSSM
All Other Authors: Patricia Kelshaw, PhD, ATC, Meredith Kneavel, PhD, Andrew Lincoln, ScD, and Shane Caswell, PhD, ATC
Affiliation: University of California at Davis, Davis, CA.
Purpose: Cohort studies indicate Florida’s headgear mandate reduces concussion risk in girls’ lacrosse; however, regional differences in game-play development may be a confounder. This study compared concussion rates between Florida (FL) and non-mandating (NHM) states by region and game-play development.
Methods and Study Design: Concussions and athlete exposures (AEs) were obtained using a national injury surveillance system (NATION) from 2019 to 2024. 95% confidence intervals (CIs) were calculated for incidence rates (IR), defined as concussions per 1,000 AEs. Incidence rate ratios (IRR) comparing the FL with different NHM groups were calculated using Poisson regressions with values excluding 1.00 being significant.
Results: Four hundred eighty-one concussions (FL: 49, NHM: 432) over 994,606 AEs (FL: 184,618, NHM: 809,988) were recorded. Compared to FL (IR: 0.265, 95% CI: 0.200–0.352), NHM (IR: 0.533, 95% CI: 0.485–0.586) had a significantly higher overall rate of concussion injuries (IRR: 2.02, 95% CI: 1.50–2.71). Using a previously established rubric, NHM states were divided by game-play development using sport participation data into Established (EST) and Emerging (EMG) states, and by geographical location into Northeast, Mid-Atlantic, South (excluding Florida), Midwest, and West regions. Both EST (IRR: 1.87, 95% CI: 1.37–2.58) and EMG states (IRR: 2.12, 95% CI: 1.56–2.58) had significantly higher concussion rates compared to FL. All geographical regions also had significantly higher concussion IR compared to FL, with the West (IRR: 2.28, 95% CI: 1.58–3.30) and South (IRR: 2.24, 95% CI: 1.58–3.20) regions demonstrating the greatest differences.
Conclusions: Mandated headgear use is associated with a large reduction in concussion rate in high school girls’ lacrosse, with Florida having a concussion rate that is half the rate of non-mandating states. This effect is robust across geographical regions and level of game development, with large differences observed in comparison to southern states with an emerging level of game-play development that are most like Florida.
Significance: Given that Florida is the only state with a headgear mandate and that voluntary use of headgear is exceptionally low, the enactment of mandates presents a significant and meaningful concussion prevention opportunity in high school girls’ lacrosse.
Acknowledgements: This study was funded by the Foundation for the American Medical Society for Sports Medicine, USA Lacrosse, and the National Operating Committee for Standards on Athletic Equipment.
TOPIC: Concussion
STUDY TYPE: Cohort
Helmet Adoption in Olympic Surfing: Rates and Key Modifiers
Mitchell Anderson, MD
All Other Authors: Patricia Kelshaw, PhD, ATC, Nicole Nagayama, BS, and Daniel Herman, MD, PhD, FAMSSM
Affiliation: Stanford University, Stanford, CA.
Purpose: Helmet use rates among elite surfers is low; however, higher use rates have been observed at locations with greater injury risks, such as large waves over coral reef bottoms. This study aims to quantify helmet use in Olympic surfing and identify the factors that may influence helmet use.
Methods and Study Design: Data on surfer demographics, wave height, and helmet use were collected from publicly available surfer profiles and video footage of all competition rounds during the 2020 (the first Games for surfing) and 2024 Olympic competitions. χ2 and independent samples t-tests were used for comparisons between helmeted and non-helmeted athletes (alpha = 0.05).
Results: Data were collected among 75 different Olympic athletes (M = 38, F = 37) during the 2020 Tokyo Games at Tsurigasaki, Chiba (sandy bottom) and during the 2024 Paris Games at Teahupo’o, Tahiti (coral reef bottom). No helmet use was observed by any of the 40 athletes (M = 20, F = 20) during the 2020 Games. Of the 48 athletes (M = 24, F = 24) from the 2024 Games, 22 (M = 5, F = 17) wore a helmet for at least one round, with females having a significantly greater percentage of helmet use (70.8%) compared to men (20.8%; χ2 = 12.084, P = 0.001). The average wave height at the 2024 Games was significantly greater than at the 2020 Games for both men (7.2 ± 2.2 ft vs. 5.6 ± 1.7 ft, P = 0.001) and women (6.9 ± 1.0 ft vs. 5.5 ± 1.7 ft, P = 0.001); however, during the 2024 Games, the average wave height was smaller in rounds where athletes wore helmets compared to rounds where athletes did not wear helmets (6.7 ± 1.0 ft vs. 7.3 ± 1.0 ft, P = 0.014). Among the 2024 Games, athletes who wore a helmet at least once were significantly younger (23.5 ± 4.1 years) than those who did not (27.2 ± 4.4 years; P = 0.004).
Conclusions: Helmet use among Olympic surfers may be influenced by athlete age and sex. Younger female surfers, in particular, appear more open to using helmets. While helmet use was more common at Teahupo’o, Tahiti, where features such as surface bottoms may pose a greater risk of injury compared to Tsurigasaki, Chiba, the influence of wave height on helmet use among professional surfers’ willingness to adopt helmet use remains unclear.
Significance: Helmet adoption among elite surfers has historically been low. However, the trend is potentially shifting. Further research is needed to assess barriers to use, the potential for risk compensation, and the efficacy of helmets in preventing injuries.
TOPIC: Concussion
STUDY TYPE: Case-Control
Assessing Autonomic Clinical Biomarkers in Pediatric Patients With Concussion Symptoms and Dysautonomia
Tyler Marx, MS
All Other Authors: Joseph Marshall, BS, Victoria Snapp, DO, Alicia Chen, BS, Brett Dusenberry, MD, Jaden Bailey, PA-C, Leslie Streeter, DNP, Mo Mortazavi, DNP, and Mo Mortazavi, MD
Affiliation: Midwestern University, Glendale, AZ.
Purpose: We investigated the link between the autonomic clinical markers heart rate (HR), heart rate variability (HRV) score, orthostatic measurements, and 30:15 ratio in concussed pediatric patients who are exercise tolerant (ET) and exercise intolerant (EI) during standardized exertional testing.
Methods and Study Design: A case-control study of 131 pediatric patients, ages 7–19, with 408 clinical visits for concussion symptoms between 10/6/21 and 8/9/24. HR and HRV score measurements were recorded via a chest strap heart rate monitor and orthostatic measurements via a blood pressure (BP) cuff from supine to standing for 1 minute. The onset of signs or symptoms of cardiovagal dysautonomia determined ET and EI.
Results: There was a significant difference in systolic BP change during orthostatic measurements between ET and EI patients in the medium intensity (MI), −0.87 mm Hg, and −13.5 mm Hg respectively (P = P = 0.0143), due to cardiovagal dysautonomia. There was a significant difference in diastolic BP change during orthostatic measurements between ET and EI patients in the low intensity (LI), 6.46 mm Hg, and −3.2 mm Hg respectively (P = 0.0004), MI, 6.24 mm Hg, and 0.03 mm Hg respectively (P = 0.0054), and HI exertional testing, 8.59 mm Hg, and −3.55 mm Hg respectively (P = 0.0015), due to cardiovagal dysautonomia. While non-significant, there was an increase in the 30:15 ratio trending towards significance between ET and EI patients across LI (P = 0.11), MI (P = 0.18), and HI (P = 0.2657) exertional testing. There was no significant difference in baseline HR and HRV scores in ET and EI patients due to dysautonomia at all 3 exercise intensity levels.
Conclusions: There is a significant difference in post-exertional testing orthostatic BP measurements in EI pediatric patients compared to ET counterparts. This suggests orthostatic BP measurements can be clinically useful when evaluating concussion recovery in pediatric patients. While not statistically significant the 30:15 ratio is trending toward significance when comparing ET and EI pediatric patients and may benefit from an increased population study.
Significance: Considering the link between post-concussion syndrome and cardiovagal dysautonomia, autonomic clinical markers investigated in this study can provide insight into the management and recovery of EI pediatric patients.
Acknowledgements: Thank you to everybody who helped out with this research project.
TOPIC: Concussion
STUDY TYPE: Other
The Effects of Sport Activity on Sideline Concussion Assessments in Adolescent Athletes
Joseph Tousignant, DO
All Other Authors: Haley Chizuk, PhD, MS, ATC, Kendall Marshall, DAT, ATC, CSCS, Tiffany Thavisin, MS, Mohammad Haider, MD, John Leddy, MD, and Rajiv Jain, MD
Affiliation: UBMD Orthopaedics and Sports Medicine, State University of New York at Buffalo, Amherst, NY Jacobs School of Medicine and Biomedical Sciences, State University of New York at Buffalo, Buffalo, NY.
Purpose: The Vestibular Ocular Motor Screen (VOMS) and the Sport Concussion Assessment Tool 6 (SCAT6) are point-of-care concussion assessments administered immediately or soon after sports participation or exercise. Ideally, physical exertion should not affect test results.
Methods and Study Design: High school athletes completed the SCAT6 and mVOMs (modified VOMS). Assessments were conducted before sports activity, immediately after activity, and after a 20-minute rest. Heart rate and rate of perceived exertion were assessed to confirm exercise intensity. Independent sample t-tests were completed to assess differences between pre-and post-activity and between post-practice and rest.
Results: Twenty-three participants (61% female, 15.5 ± 1.0 years old) were enrolled. Significant increases were observed in HR (−62.96 25.87, P P P = 0.006), but RPE returned to baseline (−0.48 ± 1.76, P = 0.21). Timed Tandem Gait (TTG) (P = 0.005) and Dual-Task Tandem Gait (DTTG) (P = 0.045) improved significantly from pre- to post-activity as well as from post-activity to rest (TTG (P = 0.002), DTTG (P = 0.006)). There were no other differences in SCAT6 or mVOMS scores among the 3-time points. Females demonstrated increased median NPC breaks, faster TTG scores, and faster DTTG scores compared to men after activity. Participants without a concussion history had faster TTG and DTTG scores immediately after activity and after rest compared to baseline. There was no association between sex and history of concussion.
Conclusions: SCAT6 and mVOMS performance was not adversely affected by sports activity in high school athletes. TTG and DTTG performance improved after exercise, consistent with the learning effect. Further research should be conducted on other populations to increase generalizability. Clinicians should be confident that physical exertion is unlikely to impact the performance of these sideline concussion assessment tests in non-concussed adolescent athletes.
Significance: Sports activity likely does not adversely affect performance on the SCAT6 and mVOMs in high school athletes; therefore, it can reliably be used immediately following exercise or sports participation to assess concussions at the point of care.
Acknowledgements: I would like to thank Drs. Chizuk, Marshall, Haider, Leddy, and Jain for all of their help, in addition to the AMSSM for this opportunity to further concussion research.
TOPIC: Education
STUDY TYPE: Cohort
Retrospective Analysis of DVT Prophylaxis Agents after Joint Replacement
Mohit Mathavan, MD
All Other Authors: Tyler Small, DO, Satish Chandrasekhar, MD, Nicholas Dorsey, MD, Carl Brophy, MD, Derek Farr, DO, and James McFadden, MD
Affiliation: UCF/HCA Ocala.
Purpose: This study evaluates the effectiveness of anticoagulants-Aspirin, LMWH, Rivaroxaban, Apixaban, and no-anticoagulation-in preventing DVT and PE after hip and knee replacement in primary arthritis patients. Secondary aims include assessing post-op bleeding risk and readmission rates among these groups.
Methods and Study Design: We conducted a retrospective analysis on 10,647 patients aged 18–89 who underwent hip or knee replacement for primary arthritis within HCA’s North Florida Division (2021–2023). χ2 and Fisher’s exact tests analyzed associations between anticoagulant type and readmission, DVT/PE, and bleeding outcomes at 35, 60, and 90 days post-discharge.
Results: Among all patients, anticoagulant type significantly influenced readmission rates at 35, 60, and 90 days (P P P
Conclusions: Aspirin showed a lower risk of readmission, particularly in knee replacement patients, without increasing bleeding or DVT/PE risk. This suggests Aspirin may be a preferable prophylactic agent following joint replacement. Although rare, DVT/PE and bleeding events highlight the need for tailored anticoagulation strategies, with data limitations in post-discharge follow-up.
Significance: Study supports Aspirin as an effective, potentially safer DVT prophylaxis for knee replacement, highlighting the need for individualized anticoagulation in joint replacement patients and reinforcing low post-op complication rates across anticoagulants.
TOPIC: Education
STUDY TYPE: Survey
Crossovers and Conversations: Sports Medicine Day
Anish Rana, BA
All Other Authors: Drew Ashbery, BS, Mumin Sabha, BA, Kevin Yoon, BS, Shane Fuentes, MD, Theresa Green, PhD, MBA, and Katherine Rizzone, MD, MPH
Affiliation: University of Rochester School of Medicine and Dentistry.
Purpose: Educational disparities in underserved populations limit awareness of healthcare careers such as sports medicine. This study evaluated a sports medicine camp combining basketball, CPR, and ankle taping to boost career interest and confidence in sports medicine skills among underrepresented youth.
Methods and Study Design: Funded by an AMSSM grant, the camp featured basketball training, CPR and ankle taping stations, and a panel with 3 sports medicine physicians. Pre- and post-camp surveys, scored on a 5-point Likert scale, were administered to 23 participants aged 13–18 to assess 7 domains, including familiarity with sports medicine, career interest, and skill confidence.
Results: Comparing pre- and post-camp surveys revealed that participants improved across several domains. Knowledge of sports medicine significantly increased between pre- (M = 3.43, SD = 0.90) and post-camp (M = 4.14, SD = 0.71) surveys, t(43) = 2.9, P = 0.006, while understanding of sports medicine careers also increased from pre- (M = 3.17, SD = 0.89) to post-camp (M = 4.14, SD = 0.83), t(43) = 3.7, P P = 0.001 and t(43) = 3.3, P = 0.001, respectively. Only one domain, interest in sports medicine, did not significantly improve between pre- (M = 2.87, SD = 1.18) and post-camp (M = 3.14, SD = 1.21) surveys, t(43) = 0.75, P = 0.46, but demonstrated an upward trend. Additional domains measuring camp satisfaction and future attendance scored highly, with averages exceeding 4.
Conclusions: Our sports medicine camp successfully enhanced health literacy, skill confidence, and awareness of sports medicine careers among underrepresented youth. By combining basketball training with health education, participants developed a positive attitude towards sports medicine careers, potentially contributing to greater diversity in the field. Future programs could improve outcomes by emphasizing hands-on skill application and career exploration.
Significance: This program demonstrates the value of sports-based youth development in underrepresented communities, utilizing positive youth development frameworks to promote health education and career interest in sports medicine.
Acknowledgements: We would like to acknowledge the American Medical Society for Sports Medicine, the City of Rochester Department of Recreation, and Wegmans for their contributions to this camp.
TOPIC: Epidemiology
STUDY TYPE: Cohort
Impact of BBCOR Standard on the Epidemiology of Baseball Injuries
Namita Bhardwaj, MD, MS, MPH
All Other Authors: Jorge Fuentes, MD, Anupsinh Chauhan, DO, Christopher Minifee, MD, and Wei-Chen Lee, PhD, MSPH, MJur
Affiliation: University of Texas Medical Branch, Galveston, TX.
Purpose: The BBCOR standard measures the trampoline effect of the barrel and the impact on the exit speed of the ball, but what happened to injury rates in high school baseball? Our project aims to determine the impact of the implementation of this standard on the epidemiology of injuries.
Methods and Study Design: The study population consists of high school athletes who sustained injuries while participating in boy’s baseball sport during academic years 2005–2010 and 2014–2019. This study has 3 outcomes: injury rate, proportion of time loss, and distribution of injury diagnosis.
Results: Using the pre-standard rate as the reference group, that national estimates of injuries decreased post standard implementation for both practice (pre: 127,104, post: 100,577) and competition (pre: 153,695 post: 136,487). The injury rate (IRR) decreased for preseason (IRR = 0.803) or during competition (IRR = 0.976) but increased postseason independent of it was practice or competition (IRR = 1.556 and IRR = 1.395). The injury rates decreased (IRR less than 1) in most activities after the standard during practice. However, only rates of fielding (IRR = 0.926), general play (IRR = 0.806), running (IRR = 0.958), and sliding activities (IRR = 0.927) decreased during competition after the standard. The percentages of people who lost more than 1 week increased (Practice: IRR = 1.176 and 1.257; Competition: IRR = 1.000 and 1.230), post standard. The BBCOR standard was most helpful in reducing shoulder (IRR = 0.688) and lower leg (IRR = 0.543) injuries during practice and lower leg (IRR = 0.364) and other (IRR = 0.408) injuries during competition.
Conclusions: The introduction of the Batted-Ball Coefficient Restitution (BBCOR) bat standard was mandated in 2012 for baseball bats. In evaluation of injuries prior and post standard implementation, there is a decrease in overall injuries and the distribution of those injuries. Interestingly, the time loss due to injury increased post standard change. The BBCOR standard positively impacted baseball by decreasing the number of injuries.
Significance: Evaluation of the impact of rule and standard changes, helps governing bodies and sports medicine professionals evaluate their continued need. Evaluations such as these also help the sports medicine community work towards safer sports.
Acknowledgements: Funding for HS RIO was provided in part by the Centers for Disease Control and Prevention grants R49/CE000674-01 and R49/CE001172-01. The authors also acknowledge the research funding contributions of the National Federation of State High School Associations, the National Operating Committee on Standards for Athletic Equipment, DonJoy Orthotics, and EyeBlack. HS RIO is conducted by the Datalys Center for Sports Injury Research and Prevention, Inc. The content of this report is solely the responsibility of the authors and does not necessarily represent the official views of the funding organizations or the Datalys Center. We thank the many athletic trainers who have volunteered their time and efforts to submit data to HS RIO. Their efforts are greatly appreciated and have had a tremendously positive effect on the safety of high school student-athletes.
TOPIC: Epidemiology
STUDY TYPE: Other
Distribution of Injuries Before and After Major Rule Changes in High School Girls’ Lacrosse
Kaitlyn Chin, DO
All Other Authors: David Aaby, MS and Prakash Jayabalan, MD, PhD
Affiliation: Northwestern Feinberg School of Medicine/Shirley Ryan Abilitylab.
Purpose: Lacrosse is one of the fastest-growing sports in the past decade. The current literature on injuries in high school lacrosse is limited. The present study evaluated the impact of rule changes regarding direct player-to-player contact and checking on player safety in girls’ lacrosse.
Methods and Study Design: Retrospective review of injury data from the Reporting Information Online sports injury surveillance database for high school girls’ lacrosse. We estimated injury rates across 3 periods: before, between, and after major rule changes. We estimated injury rates at every year of data collection, injury rates aggregated within each period. Injury rates were compared across time periods.
Results: The mean age of players was 15.99 (SD = 3.2). A total of 1,556 injuries were recorded for 11 seasons. Of these, 848 injuries were sustained during competition and 708 during practice. There were 101 injuries related to checking and 174 related to contact with another player. Concussion is the most common injury type due to contact with another player (43%) or contact with the crosse/stick (67%), followed by joint/ligament injuries (contact with another player 40%; contact with stick 17%). The head/neck/face were the most injured body part due to contact with another player (48%) and contact with the stick (75%). Comparing injury rates across periods, concussion rates increased from period 1 to period 3 (RR 1.35, CI 1.04–1.75, P = 0.098; FDR-adjust P-value P = 0.07). Compared to period 2, there was a 14% increase in rate of injury in period 3 (RR = 1.14, CI 1.02–1.28, P = 0.08).
Conclusions: By utilizing a national injury surveillance system, our study shows implementation of stricter penalties for contact play and checking was not associated with a decrease in injury rates. In fact, injury rates in high school girls’ lacrosse have increased over time. This warrants further investigation and analyses as to why injuries are increasing over time.
Significance: Injury rates in high school girls’ lacrosse has increased over time despite nationwide implementation of stricter penalties. This is the first study of its kind to assess the distribution of injuries in relation to rule changes over time.
TOPIC: Epidemiology
STUDY TYPE: Other
The Relationship Between Games Missed for Rest or Load Management and Injury in the NBA: A 9 Year Study
John DiFiori, MD, FAMSSM
All Other Authors: Mackenzie Herzog, PhD, Alexandra Chretien, PhD, Rahul Gondalia, PhD, Kristin Shiue, PhD, and Christina Mack, PhD
Affiliation: National Basketball Association, New York, NY.
Purpose: The use of load management (LM) has been proposed to reduce injury risk. The purpose of this study was to assess relationships between injury risk and selectively reducing NBA game load, including factors such as player age, prior injury, schedule density, travel, and cumulative NBA participation.
Methods and Study Design: A 9-year retrospective study of the 2014–2015—2022–2023 seasons focusing on NBA All Star and Top 150 players was conducted using the League-wide EMR. EMR entry for games missed using uniform criteria for injury, illness, rest or LM is required and audited by the League. Cox proportional hazards models and Poisson regression were used to analyze injury risk for players who missed games for rest/LM.
Results: 1,233 player-seasons contributing 1,538,917 player-minutes of game play were included. Both games missed for rest/LM and injuries were found to increase over the study period. Among All Star and Top 150 players who missed games for rest/LM, there was no significant difference in injury risk compared to players who did not miss games for rest/LM (1–2 games missed: HR = 1.24, 95% CI: 0.92–1.65; 3–5 games missed: HR = 0.91, 95% CI: 0.39–1.80; 6+ games missed: HR = 2.16, 95% CI: 0.53–5.78). Results were adjusted for age, prior injury history, and average minutes played per regular season game in the previous season, current season, and career. Similarly, there was no significant difference in injury risk considering schedule density (including back-to-back games, and games over 7- or 14-day windows) or travel (total miles or time zones). Study limitations include inability to assess non-game training load (reporting such data not required), other potential measures of internal and external load, and player-level factors (e.g., intensity of play).
Conclusions: This large, comprehensive analysis of the available NBA data did not demonstrate that managing load by reducing game participation decreases injury risk, even when adjusted for cumulative injury history, per minute game participation, and age. Although NBA players have an average of 3–4 games each week, future studies that include non-game load, player-level factors, and other measures of internal and external load are needed.
Significance: Sport specific data is essential to injury prevention efforts. This first study with NBA data of LM-related practices and injury risk emphasizes the complexity of injury prevention, and the need for additional research to better address these issues.
Acknowledgements: NBA Athletic Trainers Association, NBA Player Health Department, NBA Research Committee, NBA Sports Science Committee, and National Basketball Players Association.
TOPIC: Mental Health
Impact of Injury on Mental Health in Youth Soccer Players
Andrew Watson, MD
All Other Authors: Ian Staresinic, Jennifer Sanfilippo-Nackers, MS, Sakar Gupta, BS, Scott Anderson, PhD, and Kristin Haraldsdottir, PhD
Affiliation: University of Wisconsin School of Medicine and Public Health, Madison, WI.
Purpose: The purpose of this study was to evaluate the associations between injury, quality of life (QOL) and mental health among elite youth soccer athletes.
Methods and Study Design: Six hundred sixty-eight soccer athletes (ages 13–19, 67% female) completed a survey regarding injuries in the last 6 months, anxiety (GAD-7), and depression (PHQ-9). Outcomes were compared based on injury status (uninjured, recovered and returned to play, currently injured) using estimated marginal means. Similar models were used to compare outcomes by injury count (0, 1, or 2 or more) and injury duration.
Results: Two hundred ninety-five respondents (44.2%) reported an injury in the prior 6 months. Two hundred twenty-four (33.5%) had recovered and 71 (10.6%) remained injured. Uninjured athletes demonstrated significantly lower symptoms of anxiety (mean 5.0, 95% CI (4.5–5.5)) than currently injured (6.6 (5.5–7.7), P = 0.008) and recovered athletes (6.0, (5.3–6.6), P P = 0.01) and uninjured athletes (4.6, (4.0–5.1), P P = 0.018) or no injuries (5.0 (4.5–5.5), P P = 0.013). Among recovered athletes, those injured for more than 3 months reported higher anxiety (9.7 (5.7–14)) than those injured for less than one week (5.2 (4.0–6.3), P = 0.035) and 3–6 weeks (5.2, (3.8–6.7), P = 0.037).
Conclusions: Youth soccer athletes injured in the prior 6 months report significantly worse mental health than uninjured athletes, even after return to play. Greater numbers of injuries were associated with increased anxiety and depression. Among recovered athletes, more severe injuries were associated higher levels of anxiety.
Significance: Stakeholders in youth sports should consider that negative mental health impacts of injury may persist beyond return to play. Athletes with multiple injuries and more severe injuries may be at increased risk, even after physical recovery.
Acknowledgements: We would like to acknowledge the staff, member clubs, players and families of the Elite Clubs National League for their participation and assistance with this study.
TOPIC: Musculoskeletal
STUDY TYPE: RCT
Effectiveness of Particulate Versus Non-Particulate Steroid Injections for Glenohumeral Joint Pain
Chantal Nguyen, MD
All Other Authors: Savannah Truehart, MD, John Chan, MD, Max Johnson, MD, and Eugene Roh, MD
Affiliation: Stanford University, Palo Alto, CA.
Purpose: Corticosteroid injections are commonly used for musculoskeletal joint pain. There are no current studies investigating the effectiveness of particulate versus non-particulate corticosteroid injections in improving glenohumeral joint pain or function.
Methods and Study Design: A single-center, single-blind, prospective, randomized trial of 74 subjects with glenohumeral joint pain was performed. 33/74 (45%) received an ultrasound-guided glenohumeral joint injection with particulate steroid (40 mg triamcinolone or 6 mg βmethasone) and 41/74 (55%) received non-particulate steroid (10 mg dexamethasone). Improvements in pain and function over 6 months were assessed.
Results: Both particulate and non-particulate resulted in statistically significant improvements in glenohumeral joint pain via Visual Analog Scale (VAS) and function via American Shoulder and Elbow Surgeon (ASES) and Quick Disabilities of Arm, Shoulder, and Hand (QDASH) questionnaires after 2 weeks, 3 months, and 6 months, without clear differences between groups. The particulate group showed a statistically significant improvement in VAS (−2.7, standard deviation or SD 1.9) compared to that of the non-particulate group (−1.5, SD 2) at 2 weeks only (P = 0.02). There was a statistically significant improvement in QDASH at 3 months in the non-particulate group (−17.1, SD 16.6) versus the particulate group (−8.2, SD 12.5), which was not observed at any other time point (P = 0.045). There was no statistical difference in the number of repeat injections each patient received at 2 week, 3 month, and 6 month follow-up appointments between the 2 groups.
Conclusions: Corticosteroid injections are effective for relief of glenohumeral joint pain and can improve upper extremity function. Both particulate and non-particulate steroids have similar effectiveness in improving pain relief and function, though there may be slight improvement in pain scores at 2 weeks after particulate steroid injections.
Significance: Particulate steroids may lead to more rapid relief in the glenohumeral joint at 2 weeks but not long-term. This may be useful when selecting steroid formulations in patients who require quick recovery, such as athletes seeking return to play.
TOPIC: Musculoskeletal
STUDY TYPE: Cohort
Effects of GLP-1 Agonists on Changes in Skeletal Muscle Strength in a 52 Week Exercise Program
Matthew Kampert, DO, MS
Affiliation: Cleveland Clinic Sports & Exercise Medicine Cleveland, OH.
Purpose: Studies incorporating GLP-1 medication have demonstrated significant weight loss, with participants experiencing a 17% reduction in initial body weight over a 68-week treatment period. However, one-third of the weight lost with GLP-1 medication was attributed to the loss of lean mass.
Methods and Study Design: We searched our prospective exercise data registry (BRIDGE-PROJECT Data Registry), and identified 31 exercisers with a BMI above 30 who completed isokinetic one-rep max strength assessments at baseline and following 52-weeks of digitally guided and monitored resistance training. We split the cohort into 2 groups, those treated with GLP-1 (G) and No GLP-1 (NG) medications during the 52-weeks.
Results: Following 52-weeks of digitally guided resistance training, exercisers prescribed GLP-1 changes from baseline in weight −9.1%; P = 0.3438, 1 rep max for chest press = +32.3%; P = 0.777, low row +46.7%; P = 0.0013, lat pulldown +15.8%; P = 0.2943, shoulder press +23.3%; P = 0.1226, leg press +54.1%; P = 0.0116, leg curl +41.9%; P = 0.0029, leg ext +41.9%; P = 0.0079, low back +108%; P = 0.0001, and abdominal +18.1%; P = 0.1816. While exercisers not prescribed GLP-1 experiences changes from baseline in weight −2.3%; P = 0.7236, 1 rep max for chest press = +28.1%; P = 0.0592, low row +44.2%; P = 0.0007, lat pulldown +16.5%; P = 0.1146, shoulder press +25.8%; P = 0.660, leg press +47.8%; P = 0.0045, leg curl +30.3%; P = 0.0063, leg ext +33.0%; P = 0.0134, low back +81%; P = 0.0000, and abdominal +21.6%; P = 0.0670. Between groups, there was no statistically significant difference for % change in strength, or %change in muscle mass (−3.9% vs. −0.66%; P = 0.171), but there was a significant difference in % weight change (−9.4% vs. −2.1%) between groups (G vs. NG), respectively.
Conclusions: Resistance training is an essential component of an exercise prescription, especially during periods of caloric restriction. This data supports that a well structured resistance training program can not only protect against loss in strength, but actually support improvements in strength even during caloric restriction, and providing comprehensive approach combining strength, power, and endurance training appears notably effective.
Significance: This research potentially fills existing gaps in understanding the impact of digitally guided exercise interventions on health outcomes across various populations and health conditions by adding effective exercise interventions as part of Healthcare.
TOPIC: NCAA
STUDY TYPE: Other
The Accuracy of the Athlete Psychological Strain Questionnaire Compared to Common Screening Tools for Mental Health Disorders
Lauren Paladino, DO
All Other Authors: Katherine Wainwright, MD, Kimberly Harmon, MD, Camilla Astley, PhD candidate, Daniel Taylor, PhD, Alisa Huskey, PhD, Kelly Kim, MA, Bridget Whelan, MA, and Bridget Whelan, MPH
Affiliation: University of Washington, Seattle, WA.
Purpose: The Athlete Psychological Strain Questionnaire (APSQ) is a 10-item mental health screen in the International Olympic Committee Sport Mental Health Assessment Tool (SMHAT). A score >17 indicates need for further evaluation. This study evaluated APSQ’s accuracy against 6 common mental health screens.
Methods and Study Design: Division I collegiate athletes completed the APSQ and select mental health sub-screens including GAD-7 for anxiety, PHQ-9 for depression, ASSQ for sleep disturbance, AUDIT-C for alcohol consumption, CAGE-AID for alcohol and drug consumption, and EDE-Q for eating disorder from 6/2020 to 6/2023. This study compares the accuracy of the APSQ to these sub-screen surveys.
Results: The APSQ was taken 5217 times by 2758 unique students, and concurrently the GAD7 was taken 5191 times, PHQ9 5175 times, ASSQ 4900 times, AUDIT-C 4621 times, CAGE-AID 4365 times, and EDE-Q 4692 times. Using a cutoff of 17, 1546 (30%) screened positive on the APSQ. Of the sub-screens, 413 (8.0%) screened positive for anxiety (GAD7), 345 (6.7%) for depression (PHQ9), 1104 (21.3%) for sleep issues (ASSQ), 224 (4.3%) for alcohol misuse (AUDIT-C), 198 (3.8%) for alcohol/drug issues (CAGE-AID), and 203 (3.9%) for disordered eating (EDE-Q). The AUC, sensitivity (sens), and specificity (spec) of the APSQ compared to the GAD7 was AUC 0.92 (0.91–0.93), sens 93%, spec 81%; the PHQ9 was AUC 0.95 (0.94–0.96), sens 97%, spec 80%; the ASSQ was AUC 0.72 (0.70–0.74), sens 69%, spec 81%; the AUDIT-C was AUC 0.63 (0.60–0.67), sens 65%, spec 78%; the CAGE-AID was AUC 0.70 (0.66–0.75), sens 70%, spec 80%; and the EDE-Q was AUC 0.85 (0.82–0.87), sens 85%, spec 79%. The false negative rate of the subtests compared to the APSQ were GAD7 7%, PHQ9 3%, ASSQ 31%, AUDIT-C 35%, CAGE-AID 30%, and EDE-Q 15%.
Conclusions: In the largest study to date comparing APSQ to sub-screens for college athletes, there was high prevalence of sleep issues, anxiety, and depression, with lower rates of alcohol use and disordered eating reported. Compared to respective sub-screens, the APSQ is an excellent screen for anxiety and depression (GAD-7, PHQ-9), good for disordered eating (EDE-Q), fair for sleep (ASSQ) and substance use (CAGE-AID), and poor for alcohol use (AUDIT-C).
Significance: Data from this study can be used to improve the APSQ and SMHAT. The APSQ lacks specific questions on sleep and eating and has only one question on alcohol use. Enhancements in these areas could be made by adding or modifying current questions.
Acknowledgements: We would like to thank the Pac-12 for their support of research and the Health Analytics Program of which this data is a part.
TOPIC: NCAA
STUDY TYPE: Cohort
Evaluating the Impact of Cold Water Immersion on Recovery and Performance in Division I Athletes
Jeremy Swisher, MD
All Other Authors: Joshua Goldman, MD, MBA, Brian Donohoe, MD, Jeremy Vail, PT, SCS, OCS, Jeonguen Kim, MBI, Calvin Duffaut, MD, Nicolas Hatamiya, DO, and Nelson Boland, MD
Affiliation: UCLA Division of Sports Medicine, Los Angeles, California.
Purpose: This study aims to evaluate the impact of cold water immersion (CWI) on recovery and performance in NCAA Division I athletes using wearable technology (WHOOP) and subjective surveys. Measures collected are looking to provide insights into the physiological effects of CWI.
Methods and Study Design: Thirty-seven participants wore WHOOP devices during a 4-week intensive training cycle. During the 2-week intervention, they completed ten-minute cold water immersions (53F) within one-hour of training, 5 times weekly. Objective recovery metrics (e.g., HRV, RHR, sleep) were tracked via WHOOP. Subjective recovery was assessed daily with surveys, and performance was measured with weekly jump testing.
Results: In the full sample of NCAA Division I athletes, mediational multi-level models showed no significant relationships between cold water immersion (CWI) and objective recovery metrics, including heart rate variability (HRV), resting heart rate (RHR), or sleep duration. However, CWI was associated with a small but statistically significant reduction in muscle soreness (P = 0.013). Subgroup analysis of identified responders, athletes who demonstrated increases in HRV following cold water immersion, revealed that CWI led to significant improvements in perceived recovery (P = 0.008), alongside increases in HRV (P = 0.017) and reductions in RHR (P = 0.016). Additionally, direct effects of CWI on HRV were observed independent of subjective recovery scores in this subgroup (P = 0.004). Lastly, the decrease in muscle soreness associated with CWI was over twice the magnitude among responders than it was in the complete sample (P = 0.0004). No significant impacts on performance metrics such as jump height or concentric mean power were noted across groups.
Conclusions: While CWI did not demonstrate significant improvements in objective recovery or performance metrics across the full sample of NCAA Division I athletes, subgroup analysis revealed potential benefits for a subset of responders. In this group, CWI enhanced subjective recovery and HRV and reduced RHR, suggesting a targeted role for CWI in individualized recovery strategies.
Significance: This study highlights that CWI may benefit specific athletes by reducing muscle soreness and enhancing subjective recovery and certain physiological metrics, such as HRV and RHR. It also shows that CWI does not inhibit performance in these athletes.
Acknowledgements: We would like to thank WHOOP for providing statistical support and WHOOP bands, as well as UCLA for their institutional resources. We also thank the Division I athletes for their essential participation in this study.
TOPIC: NCAA
STUDY TYPE: Cohort
Mental Health Screening Tools Compared to Diagnostic Psychiatric Interview in NCAA Athletes
Vicki Nelson, MD, PhD
All Other Authors: James Anderson, MBA, ATC, Raphaela Fontana, MD, Bailey Nevels, PhD, and Christina Gutta, MD
Affiliation: Prisma Health Steadman Hawkins Clinic of the Carolinas, University of South Carolina School of Medicine—Greenville, Greenville, SC.
Purpose: This study evaluates mental health screening tools for depression, anxiety, and eating disorders to determine their performance compared to a standardized diagnostic psychiatric interview in collegiate athletes.
Methods and Study Design: Incoming NCAA athletes at participating Div 1 and 2 programs completed mental health screening including APSQ, BRS (brief resilience scale), GAD-7, PHQ-9, BEDA-Q and SCOFF. Diagnostic Interviews were completed using the Quick Structured Clinical Interview for DSM-5 Disorders components. Screening and interview results were compared to evaluate diagnostic utility of the instruments.
Results: Seven hundred ten athletes underwent both mental health screening and interview. 42 (5.9%) athletes were diagnosed with a psychiatric disorder during interview (1.69% major depression, MDD; 3.52% generalized anxiety, GAD; 0.14% anorexia nervosa, AN; and 0.56% bulimia nervosa, BN). GAD screening instruments had low sensitivity and good specificity: GAD-7 (5.2% positive, 0.48 sensitivity, 0.96 specificity, positive likelihood ratio 13.2), APSQ (10.6%, 0.56, 0.91, 6.3) and BRS (5.2%, 0.24, 0.95, 5.3). Screening tools for MDD performed similarly with low sensitivity: PHQ-9 (1.6%, 0.33, 0.99, 33.24), APSQ (10.6%, 0.58, 0.90, 6.0) and BRS (5.2%, 0.33, 0.95, 7.1). Screening tools for eating disorders (AN or BN) showed varied performance. BEDA had the highest sensitivity in the study but lacked specificity (49.7%, 0.8, 0.5, 1.62); SCOFF was the most specific with a high PLR (5.2%, 0.6, 0.95, 12.4); APSQ was more moderate in sensitivity and specificity (10.6%, 0.4, 0.9, 3.9).
Conclusions: Low sensitivity across tools, with false negative rates ≥ 42%, is concerning for GAD and MDD screening. The sensitivity of BEDA and specificity of SCOFF are appealing for AN and BN, examination of deal screening strategies is needed. High false positive rates (e.g., BEDA 50%, APSQ 10%) should be considered in resource context. The high PLR of the GAD-7 for GAD and PHQ-9 for MDD reinforce their use as diagnostic surrogates in future studies.
Significance: High false negative rates compared to diagnostic interview are a concern across evaluated screening instruments and disorders. Data suggests use of multiple targeted screens rather than a triage instrument such as APSQ or BRS.
TOPIC: Neurologic
Protein Kinase C Gamma (PKCg) as a Potential Sport-Related Concussion (SRC) Biomarker
LaRae Seemann, MD
All Other Authors: Kyle Marden, MD, Olivia M. Emanuel, BA, Emily F. Matusz, MA, Alec Fernandes, BS, Jarad Wilson, PhD, Ann Cornell-Bell, PhD, James R. Clugston, PhD, and James R. Clugston, MD
Affiliation: University of Florida, Gainesville, FL.
Purpose: PKCg is a neuronal peptide which increases in response to brain injury. Before its utility as a marker for SRC is determined, PKCg plasma levels in athletes should be established. This pilot study assesses PKCg concentration before and during preseason training in non-concussed contact athletes.
Methods and Study Design: We analyzed PKCg concentrations from non-concussed female soccer and male football athletes at baseline (pre-preseason) and 2 weeks later post-activity (mid-preseason) to ascertain change in levels. Samples were obtained via venipuncture. After centrifugation, plasma was refrigerated until quantification analyses were performed using enzyme linked fluorescence immunoassay and flow cytometry.
Results: Fifty-five athletes had both pre- and mid-season samples collected. Paired t-tests were conducted to compare pre-preseason and mid-preseason concentrations of PKCg. Overall, there was a significant decrease in the concentration of PKCg (mean pre-preseason = 240.45 pg/mL, mean mid-preseason = 204.73 pg/mL, t = 2.74, P = 0.01, Cohen’s d = 0.38) with small effect size between pre-preseason and mid-preseason. For male football, there was a significant decrease in PKCg concentration (t = 2.37, P = 0.03, Cohen’s d = 0.48) with small effect size from pre-preseason to mid-preseason. For female soccer, there was no significant change in PKCg concentration (t = 1.50, P = 0.15, Cohen’s d = 0.28) with small effect size from pre-preseason to post-preseason.
Conclusions: The results of this pilot study indicate that PKCg plasma concentrations do not increase with preseason training. Instead, there was a slight decrease with small effect size over the 2-week preseason training period.
Significance: As PKCg concentration does not increase with contact sport activity, further research is needed to evaluate PKCg concentrations at various timepoints in athletes diagnosed with SRC for use as a potential SRC diagnostic and recovery biomarker.
Acknowledgements: Perseus Sciences, LLC; RayBiotech; Constance Andrews ATC and Tony Hill ATC of the University of Florida Athletic Association; Breton Asken, PhD; Alejandro Sanoja, MD; University of Florida Student Health Care Center.
TOPIC: Other-Supplements
STUDY TYPE: Other
A Comparison of Traditional vs. Personalized Hydration in NCAA Division I Mens’ Basketball Athletes
Jeremy Calderwood, DO
All Other Authors: Joseph Armen, DO, Joseph Barsa, MD, Patrick Rider, MSc, Kristina Coe, RD, and Nate Clark, ATC
Affiliation: East Carolina University Sports Medicine Fellowship Greenville, NC.
Purpose: To determine whether a personalized fluid and transmucosal electrolyte replacement strategy improves physical capacity, sport specific skill and/or ratings of perceived exertion in NCAA Division I male basketball athletes when compared to a traditional approach.
Methods and Study Design: Fifteen male NCAA basketball athletes consented to participate in a randomized crossover design involving 2 separate training sessions during the preseason. Sweat and electrolyte losses were measured, and replacement occurred ad lib with Gatorade verses a personalized combination of water and oral transmucosal electrolytes. Physical capacity, sports specific skill and RPE were measured for analysis.
Results: There were no significant group effects between traditional and personalized groups (all values respectively arranged) in physical capacity metrics (e.g., 1% ± 0.12% vs. 0% ± 0.13% change in total loading between 1st and last game during the session) or sports specific skill performance (13.1 ± 2.4 vs. 12.9 ± 2.0). There were also no differences in perceived exertion between groups (7.0 ± 1.8 vs. 6.9 ± 2.0) despite significant group effects between traditional vs. personalized for change in weight (−1.1% ± 0.6 vs. −1.8% ± 0.8), total volume consumed (44.5 oz ± 8.5 vs. 28.3 oz ± 13.5) and total electrolyte consumed (778 mg ± 148 vs. 1420 mg ± 700). Environmental conditions were the same for each day of testing (71° and 62% humidity). There was a significant day effect for the sport specific skill test with improved shooting performance on Day 2 (12.4 ± 2 vs. 13.6 ± 2.6). There was also a significant reduction in vertical jump performance on Day 2 (28.4 inches ± 3.8 vs. 27.1 inches ± 3.4).
Conclusions: The personalized group consumed more electrolytes but lost more weight because they consumed less fluid; however, this did not significantly impact the athletes’ ability to handle the demands of practice. On day 2 sport specific skill improved, probably due to an increased familiarity with the testing procedure; whereas jump height decreased, most likely from a reduction in effort, given all other physical capacity metrics did not change.
Significance: A personalized rehydration strategy using water followed by an oral transmucosal delivery of electrolytes appears to have no advantage compared to a traditional approach when evaluated over 2 typical indoor male basketball practice sessions.
Acknowledgements: This research was partially supported by East Carolina University Department of Family Medicine. We also thank the student-athletes and coaches from the Department of Athletics for their participation in this study along with Daniela Ramirez, Benjamin Shahady and Alexis Schroeder for their assistance with data collection.
TOPIC: Other-Transgender
STUDY TYPE: Survey
Physical Activity in the Transgender Patient Population
Emily Miro, MD, MPH
All Other Authors: Masaru Teramoto, PhD, MPH and Brett Toresdahl, MD
Affiliation: St. Mark’s Family Medicine Residency, Salt Lake City, UT.
Purpose: We aimed to study the physical activity levels of transgender individuals, the effect of gender-affirming hormone therapy (GAHT) on the comfort of exercising in various settings, and barriers to participation in physical activity.
Methods and Study Design: Adults initiating or continuing GAHT at one medical center (02/2024–07/2024) completed an online survey. Activity levels were assessed via the IPAQ, and Likert scales measured comfort exercising in various settings pre- and post-GAHT and factors influencing participation. Data were analyzed using descriptive statistics and Wilcoxon signed-rank tests.
Results: There were 113 survey participants: 32% transgender men, 43% transgender women, 20% nonbinary, and 4% other. 80% were on GAHT. Participants reported a median of 2,765 (IQR = 1,138–6,204) MET-minutes of total physical activity weekly. After initiation of GAHT, participants felt significantly more comfortable exercising at home (39.1% more comfortable vs. 5.4% less comfortable, P P = 0.003), with 73% feeling very uncomfortable in a sports league. There was no significant difference in level of comfort while exercising outdoors or in a public gym pre-versus post-initiation of GAHT (P > 0.05). Key barriers to physical activity included fear of acceptance (39%), fear of harassment (37%), and discomfort with their body (35%).
Conclusions: The majority of transgender respondents felt uncomfortable exercising in a public gym or sports league pre- and post-initiation of GAHT, with discomfort in sports leagues significantly increasing after initiation of GAHT. Many respondents cited external factors as major influences on their exercise habits. Despite these limitations, most participants reported physical activity participation that exceeds the recommendation for American adults.
Significance: Further understanding of the barriers to physical activity among the transgender patient population may aid in promotion of exercise among this vulnerable group.
Acknowledgements: Thank you to the University of Utah Transgender Patient and Family Advisory Board for their consideration of this study.
TOPIC: Other-Travel Fatigue
STUDY TYPE: Cohort
The Effect of Travel Load on Minor League Baseball (MiLB) Injuries Part 1 Total Injury Frequency
Spencer Cooper, DO
All Other Authors: T. Jason Meredith, MD, Samuel Wilkins, PhD, and Adam Rosen, PhD
Affiliation: University of Nebraska Medical Center, Omaha, NE.
Purpose: Travel fatigue is an accumulative disorder due to repetitive travel over a season that can result in a buildup of injures in athletes. A major change in the MiLB schedule occurred for the 2021 season that reduced cumulative travel load; we hypothesized that it also reduced number of injuries.
Methods and Study Design: A retrospective cohort study analyzed data from minor league baseball players using a public database from 4 seasons pre-schedule change (2016–2019) and 4 seasons post-schedule change (2021–2024) at each level of baseball (AAA, AA, A+, A). A paired t-test was used for analysis with alpha level of 0.05.
Results: Exposures were calculated per primary position played for each competition period and pooled per league. Pre-schedule change, there were 7733 injuries across 858,829 exposures, while post-schedule change, there were 6430 total injuries across 844,172 total exposures. Overall, the results showed a statistically significant reduction in the total injuries IL post-schedule modification (P = 0.008), with a moderate effect size (Cohen’s d = 0.76). The total amount of days spent on the IL pre-schedule change was 291,544, compared to 269,760 post-schedule change. While the total days on the IL did not show a significant reduction (P = 0.103), the average days per injury slightly increased, though not significantly (P = 0.088).
Conclusions: The schedule modification implemented in minor league baseball appears to have led to a statistically significant reduction in the overall number of injuries. The findings indicate that while the schedule change with decreased travel load alleviates injury frequency among MiLB players, there is a need for further strategies to address injury severity and duration.
Significance: This is the first study to link scheduling-associated travel load with injury frequency. Our data may be extrapolated to other sports, including NCAA athletics, to assist medical staff in advocating for their student athletes’ health and well-being.
TOPIC: Pediatrics
STUDY TYPE: RCT
The Effect of Timing of Physical Therapy on Bony Healing in Adolescent Lumbar Spondylolysis
Anastasia Fischer, MD
All Other Authors: Emily Sweeney, MD, Lisa Martin, MD, Jingzhen Yang, PhD, Madison Brna, BS, and Mitchell Selhorst, PhD
Affiliation: Nationwide Children’s Hospital and The Ohio State University College of Medicine, Columbus, OH.
Purpose: Rest is thought necessary to promote bony healing for adolescent lumbar spondylolysis. However, prolonged rest is not without significant negative consequences. This study compared the healing of spondylolysis on MRI in adolescents treated with prolonged rest vs. immediate physical therapy (PT).
Methods and Study Design: This was a prospective multi-center randomized controlled trial. Adolescent athletes with active lumbar spondylolysis on MRI were removed from sport then randomized to immediate PT or usual care (rest until pain-free with daily activities before PT). A blinded radiologist assessed baseline and 3-month MRIs to determine the healing of the spondylolysis defect and edema.
Results: Sixty adolescent athletes (14.2 ± 1.5 years; 39% female) with a mean symptom duration of 9 weeks (± 9) were randomized to immediate PT (n = 30) or usual care (n = 30). The immediate PT group began physical therapy an average of 6 days after diagnosis, while the usual care group rested an average of 34 days before initiating PT. Fifty-six adolescents (93%) completed follow-up imaging. On the 3-month MRI, 75% of patients demonstrated significant healing of the spondylolysis injury, 11% demonstrated no notable change, and 14% appeared worse. There were no significant between-group differences in the healing of spondylolysis at 3 months (Healing MRI: Usual Care n = 20, Immediate PT n = 22, P = 0.88; No Change on MRI: Usual Care n = 2, Immediate PT n = 4, P = 0.67; Worsened MRI: Usual Care n = 5, Immediate PT n = 3; P = 0.46).
Conclusions: Waiting until pain-free before starting PT does not promote bony healing on MRI compared to immediate PT in adolescents with lumbar spondylolysis restricted from sport. This study demonstrates PT can start immediately upon diagnosis, avoiding the negative consequences associated with prolonged rest.
Significance: Although larger prospective studies are recommended, physicians may consider starting PT immediately without negatively impacting the healing potential of the spondylolysis on MRI.
Acknowledgements: We would like to thank the AMSSM CRN for providing funding and support for this study.
TOPIC: REGENMED
STUDY TYPE: Other
Comprehensive Analysis of Platelet-Rich Plasma Formulations: Implications for Reporting Improved Guidelines
Nicholas Hooper, MD, MS
All Other Authors: Ramnik Gill, Kenneth Mautner, MD, and Prathap Jayaram, MD
Affiliation: Emory University, Atlanta, GA.
Purpose: This study aimed to address PRP formulation heterogeneity by comprehensively characterizing cytokine profiles of 2 commonly used PRP formulate: leukocyte-rich PRP (LR-PRP) and leukocyte-poor PRP (LP-PRP), obtained from the same participants.
Methods and Study Design: Blood samples were collected from 12 healthy donors and processed to obtain LR-PRP and LP-PRP. Laboratory analysis of cytokine concentrations in leukocyte-rich PRP (LR-PRP) and leukocyte-poor PRP (LP-PRP), were conducted from the same donors. Statistical analysis was performed to compare cytokine levels between LR-PRP and LP-PRP groups.
Results: LR-PRP demonstrated significantly higher concentrations of anti-inflammatory cytokines interleukin-1 receptor antagonist (IL-1RA) (725.41 pg/mL versus 378.33 pg/mL, P P P P
Conclusions: Our findings highlight distinct cytokine profiles between LR-PRP and LP-PRP formulations, with LR-PRP exhibiting a more favorable profile characterized by higher concentrations of anti-inflammatory and tissue-repair factors. These results underscore the importance of considering PRP formulation variability and its implications for therapeutic efficacy in musculoskeletal disorders.
Significance: Our data underscores the need to report specific concentrations of cell lines within PRP, and also suggests that reporting of certain cytokine and other signaling molecule concentrations is warranted.
TOPIC: Rehabilitation
STUDY TYPE: RCT
Impact of Joint Off-Loading Walking Exercise in Knee Osteoarthritis: A Randomized Clinical Trial
Prakash Jayabalan, MD, PhD
All Other Authors: Audrey Lazar, BS, Yelyzeveta Merenzon, BS, Sanchita Sen, BA, Vikram Darbhe, PhD, Elizabeth Gray, MS, Abhishek Balu, BS, Leah Welty, BS, and Leah Welty, PhD
Affiliation: Shirley Ryan Ability Lab and Northwestern University, Chicago, IL.
Purpose: Engaging individuals with knee osteoarthritis (KOA) in walking exercise remains challenging due to pain exacerbation. Our goal was to determine longitudinal changes in joint pain, function, and systemic markers of inflammation, using a lower-body positive pressure (LBPP) treadmill or aquatic walking.
Methods and Study Design: Randomized clinical trial. 49 adults, aged >50 with KOA enrolled in 3 groups for 8 weeks: (1) LBPP treadmill (2/wk), n = 16 (5); (2) Aquatic walking (2/wk), n = 17 (2); (3) Control (exercise recommendations), n = 16 (1). Primary outcomes: Knee Injury and Osteoarthritis Outcome Score (KOOS). Secondary outcomes: 6-minute walk test (6-MWT), gait kinematics, Quality-of-Life survey (EQ-5D), and serum biomarkers.
Results: The LBPP group had significant improvements in KOOS pain at 4-weeks (P = 0.055), 8-weeks (P = 0.008), and 3–6 months. (P = 0.019), not observed in the other groups. The LBPP group had significant improvements at 4-week (P = 0.015) and at 8-week (P = 0.001) in 6-MWT. At 8-week, aquatic walking had a significant improvement in KOOS sports only (P = 0.012). Compared to baseline, LBPP was associated with significant change in abduction/adduction angle at 4-week and 8-week (P P P P = 0.03), and its associated protein IL-1ra (P = 0.009), not observed in either interventional group. Additionally, the aquatic group had an increase in MMP-7 concentration (P = 0.05), which is an enzyme associated with tissue remodeling.
Conclusions: LBPP treadmill walking significantly improved symptoms and objective function in individuals with KOA. Potential mechanisms for improvement may be through associated changes in knee joint kinematics. Additionally, longitudinal off-loaded walking therapies were not associated with an increase in systemic inflammation that was only observed in a group receiving exercise recommendations (control group).
Significance: First study to show that off-loaded walking therapies improve symptoms, function and reduce systemic inflammation markedly in individuals with KOA. These therapies should be considered part of walking future programs in the future.
TOPIC: Running
STUDY TYPE: Cohort
Going the Extra Mile: Do Running Mechanics Change in High-School Cross-Country Runners During a Distance Run?
Shane Miller, MD
All Other Authors: Alex Loewen, MS, Ashley Erdman, BS, MBA, Jessica Penshorn, PT, DPT, ATS, SCS, Jane Chung, MD, Jacob Jones, MD, Henry Ellis, MD, Sophia Ulman, MD, and Sophia Ulman, PhD
Affiliation: Scottish Rite for Children, Frisco, TX, University of Texas Southwestern Medical Center, Dallas, TX.
Purpose: Long-distance running imposes significant demands on the musculoskeletal system. The purpose of this study was to assess differences in running mechanics in high-school cross-country runners over a 5-mile run. It was hypothesized that significant changes in running mechanics would be observed.
Methods and Study Design: High-school cross-country runners were outfitted with motion capture markers and Novel Loadsol force sensor insoles, then completed a 5-mile outdoor run on a flat, concrete surface. Running mechanics were captured using a 14-camera Vicon motion capture system every half-mile. A comparison between mile 1 (M1) and mile 5 (M5) was analyzed using Wilcoxon’s signed-rank tests.
Results: Twenty healthy runners (45% female, age: 16.2 ± 1.2 years) were included in this analysis. Runners exhibited decreased running speed (1.0 min/mile difference, P = 0.001) and stride lengths (0.8 m difference, P = 0.001) at M5 compared to M1. At toe off, sagittal trunk angle increased (3.4° difference, P P = 0.001) and knee flexion/extension range of motion (10.1° difference, P P = 0.002), maximum hip adduction moment (0.13 Nm/kg difference, P = 0.035), and ankle dorsiflexion moment (0.18 Nm/kg difference, P = 0.017) decreased during stance phase at M5 compared to M1. Loadsol insole data demonstrated decreased maximum forefoot force (0.1 N/kg difference, P = 0.008) and rate of force development in the forefoot (0.9 N/sec difference, P = 0.011) and midfoot (0.3 N/sec difference, P = 0.025) at M5 compared to M1.
Conclusions: This study highlights changes in running mechanics in high-school cross-country runners during a 5-mile run on typical training surfaces. As distance increased, runners demonstrated altered running patterns characterized by decreased stride length and running speed. These changes were accompanied by alterations in joint kinematics and moments, as well as force development across foot regions.
Significance: Understanding these adaptations is crucial for developing interventions aimed at reducing the risk of overuse injuries in adolescent long-distance runners.
TOPIC: Running
STUDY TYPE: Survey
History of Bone Stress Injuries and Associated Factors Among Runners Training for a Marathon: An RHC Study
Nathan Katz, DO
All Other Authors: Adam Tenforde, MD, Allison Schroeder, MD, Emily Kraus, MD, Stephanie Kliethermes, PhD, Mark Fontana, PhD, and Brett Toresdahl, MD
Affiliation: MetroHealth System, Department of Physical Medicine and Rehabilitation, Case Western Reserve University, Cleveland, Ohio.
Purpose: Given the popularity of distance running and the risk of bone stress injuries (BSIs), we aimed to measure the prevalence of a history of a BSI among runners training for a marathon and the association with other factors, including demographics, nutrition, and running experience.
Methods and Study Design: This was an analysis of baseline data from the Runner Health Consortium of adult runners training for a marathon. Data included demographics (age, sex, BMI), nutrition (dairy intake, alcohol use, eating disorder history), and running experience (weekly mileage, years participating in endurance races). Logistic regression was used to assess the association of these factors with a history of a BSI.
Results: A total of 2035 runners were included in this analysis: mean age 38.5 (SD 11.4), 59.0% female, mean BMI 23.8 (SD 3.5), 9.3% restrict dairy, 45.5% use alcohol at least weekly, 18.5% reported a prior/current eating disorder, mean weekly mileage 22.7 (SD 15.5), and median years participating in endurance races 8 (IQR 3–14). 455 (22%) reported a history of at least one BSI. In multivariable regression, factors associated with a history of BSI included years participating in endurance races (OR 1.03, CI 1.02–1.05, P P P P = 0.01) and BMI (OR 0.96, CI 0.93–0.99, P = 0.01), were associated with a history of BSI in the univariable analysis but not the multivariable model. Age, alcohol use, and miles per week run at baseline were not associated with BSI in the univariable or multivariable models (all P > 0.05).
Conclusions: Nearly 1 in 4 runners training for a marathon reported a history of BSI. Runners with more years participating in endurance races, who restrict dairy intake, and who reported a prior or current eating disorder were more likely to report a history of BSI. However, unlike prior studies, a history of BSI was not more common in women or lower BMI, after adjusting for other factors.
Significance: Although prospective research is needed to better establish risk factors among adult runners training for a marathon, the results of this analysis support the importance of adequate nutrition, including dairy intake, for reducing the risk of BSI.
Acknowledgements: The authors wish to thank our race partners, including Bank of America Chicago Marathon, New York Road Runners, and the Seattle Marathon Association for contributing to enrollment for this study. The Runner Health Consortium is funded by an AMSSM CRN Grant.
TOPIC: Running
STUDY TYPE: Survey
Identifying Prevalence and Associations With Urinary Incontinence in Female Endurance Runners: An RHC Study
Emily Kraus, MD
All Other Authors: Stephanie Kliethermes, PhD, Mark Fontana, PhD, Luke Johnson, BS, Adam Tenforde, MD, and Brett Toresdahl, MD
Affiliation: Stanford University, Stanford, CA.
Purpose: To identify the prevalence of urinary incontinence (UI) and to evaluate demographic, clinical and training characteristics associated with UI in a general population of female endurance runners training for a marathon as part of the Runner Health Consortium (RHC).
Methods and Study Design: This is a cross-sectional analysis of baseline surveys completed by female runners enrolled in the RHC study. Univariable and multivariable logistic regression assessed associations with UI. Univariable predictors with P
Results: 1378 females enrolled in the study (age: 37.8 ± 11.1 years; BMI: 23.4 ± 3.6 kg/m2; running volume: 20.36 ± 14.7 miles/week). The overall prevalence of self-reported UI in this population was 12.7% (n = 150). Older age (OR: 1.25, 95% CI: 1.62–1.35; P P = 0.04) and greater average training volume (OR: 1.08; 95% CI: 1.03–1.14; P = 0.004) were associated with increased odds of UI on univariable analysis; no associations were found with BMI (P = 0.16), breast-feeding (P = 0.33), training frequency (days/week, P = 0.12), regular cross-training including heavy resistance (P = 0.79), high repetition, low resistance (P = 0.69), or yoga/pilates (P = 0.54). Older age (OR: 1.45; 95% CI: 1.26–1.67, P P = 0.007) and greater BMI (OR: 1.05, 95% CI: 1.00, 1.10, P = 0.049) were significantly associated with UI in multivariable analysis, while training frequency (P = 0.48) and distance (P = 0.06) were not.
Conclusions: UI was prevalent in this population of female runners training for an endurance race. Our findings suggest UI may be more common in older female runners, postpartum runners and those with higher BMI. Training characteristics (e.g., training frequency, distance or cross-training) were not clearly associated with UI in this population. Pregnancies over 12 months ago and pelvic floor physical therapy interventions were not assessed.
Significance: Given the high prevalence of UI in the sport of running, clinicians should recognize characteristics associated with UI and provide resources for management, including pelvic floor muscle training, to help treat affected female endurance runners.
Acknowledgements: The authors wish to thank our race partners, including Bank of America Chicago Marathon, New York Road Runners, and the Seattle Marathon Association for contributing to enrollment for this study. The Runner Health Consortium is funded by an AMSSM CRN Grant.
TOPIC: Training
STUDY TYPE: Other
Establishing the Validity of a Lower Body 3-D Musculoskeletal Kinematic Assessment Using 2 Smartphones
Jaineet Chhabra, MD
All Other Authors: Huaqing Liang, MD, PhD and Steven Leigh, PhD
Affiliation: Kirk Kerkorian School of Medicine at UNLV, Las Vegas, NV.
Purpose: To validate the use of open-source motion analysis software and smartphones to measure 3-D musculoskeletal biomechanics without the time and equipment barriers. This technology may allow for more detailed and objective injury risk assessments than the usual visual observation and subjective opinion.
Methods and Study Design: Healthy adults perform common sports skills in a biomechanics lab, including running, jumping, landing, hopping, and unanticipated cutting. Their lower body movements are measured with a marker-based motion analysis system, force plates, inertial measurement units, and the OpenCap open-source pose estimation platform with 2 smartphones, all operating simultaneously. Data collection is ongoing.
Results: Joint angle time histories of the first set of participants (6 M 1 F, 24 ± 0.4 years, 77 ± 4 kg, 1.79 ± 0.03 m) were compared among the motion analysis systems with cross-correlation. The cross-correlation between the lab-based motion analysis system and smartphone-based system was strong for running (Xcorr = 0.997, lag = −0.007 s), jumping (Xcorr = 0.999, lag = 0.005 s), and landing (Xcorr = 0.932, lag = −0.1 s). The processing time for the smartphone-based motion analysis system was considerably faster than the lab-based system. Complete OpenCap success rate data were captured for all 10 drills for the last 3 participants who were given subjective binary scores of high vs. low quality movement for the tasks: “walk,” “run,” “jog backwards,” single leg hops,” “countermovement jump,” “rebound landing,” “run-stop-jump,” “backward side shuffle,” and “forward side shuffle.” These participants exhibited high quality movement across most drills. These subjective scores are planned to serve as a classifier with the aim of developing objective assessment.
Conclusions: Smartphone-based motion analysis appears valid for assessing human movement in real-time. Our last 3 participants’ data were the most robust as we refined our calibration and synchronization process. The complete pose estimation of over 90% compares favorably to the lab-based system where 12% of marker trajectories were occluded. The 0.1 second lag of the landing cross-correlation comparison suggests the algorithm struggles to reconstruct impacts.
Significance: Though early, this work suggests smartphone-based motion analysis is feasible for on court or clinic use, providing 3-D kinematics similar to the gold-standard lab-based motion analysis system, proposing utility for objective movement assessments.
Acknowledgements: We would like to thank our participants for volunteering their valuable time with us and ARION for providing their insole technology that was used for this collaborative, preparative study that will help inform the next phase of our work. The OpenCap platform is accessible at www.opencap.ai.
TOPIC: Ultrasound
STUDY TYPE: Cohort
Ultrasound as a Predictor of Time-loss Injury for the Patellar Tendon, Achilles Tendon and Plantar Fascia
Dan Cushman, MD
All Other Authors: Derek Stokes, MD, Leyen Vu, DO, Blake Corcoran, MD, Michael Fredericson, MD, Sarah Eby, MD, PhD, and Masaru Teramoto, PhD, MPH, PStat
Affiliation: University of Utah Salt Lake City, UT.
Purpose: Tendinopathy/fasciopathy are common conditions that can result in time-loss injury. This study aimed to determine if pre-season sonographic abnormalities of patellar tendons, Achilles tendons, and plantar fasciae are associated with future time-loss injuries in collegiate student-athletes.
Methods and Study Design: NCAA Division I athletes from 3 institutions participated in this 3-year prospective, observational study. Each athlete underwent an ultrasound examination of the patellar tendons, Achilles tendons, and plantar fasciae; they were blindly assessed for tendon/fascia thickening, hypoechogenicity, and neovascularization. Athletes were monitored for time-loss injury over the subsequent year.
Results: A total of 695 athletes were analyzed over 3 years. Sonographic abnormalities were identified in 36.6%, 7.5%, and 2.8% of the patellar tendons, Achilles tendons, and plantar fasciae, respectively. Time-loss injuries were reported in 3.3%, 1.6%, and 0.7% of these structures with injury risk significantly increased by about 9 times, 19 times, and 21 times in those with abnormalities (P less than 0.001). The presence of an ultrasound abnormality was even more predictive of future injury than a self-report of prior history of injury or pain in the area at the time of scan. Ultrasound demonstrated higher sensitivity for future time-loss injury than a history of prior injury to the area or pain at the time of scanning. The negative predictive value for future injury based on the presence of a sonographic abnormality was over 99%, with the positive predictive value being low (9.0–16.3%).
Conclusions: Pre-season sonographic abnormalities of the patellar tendon, Achilles tendon, or plantar fascia in collegiate student-athletes were found to have a higher association with the risk of developing time-loss injury than self-reported prior injury or pain at the time of the scan. A negative scan virtually rules out an injury while a positive scan still demonstrates a relatively low likelihood of upcoming injury.
Significance: A pre-season ultrasound evaluation of the Achilles tendon, patellar tendon, and plantar fascia may identify at-risk athletes. A normal ultrasound is highly correlated to staying injury-free for that particular structure during the following year.
Acknowledgements: The publication contains materials created, compiled, or produced by the Pac-12 Health Analytics Program (HAP). We thank the many athletic department staff, including athletic trainers, who have submitted data to the HAP.
TOPIC: Ultrasound
STUDY TYPE: Cohort
Which Sonographic Abnormalities Predict Future Tendon Injury
Dalton Brady, MD
All Other Authors: Masaru Teramoto, Phd, Luke Johnson, BS, and Daniel Cushman, MD
Affiliation: University of Utah, Salt Lake City, Utah.
Purpose: This study aims to identify specific sonographic findings that are most useful for predicting the development of future injury in athletes who have undergone pre-season sonographic evaluations of the patellar and Achilles tendons.
Methods and Study Design: This is a sub-analysis of a larger study that screened NCAA D1 athletes via ultrasound (US) scans of the bilateral patellar and Achilles tendons. Abnormalities of interest were the presence of hypoechogenecity, thickening, and/or neovascularity. These findings were compared individually and in combination via diagnostic measures to determine their predictive value on future injury.
Results: Five hundred thirty-eight athletes accounting for 695 “athlete-years” with a total of 2780 tendons were examined. The cohort was 61% female, with mean age 20, and 9.4 years playing sport. Hypoechogenicity was the most prevalent ultrasound abnormality in the patellar (n = 483 or 35.6%) and Achilles (n = 80 or 5.8%) tendons. For the patellar tendon, about 9% of athletes with any one ultrasound abnormality developed an injury (PPV = 9.1; 95% CI = 6.7, 12.0), while about 99% of those without an ultrasound abnormality did not develop an injury (NPV = 99.2; 95% CI = 98.3, 99.7). The combination of all abnormalities (hypoechogenecity, thickening, and neovascularity) resulted in 17.6% of athletes developing an injury (=PPV; 95% CI = 12.1, 24.3). For the Achilles tendon, having any one ultrasound abnormality was associated with PPV and NPV of 16.5% (95% CI = 9.9, 25.1) and 99.4% (95% CI = 98.8, 99.7), respectively. The combination of all abnormalities resulted in 36.4% (=PPV; 95% CI = 10.9, 69.2) of athletes developing an injury.
Conclusions: When all 3 types of ultrasound abnormality—hypoechogenicity, thickening, and neovascularity—were present in the same patellar tendon, the predicted probability of future injury tended to be higher than when any one type of ultrasound abnormality was present. Such a trend was also observed in the Achilles tendon.
Significance: This large study suggests that the combination of multiple abnormal sonographic findings in the patellar or Achilles tendon is more sensitive for predicting future injury than the same abnormal findings in isolation.
TOPIC: Ultrasound
STUDY TYPE: Case-Control
Varus-to-Valgus Stress Ultrasonography for Evaluating the Ulnar Collateral Ligament
Ricardo Colberg, MD
All Other Authors: Tomas Vega, MD, Jeremy Towns, MD, Alex Yates, MA, Brody Westbrooks, BS, Sydney Carlson, BS, and Matthew Ithurburn, DPT, PhD
Affiliation: Andrew’s Sports Medicine and Orthopedic Center American Sports Medicine Institute Birmingham, AL.
Purpose: To evaluate the stress ultrasound (sUS) technique in the supine position and determine if a varus-to-valgus joint gap measurement can identify UCL laxity due to the forearm weight that the standard rest-to-valgus joint gap measurement does not account for.
Methods and Study Design: Asymptomatic, male, NCAA Division 1 baseball athletes with no history of elbow injury were evaluated with sUS imaging in the supine position for bilateral UCL tissue quality and measurements of ulnohumeral joint space at rest, with varus stress, and with valgus stress on the throwing arm and the non-throwing arm. The rest-to-valgus joint gap and the varus-to-valgus joint gap were compared.
Results: A total of 50 baseball athletes participated in the study. Compared to the non-throwing elbow, the throwing elbow demonstrated significantly wider joint space at rest (3.4 vs. 2.9 mm; P P P = 0.24). Accordingly, there was a wider varus-to-valgus joint space gap (1.3 vs. 1.0; P P = 0.48). Within throwing elbows only, the varus-to-valgus joint space gap was significantly larger than the rest-to-valgus joint space gap (P
Conclusions: There was acquired UCL laxity in the throwing elbow, that affected the rest and valgus measurements but did not affect the varus joint space. The varus-to-valgus measurement identified UCL laxity that the rest-to-valgus measurement did not identify. A modified technique to evaluate for UCL laxity is proposed in which the joint space under elbow varus stress is used as the baseline measurement when performing sUS in the supine position.
Significance: UCL laxity of the throwing elbow needs to be accounted for when performing sUS. The varus-to-valgus measurement was able to identify UCL laxity in the supine position that was not accounted for with the standard rest-to-valgus joint gap measurement.
TOPIC: Ultrasound
STUDY TYPE: Cohort
Reliability and Stability of Shear Wave Velocity of the Patellofemoral Joint Restraining Structures
George Raum, DO
All Other Authors: Robin Tipps, MD, JD, Allison Bean, MD, PhD, and Andrew Sprague, DPT, PhD
Affiliation: University of Pittsburgh Medical Center, Pittsburgh, PA.
Purpose: Shear wave elastography (SWE) assesses the viscoelastic properties of patellofemoral restraining structures, offering insight into biomechanical changes in patellofemoral pain syndrome (PFPS). We evaluated the inter-rater reliability and between-day stability of SWE measures in these structures.
Methods and Study Design: The quadriceps muscle, extensor tendons, patellofemoral ligaments and the iliotibial band were evaluated in 10 subjects (6 PFPS:4 healthy) using SWE and repeated 24–48 hours later. Baseline measurements for 2 examiners and follow-up measurements for one examiner were compared using intraclass correlation coefficients (ICCs) to establish interrater reliability and between-day stability.
Results: ICCs for inter-rater reliability of shear wave velocity (SWV) was excellent for all measures: vastus lateralis (VL: ICC = 0.976, 95% CI = 0.939–0.990), rectus femoris (RF: 0.995, 0.988–0.998), vastus intermedius (VI: 0.998, 0.971–0.995), vastus medialis (VM: 0.992, 0.981–0.997), quadriceps tendon (QT: 0.967, 0.916–0.987), patellar tendon (PT: 0.996, 0.990–0.998), medial patellofemoral ligament (MPFL: 0.990, 0.974–0.996), lateral patellofemoral ligament (LPFL: 0.994, 0.984–0.997), iliotibial band (ITB: 0.991, 0.977–0.996). ICCs for stability of SWV were good at the RF (0.792, 0.474–0.918), VI (0.856, 0.636–0.943), VM (0.818, 0.541–0.9280), PT (0.832, 0.576–0.934), and MPFL (0.801, 0.498–0.921) but moderate for the QT (0.613, 0.023–0.847) and ITB (0.665, 0.154–0.987), and poor for the VL (ICC = 0.000, 95% CI = −1.526 to 0.604) and LPFL (0.404, −0.505 to 0.764).
Conclusions: SWV measurements of the anterior knee have excellent inter-rater reliability. SWV of the RF, VI, VM, PT and MFPL were stable within 24–48 hours, suggesting these measures may be useful for evaluating changes over longer intervals. The VL, LPFL and ITB were not stable in repeated measurements. The variability of these measures may be due to differences in subject positioning or probe alignment, and requires further investigation.
Significance: SWE may be useful in evaluating the viscoelastic properties of the RF, VI, VM, PT and MPFL. Future studies will investigate whether PFPS is associated with changes in these soft tissue structures to identify targets for interventions.
TOPIC: Ultrasound
STUDY TYPE: Cohort
Correlation of Sonographic Posterolateral Rotatory Stress Test and Instability of the Elbow
Tyra Swanson, MD
All Other Authors: Brennan Boettcher, DO, Karina Gonzalez Carta, MD, MS, and Christopher Camp, MD
Affiliation: Mayo Clinic, Rochester, MN.
Purpose: Posterolateral rotatory instability (PLRI) of the elbow is challenging to diagnose using physical examination alone. This study aims to describe preliminary sonographic criteria for diagnosing PLRI using stress ultrasound to assess ulnohumeral joint instability in clinically unstable patients.
Methods and Study Design: This retrospective cohort study included 41 elbows from 29 patients (mean age 43 ± 14 years), divided into 3 groups: symptomatic with clinical instability, symptomatic with no instability, and asymptomatic controls. A sonographic posterolateral rotatory stress test (S-PRST) assessed ulnohumeral joint laxity, with measurements of joint spacing at rest and under stress (stress delta).
Results: Stress delta values were compared among the 3 groups to evaluate S-PRST diagnostic potential for PLRI. The symptomatic with clinical instability group average stress delta was 2.2 mm (SD = 1.07 mm; range: 0.6–3.7). The symptomatic but clinically stable elbows stress delta was 0.9 mm (SD = 0.70 mm; range: 0.2–2.7) and in asymptomatic controls stress delta was 0.7 mm (SD = 0.52 mm; range 0.1–1.5). Physical examination revealed positive findings for posterolateral rotary instability (PLRI) in 16 patients (39%), and of these 16 there were 14 (88%) that exhibited a positive drawer test, and 4 (29%) showed a positive lateral pivot-shift test. Additionally, 11 participants (27%) tested positive for the laptop test, and 4 (10%) for the elbow shear test. There was a statistically significant association found between a positive PLRI on physical exam and instability confirmed by S-PRST (Fisher’s Exact Test P =
Conclusions: S-PRST findings for PLRI in clinically unstable patients may be smaller than previously shown in a cadaveric model. Only 3/25 (12%) of clinically stable elbows had a stress delta >1.5 mm whereas 12/16 (75%) of clinically unstable elbows had values >1.5 mm. While further research is required to establish definitive cutoffs, these findings suggest a lower threshold for PLRI diagnosis with S-PRST than previously identified may be necessary.
Significance: Prior cadaver studies suggested a S-PRST stress delta >4 mm indicates PLRI. However, our study found no clinically unstable elbows stress delta >4 mm, suggesting that the cutoff for diagnosing PLRI may be closer to 1.5 mm in clinical cases.
TOPIC: Ultrasound
STUDY TYPE: Cohort
Ultrasound-guided Carpal Tunnel Release—Case Series
Stacey Isidro, MD
All Other Authors: Emily Olson, MD and Marko Bodor, MD
Affiliation: University of New Mexico, Albuquerque, New Mexico.
Purpose: Carpal tunnel syndrome is the most common peripheral nerve entrapment mononeuropathy. Surgery includes open, endoscopic and ultrasound-guided release of the transverse carpal ligament. We present a case series treated with ultrasound-guided carpal tunnel release (US-CTR) with one-year follow-up.
Methods and Study Design: The case series had 80 patients and 110 wrists treated with ultrasound-guided carpal tunnel release using a 3-mm hook knife device with local anesthesia and followed for one year in a private practice. Preoperative EMGs and median nerve ultrasounds were completed. The Quick Disabilities of the Arm, Shoulder, and Hand (QDASH) score and the Boston Carpal Tunnel Questionnaire (BCTQ) were tracked.
Results: Patient age ranged from 35 to 90 years (mean 67). Preoperative scores were 6.82 to 100 (50.50) QDASH, 1.36 to 5 (3.29) BCTQ symptom severity score (SSS), and 1 to 5 (2.84) BCTQ functional severity score (FSS). 26.2% were severe, 4.8% moderate-severe, 44.4% moderate, 5.6% mild-moderate, 2.4% mild, and 0.7% very mild carpal tunnel syndrome. At 2 weeks 92 patients had a minimal clinically important difference (MCID) (change >20 points) in the QDASH, 86 met MCID (change >1.14) in the SSS, and 88 met MCID (change in mean >0.74) in the FSS. The average change of QDASH, SSS, and FSS from preoperative were statistically significant at all time points with an average change of 47.52 (QDASH), 2.15 (SSS), and 1.72 (FSS) at 1 year follow up. Many patients were asymptomatic by 1 year with 88 patients with a QDASH score of 0, 73 with SSS score of 1, and 94 with FSS score of 1. Only 1 direct surgical complication 1 (cut palmaris longus tendon) with no residual functional limitations. Of those working, 92% were able to return to work within 2 weeks. 93.5% of patients were satisfied or very satisfied.
Conclusions: This case series of patients who were treated with US-CTR and followed for one year demonstrated that there was a significant improvement in QDASH, BCTQ SSS, and BCTQ FSS within 2 weeks and continued at least 1 year. There were very few complications and none with lasting effects. Additionally post-operatively, a majority of patients were able to return to work within 2 weeks and indicated at least 90% patient satisfaction.
Significance: For patients with carpal tunnel syndrome requiring surgical intervention, US-CTR is a possible treatment option to help improve patients’ function and quality of life with significant improvement within 2 weeks and minimal risk of complications.
TOPIC: Ultrasound
Anatomic Variability of Knee Genicular Nerves Under Ultrasound: Implications for Knee Radiofrequency
Hirotaka Nakagawa, MD
All Other Authors: Quinn Howard, MD, Whitney Liehr, DO, Katherine Rizzone, MD, MPH, and Daniel Herman, MD, PhD, FAMSSM
Affiliation: UC Davis Medical Center, Sacramento, CA.
Purpose: Fluoroscopy-guided knee radiofrequency ablation (RFA) has suboptimal success rates, potentially due to genicular nerve anatomic variability. This study used ultrasound to assess the anatomic positioning of the knee genicular nerves relative to traditional fluoroscopy-guided RFA targets.
Methods and Study Design: The location of the superior-lateral (SLGN), inferior-medial (IMGN), and superior-medial (SMGN) genicular nerves were assessed with ultrasound. Distances of the nerve to the intersection of tangential lines along the shaft and condyle were calculated using ImageJ software. ANOVA with Tukey’s LSD and χ2 tests were used for comparisons by nerve and across patient characteristics (alpha = 0.05).
Results: Fifty knees (M = 17, F = 33) from 34 patients (M = 11, age 70.1 ± 8.7 years, BMI 31.9 ± 8.1 kg/m2; F = 23, age 67.8 ± 11.0 years, BMI 31.8 ± 9.7) were assessed, with no significant differences observed in age or BMI by sex (P > 0.05). The SLGN (0.73 ± 0.53 cm) was farther from the reference point than the SMGN (0.48 ± 0.40 cm; P = 0.003) and IMGN (0.26 ± 0.17 cm; P P = 0.004). Using these distances, the proportions of nerves within a 1 cm3 ablation sphere of a theoretical cooled RFA probe located at the reference point were calculated. 72% and 28% of knees had at least one or 2 nerves outside the theoretical ablation sphere, respectfully. While the proportion of the SLGN (54%) outside the theoretical ablation sphere was not significantly different from the SMGN (38%; χ2 = 2.577, P = 0.108), both the SLGN (χ2 = 24.731, P 2 = 12.705, P P > 0.05).
Conclusions: This study highlights the variability in knee genicular nerve locations, with the SLGN and SMGN showing particularly high variability similar to prior dissection studies. While sex, age, BMI, and osteoarthritis severity did not influence these distances, a significant proportion of knees had at least one nerve outside of a theoretical ablation volume based on bony landmarks traditionally used with fluoroscopic guidance for RFA procedures.
Significance: Our findings emphasize the potential need for more accurate nerve localization to better capture these structures in an ablative volume. Ultrasound allows for nerve visualization which may address this anatomic variability and thus improve outcomes.
Acknowledgements: This study was supported in part by a grant from the American Medical Society for Sports Medicine’s Collaborative Research Network.