The latest medical research on Sports & Exercise Medicine

The research magnet gathers the latest research from around the web, based on your specialty area. Below you will find a sample of some of the most recent articles from reputable medical journals about sports & exercise medicine gathered by our medical AI research bot.

The selection below is filtered by medical specialty. Registered users get access to the Plexa Intelligent Filtering System that personalises your dashboard to display only content that is relevant to you.

Want more personalised results?

Request Access

Salt Loading Blunts Central and Peripheral Postexercise Hypotension.

Medicine and Science in Sports

High salt intake is a widespread cardiovascular risk factor with systemic effects. These effects include an expansion of plasma volume, which may interfere with postexercise hypotension (PEH). However, the effects of high salt intake on central and peripheral indices of PEH remain unknown. We tested the hypothesis that high salt intake would attenuate central and peripheral PEH.

Nineteen healthy adults (7F/12M, age=25±4 yrs; BMI=23.3±2.2 kg•m; VO2peak=41.6±8.7 mL•min•kg; SBP=112±9 mmHg; DBP=65±9 mmHg) participated in this double-blind, randomized, placebo-controlled crossover study. Participants were asked to maintain a 2300 mg/d sodium diet for 10 days on two occasions separated by ≥two weeks. Total salt intake was manipulated via ingestion of capsules containing either table salt (3900 mg/d) or placebo (dextrose) during each diet. On the 10 day, participants completed 50 minutes of cycling at 60% VO2peak. A subset of participants (n=8) completed 60 minutes of seated rest (sham trial). Beat-to-beat blood pressure (BP) was measured in-lab for 60 minutes post-exercise via finger photoplethysmography. Brachial and central BP were measured for 24 hours post-exercise via ambulatory BP monitor.

Ten days of high salt intake increased urinary sodium excretion (dextrose=134±70 vs. salt=284±74 mmol•24H, p<0.001), expanded plasma volume (7.2±10.8%), and abolished PEH during in-lab BP monitoring (main effect of diet: p<0.001). Ambulatory systolic BPs were higher for 12 hours following exercise during the salt and sham trials compared to the dextrose trial (average change, dextrose=3.6±2.1, salt=9.9±1.4, sham=9.8±2.5 mmHg; p=0.01). Ambulatory central systolic BP was also higher during the salt trial compared to dextrose trial.

High salt intake attenuates peripheral and central PEH, potentially reducing the beneficial cardiovascular effects of acute aerobic exercise.

Cognitive Impairment during High-Intensity Exercise: Influence of Cerebral Blood Flow.

Medicine and Science in Sports

Cognitive performance appears to be impaired during high-intensity exercise, and this occurs concurrently with a reduction in cerebral blood flow (CBF). However, it is unclear whether cognitive impairment during high-intensity exercise is associated with reduced CBF. We tested the hypothesis that a reduction in CBF is responsible for impaired cognitive performance during high-intensity exercise.

Using a randomized crossover design seventeen healthy males performed spatial delayed-response (DR) and Go/No-Go tasks in three conditions [Exercise (EX), Exercise+CO2 (EX+CO2), and a non-exercising Control (CON)]. In the EX and EX+CO2, they performed cognitive tasks at rest and during 8-mins of moderate and high-intensity exercise. Exercise intensity corresponded to ~50% (moderate) and ~80% (high) of peak oxygen uptake. In the EX+CO2, the participants inspired hypercapnic gas (2% CO2) during high-intensity exercise. In the CON, they performed the cognitive tasks without exercise.

Middle cerebral artery mean velocity (MCAv) increased during high-intensity exercise in the EX+CO2 relative to the EX [69.4 (10.6) cms, vs. 57.2 (7.7) cms, P < 0.001]. Accuracy of the cognitive tasks was impaired during high-intensity exercise in the EX [84.1 (13.3) %, P < 0.05] and the EX+CO2 [85.7 (11.6) %, P < 0.05] relative to rest [EX: 95.1 (5.3) %, EX+CO2: 95.1 (5.3) %]. However, no differences between the EX and the EX+CO2 were observed (P > 0.10). These results demonstrate that restored CBF did not prevent cognitive impairment during high-intensity exercise.

We conclude that a reduction in CBF is not responsible for impaired cognitive performance during high-intensity exercise.

Exercise Thermoregulation with a Simulated Burn Injury: Impact of Air Temperature.

Medicine and Science in Sports

The U.S. Army's Standards of Medical Fitness (AR 40-501) states: "Prior burn injury (to include donor sites) involving a total body surface area of 40 percent or more does not meet the standard". However, the standard does not account for the interactive effect of burn injury size and air temperature on exercise thermoregulation.

To evaluate whether the detrimental effect of a simulated burn injury on exercise thermoregulation is dependent upon air temperature.

On eight occasions, nine males cycled for 60 min at a fixed metabolic heat production (6 W·kg) in air temperatures of 40°C or 25°C with simulated burn injuries of 0% (Control), 20%, 40%, or 60% of total body surface area (TBSA). Burn injuries were simulated by covering the skin with an absorbent, vapor-impermeable material to impede evaporation from the covered areas. Core temperature was measured in the gastrointestinal tract via telemetric pill.

In 40°C conditions, greater elevations in core temperature were observed with 40% and 60% TBSA simulated burn injuries vs. Control (P < 0.01). However, at 25°C, core temperature responses were not different vs. Control with 20%, 40%, and 60% TBSA simulated injuries (P = 0.97). The elevation in core temperature at the end of exercise was greater in the 40°C environment with 20%, 40% and 60% TBSA simulated burn injuries (P ≤ 0.04).

Simulated burn injuries ≥20% TBSA exacerbate core temperature responses in hot, but not temperate, air temperatures. These findings suggest that the U.S. Army's standard for inclusion of burned soldiers is appropriate for hot conditions, but could lead to the needless discharge of soldiers who could safely perform their duties in cooler training/operational settings.

Lifelong Physical Activity Determines Vascular Function in Late Postmenopausal Women.

Medicine and Science in Sports

The study evaluated the role of life-long physical activity for leg vascular function in post-menopausal women (61±1 years).

The study design was cross-sectional with 3 different groups based on their self-reported physical activity level with regard to intensity and volume over the past decade: Inactive (n=14); Moderately active (n=12) and; Very active (n=15). Endothelial dependent and smooth muscle dependent leg vascular function were assessed by ultrasound doppler measurements of the femoral artery during infusion of acetylcholine, the nitric oxide (NO) donor sodium nitroprusside and the prostacyclin analog epoprostenol. Thigh muscle biopsies and venous plasma samples were obtained for assessment of vasodilator systems.

The very active group was found to have ~ 76% greater responsiveness to acetylcholine compared to the sedentary group accompanied by a ~200% higher prostacyclin synthesis during ach infusion. Smooth muscle cell responsiveness to sodium nitroprusside and epoprostenol was not different between groups. The protein amount of endothelial nitric oxide synthase and endogenous antioxidant enzymes in muscle tissue was higher in the very active than the inactive group. The moderately active group had a similar endothelial and smooth muscle cell responsiveness as the inactive group. A secondary comparison with a smaller group (n=5) of habitually active young (24±2 yrs) women indicated that smooth muscle cell responsiveness and endothelial responsiveness is affected by age per se.

This study shows that leg vascular function, and a greater potential to form prostacyclin and NO in late post-menopausal women, is influenced by the extent of life-long physical activity.

Interval Exercise Lowers Circulating CD105 Extracellular Vesicles in Prediabetes.

Medicine and Science in Sports

Extracellular vesicles (EVs) are purported to mediate type 2 diabetes (T2D) and cardiovascular disease (CVD) risk and development. Physical activity and a balanced diet reduce disease risk, but no study has tested the hypothesis that short-term interval (INT) training would reduce EVs compared with continuous (CONT) exercise in adults with prediabetes.

Eighteen obese adults (age: 63.8±1.5yrs BMI: 31.0±1.3 kg/m) were screened for prediabetes using American Diabetes Association criteria (75g OGTT). Subjects were randomized to INT (n=10, alternating 3 min intervals at 90% and 50% HRpeak,, respectively) or CONT (n=8, 70% HRpeak) training for 12 supervised sessions over 13 d for 60 min/d. Cardiorespiratory fitness (VO2peak), weight (kg), as well as ad-libitum dietary intake were assessed and arterial stiffness (augmentation index via applanation tonometry; AIx) was calculated using total AUC during a 75g OGTT performed 24 hr following the last exercise bout. Total EVs, platelet EVs (CD31/CD41), endothelial EVs (CD105; CD31/ CD41), platelet endothelial cell adhesion molecule (PECAM) (CD31) and leukocyte EVs (CD45; CD45/CD41) were analyzed via imaging flow cytometry pre-/post- intervention.

INT exercise increased VO2peak (P=0.04) compared to CONT training. While training had no effect on platelet or leukocyte EVs, INT decreased Annexin V- endothelial EV CD105 compared with CONT (P=0.04). However, after accounting for dietary sugar intake the intensity effect was lost (P=0.18). Increased ad-libitum dietary sugar intake following training was linked to elevated AV+CD105 (r=0.49, P=0.06) and AV-CD45 (r=0.59, P=0.01). Nonetheless, increased VO2peak correlated with decreased AV+ CD105 (r=-0.60, P=0.01).

Interval exercise training decreases endothelial derived EVs in adults with prediabetes. Although increased sugar consumption may alter EVs following a short-term exercise intervention, fitness modifies EV count.

Oral L-Tyrosine Supplementation Improves Core Temperature Maintenance in Older Adults.

Medicine and Science in Sports

During cold exposure, an increase in sympathetic nerve activity evokes vasoconstriction (VC) of cutaneous vessels to minimize heat loss. In older adults, this reflex VC response is impaired thereby increasing their susceptibility to excess heat loss and hypothermia. Since L-tyrosine, the amino acid substrate necessary for catecholamine production, has been shown to augment reflex VC in aged skin, we hypothesize that oral ingestion of L-tyrosine will attenuate the decline in core temperature (Tc) during whole-body cooling in older adults.

In a randomized, double-blind design, nine young (25 ± 3 years) and nine older (72 ± 8 years) participants ingested either 150 mg/kg of L-tyrosine or placebo prior to commencing 90 minutes of whole-body cooling to decrease skin temperature to ~29.5C. Esophageal temperature and forearm laser Doppler flux (LDF) were measured continuously throughout the protocol to provide an index of Tc and skin blood flow, respectively. The change in esophageal temperature (ΔTES) was the difference in temperature at the end of cooling subtracted from baseline. Cutaneous vascular conductance (CVC) was calculated as CVC = LDF/mean arterial pressure and expressed as a percent change from baseline (%ΔCVCBASELINE).

Oral tyrosine ingestion augmented the cutaneous VC response to cooling in older adults (Placebo = 14.4 ± 2.0, Tyrosine = 32.7 ± 1.7 %ΔCVCBASELINE; p<0.05). Additionally, tyrosine improved Tc maintenance throughout cooling in older adults (Placebo = -0.29 ± 0.07, Tyrosine = -0.07 ± 0.07 ΔTES; p<0.05). Both the cutaneous VC and Tc during cooling were similar between young and older adults supplemented with tyrosine (p>0.05).

These results indicate that L-tyrosine supplementation improves Tc maintenance in response to acute cold exposure in an older population.

Compression Garments Reduce Muscle Movement and Activation during Submaximal Running.

Medicine and Science in Sports

The purpose of this study was to investigate the effectiveness of sports compression tights in reducing muscle movement and activation during running.

A total of 27 recreationally-active males were recruited across two separate studies. For study one, 13 participants (mean ± SD; 84.1 ± 9.4 kg, 22 ± 3 y) completed two 4-min treadmill running bouts (2 min at 12 km.h and 15 km.h) under two conditions; a no-compression control (CON1) and compression (COMP). For study two, 14 participants (77.8 ± 8.4 kg, 27 ± 5 y) completed four 9-min treadmill running bouts (3 min at 8 km.h, 10 km.h, and 12 km.h) under four conditions; a no-compression control (CON2) and three different commercially-available compression tights (2XU; Nike; Under Armor, UA). Using Vicon 3D motion capture technology, lower-limb muscle displacement was investigated in both study one (thigh and calf) and two (vastus lateralis + medialis, VAS; lateral + medial gastrocnemius, GAS). In addition, study two investigated the effects of compression on soft-tissue vibrations (root mean square of resultant acceleration, RMS Ar), muscle activation (iEMG), and running economy (oxygen consumption, V[Combining Dot Above]O2) during treadmill running.

Wearing compression during treadmill running reduced thigh and calf muscle displacement as compared with no compression (both studies), which was evident across all running speeds. Compression also reduced RMS Ar and iEMG during treadmill running, but had no effect on running economy (study two).Lower-limb compression garments are effective in reducing muscle displacement, soft-tissue vibrations, and muscle activation associated with the impact forces experienced during running.

Humeral Retroversion and Injury Risk After Proximal Humeral Epiphysiolysis (Little Leaguer's Shoulder).

Am J Sports Med

The increased humeral retroversion on the dominant side of throwing athletes is thought to result from repetitive throwing motion. Little Leaguer's shoulder-a rotational stress fracture of the proximal humeral epiphyseal plate-may influence the risk of humeral retroversion and injury of the shoulder or elbow joint.

To investigate the effect of Little Leaguer's shoulder on humeral retroversion and the rates of shoulder and elbow injuries.

Cohort study; Level of evidence, 3.

10 high school baseball players (average age, 16.6 years; range, 16-18 years) who had experienced Little Leaguer's shoulder during elementary or junior high school (average age, 12.6 years; range, 11-15 years) were enrolled in the study. As a control group, 22 high school baseball players (average age, 16.9 years; range, 16-18 years) who had never had any shoulder or elbow injury during elementary and junior high school were included. Humeral retroversion on ultrasonographic measurement, shoulder range of motion, and rates of shoulder and elbow injuries were evaluated.

Humeral retroversion was significantly greater on the dominant side than on the nondominant side in both players with Little Leaguer's shoulder (dominant, 104°± 8°; nondominant, 84°± 12°; P < .001) and controls (dominant, 91°± 13°; nondominant, 81°± 10°; P < .001). In the dominant shoulder, humeral retroversion was greater in the Little Leaguer's shoulder group than in the control group (P = .008). When the effects of humeral retroversion were excluded, maximal external rotation was significantly less in the dominant shoulder than in the nondominant shoulder in the Little Leaguer's shoulder group (by 11°± 12°, P = .02), whereas no significant difference was found between dominant (110°± 11°) and nondominant (111°± 13°) shoulders in the control group (P = .64). The rates of shoulder and elbow pain were significantly higher in the Little Leaguer's shoulder group (shoulder pain 80%, elbow pain 70%) than in the control group (shoulder pain 9%, P < .001; elbow pain 32%, P = .04).

Humeral retroversion was increased in baseball players without any history of shoulder or elbow injury during elementary and junior high school and was further increased in players who had had Little Leaguer's shoulder. Increased humeral retroversion after Little Leaguer's shoulder may be a risk factor for future shoulder or elbow injury.

The Effect of Preexisting and Shoulder-Specific Depression and Anxiety on Patient-Reported Outcomes After Arthroscopic Rotator Cuff Repair.

Am J Sports Med

Few studies have considered the potential effect of depression or anxiety on outcomes after rotator cuff repair.

To evaluate the effect of a preexisting diagnosis of depression or anxiety, as well as the feeling of depression and anxiety directly related to the shoulder, on the American Shoulder and Elbow Surgeons (ASES) score.

Cohort study; Level of evidence, 3.

This study is a retrospective review of prospectively collected data on patients who underwent arthroscopic rotator cuff repair and were evaluated by the ASES score preoperatively and at a minimum 12 months postoperatively as part of the senior author's shoulder registry. Preexisting diagnoses of depression and/or anxiety were recorded, and questions from the Western Ontario Rotator Cuff Index directed at feelings of depression or anxiety related to the shoulder were also evaluated. The Wilcoxon rank sum test was used to compare ASES scores between patients with and without anxiety and/or depression. Spearman correlation was used to correlate questions on depression and anxiety with ASES scores.

A total of 187 patients (63 females, 124 males; mean age, 58.6 years, SD, 8.7 years) undergoing arthroscopic rotator cuff repair were evaluated with a mean follow-up of 47.5 months (SD, 17.4 months; range, 12-77 months). Fifty-three patients (mean age, 60 years; SD, 8.6 years) had preexisting diagnoses of depression and/or anxiety and 134 patients (mean age, 58.1 years; SD, 8.7 years) did not. Patients with depression and/or anxiety had significantly lower preoperative and postoperative ASES scores (60.7 vs 67.8, P = .014; and 74.6 vs 87.1, P = .008, respectively). The change in ASES scores from preoperative to postoperative, however, was not significantly different (18.0 vs 14.9). A higher score of depression or anxiety related to the shoulder had a negative correlation with the preoperative (r = -0.76, P < .0001; and r = -0.732, P < .0001, respectively) and postoperative (r = -0.31, P = .0001; and r = -0.31, P = .0003, respectively) ASES scores, but a positive correlation (r = 0.50, P < .0001; and r = 0.43, P < .0001, respectively) with the change in ASES scores.

Patients with a history of depression and/or anxiety have lower outcome scores preoperatively and postoperatively; however, they should expect the same amount of relief from arthroscopic rotator cuff repair as those without a history of depression or anxiety. Stronger feelings of depression or anxiety directly related to the shoulder correlated with lower preoperative and postoperative outcome scores, but a greater amount of improvement from surgery. The results from this study suggest that a preexisting diagnosis of depression or anxiety, as well as feelings of depression or anxiety directly related to the shoulder, should be considered during the management of patients with rotator cuff tears.

Are Implant Choice and Surgical Approach Associated With Biceps Tenodesis Construct Strength? A Systematic Review and Meta-regression.

Am J Sports Med

Despite the increasing use of biceps tenodesis, there is a lack of consensus regarding optimal implant choice (suture anchor vs interference screw) and implant placement (suprapectoral vs subpectoral).

The purpose was to determine the associations of procedural parameters with the biomechanical performance of biceps tenodesis constructs. The authors hypothesized that ultimate failure load (UFL) would not differ between sub- and suprapectoral repairs or between interference screw and suture anchor constructs and that the number of implants and number of sutures would be positively associated with construct strength.

Meta-analysis.

The authors conducted a systematic literature search for studies that measured the biomechanical performance of biceps tenodesis repairs in human cadaveric specimens. Two independent reviewers extracted data from studies that met the inclusion criteria. Meta-regression was then performed on the pooled data set. Outcome variables were UFL and mode of failure. Procedural parameters (fixation type, fixation site, implant diameter, and numbers of implants and sutures used) were included as covariates. Twenty-five biomechanical studies, representing 494 cadaveric specimens, met the inclusion criteria.

The use of interference screws (vs suture anchors) was associated with a mean 86 N-greater UFL (95% CI, 34-138 N; P = .002). Each additional suture used to attach the tendon to the implant was associated with a mean 53 N-greater UFL (95% CI, 24-81 N; P = .001). Multivariate analysis found no significant association between fixation site and UFL. Finally, the use of suture anchors and fewer number of sutures were both independently associated with lower odds of native tissue failure as opposed to implant pullout.

These findings suggest that fixation with interference screws, rather than suture anchors, and the use of more sutures are associated with greater biceps tenodesis strength, as well as higher odds of native tissue failure versus implant pullout. Although constructs with suture anchors show inferior UFL compared with those with interference screws, incorporation of additional sutures may increase the strength of suture anchor constructs. Supra- and subpectoral repairs provide equivalent biomechanical strength when controlling for potential confounders.

Risk Factors for Loss to Follow-up in 3202 Patients at 2 Years After Anterior Cruciate Ligament Reconstruction: Implications for Identifying Health Disparities in the MOON Prospective Cohort Study.

Am J Sports Med

Understanding the risk factors for loss to follow-up in prospective clinical studies may allow for a targeted approach to minimizing follow-up bias and improving the generalizability of conclusions in anterior cruciate ligament reconstruction (ACLR) and other sports-related interventions.

To identify independent risk factors associated with failure to complete (ie, loss to follow-up) patient-reported outcome measures (PROMs) at 2 years after ACLR within a well-funded prospective longitudinal cohort.

Cohort study (prognosis); Level of evidence, 2.

All patients undergoing primary or revision ACLR enrolled in the prospectively collected database of the multicenter consortium between 2002 and 2008 were included. Multivariate regression analyses were conducted to determine which baseline risk factors were significantly associated with loss to follow-up at a minimum of 2 years after surgery. Predictors assessed for loss to follow-up were as follows: consortium site, sex, race, marital status, smoking status, phone number provided (home or cell), email address provided (primary or secondary), years of school completed, average hours worked per week, working status (full-time, part-time, homemaker, retired, student, or disabled), number of people living at home, and preoperative PROMs (Knee injury and Osteoarthritis Outcome Score, Marx Activity Rating Scale, and International Knee Documentation Committee).

A total of 3202 patients who underwent ACLR were enrolled. The 2-year PROM follow-up rate for this cohort was 88% (2821 of 3202). Multivariate analyses showed that patient sex (male: odds ratio [OR], 1.80) and race (black: OR, 3.64; other nonwhite: OR, 1.81) were independent predictors of 2-year loss to follow-up of PROMs. Education level was a nonconfounder.

While education level did not predict loss to follow-up, patients who are male and nonwhite are at increased risk of loss to follow-up of PROM at 2 years. Capturing patient outcomes with minimal loss depends on equitable, not equal, opportunity to maximize generalizability and mitigate potential population-level health disparities.

NCT00478894 (ClinicalTrials.gov identifier).

Nonoperative Treatment of Elbow Ulnar Collateral Ligament Injuries With and Without Platelet-Rich Plasma in Professional Baseball Players: A Comparative and Matched Cohort Analysis.

Am J Sports Med

Recent studies evaluating nonoperative treatment of elbow ulnar collateral ligament (UCL) injuries augmented with platelet-rich plasma (PRP) have shown promising results. To date, no comparative studies have been performed on professional baseball players who have undergone nonoperative treatment with or without PRP injections for UCL injuries.

Players who received PRP injections would have better outcomes than those who did not receive PRP.

Cohort study; Level of evidence, 3.

The Major League Baseball (MLB) Health and Injury Tracking System identified 544 professional baseball players who were treated nonoperatively for elbow UCL injuries between 2011 and 2015. Of these, 133 received PRP injections (PRP group) before starting their nonoperative treatment program, and 411 did not (no-PRP group). Player outcomes and a Kaplan-Meier survival analysis were compared between groups. In addition, to reduce selection bias, a 1:1 matched comparison of the PRP group versus the no-PRP group was performed. Players were matched by age, position, throwing side, and league status: major (MLB) and minor (Minor League Baseball [MiLB]). A single radiologist with extensive experience in magnetic resonance imaging (MRI) interpretation of elbow injuries in elite athletes analyzed 243 MRI scans for which images were accessible for tear location and grade interpretation.

Nonoperative treatment of UCL injuries resulted in an overall 54% rate of return to play (RTP). Players who received PRP had a significantly longer delay in return to throwing (P < .001) and RTP (P = .012). The matched cohort analysis showed that MLB and MiLB pitchers in the no-PRP group had a significantly faster return to throwing (P < .05) and the MiLB pitchers in the no-PRP group had a significantly faster RTP (P = .045). The survival analysis did not reveal significant differences between groups over time. The use of PRP, MRI grade, and tear location were not statistically significant predictors for RTP or progression to surgery.

In this retrospective matched comparison of MLB and MiLB pitchers and position players treated nonoperatively for a UCL tear, PRP did not improve RTP outcomes or ligament survivorship, although there was variability with respect to PRP preparations, injection protocols, time from injury to injection, and rehabilitation programs. MRI grade and tear location also did not significantly affect RTP outcomes or progression to surgery.