Lorentzendegn6645
In a proof-of-concept study, to quantify myocardial viability in patients with acute myocardial infarction using manganese-enhanced MRI (MEMRI), a measure of intracellular calcium handling.
Healthy volunteers (n=20) and patients with ST-elevation myocardial infarction (n=20) underwent late gadolinium enhancement (LGE) using gadobutrol and MEMRI using manganese dipyridoxyl diphosphate. Patients were scanned ≤7 days after reperfusion and rescanned after 3 months. Differential manganese uptake was described using a two-compartment model.
After manganese administration, healthy control and remote non-infarcted myocardium showed a sustained 25% reduction in T1 values (mean reductions, 288±34 and 281±12 ms). Infarcted myocardium demonstrated less T1 shortening than healthy control or remote myocardium (1157±74 vs 859±36 and 835±28 ms; both p<0.0001) with intermediate T1 values (1007±31 ms) in peri-infarct regions. Compared with LGE, MEMRI was more sensitive in detecting dysfunctional myocardium (dysfunctional fraction 40.5±11.9 vs 34.9%±13.9%; p=0.02) and tracked more closely with abnormal wall motion (r
=0.72 vs 0.55; p<0.0001). Kinetic modelling showed reduced myocardial manganese influx between remote, peri-infarct and infarct regions, enabling absolute discrimination of infarcted myocardium. After 3 months, manganese uptake increased in peri-infarct regions (16.5±3.5 vs 22.8±3.5 mL/100 g/min, p<0.0001), but not the remote (23.3±2.8 vs 23.0±3.2 mL/100 g/min, p=0.8) or infarcted (11.5±3.7 vs 14.0±1.2 mL/100 g/min, p>0.1) myocardium.
Through visualisation of intracellular calcium handling, MEMRI accurately differentiates infarcted, stunned and viable myocardium, and correlates with myocardial dysfunction better than LGE. MEMRI holds major promise in directly assessing myocardial viability, function and calcium handling across a range of cardiac diseases.
NCT03607669; EudraCT number 2016-003782-25.
NCT03607669; EudraCT number 2016-003782-25.Although primarily affecting the respiratory system, COVID-19 causes multiple organ damage. One of its grave consequences is a prothrombotic state that manifests as thrombotic, microthrombotic and thromboembolic events. Therefore, understanding the effect of antiplatelet and anticoagulation therapy in the context of COVID-19 treatment is important. The aim of this rapid review was to highlight the role of thrombosis in COVID-19 and to provide new insights on the use of antithrombotic therapy in its management. A rapid systematic review was performed using preferred reporting items for systematic reviews. Papers published in English on antithrombotic agent use and COVID-19 complications were eligible. Results showed that the use of anticoagulants increased survival and reduced thromboembolic events in patients. However, despite the use of anticoagulants, patients still suffered thrombotic events likely due to heparin resistance. Data on antiplatelet use in combination with anticoagulants in the setting of COVID-19 are quite scarce. Current side effects of anticoagulation therapy emphasise the need to update treatment guidelines. In this rapid review, we address a possible modulatory role of antiplatelet and anticoagulant combination against COVID-19 pathogenesis. CDK4/6-IN-6 This combination may be an effective form of adjuvant therapy against COVID-19 infection. However, further studies are needed to elucidate potential risks and benefits associated with this combination.
To characterize functional network changes related to conversion to cognitive impairment in a large sample of MS patients over a period of 5 years.
227 MS patients and 59 healthy controls (HCs) of the Amsterdam MS cohort underwent neuropsychological testing and resting-state fMRI at two time points (time-interval 4.9±0.9 years). At both baseline and follow-up, patients were categorized as cognitively preserved (CP, N=123), mildly impaired (MCI, Z<-1.5 on ≥2 cognitive tests, N=32) or impaired (CI, Z<-2 on ≥2 tests, N=72) and longitudinal conversion between groups was determined. Network function was quantified using eigenvector centrality, a measure of regional network importance, which was computed for individual resting-state networks at both time-points.
Over time, 18.9% of patients converted to a worse phenotype; 22/123 CP patients (17.9%) converted from CP to MCI, 10/123 from CP to CI (8.1%) and 12/32 MCI patients converted to CI (37.5%). At baseline, DMN centrality was higher in CI compared tclinically progress.
To investigate the relationship between late-life duration of poverty exposure and cognitive function and decline among older adults in China.
Data were from 3,209 participants aged ≥64 in the Chinese Longitudinal Healthy Longevity Survey (CLHLS). Duration of poverty, defined according to urban and rural regional standards from the China Statistical Yearbook, was assessed based on annual household income from 2005-2011 (never in poverty; 1/3 of the period in poverty; ≥2/3 of the period in poverty). Cognitive function was measured by the Chinese Mini Mental State Exam (CMMSE) from 2011-2018. We used attrition-weighted, multivariable mixed-effects Tobit regression to examine the association of duration of poverty with cognitive function and rate of decline.
A total of 1,162 individuals (36.21%) were never in poverty over the period from 2005-2011, 1,172 (36.52%) were in poverty 1/3 of the period, and 875 (27.27%) were in poverty ≥2/3 of the period. A longer poverty duration was associated with lower subsequent CMMSE scores with a dose-response relationship (1/3 vs. never in poverty β = -0.98; 95% CI -1.61 to -0.35; ≥2/3 vs. never in poverty β = -1.55; 95% CI -2.29 to -0.81). However, a longer duration of poverty was associated with a slower rate of CMMSE score decline over time from 2011-2018.
These findings provide valuable evidence on the role of cumulative late-life poverty in relation to cognitive health among older adults in a rapidly urbanizing and aging middle-income country. Our findings may support a
hypothesis for cognitive reserve in this setting.
These findings provide valuable evidence on the role of cumulative late-life poverty in relation to cognitive health among older adults in a rapidly urbanizing and aging middle-income country. Our findings may support a compensation hypothesis for cognitive reserve in this setting.