Secherbarber6399

Z Iurium Wiki

Choroideremia (CHM) is an X-linked chorioretinal dystrophy caused by variants in the CHM gene. The aim of this study was to report the clinical and genetic features of a cohort of affected males with CHM and establish the relationship between best correct visual acuity (BCVA) and age.

Twenty-seven patients from 24 unrelated families underwent detailed ophthalmic examinations and comprehensive molecular genetic analysis. We combined the 27 patients in our own cohort with 68 Chinese patients from six previously reported studies to determine a transition age for BCVA rapid decline in 95 patients.

Twenty-three causal (9 novel) CHM variants were identified in the 27 patients, who had a mean age of 30.5 ± 17.4 years and a mean BCVA (LogMAR) of 0.61 ± 0.79. Patients at different disease stages showed different extents of retinal pigment epithelium (RPE) and choroid abnormalities. Central retinal optical coherence tomography (OCT) scanning revealed defects in the ellipsoid zone and RPE in all patients and outer retinal tubulations in 75%. The 95 patients had a mean age of 33.27 ± 16.27 years and an average (LogMAR) of 0.72 ± 0.82. The BCVA did not decline rapidly before age 25, but decreased at a mean rate of 0.037logMAR/year after that age.

Our results indicated Chinese patients with CHM variants have a younger transition age for rapid BCVA decline than previously reported for other ethnic groups. Central retinal OCT scanning can identify different abnormalities in the retinal structures, and these might be used as other parameters for monitoring disease progression in patients with CHM.

Our results indicated Chinese patients with CHM variants have a younger transition age for rapid BCVA decline than previously reported for other ethnic groups. Central retinal OCT scanning can identify different abnormalities in the retinal structures, and these might be used as other parameters for monitoring disease progression in patients with CHM.

Comparison of diabetic retinopathy (DR) severity between autonomous Artificial Intelligence (AI)-based outputs from an FDA-approved screening system and human retina specialists' gradings from ultra-widefield (UWF) colour images.

Asymptomatic diabetics without a previous diagnosis of DR were included in this prospective observational pilot study. Patients were imaged with autonomous AI (IDx-DR, Digital Diagnostics). For each eye, two 45° colour fundus images were analysed by a secure server-based AI algorithm. UWF colour fundus imaging was performed using Optomap (Daytona, Optos). RO 7496998 The International Clinical DR severity score was assessed both on a 7-field area projection (7F-mask) according to the early treatment diabetic retinopathy study (ETDRS) and on the total gradable area (UWF full-field) up to the far periphery on UWF images.

Of 54 patients included (n = 107 eyes), 32 were type 2 diabetics (11 females). Mean BCVA was 0.99 ± 0.25. Autonomous AI diagnosed 16 patients as negative, 28 for moderate DR and 10 for having a vision-threatening disease (severe DR, proliferative DR, diabetic macular oedema). Based on the 7F-mask grading with the eye with the worse grading defining the DR stage 23 patients were negative for DR, 11 showed mild, 19 moderate and 1 severe DR. When UWF full-field was analysed, 20 patients were negative for DR, while the number of mild, moderate and severe DR patients were 12, 21, and 1, respectively.

The autonomous AI-based DR examination demonstrates sufficient accuracy in diagnosing asymptomatic non-proliferative diabetic patients with referable DR even compared to UWF imaging evaluated by human experts offering a suitable method for DR screening.

The autonomous AI-based DR examination demonstrates sufficient accuracy in diagnosing asymptomatic non-proliferative diabetic patients with referable DR even compared to UWF imaging evaluated by human experts offering a suitable method for DR screening.

Anti-vascular endothelial growth factor (VEGF) treatments are the first-line treatment for Retinal Vein Occlusion (RVO). Although effectiveness and safety of these treatments is well documented, knowledge regarding the effect of lapses in anti-VEGF treatment among RVO patients is lacking. The purpose of this study is to analyse the anatomic and visual outcomes from a lapse in anti-VEGF treatment in patients with RVO.

This retrospective case-control study evaluated 136 patients diagnosed with RVO and treated with anti-VEGF between January 2012 and June 2020 at Cole Eye Institute, Cleveland Clinic. Patients were divided into two cohorts RVO patients with no lapse in anti-VEGF treatment (control group) and RVO patients with a lapse ≥3 months (lapse group). Central subfield thickness (CST) and best corrected visual acuity (BCVA) were collected pre-lapse, the first appointment post-lapse, and at 3-, 6-, and 12-month follow-up appointments.

Lapse patients (n = 68) and control patients (n = 68) had similar pre-lapse CST (p = 0.466) and BCVA (p = 0.303). Lapse patients experienced a significant increase in CST after discontinuing anti-VEGF therapy (lapse 400.6 ± 192.1 µm, control 333.0 ± 111.1 µm, p = 0.024). This persisted 12 months post-lapse after re-initiation of anti-VEGF agents (lapse 381.6 ± 161.1 µm, control 307.5 ± 95.4 µm, p = 0.030). Lapse patients also experienced a decrease in BCVA after lapse (lapse 54.3 ± 25.1 ETDRS, control 64.4 ± 17.8 ETDRS, p < 0.001) that recovered after 6 months of anti-VEGF treatment.

RVO patients with any lapse of anti-VEGF treatment are at risk for poorer anatomic and visual outcomes. Though BCVA normalizes upon treatment resumption, patients experience a statistically significant increase in CST that does not recover.

RVO patients with any lapse of anti-VEGF treatment are at risk for poorer anatomic and visual outcomes. Though BCVA normalizes upon treatment resumption, patients experience a statistically significant increase in CST that does not recover.The role of natural selection in shaping spatial patterns of genetic diversity in the Neotropics is still poorly understood. Here, we perform a genome scan with 24,751 probes targeting 11,026 loci in two Neotropical Bignoniaceae tree species Handroanthus serratifolius from the seasonally dry tropical forest (SDTF) and Tabebuia aurea from savannas, and compared with the population genomics of H. impetiginosus from SDTF. OutFLANK detected 29 loci in 20 genes with selection signal in H. serratifolius and no loci in T. aurea. Using BayPass, we found evidence of selection in 335 loci in 312 genes in H. serratifolius, 101 loci in 92 genes in T. aurea, and 448 loci in 416 genes in H. impetiginosus. All approaches evidenced several genes affecting plant response to environmental stress and primary metabolic processes. The three species shared no SNPs with selection signal, but we found SNPs affecting the same gene in pair of species. Handroanthus serratifolius showed differences in allele frequencies at SNPs with selection signal among ecosystems, mainly between Caatinga/Cerrado and Atlantic Forest, while H. impetiginosus had one allele fixed across all populations, and T. aurea had similar allele frequency distribution among ecosystems and polymorphism across populations. Taken together, our results indicate that natural selection related to environmental stress shaped the spatial pattern of genetic diversity in the three species. However, the three species have different geographical distribution and niches, which may affect tolerances and adaption, and natural selection may lead to different signatures due to the differences in adaptive landscapes in different niches.Among all the nutrients, nitrogen (N) and phosphorous (P) are the most limiting factors reducing wheat production and productivity world-wide. These macronutrients are directly applied to soil in the form of fertilizers. However, only 30-40% of these applied fertilizers are utilized by crop plants, while the rest is lost through volatilization, leaching, and surface run off. Therefore, to overcome the deficiency of N and P, it becomes necessary to improve their use efficiency. Marker-assisted selection (MAS) combined with traditional plant breeding approaches is considered best to improve the N and P use efficiency (N/PUE) of wheat varieties. In this study, we developed and evaluated a total of 98 simple sequence repeat (SSR) markers including 66 microRNAs and 32 gene-specific SSRs on a panel of 10 (N and P efficient/deficient) wheat genotypes. Out of these, 35 SSRs were found polymorphic and have been used for the study of genetic diversity and population differentiation. A set of two SSRs, namely miR171a and miR167a were found candidate markers able to discriminate contrasting genotypes for N/PUE, respectively. Therefore, these two markers could be used as functional markers for characterization of wheat germplasm for N and P use efficiency. Target genes of these miRNAs were found to be highly associated with biological processes (24 GO terms) as compared to molecular function and cellular component and shows differential expression under various P starving conditions and abiotic stresses.The dominance effect is considered to be a key factor affecting complex traits. However, previous studies have shown that the improvement of the model, including the dominance effect, is usually less than 1%. This study proposes a novel genomic prediction method called CADM, which combines additive and dominance genetic effects through locus-specific weights on heterozygous genotypes. To the best of our knowledge, this is the first study of weighting dominance effects for genomic prediction. This method was applied to the analysis of chicken (511 birds) and pig (3534 animals) datasets. A 5-fold cross-validation method was used to evaluate the genomic predictive ability. The CADM model was compared with typical models considering additive and dominance genetic effects (ADM) and the model considering only additive genetic effects (AM). Based on the chicken data, using the CADM model, the genomic predictive abilities were improved for all three traits (body weight at 12th week, eviscerating percentage, and breast muscle percentage), and the average improvement in prediction accuracy was 27.1% compared with the AM model, while the ADM model was not better than the AM model. Based on the pig data, the CADM model increased the genomic predictive ability for all the three pig traits (trait names are masked, here designated as T1, T2, and T3), with an average increase of 26.3%, and the ADM model did not improve, or even slightly decreased, compared with the AM model. The results indicate that dominant genetic variation is one of the important sources of phenotypic variation, and the novel prediction model significantly improves the accuracy of genomic prediction.Traumatic brain injury (TBI) survivors suffer from long-term disability and neuropsychiatric sequelae due to irreparable brain tissue destruction. However, there are still few efficient therapies to promote neurorestoration in damaged brain tissue. This study aimed to investigate whether the pro-oncogenic gene ski can promote neurorestoration after TBI. We established a ski-overexpressing experimental TBI mouse model using adenovirus-mediated overexpression through immediate injection after injury. Hematoxylin-eosin staining, MRI-based 3D lesion volume reconstruction, neurobehavioral tests, and analyses of neuronal regeneration and astrogliosis were used to assess neurorestorative efficiency. The effects of ski overexpression on the proliferation of cultured immature neurons and astrocytes were evaluated using imaging flow cytometry. The Ski protein level increased in the perilesional region at 3 days post injury. ski overexpression further elevated Ski protein levels up to 14 days post injury. Lesion volume was attenuated by approximately 36-55% after ski overexpression, with better neurobehavioral recovery, more newborn immature and mature neurons, and less astrogliosis in the perilesional region.

Autoři článku: Secherbarber6399 (Marquez Lang)