Lysgaardstack0638
The most prevalent bleed type was traumatic musculoskeletal bleeding. Bleeding events occurred more often in the first 10weeks after starting emicizumab, but no time period was without bleeding events. The majority of patients were prescribed every-week or every-2-week dosing, but some had alternative dosing frequency.
Real-world emicizumab use in our center was characterized by variations in prescribing practices and bleeding outcomes and lack of severe adverse reactions.
Real-world emicizumab use in our center was characterized by variations in prescribing practices and bleeding outcomes and lack of severe adverse reactions.Despite the numerous and groundbreaking therapeutic advances made in the field of hemophilia over the past decades and particularly in recent years, hemophilia remains a disease that has a major impact on the daily lives of our patients, through the multiple complications and burdensome treatments it imposes. The disease burden is not only physical but also psychological and is difficult to evaluate solely by questionnaires and scores. In this article, we propose to examine the absence of psychological burden and of permanent thoughts about the disease and its complications in people with hemophilia as a new ambition that should guide hemophilia care and research in the future.
Accurate assessment of a molecular classifier that guides patient care is of paramount importance in precision oncology. Recent years have seen an increasing use of external validation for such assessment. TR-107 price However, little is known about how it is affected by ubiquitous unwanted variations in test data because of disparate experimental handling and by the use of data normalization for alleviating such variations.
In this paper, we studied these issues using two microarray data sets for the same set of tumor samples and additional data simulated by resampling under various levels of signal-to-noise ratio and different designs for array-to-sample allocation.
We showed that (1) unwanted variations can lead to biased classifier assessment and (2) data normalization mitigates the bias to varying extents depending on the specific method used. In particular, frozen normalization methods for test data outperform their conventional forms in terms of both reducing the bias in accuracy estimation and increasing robustness to handling effects. We make available our benchmarking tool as an R package on GitHub for performing such evaluation on additional methods for normalization and classification.
Our findings thus highlight the importance of proper test-data normalization for valid assessment by external validation and call for caution on the choice of normalization method for molecular classifier development.
Our findings thus highlight the importance of proper test-data normalization for valid assessment by external validation and call for caution on the choice of normalization method for molecular classifier development.
To compare clinical outcomes in a cohort of patients with advanced non-small-cell lung cancer (NSCLC) with targetable genomic alterations detected using plasma-based circulating tumor DNA (ctDNA) or tumor-based next-generation sequencing (NGS) assays treated with US Food and Drug Administration-approved therapies at a large academic research cancer center.
A retrospective review from our MD Anderson GEMINI database identified 2,224 blood samples sent for ctDNA NGS testing from 1971 consecutive patients with a diagnosis of advanced NSCLC. Clinical, treatment, and outcome information were collected, reviewed, and analyzed.
Overall, 27% of the ctDNA tests identified at least one targetable mutation and 73% of targetable mutations were
-sensitizing mutations. Among patients treated with first-line epidermal growth factor receptor (EGFR)-tyrosine kinase inhibitor (TKI) therapies, there were no significant differences in progression-free survival of 379 days and 352 days (
value = .41) with treatment baser plasma-based comprehensive profiling and those with very low VAF as compared with high VAF, supporting the use of plasma-based profiling to guide initial TKI use in patients with metastatic EGFR-mutant NSCLC.
High-residual C-peptide in longer-duration type 1 diabetes (T1D) is associated with fewer hypoglycemic events and reduced glycemic variability. Little is known about the impact of C-peptide close to diagnosis.
Using continuous glucose monitoring (CGM) data from a study of newly diagnosed adults with T1D, we aimed to explore if variation in C-peptide close to diagnosis influenced glycemic variability and risk of hypoglycemia.
We studied newly diagnosed adults with T1D who wore a Dexcom G4 CGM for 7 days as part of the Exercise in Type 1 Diabetes (EXTOD) study. We examined the relationship between peak stimulated C-peptide and glycemic metrics of variability and hypoglycemia for 36 CGM traces from 23 participants.
For every 100 pmol/L-increase in peak C-peptide, the percentage of time spent in the range 3.9 to 10 mmol/L increased by 2.4% (95% CI, 0.5-4.3),
= .01) with a reduction in time spent at level 1 hyperglycemia (> 10 mmol/L) and level 2 hyperglycemia (> 13.9 mmol/L) by 2.6% (95% CI, -4.9 to -0.4,
= .02) and 1.3% (95% CI, -2.7 to -0.006,
= .04), respectively. Glucose levels were on average lower by 0.19 mmol/L (95% CI, -0.4 to 0.02,
= .06) and SD reduced by 0.14 (95% CI, -0.3 to -0.02,
= .02). Hypoglycemia was not common in this group and no association was observed between time spent in hypoglycemia (
= .97) or hypoglycemic risk (
= .72). There was no association between peak C-peptide and insulin dose-adjusted glycated hemoglobin A
(
= .45).
C-peptide is associated with time spent in the normal glucose range and with less hyperglycemia, but not risk of hypoglycemia in newly diagnosed people with T1D.
C-peptide is associated with time spent in the normal glucose range and with less hyperglycemia, but not risk of hypoglycemia in newly diagnosed people with T1D.A germline mutation is identified in almost 40% of pheochromocytoma/paraganglioma (PPGL) syndromes. Genetic testing and counseling are essential for the management of index cases as well as presymptomatic identification and preemptive management of affected family members. Mutations in the genes encoding the mitochondrial enzyme succinate dehydrogenase (SDH) are well described in patients with hereditary PPGL. Among patients of African ancestry, the prevalence, phenotype, germline mutation spectrum, and penetrance of SDH mutations is poorly characterized. We describe a multifocal paraganglioma in a young African male with an underlying missense succinate dehydrogenase subunit B (SDHB) mutation and a history of 3 first-degree relatives who died at young ages from suspected cardiovascular causes. The same SDHB mutation, Class V variant c.724C>A p.(Arg242Ser), was detected in one of his asymptomatic siblings. As there are limited data describing hereditary PPGL syndromes in Africa, this report of an SDHB-associated PPGL is a notable contribution to the literature in this growing field. Due to the noteworthy clinical implications of PPGL mutations, this work highlights the existing need for broader genetic screening among African patients with PPGL despite the limited healthcare resources available in this region.
Latin American reports on pheochromocytomas and paragangliomas (PPGLs) are scarce. Recent studies demonstrate changes in clinical presentation and management of these patients. Herein, we assessed the main characteristics of PPGL patients in our academic center over the past 4 decades.
Demographic, clinical, biochemical, and perioperative data from 105 PPGL patients were retrospectively and prospectively collected over the 1980-2019 period. Data were organized into 4 periods by decade.
Age at diagnosis, gender, tumor size and percentage of bilaterality, percentage of paragangliomas, and metastases remained stable across the 4 decades. The proportion of genetic testing and incidentalomas increased in recent decades (all
< 0.001). Therefore, we compared PPGLs diagnosed as incidentalomas (36%) with those clinically suspected (64%). Incidentalomas had fewer adrenergic symptoms (38 vs. 62%;
< 0.001) and lower rates of hypertension (64% vs. 80%;
= 0.01) and hypertensive crisis (28% vs. 44%;
gnoses, more genetic testing, and improvements in perioperative management.Coronavirus disease 2019 (COVID-19), caused by severe acute respiratory syndrome coronavirus 2, was first identified in Wuhan, China, in December 2019. As the number of COVID-19 infections and deaths worldwide continues to increase rapidly, the prevention and control of COVID-19 remains urgent. This article aims to analyze COVID-19 from a geographical perspective, and this information can provide useful insights for rapid visualization of spatial-temporal epidemic information and identification of the factors important to the spread of COVID-19. A new type of vitalization method, called the point grid map, is integrated with calendar-based visualization to show the spatial-temporal variations in COVID-19. The combination of mixed geographically weighted regression (mixed GWR) and extreme gradient boosting (XGBoost) is used to identify the potential factors and the corresponding importance. The visualization results clearly reflect the spatial-temporal patterns of COVID-19. The quantified results reveal that the impact of population outflow from Wuhan is the most important factor and indicate statistically significant spatial heterogeneity. Our results provide insights into how multisource big geodata can be employed within the framework of integrating visualization and analytical methods to characterize COVID-19 trends. In addition, this work can help understand the influential factors for controlling and preventing epidemics, which is important for policy design and effective decision-making for controlling COVID-19. The results reveal that one of the most effective ways to control COVID-19 include controlling the source of infection, cutting off the transmission route, and protecting vulnerable groups.This study summarizes the results from fitting a Bayesian hierarchical spatiotemporal model to coronavirus disease 2019 (COVID-19) cases and deaths at the county level in the United States for the year 2020. Two models were created, one for cases and one for deaths, utilizing a scaled Besag, York, Mollié model with Type I spatial-temporal interaction. Each model accounts for 16 social vulnerability and 7 environmental variables as fixed effects. The spatial pattern between COVID-19 cases and deaths is significantly different in many ways. The spatiotemporal trend of the pandemic in the United States illustrates a shift out of many of the major metropolitan areas into the United States Southeast and Southwest during the summer months and into the upper Midwest beginning in autumn. Analysis of the major social vulnerability predictors of COVID-19 infection and death found that counties with higher percentages of those not having a high school diploma, having non-White status and being Age 65 and over to be significant. Among the environmental variables, above ground level temperature had the strongest effect on relative risk to both cases and deaths. Hot and cold spots, areas of statistically significant high and low COVID-19 cases and deaths respectively, derived from the convolutional spatial effect show that areas with a high probability of above average relative risk have significantly higher Social Vulnerability Index composite scores. The same analysis utilizing the spatiotemporal interaction term exemplifies a more complex relationship between social vulnerability, environmental measurements, COVID-19 cases, and COVID-19 deaths.