Nealhogan9284

Z Iurium Wiki

Verze z 28. 9. 2024, 20:51, kterou vytvořil Nealhogan9284 (diskuse | příspěvky) (Založena nová stránka s textem „Genome-wide association study (GWAS) has turned out to be an essential technology for exploring the genetic mechanism of complex traits. [https://www.selle…“)
(rozdíl) ← Starší verze | zobrazit aktuální verzi (rozdíl) | Novější verze → (rozdíl)

Genome-wide association study (GWAS) has turned out to be an essential technology for exploring the genetic mechanism of complex traits. ML162 in vitro To reduce the complexity of computation, it is well accepted to remove unrelated single nucleotide polymorphisms (SNPs) before GWAS, e.g., by using iterative sure independence screening expectation-maximization Bayesian Lasso (ISIS EM-BLASSO) method. In this work, a modified version of ISIS EM-BLASSO is proposed, which reduces the number of SNPs by a screening methodology based on Pearson correlation and mutual information, then estimates the effects via EM-Bayesian Lasso (EM-BLASSO), and finally detects the true quantitative trait nucleotides (QTNs) through likelihood ratio test. We call our method a two-stage mutual information based Bayesian Lasso (MBLASSO). link2 Under three simulation scenarios, MBLASSO improves the statistical power and retains the higher effect estimation accuracy when comparing with three other algorithms. Moreover, MBLASSO performs best on model fitting, the accuracy of detected associations is the highest, and 21 genes can only be detected by MBLASSO in Arabidopsis thaliana datasets.Adsorption chillers are characterized by low electricity consumption, lack of moving parts, and high reliability. The disadvantage of these chillers is their large weight due to low adsorbent sorption capacity. link3 Therefore, the attention is turned to finding a sorbent with a high water sorption capacity and enhanced thermal conductivity to increase chiller efficiency. The article discusses the impact of selected adhesives used for the production of an adsorption bed in order to improve heat exchange on its surface. Experiments with silica gel with three commercial types of glue on metal plates representing heat exchanger were performed. The structure of samples was observed under a microscope to determine the coverage of adsorbent by glue. To determine the kinetics of the free adsorption, the amounts of moisture adsorbed and the desorption dynamics the prepared samples of coated bed on metal plates were moisturized and dried in a moisture analyzer. Samples made of silica gel mixed with the adhesive 2-hydroxyethyl cellulose, show high adsorption capacity, low dynamic adsorption, and medium dynamic desorption. Samples containing adhesive poly(vinyl alcohol) adsorb less moisture, but free adsorption and desorption were more dynamic. Samples containing the adhesive hydroxyethyl cellulose show lower moisture capacity, relatively dynamic adsorption, and lower dynamic desorption.Multi-level thresholding is one of the effective segmentation methods that have been applied in many applications. Traditional methods face challenges in determining the suitable threshold values; therefore, metaheuristic (MH) methods have been adopted to solve these challenges. In general, MH methods had been proposed by simulating natural behaviors of swarm ecosystems, such as birds, animals, and others. The current study proposes an alternative multi-level thresholding method based on a new MH method, a modified spherical search optimizer (SSO). This was performed by using the operators of the sine cosine algorithm (SCA) to enhance the exploitation ability of the SSO. Moreover, Fuzzy entropy is applied as the main fitness function to evaluate the quality of each solution inside the population of the proposed SSOSCA since Fuzzy entropy has established its performance in literature. Several images from the well-known Berkeley dataset were used to test and evaluate the proposed method. The evaluation outcomes approved that SSOSCA showed better performance than several existing methods according to different image segmentation measures.In a previous work, a parsimonious topic model (PTM) was proposed for text corpora. In that work, unlike LDA, the modeling determined a subset of salient words for each topic, with topic-specific probabilities, with the rest of the words in the dictionary explained by a universal shared model. Further, in LDA all topics are in principle present in every document. In contrast, PTM gives sparse topic representation, determining the (small) subset of relevant topics for each document. A customized Bayesian information criterion (BIC) was derived, balancing model complexity and goodness of fit, with the BIC minimized to jointly determine the entire model-the topic-specific words, document-specific topics, all model parameter values, and the total number of topics-in a wholly unsupervised fashion. In the present work, several important modeling and algorithm (parameter learning) extensions of PTM are proposed. First, we modify the BIC objective function using a lossless coding scheme with low modeling cost for describing words that are non-salient for all topics-such words are essentially identified as wholly noisy/uninformative. This approach increases the PTM's model sparsity, which also allows model selection of more topics and with lower BIC cost than the original PTM. Second, in the original PTM model learning strategy, word switches were updated sequentially, which is myopic and susceptible to finding poor locally optimal solutions. Here, instead, we jointly optimize all the switches that correspond to the same word (across topics). This approach jointly optimizes many more parameters at each step than the original PTM, which in principle should be less susceptible to finding poor local minima. Results on several document data sets show that our proposed method outperformed the original PTM model with respect to multiple performance measures, and gave a sparser topic model representation than the original PTM.In this work, the dependence of reversal potentials and zero-current fluxes on diffusion coefficients are examined for ionic flows through membrane channels. The study is conducted for the setup of a simple structure defined by the profile of permanent charges with two mobile ion species, one positively charged (cation) and one negatively charged (anion). Numerical observations are obtained from analytical results established using geometric singular perturbation analysis of classical Poisson-Nernst-Planck models. For 11 ionic mixtures with arbitrary diffusion constants, Mofidi and Liu (arXiv1909.01192) conducted a rigorous mathematical analysis and derived an equation for reversal potentials. We summarize and extend these results with numerical observations for biological relevant situations. The numerical investigations on profiles of the electrochemical potentials, ion concentrations, and electrical potential across ion channels are also presented for the zero-current case. Moreover, the dependence of current and fluxes on voltages and permanent charges is investigated. In the opinion of the authors, many results in the paper are not intuitive, and it is difficult, if not impossible, to reveal all cases without investigations of this type.With the emergence of network security issues, various security devices that generate a large number of logs and alerts are widely used. This paper proposes an alert aggregation scheme that is based on conditional rough entropy and knowledge granularity to solve the problem of repetitive and redundant alert information in network security devices. Firstly, we use conditional rough entropy and knowledge granularity to determine the attribute weights. This method can determine the different important attributes and their weights for different types of attacks. We can calculate the similarity value of two alerts by weighting based on the results of attribute weighting. Subsequently, the sliding time window method is used to aggregate the alerts whose similarity value is larger than a threshold, which is set to reduce the redundant alerts. Finally, the proposed scheme is applied to the CIC-IDS 2018 dataset and the DARPA 98 dataset. The experimental results show that this method can effectively reduce the redundant alerts and improve the efficiency of data processing, thus providing accurate and concise data for the next stage of alert fusion and analysis.Evaluating the harmonic contributions of each nonlinear customer is important for harmonic mitigation in a power system with diverse and complex harmonic sources. The existing evaluation methods have two shortcomings (1) the calculation accuracy is easily affected by background harmonics fluctuation; and (2) they rely on Global Positioning System (GPS) measurements, which is not economic when widely applied. In this paper, based on the properties of asynchronous measurements, we propose a model for evaluating harmonic contributions without GPS technology. In addition, based on the Gaussianity of the measured harmonic data, a mixed entropy screening mechanism is proposed to assess the fluctuation degree of the background harmonics for each data segment. Only the segments with relatively stable background harmonics are chosen for calculation, which reduces the impacts of the background harmonics in a certain degree. Additionally, complex independent component analysis, as a potential method to this field, is improved in this paper. During the calculation process, the sparseness of the mixed matrix in this method is used to reduce the optimization dimension and enhance the evaluation accuracy. The validity and the effectiveness of the proposed methods are verified through simulations and field case studies.We reveal the analytic relations between a matrix permanent and major nature's complexities manifested in critical phenomena, fractal structures and chaos, quantum information processes in many-body physics, number-theoretic complexity in mathematics, and ♯P-complete problems in the theory of computational complexity. They follow from a reduction of the Ising model of critical phenomena to the permanent and four integral representations of the permanent based on (i) the fractal Weierstrass-like functions, (ii) polynomials of complex variables, (iii) Laplace integral, and (iv) MacMahon master theorem.Entropy and information inequalities are vitally important in many areas of mathematics and engineering [...].Convolutional neural networks (CNN) is the most mainstream solution in the field of image retrieval. Deep metric learning is introduced into the field of image retrieval, focusing on the construction of pair-based loss function. However, most pair-based loss functions of metric learning merely take common vector similarity (such as Euclidean distance) of the final image descriptors into consideration, while neglecting other distribution characters of these descriptors. In this work, we propose relative distribution entropy (RDE) to describe the internal distribution attributes of image descriptors. We combine relative distribution entropy with the Euclidean distance to obtain the relative distribution entropy weighted distance (RDE-distance). Moreover, the RDE-distance is fused with the contrastive loss and triplet loss to build the relative distributed entropy loss functions. The experimental results demonstrate that our method attains the state-of-the-art performance on most image retrieval benchmarks.Entropy quantification algorithms are becoming a prominent tool for the physiological monitoring of individuals through the effective measurement of irregularity in biological signals. However, to ensure their effective adaptation in monitoring applications, the performance of these algorithms needs to be robust when analysing time-series containing missing and outlier samples, which are common occurrence in physiological monitoring setups such as wearable devices and intensive care units. This paper focuses on augmenting Dispersion Entropy (DisEn) by introducing novel variations of the algorithm for improved performance in such applications. The original algorithm and its variations are tested under different experimental setups that are replicated across heart rate interval, electroencephalogram, and respiratory impedance time-series. Our results indicate that the algorithmic variations of DisEn achieve considerable improvements in performance while our analysis signifies that, in consensus with previous research, outlier samples can have a major impact in the performance of entropy quantification algorithms.

Autoři článku: Nealhogan9284 (Melton Emery)