Halludsen4126

Z Iurium Wiki

Increasing the temperature in the muffle furnace reduces the concentration of combustion products of gas hydrates.The generalized cumulative residual entropy is a recently defined dispersion measure. In this paper, we obtain some further results for such a measure, in relation to the generalized cumulative residual entropy and the variance of random lifetimes. We show that it has an intimate connection with the non-homogeneous Poisson process. We also get new expressions, bounds and stochastic comparisons involving such measures. Moreover, the dynamic version of the mentioned notions is studied through the residual lifetimes and suitable aging notions. In this framework we achieve some findings of interest in reliability theory, such as a characterization for the exponential distribution, various results on k-out-of-n systems, and a connection to the excess wealth order. We also obtain similar results for the generalized cumulative entropy, which is a dual measure to the generalized cumulative residual entropy.A single-input-multiple-output (SIMO) channel is obtained from the use of an array of antennas in the receiver where the same information is transmitted through different sub-channels, and all received sequences are distinctly distorted versions of the same message. The inter-symbol-interference (ISI) level from each sub-channel is presently unknown to the receiver. Thus, even when one or more sub-channels cause heavy ISI, all the information from all the sub-channels was still considered in the receiver. Obviously, if we know the approximated ISI of each sub-channel, we will use in the receiver only those sub-channels with the lowest ISI level to get improved system performance. In this paper, we present a systematic way for obtaining the approximated ISI from each sub-channel modelled as a finite-impulse-response (FIR) channel with real-valued coefficients for a 16QAM (16 quadrature amplitude modulation) source signal transmission. The approximated ISI is based on the maximum entropy density approximation technique, on the Edgeworth expansion up to order six, on the Laplace integral method and on the generalized Gaussian distribution (GGD). Although the approximated ISI was derived for the noiseless case, it was successfully tested for signal to noise ratio (SNR) down to 20 dB.This work is an extension of our earlier article, where a well-known integral representation of the logarithmic function was explored and was accompanied with demonstrations of its usefulness in obtaining compact, easily-calculable, exact formulas for quantities that involve expectations of the logarithm of a positive random variable. Here, in the same spirit, we derive an exact integral representation (in one or two dimensions) of the moment of a nonnegative random variable, or the sum of such independent random variables, where the moment order is a general positive non-integer real (also known as fractional moments). The proposed formula is applied to a variety of examples with an information-theoretic motivation, and it is shown how it facilitates their numerical evaluations. In particular, when applied to the calculation of a moment of the sum of a large number, n, of nonnegative random variables, it is clear that integration over one or two dimensions, as suggested by our proposed integral representation, is significantly easier than the alternative of integrating over n dimensions, as needed in the direct calculation of the desired moment.At the battalion level, NATO ROLE1 medical treatment command focuses on the provision of primary health care being the very first physician and higher medical equipment intervention for casualty treatments. ROLE1 has paramount importance in casualty reductions, representing a complex system in current operations. This study deals with an experiment on the optimization of ROLE1 according to the key parameters of the numbers of physicians, the number of ambulances and the distance between ROLE1 and the current battlefield. The very first step in this study is to design and implement a model of current battlefield casualties. The model uses friction data generated from an already executed computer assisted exercise (CAX) while employing a constructive simulation to produce offense and defense scenarios on the flow of casualties. The next step in the study is to design and implement a model representing the transportation to ROLE1, its structure and behavior. The deterministic model of ROLE1, employing a system dynamics simulation paradigm, uses the previously generated casualty flows as the inputs representing human decision-making processes through the recorder CAX events. A factorial experimental design for the ROLE1 model revealed the recommended variants of the ROLE1 structure for both offensive and defensive operations. The overall recommendation is for the internal structure of ROLE1 to have three ambulances and three physicians for any kind of current operation and any distance between ROLE1 and the current battlefield within the limit of 20 min. This study provides novelty in the methodology of casualty estimations involving human decision-making factors as well as the optimization of medical treatment processes through experimentation with the process model.The problem of determining the best achievable performance of arbitrary lossless compression algorithms is examined, when correlated side information is available at both the encoder and decoder. For arbitrary source-side information pairs, the conditional information density is shown to provide a sharp asymptotic lower bound for the description lengths achieved by an arbitrary sequence of compressors. This implies that for ergodic source-side information pairs, the conditional entropy rate is the best achievable asymptotic lower bound to the rate, not just in expectation but with probability one. Under appropriate mixing conditions, a central limit theorem and a law of the iterated logarithm are proved, describing the inevitable fluctuations of the second-order asymptotically best possible rate. An idealised version of Lempel-Ziv coding with side information is shown to be universally first- and second-order asymptotically optimal, under the same conditions. These results are in part based on a new almost-sure invariance principle for the conditional information density, which may be of independent interest.Solar energy is utilized in a combined ejector refrigeration system with an organic Rankine cycle (ORC) to produce a cooling effect and generate electrical power. This study aims at increasing the utilized share of the collected solar thermal energy by inserting an ORC into the system. As the ejector refrigeration cycle reaches its maximum coefficient of performance (COP), the ORC starts working and generating electrical power. This electricity is used to run the circulating pumps and the control system, which makes the system autonomous. For the ejector refrigeration system, R134a refrigerant is selected as the working fluid for its performance characteristics and environmentally friendly nature. The COP of 0.53 was obtained for the ejector refrigeration cycle. The combined cycle of the solar ejector refrigeration and ORC is modeled in EBSILON Professional. Different parameters like generator temperature and pressure, condenser temperature and pressure, and entrainment ratio are studied, and the effect of these parameters on the cycle COP is investigated. Exergy, economic, and exergoeconomic analyses of the hybrid system are carried out to identify the thermodynamic and cost inefficiencies present in various components of the system.Assessment of brain dynamics elicited by motor imagery (MI) tasks contributes to clinical and learning applications. In this regard, Event-Related Desynchronization/Synchronization (ERD/S) is computed from Electroencephalographic signals, which show considerable variations in complexity. We present an Entropy-based method, termed VQEnt, for estimation of ERD/S using quantized stochastic patterns as a symbolic space, aiming to improve their discriminability and physiological interpretability. The proposed method builds the probabilistic priors by assessing the Gaussian similarity between the input measured data and their reduced vector-quantized representation. The validating results of a bi-class imagine task database (left and right hand) prove that VQEnt holds symbols that encode several neighboring samples, providing similar or even better accuracy than the other baseline sample-based algorithms of Entropy estimation. Besides, the performed ERD/S time-series are close enough to the trajectories extracted by the variational percentage of EEG signal power and fulfill the physiological MI paradigm. In BCI literate individuals, the VQEnt estimator presents the most accurate outcomes at a lower amount of electrodes placed in the sensorimotor cortex so that reduced channel set directly involved with the MI paradigm is enough to discriminate between tasks, providing an accuracy similar to the performed by the whole electrode set.Continuous drive friction welding is a solid-state welding process that has been experimentally proven to be a fast and reliable method. This is a complex process; deformations in the viscosity of a material alter the friction between the surfaces of the pieces. All these dynamics cause changes in the vibration signals; the interpretation of these signals can reveal important information. The vibration signals generated during the friction and forging stages are measured on the stationary part of the structure to determine the influence of the manipulated variables on the time domain statistical characteristics (root mean square, peak value, crest factor, and kurtosis). In the frequency domain, empirical mode decomposition is used to characterize frequencies. It was observed that it is possible to identify the effects of the manipulated variables on the calculated statistical characteristics. The results also indicate that the effect of manipulated variables is stronger on low-frequency signals.Recent advances in single-molecule science have revealed an astonishing number of details on the microscopic states of molecules, which in turn defined the need for simple, automated processing of numerous time-series data. In particular, large datasets of time series of single protein molecules have been obtained using laser optical tweezers. In this system, each molecular state has a separate time series with a relatively uneven composition from the point of view-point of local descriptive statistics. In the past, uncertain data quality and heterogeneity of molecular states were biased to the human experience. Glycochenodeoxycholic acid chemical Because the data processing information is not directly transferable to the black-box-framework for an efficient classification, a rapid evaluation of a large number of time series samples simultaneously measured may constitute a serious obstacle. To solve this particular problem, we have implemented a supervised learning method that combines local entropic models with the global Lehmer average. We find that the methodological combination is suitable to perform a fast and simple categorization, which enables rapid pre-processing of the data with minimal optimization and user interventions.

Autoři článku: Halludsen4126 (Melton Pihl)