Mcfarlandfuttrup7448

Z Iurium Wiki

The misfit of this new approximate analytical solution against the exact numerical solution was demonstrated to be smaller than or equal to the misprediction of the original IAPWS G12-15 formulation with respect to experimental values.In this paper, first we show that the variance used in the Markowitz's mean-variance model for the portfolio selection with its numerous modifications often does not properly present the risk of portfolio. Selleckchem MM3122 Therefore, we propose another treating of portfolio risk as the measure of possibility to earn unacceptable low profits of portfolio and a simple mathematical formalization of this measure. In a similar way, we treat the criterion of portfolio's return maximization as the measure of possibility to get a maximal profit. As the result, we formulate the portfolio selection problem as a bicriteria optimization task. Then, we study the properties of the developed approach using critical examples of portfolios with interval and fuzzy valued returns. The α-cuts representation of fuzzy returns was used. To validate the proposed method, we compare the results we got using it with those obtained with the use of fuzzy versions of seven widely reputed methods for portfolio selection. As in our approach we deal with the bicriteria task, the three most popular methods for local criteria aggregation are compared using the known example of fuzzy portfolio consist of five assets. It is shown that the results we got using our approach to the interval and fuzzy portfolio selection reflect better the essence of this task than those obtained by widely reputed traditional methods for portfolio selection in the fuzzy setting.We present a mathematical model of disease (say a virus) spread that takes into account the hierarchic structure of social clusters in a population. It describes the dependence of epidemic's dynamics on the strength of barriers between clusters. These barriers are established by authorities as preventative measures; partially they are based on existing socio-economic conditions. We applied the theory of random walk on the energy landscapes represented by ultrametric spaces (having tree-like geometry). This is a part of statistical physics with applications to spin glasses and protein dynamics. To move from one social cluster (valley) to another, a virus (its carrier) should cross a social barrier between them. The magnitude of a barrier depends on the number of social hierarchy levels composing this barrier. Infection spreads rather easily inside a social cluster (say a working collective), but jumps to other clusters are constrained by social barriers. The model implies the power law, 1-t-a, for approaching herd immunity, where the parameter a is proportional to inverse of one-step barrier Δ. We consider linearly increasing barriers (with respect to hierarchy), i.e., the m-step barrier Δm=mΔ. We also introduce a quantity characterizing the process of infection distribution from one level of social hierarchy to the nearest lower levels, spreading entropy E. The parameter a is proportional to E.In this paper, we present a method by which it is possible to describe a dissipative system (that is modeled by a linear differential equation) in Lagrangian formalism, without the trouble of finding the proper way to model the environment. The concept of the presented method is to create a function that generates the measurable physical quantity, similarly to electrodynamics, where the scalar potential and vector potential generate the electric and magnetic fields. The method is examined in the classical case; the question of quantization is unanswered.We propose probability and density forecast combination methods that are defined using the entropy regularized Wasserstein distance. First, we provide a theoretical characterization of the combined density forecast based on the regularized Wasserstein distance under the assumption. More specifically, we show that the regularized Wasserstein barycenter between multivariate Gaussian input densities is multivariate Gaussian, and provide a simple way to compute mean and its variance-covariance matrix. Second, we show how this type of regularization can improve the predictive power of the resulting combined density. Third, we provide a method for choosing the tuning parameter that governs the strength of regularization. Lastly, we apply our proposed method to the U.S. inflation rate density forecasting, and illustrate how the entropy regularization can improve the quality of predictive density relative to its unregularized counterpart.Statistical physics determines the abundance of different arrangements of matter depending on cost-benefit balances. Its formalism and phenomenology percolate throughout biological processes and set limits to effective computation. Under specific conditions, self-replicating and computationally complex patterns become favored, yielding life, cognition, and Darwinian evolution. Neurons and neural circuits sit at a crossroads between statistical physics, computation, and (through their role in cognition) natural selection. Can we establish a statistical physics of neural circuits? Such theory would tell what kinds of brains to expect under set energetic, evolutionary, and computational conditions. With this big picture in mind, we focus on the fate of duplicated neural circuits. We look at examples from central nervous systems, with stress on computational thresholds that might prompt this redundancy. We also study a naive cost-benefit balance for duplicated circuits implementing complex phenotypes. From this, we derive phase diagrams and (phase-like) transitions between single and duplicated circuits, which constrain evolutionary paths to complex cognition. Back to the big picture, similar phase diagrams and transitions might constrain I/O and internal connectivity patterns of neural circuits at large. The formalism of statistical physics seems to be a natural framework for this worthy line of research.In attributing individual credit for co-authored academic publications, one issue is how to apportion (unequal) credit, based on the order of authorship. Apportioning credit for completed joint undertakings has always been a challenge. Academic promotion committees are faced with such tasks regularly, when trying to infer a candidate's contribution to an article they coauthored with others. We propose a method for achieving this goal in disciplines (such as the author's) where the default order is alphabetical. The credits are those maximizing Shannon entropy subject to order constraints.

Autoři článku: Mcfarlandfuttrup7448 (Woodruff Singer)