Rankinterkelsen5313

Z Iurium Wiki

The correct classification of requirements has become an essential task within software engineering. This study shows a comparison among the text feature extraction techniques, and machine learning algorithms to the problem of requirements engineer classification to answer the two major questions "Which works best (Bag of Words (BoW) vs. Term Frequency-Inverse Document Frequency (TF-IDF) vs. Chi Squared (CHI2)) for classifying Software Requirements into Functional Requirements (FR) and Non-Functional Requirements (NF), and the sub-classes of Non-Functional Requirements?" and "Which Machine Learning Algorithm provides the best performance for the requirements classification task?". The data used to perform the research was the PROMISE_exp, a recently made dataset that expands the already known PROMISE repository, a repository that contains labeled software requirements. All the documents from the database were cleaned with a set of normalization steps and the two feature extractions, and feature selection techniques used were BoW, TF-IDF and CHI2 respectively. The algorithms used for classification were Logist Regression (LR), Support Vector Machine (SVM), Multinomial Naive Bayes (MNB) and k-Nearest Neighbors (kNN). The novelty of our work is the data used to perform the experiment, the details of the steps used to reproduce the classification, and the comparison between BoW, TF-IDF and CHI2 for this repository not having been covered by other studies. This work will serve as a reference for the software engineering community and will help other researchers to understand the requirement classification process. We noticed that the use of TF-IDF followed by the use of LR had a better classification result to differentiate requirements, with an F-measure of 0.91 in binary classification (tying with SVM in that case), 0.74 in NF classification and 0.78 in general classification. As future work we intend to compare more algorithms and new forms to improve the precision of our models.The financial performance of football clubs has become an essential element to ensure the solvency and viability of the club over time. For this, both the theory and the practical and regulatory evidence show the need to study financial factors, as well as sports and corporate factors to analyze the possible flow of income and for good management of the club's accounts, respectively. Through these factors, the present study analyzes the financial performance of European football clubs using neural networks as a methodology, where the popular multilayer perceptron and the novel quantum neural network are applied. The results show the financial performance of the club is determined by liquidity, leverage, and sporting performance. Additionally, the quantum network as the most accurate variant. These conclusions can be useful for football clubs and interest groups, as well as for regulatory bodies that try to make the best recommendations and conditions for the football industry.Generative adversarial networks (GANs), which are a promising type of deep generative network, have recently drawn considerable attention and made impressive progress. However, GAN models suffer from the well-known problem of mode collapse. This study focuses on this challenge and introduces a new model design, called the encoded multi-agent generative adversarial network (E-MGAN), which tackles the mode collapse problem by introducing the variational latent representations learned from a variable auto-encoder (VAE) to a multi-agent GAN. The variational latent representations are extracted from training data to replace the random noise input of the general multi-agent GANs. The generator in E-MGAN employs multiple generators and is penalized by a classifier. GSK-3 phosphorylation This integration guarantees that the proposed model not only enhances the quality of generated samples but also improves the diversity of generated samples to avoid the mode collapse problem. Moreover, extensive experiments are conducted on both a synthetic dataset and two large-scale real-world datasets. The generated samples are visualized for qualitative evaluation. The inception score (IS) and Fréchet inception distance (FID) are adopted to measure the performance of the model for quantitative assessment. The results confirmed that the proposed model achieves outstanding performances compared to other state-of-the-art GAN variants.The electric double layer (EDL) is an important phenomenon that arises in systems where a charged surface comes into contact with an electrolyte solution. In this work we describe the generalization of classic Poisson-Boltzmann (PB) theory for point-like ions by taking into account orientational ordering of water molecules. The modified Langevin Poisson-Boltzmann (LPB) model of EDL is derived by minimizing the corresponding Helmholtz free energy functional, which includes also orientational entropy contribution of water dipoles. The formation of EDL is important in many artificial and biological systems bound by a cylindrical geometry. We therefore numerically solve the modified LPB equation in cylindrical coordinates, determining the spatial dependencies of electric potential, relative permittivity and average orientations of water dipoles within charged tubes of different radii. Results show that for tubes of a large radius, macroscopic (net) volume charge density of coions and counterions is zero at the geometrical axis. This is attributed to effective electrolyte charge screening in the vicinity of the inner charged surface of the tube. For tubes of small radii, the screening region extends into the whole inner space of the tube, leading to non-zero net volume charge density and non-zero orientational ordering of water dipoles near the axis.We discuss the effect of sequential error injection on information leakage under a network code. We formulate a network code for the single transmission setting and the multiple transmission setting. Under this formulation, we show that the eavesdropper cannot increase the power of eavesdropping by sequential error injection when the operations in the network are linear operations. We demonstrated the usefulness of this reduction theorem by applying a concrete example of network.Generating Boolean Functions (BFs) with high nonlinearity is a complex task that is usually addresses through algebraic constructions. Metaheuristics have also been applied extensively to this task. However, metaheuristics have not been able to attain so good results as the algebraic techniques. This paper proposes a novel diversity-aware metaheuristic that is able to excel. This proposal includes the design of a novel cost function that combines several information from the Walsh Hadamard Transform (WHT) and a replacement strategy that promotes a gradual change from exploration to exploitation as well as the formation of clusters of solutions with the aim of allowing intensification steps at each iteration. The combination of a high entropy in the population and a lower entropy inside clusters allows a proper balance between exploration and exploitation. This is the first memetic algorithm that is able to generate 10-variable BFs of similar quality than algebraic methods. Experimental results and comparisons provide evidence of the high performance of the proposed optimization mechanism for the generation of high quality BFs.The paper introduces a new concept of social entropy and a new concept of social order, both based on the normative framework of society. From these two concepts, typologies (logical and historical) of societies are inferred and examined in their basic features. To these ends, some well-known concepts such as entropy, order, system, network, synergy, norm, autopoieticity, fetality, and complexity are revisited and placed into an integrated framework. The core body of this paper addresses the structure and the mechanism of social entropy, understood as an institutionally working counterpart of social order. Finally, this paper concludes that social entropy is an artefact, like society itself, and acts through people's behavior.This study assesses mainly the uncertainty of the mean annual runoff (MAR) for quaternary catchments (QCs) considered as metastable nonextensive systems (from Tsalllis entropy) in the Middle Vaal catchment. The study is applied to the surface water resources (WR) of the South Africa 1990 (WR90), 2005 (WR2005) and 2012 (WR2012) data sets. The q-information index (from the Tsalllis entropy) is used here as a deviation indicator for the spatial evolution of uncertainty for the different QCs, using the Shannon entropy as a baseline. It enables the determination of a (virtual) convergence point, zone of positive and negative uncertainty deviation, zone of null deviation and chaotic zone for each data set. Such a determination is not possible on the basis of the Shannon entropy alone as a measure for the MAR uncertainty of QCs, i.e., when they are viewed as extensive systems. Finally, the spatial distributions for the zones of the q-uncertainty deviation (gain or loss in information) of the MAR are derived and lead to iso q-uncertainty deviation maps.Although natural and bioinspired computing has developed significantly, the relationship between the computational universality and efficiency beyond the Turing machine has not been studied in detail. Here, we investigate how asynchronous updating can contribute to the universal and efficient computation in cellular automata (CA). First, we define the computational universality and efficiency in CA and show that there is a trade-off relation between the universality and efficiency in CA implemented in synchronous updating. Second, we introduce asynchronous updating in CA and show that asynchronous updating can break the trade-off found in synchronous updating. Our finding spells out the significance of asynchronous updating or the timing of computation in robust and efficient computation.Closed-form expressions for the expected logarithm and for arbitrary negative integer moments of a noncentral χ2-distributed random variable are presented in the cases of both even and odd degrees of freedom. Moreover, some basic properties of these expectations are derived and tight upper and lower bounds on them are proposed.Social systems are characterized by an enormous network of connections and factors that can influence the structure and dynamics of these systems. Among them the whole economical sphere of human activity seems to be the most interrelated and complex. All financial markets, including the youngest one, the cryptocurrency market, belong to this sphere. The complexity of the cryptocurrency market can be studied from different perspectives. First, the dynamics of the cryptocurrency exchange rates to other cryptocurrencies and fiat currencies can be studied and quantified by means of multifractal formalism. Second, coupling and decoupling of the cryptocurrencies and the conventional assets can be investigated with the advanced cross-correlation analyses based on fractal analysis. Third, an internal structure of the cryptocurrency market can also be a subject of analysis that exploits, for example, a network representation of the market. In this work, we approach the subject from all three perspectives based on data from a recent time interval between January 2019 and June 2020.

Autoři článku: Rankinterkelsen5313 (Berthelsen Woodruff)