Harringtonreese7259
Regression analysis makes up a large part of supervised machine learning, and consists of the prediction of a continuous independent target from a set of other predictor variables. The difference between binary classification and regression is in the target range in binary classification, the target can have only two values (usually encoded as 0 and 1), while in regression the target can have multiple values. Even if regression analysis has been employed in a huge number of machine learning studies, no consensus has been reached on a single, unified, standard metric to assess the results of the regression itself. Many studies employ the mean square error (MSE) and its rooted variant (RMSE), or the mean absolute error (MAE) and its percentage variant (MAPE). Although useful, these rates share a common drawback since their values can range between zero and +infinity, a single value of them does not say much about the performance of the regression with respect to the distribution of the ground truth elements. In this study, we focus on two rates that actually generate a high score only if the majority of the elements of a ground truth group has been correctly predicted the coefficient of determination (also known as R-squared or R 2) and the symmetric mean absolute percentage error (SMAPE). After showing their mathematical properties, we report a comparison between R 2 and SMAPE in several use cases and in two real medical scenarios. Our results demonstrate that the coefficient of determination (R-squared) is more informative and truthful than SMAPE, and does not have the interpretability limitations of MSE, RMSE, MAE and MAPE. We therefore suggest the usage of R-squared as standard metric to evaluate regression analyses in any scientific domain.Understanding the concept of simple interest is essential in financial mathematics because it establishes the basis to comprehend complex conceptualizations. Nevertheless, students often have problems learning about simple interest. This paper aims to introduce a prototype called "simple interest computation with mobile augmented reality" (SICMAR) and evaluate its effects on students in a financial mathematics course. The research design comprises four stages (i) planning; (ii) hypotheses development; (iii) software development; and (iv) design of data collection instruments. The planning stage explains the problems that students confront to learn about simple interest. In the second stage, we present the twelve hypotheses tested in the study. The stage of software development discusses the logic implemented for SICMAR functionality. In the last stage, we design two surveys and two practice tests to assess students. The pre-test survey uses the attention, relevance, confidence, and satisfaction (ARCS) model to assess students' motivation in a traditional learning setting. The post-test survey assesses motivation, technology usage with the technology acceptance model (TAM), and prototype quality when students use SICMAR. Also, students solve practice exercises to assess their achievement. One hundred three undergraduates participated in both sessions of the study. The findings revealed the direct positive impact of SICMAR on students' achievement and motivation. Moreover, students expressed their interest in using the prototype because of its quality. In summary, students consider SICMAR as a valuable complementary tool to learn simple interest topics.A global path planning algorithm for unmanned surface vehicles (USVs) with short time requirements in large-scale and complex multi-island marine environments is proposed. The fast marching method-based path planning for USVs is performed on grid maps, resulting in a decrease in computer efficiency for larger maps. This can be mitigated by improving the algorithm process. In the proposed algorithm, path planning is performed twice in maps with different spatial resolution (SR) grids. The first path planning is performed in a low SR grid map to determine effective regions, and the second is executed in a high SR grid map to rapidly acquire the final high precision global path. In each path planning process, a modified inshore-distance-constraint fast marching square (IDC-FM2) method is applied. Based on this method, the path portions around an obstacle can be constrained within a region determined by two inshore-distance parameters. The path planning results show that the proposed algorithm can generate smooth and safe global paths wherein the portions that bypass obstacles can be flexibly modified. Compared with the path planning based on the IDC-FM2 method applied to a single grid map, this algorithm can significantly improve the calculation efficiency while maintaining the precision of the planned path.Energy is at the basis of any social or economic development. The fossil energy is the most used energy source in the world due to the cheap building cost of the power plants. In 2017, fossil fuels generated 64.5% of the world electricity. Since, on the one hand, these plants produce large amount of carbon dioxide which drives climate change, and on the other hand, the storage of existing world fossil resources is in continuous decrease, safer and highly available energy sources should be considered. Hence, for human well-being, and for a green environment, these fossil plants should be switched to cleaner ones. Renewable energy resources have begun to be used as alternatives. These resources have many advantages such as sustainability and environmental protection. Nevertheless, they require higher investment costs. In addition, the reliability of many planted systems is poor. In most cases these systems are not sufficient to ensure a continuous demand of energy for all in needy regions because most of their resources are climate dependent. The main contributions of this research are (i) to propose a natural formalisation of the renewable energy distribution problem, based on COP (Constraint Optimisation Problem), that takes into consideration all the constraints related to this problem; (ii) to propose a novel multi-agent dynamic (A-RESS for Agent based Renewable Energy Sharing System) to solve this problem. The proposed system was implemented and the obtained results show its efficiency and performance in terms of produced, consumed and lost energy.Dynamic and flexible systems offer huge advantages for businesses in addressing dynamic uncertain factors and implementing dynamic business processes (DBP). However, DBP remains a challenge from the perspectives of modeling, simulation, and implementation because of a nontrivial understanding of "What is a dynamic business process?" A variety of approaches for DBP modeling and implementation have been proposed over the past years, yet few comprehensive studies analyzing DBP from different particular perspectives (e.g., business process (BP) variability, aspect oriented BP, service compositions, etc.) and research questions that lay the foundation for the development of a meaning of a DBP have been reported. The motivation behind this review is to examine DBP meaning from a global perspective and, consequently, answer the previously presented research question. KG-501 Therefore, in this paper, we present a systematic literature review (SLR) comprised of 67 papers from five respective digital libraries, which index Cod serves as a useful resource for future DBP studies and practice. Moreover, we expect that our results could inspire researchers and practitioners towards further work aimed at bringing forward the field of DBP modeling and implementation.Medical imaging refers to visualization techniques to provide valuable information about the internal structures of the human body for clinical applications, diagnosis, treatment, and scientific research. Segmentation is one of the primary methods for analyzing and processing medical images, which helps doctors diagnose accurately by providing detailed information on the body's required part. However, segmenting medical images faces several challenges, such as requiring trained medical experts and being time-consuming and error-prone. Thus, it appears necessary for an automatic medical image segmentation system. Deep learning algorithms have recently shown outstanding performance for segmentation tasks, especially semantic segmentation networks that provide pixel-level image understanding. By introducing the first fully convolutional network (FCN) for semantic image segmentation, several segmentation networks have been proposed on its basis. One of the state-of-the-art convolutional networks in the medical imks and 3,205 test images. Our proposed segmentation network achieves a 0.8608 mean Dice similarity coefficient (DSC) on the test set, which is among the top one-percent systems in the Kaggle competition.Scientific Workflows (SWfs) have revolutionized how scientists in various domains of science conduct their experiments. The management of SWfs is performed by complex tools that provide support for workflow composition, monitoring, execution, capturing, and storage of the data generated during execution. In some cases, they also provide components to ease the visualization and analysis of the generated data. During the workflow's composition phase, programs must be selected to perform the activities defined in the workflow specification. These programs often require additional parameters that serve to adjust the program's behavior according to the experiment's goals. Consequently, workflows commonly have many parameters to be manually configured, encompassing even more than one hundred in many cases. Wrongly parameters' values choosing can lead to crash workflows executions or provide undesired results. As the execution of data- and compute-intensive workflows is commonly performed in a high-performance computing environment e.g., (a cluster, a supercomputer, or a public cloud), an unsuccessful execution configures a waste of time and resources. In this article, we present FReeP-Feature Recommender from Preferences, a parameter value recommendation method that is designed to suggest values for workflow parameters, taking into account past user preferences. FReeP is based on Machine Learning techniques, particularly in Preference Learning. FReeP is composed of three algorithms, where two of them aim at recommending the value for one parameter at a time, and the third makes recommendations for n parameters at once. The experimental results obtained with provenance data from two broadly used workflows showed FReeP usefulness in the recommendation of values for one parameter. Furthermore, the results indicate the potential of FReeP to recommend values for n parameters in scientific workflows.
After many years of research on software repositories, the knowledge for building mature, reusable tools that perform data retrieval, storage and basic analytics is readily available. However, there is still room to improvement in the area of reusable tools implementing this knowledge.
To produce a reusable toolset supporting the most common tasks when retrieving, curating and visualizing data from software repositories, allowing for the easy reproduction of data sets ready for more complex analytics, and sparing the researcher or the analyst of most of the tasks that can be automated.
Use our experience in building tools in this domain to identify a collection of scenarios where a reusable toolset would be convenient, and the main components of such a toolset. Then build those components, and refine them incrementally using the feedback from their use in both commercial, community-based, and academic environments.
GrimoireLab, an efficient toolset composed of five main components, supporting about 30 different kinds of data sources related to software development.