Informacja

Drogi użytkowniku, aplikacja do prawidłowego działania wymaga obsługi JavaScript. Proszę włącz obsługę JavaScript w Twojej przeglądarce.

Wyszukujesz frazę "machine learning algorithm" wg kryterium: Temat


Tytuł:
Improving Crop Yield Predictions in Morocco Using Machine Learning Algorithms
Autorzy:
Ed-Daoudi, Rachid
Alaoui, Altaf
Ettaki, Badia
Zerouaoui, Jamal
Powiązania:
https://bibliotekanauki.pl/articles/24202898.pdf
Data publikacji:
2023
Wydawca:
Polskie Towarzystwo Inżynierii Ekologicznej
Tematy:
crop yield prediction
machine learning algorithm
statistical model
model evaluation
Opis:
In Morocco, agriculture is an important sector that contributes to the country’s economy and food security. Accurately predicting crop yields is crucial for farmers, policy makers, and other stakeholders to make informed decisions regarding resource allocation and food security. This paper investigates the potential of Machine Learning algorithms for improving the accuracy of crop yield predictions in Morocco. The study examines various factors that affect crop yields, including weather patterns, soil moisture levels, and rainfall, and how these factors can be incorporated into Machine Learning models. The performance of different algorithms, including Decision Trees, Random Forests, and Neural Networks, is evaluated and compared to traditional statistical models used for crop prediction. The study demonstrated that the Machine Learning algorithms outperformed the Statistical models in predicting crop yields. Specifically, the Machine Learning algorithms achieved mean squared error values between 0.10 and 0.23 and coefficient of determination values ranging from 0.78 to 0.90, while the Statistical models had mean squared error values ranging from 0.16 to 0.24 and coefficient of determination values ranging from 0.76 to 0.84. The Feed Forward Artificial Neural Network algorithm had the lowest mean squared error value (0.10) and the highest R² value (0.90), indicating that it performed the best among the three Machine Learning algorithms. These results suggest that Machine Learning algorithms can significantly improve the accuracy of crop yield predictions in Morocco, potentially leading to improved food security and optimized resource allocation for farmers.
Źródło:
Journal of Ecological Engineering; 2023, 24, 6; 392--400
2299-8993
Pojawia się w:
Journal of Ecological Engineering
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Weighted accuracy algorithmic approach in counteracting fake news and disinformation
Algorytmiczne podejście do dokładności ważonej w przeciwdziałaniu fałszywym informacjom i dezinformacji
Autorzy:
Bonsu, K.O.
Powiązania:
https://bibliotekanauki.pl/articles/2048986.pdf
Data publikacji:
2021
Wydawca:
Akademia Bialska Nauk Stosowanych im. Jana Pawła II w Białej Podlaskiej
Tematy:
artificial intelligence
natural language processing
machine learning algorithm
disinformation
digital revolution
fake news
Opis:
Subject and purpose of work: Fake news and disinformation are polluting information environment. Hence, this paper proposes a methodology for fake news detection through the combined weighted accuracies of seven machine learning algorithms. Materials and methods: This paper uses natural language processing to analyze the text content of a list of news samples and then predicts whether they are FAKE or REAL. Results: Weighted accuracy algorithmic approach has been shown to reduce overfitting. It was revealed that the individual performance of the different algorithms improved after the data was extracted from the news outlet websites and 'quality' data was filtered by the constraint mechanism developed in the experiment. Conclusions: This model is different from the existing mechanisms in the sense that it automates the algorithm selection process and at the same time takes into account the performance of all the algorithms used, including the less performing ones, thereby increasing the mean accuracy of all the algorithm accuracies.
Źródło:
Economic and Regional Studies; 2021, 14, 1; 99-107
2083-3725
2451-182X
Pojawia się w:
Economic and Regional Studies
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
The Effect of Dual Hyperparameter Optimization on Software Vulnerability Prediction Models
Autorzy:
Bassi, Deepali
Singh, Hardeep
Powiązania:
https://bibliotekanauki.pl/articles/2203949.pdf
Data publikacji:
2023
Wydawca:
Politechnika Wrocławska. Oficyna Wydawnicza Politechniki Wrocławskiej
Tematy:
software vulnerability
hyperparameter optimization
machine learning algorithm
data balancing techniques
data complexity measures
Opis:
Background: Prediction of software vulnerabilities is a major concern in the field of software security. Many researchers have worked to construct various software vulnerability prediction (SVP) models. The emerging machine learning domain aids in building effective SVP models. The employment of data balancing/resampling techniques and optimal hyperparameters can upgrade their performance. Previous research studies have shown the impact of hyperparameter optimization (HPO) on machine learning algorithms and data balancing techniques. Aim: The current study aims to analyze the impact of dual hyperparameter optimization on metrics-based SVP models. Method: This paper has proposed the methodology using the python framework Optuna that optimizes the hyperparameters for both machine learners and data balancing techniques. For the experimentation purpose, we have compared six combinations of five machine learners and five resampling techniques considering default parameters and optimized hyperparameters. Results: Additionally, the Wilcoxon signed-rank test with the Bonferroni correction method was implied, and observed that dual HPO performs better than HPO on learners and HPO on data balancers. Furthermore, the paper has assessed the impact of data complexity measures and concludes that HPO does not improve the performance of those datasets that exhibit high overlap. Conclusion: The experimental analysis unveils that dual HPO is 64% effective in enhancing the productivity of SVP models.
Źródło:
e-Informatica Software Engineering Journal; 2023, 17, 1; art. no. 230102
1897-7979
Pojawia się w:
e-Informatica Software Engineering Journal
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Machine learning methods applied to sea level predictions in the upper part of a tidal estuary
Autorzy:
Guillou, N.
Chapalain, G.
Powiązania:
https://bibliotekanauki.pl/articles/2078822.pdf
Data publikacji:
2021
Wydawca:
Polska Akademia Nauk. Instytut Oceanologii PAN
Tematy:
multiple regression model
artificial neural network
multilayer perceptron
regression function
machine learning algorithm
sea level
Opis:
Sea levels variations in the upper part of estuary are traditionally approached by relying on refined numerical simulations with high computational cost. As an alternative efficient and rapid solution, we assessed here the performances of two types of machine learning algorithms: (i) multiple regression methods based on linear and polynomial regression functions, and (ii) an artificial neural network, the multilayer perceptron. These algorithms were applied to three-year observations of sea levels maxima during high tides in the city of Landerneau, in the upper part of the Elorn estuary (western Brittany, France). Four input variables were considered in relation to tidal and coastal surge effects on sea level: the French tidal coefficient, the atmospheric pressure, the wind velocity and the river discharge. Whereas a part of these input variables derived from large-scale models with coarse spatial resolutions, the different algorithms showed good performances in this local environment, thus being able to capture sea level temporal variations at semi-diurnal and spring-neap time scales. Predictions improved furthermore the assessment of inundation events based so far on the exploitation of observations or numerical simulations in the downstream part of the estuary. Results obtained exhibited finally the weak influences of wind and river discharges on inundation events.
Źródło:
Oceanologia; 2021, 63, 4; 531-544
0078-3234
Pojawia się w:
Oceanologia
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
CellProfiler and WEKA Tools: Image Analysis for Fish Erythrocytes Shape and Machine Learning Model Algorithm Accuracy Prediction of Dataset
Autorzy:
Talapatra, Soumendra Nath
Chaudhuri, Rupa
Ghosh, Subhasis
Powiązania:
https://bibliotekanauki.pl/articles/1193348.pdf
Data publikacji:
2021
Wydawca:
Przedsiębiorstwo Wydawnictw Naukowych Darwin / Scientific Publishing House DARWIN
Tematy:
Automatic image analysis
CellProfiler tool
Fish erythrocytes quantification
Machine learning algorithm
Model classifier accuracy
Shapes measurement
WEKA tool
Opis:
The first part of the study was detected the number of cells and measurement of shape of cells, cytoplasm, and nuclei in an image of Giemsa-stained of fish peripheral erythrocytes by using CellProfiler (CP, version 2.1.0) tool, an image analysis tool. In the second part, it was evaluated machine learning (ML) algorithm models viz. BayesNet (BN), NaiveBayes (NB), logistic regression (LR), Lazy.KStar (K*), decision tree (DT) J48, Random forest (RF) and Random tree (RT) in the WEKA tool (version 3.8.5) for the prediction of the accuracy of the dataset generated from an image. The CP predicts the numbers and individual cellular area shape (arbitrary unit) of cells, cytoplasm, and nuclei as primary, secondary, and tertiary object data in an image. The performance of model accuracy of studied ML algorithm classifications as per correctly and incorrectly classified instances, the highest values were observed in RF and RT followed by K*, LR, BN and DTJ48 and lowest in NB as per training and testing set of correctly classified instances. In case of performance accuracy of class for K value, the highest values were observed in RF and RT followed by K*, LR, BN and DTJ48 and lowest in NB while lowest values were obtained for mean absolute error (MAE) and root mean squared error (RMSE) in case of RT followed by RF, K*, LR, BN and DTJ48 and comparatively highest value in case of NB as per training and testing set. In conclusion, both tools performed well as an image to the dataset and obtained dataset to rich information through ML modelling and future study in WEKA tool can easily be analysed many biological big data to predict classifier accuracy.
Źródło:
World Scientific News; 2021, 154; 101-116
2392-2192
Pojawia się w:
World Scientific News
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
A comprehensive study on the application of firefly algorithm in prediction of energy dissipation on block ramps
Autorzy:
Mahdavi-Meymand, Amin
Sulisz, Wojciech
Zounemat-Kermani, Mohammad
Powiązania:
https://bibliotekanauki.pl/articles/2087026.pdf
Data publikacji:
2022
Wydawca:
Polska Akademia Nauk. Polskie Naukowo-Techniczne Towarzystwo Eksploatacyjne PAN
Tematy:
firefly algorithm
machine learning
energy dissipation
block ramp
Opis:
In this study novel integrative machine learning models embedded with the firefly algorithm (FA) were developed and employed to predict energy dissipation on block ramps. The used models include multi-layer perceptron neural network (MLPNN), adaptive neuro-fuzzy inference system (ANFIS), group method of data handling (GMDH), support vector regression (SVR), linear equation (LE), and nonlinear regression equation (NE). The investigation focused on the evaluation of the performance of standard and integrative models in different runs. The performances of machine learning models and the nonlinear equation are higher than the linear equation. The results also show that FA increases the performance of all applied models. Moreover, the results indicate that the ANFIS-FA is the most stable integrative model in comparison to the other embedded methods and reveal that GMDH and SVR are the most stable technique among all applied models. The results also show that the accuracy of the LE-FA technique is relatively low, RMSE=0.091. The most accurate results provide SVR-FA, RMSE=0.034.
Źródło:
Eksploatacja i Niezawodność; 2022, 24, 2; 200--210
1507-2711
Pojawia się w:
Eksploatacja i Niezawodność
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Man-algorithm Cooperation Intelligent Design of Clothing Products in Multi Links
Autorzy:
Han, Chen
Lei, Shen
Shaogeng, Zhang
Mingming, Wang
Ying, Tang
Powiązania:
https://bibliotekanauki.pl/articles/2056307.pdf
Data publikacji:
2022
Wydawca:
Sieć Badawcza Łukasiewicz - Instytut Biopolimerów i Włókien Chemicznych
Tematy:
intelligent algorithm
cooperation
clothing design
machine learning
efficiency
Opis:
The changes in technology have led to a synchronous change in the clothing design method, as well as media and artistic aesthetics in the same period. The intelligence algorithm is constantly increasing its participation in development and production in the clothing industry. In this study, a variety of intelligent algorithms, including the parameterised numer state algorithm, Generative Adversarial Networks, and style transfer were introduced into the multi-links of clothing product design and development, such as clothing shape, print pattern, texture craft, product vision, and so on. Then, an innovative clothing design method based on the cooperation of the intelligent algorithm and various human functional roles was constructed. The method improves the efficiency of the multiple links of clothing design, such as generating 10000 printing patterns every 72.12 seconds, and completing the migration of 92.7 frames of the garment process style every second. To a certain extent, this study realizes the scale economy of clothing design and reduces its marginal cost through the unlimited computing power brought about by Moore’s law of digital technology, which provides a reference for the exploration of clothing design in the era of industry 4.0.
Źródło:
Fibres & Textiles in Eastern Europe; 2022, 1 (151); 59--66
1230-3666
2300-7354
Pojawia się w:
Fibres & Textiles in Eastern Europe
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Automatic identification of malfunctions of large turbomachinery during transient states with genetic algorithm optimization
Autorzy:
Barszcz, Tomasz
Zabaryłło, Mateusz
Powiązania:
https://bibliotekanauki.pl/articles/2052104.pdf
Data publikacji:
2022
Wydawca:
Polska Akademia Nauk. Czytelnia Czasopism PAN
Tematy:
machine learning
fault detection
transient
turbine generator
genetic algorithm
Opis:
Turbines and generators operating in the power generation industry are a major source of electrical energy worldwide. These are critical machines and their malfunctions should be detected in advance in order to avoid catastrophic failures and unplanned shutdowns. A maintenance strategy which enables to detect malfunctions at early stages of their existence plays a crucial role in facilities using such types of machinery. The best source of data applied for assessment of the technical condition are the transient data measured during start-ups and coast-downs. Most of the proposed methods using signal decomposition are applied to small machines with a rolling element bearing in steady-state operation with a shaft considered as a rigid body. The machines examined in the authors’ research operate above their first critical rotational speed interval and thus their shafts are considered to be flexible and are equipped with a hydrodynamic sliding bearing. Such an arrangement introduces significant complexity to the analysis of the machine behavior, and consequently, analyzing such data requires a highly skilled human expert. The main novelty proposed in the paper is the decomposition of transient vibration data into components responsible for particular failure modes. The method is automated and can be used for identification of turbogenerator malfunctions. Each parameter of a particular decomposed function has its physical representation and can help the maintenance staff to operate the machine properly. The parameters can also be used by the managing personnel to plan overhauls more precisely. The method has been validated on real-life data originating from a 200 MW class turbine. The real-life field data, along with the data generated by means of the commercial software utilized in GE’s engineering department for this particular class of machines, was used as the reference data set for an unbalanced response during the transients in question.
Źródło:
Metrology and Measurement Systems; 2022, 29, 1; 175-190
0860-8229
Pojawia się w:
Metrology and Measurement Systems
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
A real-valued genetic algorithm to optimize the parameters of support vector machine for classification of multiple faults in NPP
Autorzy:
Amer, F. Z.
El-Garhy, A. M.
Awadalla, M. H.
Rashad, S. M.
Abdien, A. K.
Powiązania:
https://bibliotekanauki.pl/articles/147652.pdf
Data publikacji:
2011
Wydawca:
Instytut Chemii i Techniki Jądrowej
Tematy:
support vector machine (SVM)
fault classification
multi fault classification
genetic algorithm (GA)
machine learning
Opis:
Two parameters, regularization parameter c, which determines the trade off cost between minimizing the training error and minimizing the complexity of the model and parameter sigma (σ) of the kernel function which defines the non-linear mapping from the input space to some high-dimensional feature space, which constructs a non-linear decision hyper surface in an input space, must be carefully predetermined in establishing an efficient support vector machine (SVM) model. Therefore, the purpose of this study is to develop a genetic-based SVM (GASVM) model that can automatically determine the optimal parameters, c and sigma, of SVM with the highest predictive accuracy and generalization ability simultaneously. The GASVM scheme is applied on observed monitored data of a pressurized water reactor nuclear power plant (PWRNPP) to classify its associated faults. Compared to the standard SVM model, simulation of GASVM indicates its superiority when applied on the dataset with unbalanced classes. GASVM scheme can gain higher classification with accurate and faster learning speed.
Źródło:
Nukleonika; 2011, 56, 4; 323-332
0029-5922
1508-5791
Pojawia się w:
Nukleonika
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Designing Smart Antennas Using Machine Learning Algorithms
Autorzy:
Samantaray, Barsa
Das, Kunal Kumar
Roy, Jibendu Sekhar
Powiązania:
https://bibliotekanauki.pl/articles/27312957.pdf
Data publikacji:
2023
Wydawca:
Instytut Łączności - Państwowy Instytut Badawczy
Tematy:
artificial neural network
decision tree
ensemble algorithm
machine learning
smart antenna
support vector machine
Opis:
Smart antenna technologies improve spectral efficiency, security, energy efficiency, and overall service quality in cellular networks by utilizing signal processing algorithms that provide radiation beams to users while producing nulls for interferers. In this paper, the performance of such ML solutions as the support vector machine (SVM) algorithm, the artificial neural network (ANN), the ensemble algorithm (EA), and the decision tree (DT) algorithm used for forming the beam of smart antennas are compared. A smart antenna array made up of 10 half-wave dipoles is considered. The ANN method is better than the remaining approaches when it comes to achieving beam and null directions, whereas EA offers better performance in terms of reducing the side lobe level (SLL). The maximum SLL is achieved using EA for all the user directions. The performance of the ANN algorithm in terms of forming the beam of a smart antenna is also compared with that of the variable-step size adaptive algorithm.
Źródło:
Journal of Telecommunications and Information Technology; 2023, 4; 46--52
1509-4553
1899-8852
Pojawia się w:
Journal of Telecommunications and Information Technology
Dostawca treści:
Biblioteka Nauki
Artykuł

Ta witryna wykorzystuje pliki cookies do przechowywania informacji na Twoim komputerze. Pliki cookies stosujemy w celu świadczenia usług na najwyższym poziomie, w tym w sposób dostosowany do indywidualnych potrzeb. Korzystanie z witryny bez zmiany ustawień dotyczących cookies oznacza, że będą one zamieszczane w Twoim komputerze. W każdym momencie możesz dokonać zmiany ustawień dotyczących cookies