Informacja

Drogi użytkowniku, aplikacja do prawidłowego działania wymaga obsługi JavaScript. Proszę włącz obsługę JavaScript w Twojej przeglądarce.

Wyszukujesz frazę "hyperparameter optimization" wg kryterium: Temat


Wyświetlanie 1-4 z 4
Tytuł:
Hyperparameter optimization of artificial neural networks to improve the positional accuracy of industrial robots
Autorzy:
Uhlmann, Eckart
Polte, Mitchel
Blumberg, Julian
Li, Zhoulong
Kraft, Adrian
Powiązania:
https://bibliotekanauki.pl/articles/1429023.pdf
Data publikacji:
2021
Wydawca:
Wrocławska Rada Federacji Stowarzyszeń Naukowo-Technicznych
Tematy:
artificial neural network
robot calibration
hyperparameter optimization
Opis:
Due to the rising demand for individualized product specifications and short product innovation cycles, industrial robots gain increasing attention for machining operations as milling and forming. Limitations in their absolute positional accuracy are addressed by enhanced modelling and calibration techniques. However, the resulting absolute positional accuracy stays in a range still not feasible for general purpose milling and forming tolerances. Improvements of the model accuracy demand complex, often not accessible system knowledge on the expense of realtime capability. This article presents a new approach using artificial neural networks to enhance positional accuracy of industrial robots. A hyperparameter optimization is applied, to overcome the downside of choosing an appropriate artificial neural network structure and training strategy in a trial and error procedure. The effectiveness of the method is validated with a heavy-duty industrial robot. It is demonstrated that artificial neural networks with suitable hyperparameters outperform a kinematic model with calibrated geometric parameters.
Źródło:
Journal of Machine Engineering; 2021, 21, 2; 47-59
1895-7595
2391-8071
Pojawia się w:
Journal of Machine Engineering
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
The Effect of Dual Hyperparameter Optimization on Software Vulnerability Prediction Models
Autorzy:
Bassi, Deepali
Singh, Hardeep
Powiązania:
https://bibliotekanauki.pl/articles/2203949.pdf
Data publikacji:
2023
Wydawca:
Politechnika Wrocławska. Oficyna Wydawnicza Politechniki Wrocławskiej
Tematy:
software vulnerability
hyperparameter optimization
machine learning algorithm
data balancing techniques
data complexity measures
Opis:
Background: Prediction of software vulnerabilities is a major concern in the field of software security. Many researchers have worked to construct various software vulnerability prediction (SVP) models. The emerging machine learning domain aids in building effective SVP models. The employment of data balancing/resampling techniques and optimal hyperparameters can upgrade their performance. Previous research studies have shown the impact of hyperparameter optimization (HPO) on machine learning algorithms and data balancing techniques. Aim: The current study aims to analyze the impact of dual hyperparameter optimization on metrics-based SVP models. Method: This paper has proposed the methodology using the python framework Optuna that optimizes the hyperparameters for both machine learners and data balancing techniques. For the experimentation purpose, we have compared six combinations of five machine learners and five resampling techniques considering default parameters and optimized hyperparameters. Results: Additionally, the Wilcoxon signed-rank test with the Bonferroni correction method was implied, and observed that dual HPO performs better than HPO on learners and HPO on data balancers. Furthermore, the paper has assessed the impact of data complexity measures and concludes that HPO does not improve the performance of those datasets that exhibit high overlap. Conclusion: The experimental analysis unveils that dual HPO is 64% effective in enhancing the productivity of SVP models.
Źródło:
e-Informatica Software Engineering Journal; 2023, 17, 1; art. no. 230102
1897-7979
Pojawia się w:
e-Informatica Software Engineering Journal
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
An autoencoder-enhanced stacking neural network model for increasing the performance of intrusion detection
Autorzy:
Brunner, Csaba
Kő, Andrea
Fodor, Szabina
Powiązania:
https://bibliotekanauki.pl/articles/2147134.pdf
Data publikacji:
2022
Wydawca:
Społeczna Akademia Nauk w Łodzi. Polskie Towarzystwo Sieci Neuronowych
Tematy:
intrusion detection
neural network
ensemble classifiers
hyperparameter optimization
sparse autoencoder
NSL-KDD
machine learning
Opis:
Security threats, among other intrusions affecting the availability, confidentiality and integrity of IT resources and services, are spreading fast and can cause serious harm to organizations. Intrusion detection has a key role in capturing intrusions. In particular, the application of machine learning methods in this area can enrich the intrusion detection efficiency. Various methods, such as pattern recognition from event logs, can be applied in intrusion detection. The main goal of our research is to present a possible intrusion detection approach using recent machine learning techniques. In this paper, we suggest and evaluate the usage of stacked ensembles consisting of neural network (SNN) and autoencoder (AE) models augmented with a tree-structured Parzen estimator hyperparameter optimization approach for intrusion detection. The main contribution of our work is the application of advanced hyperparameter optimization and stacked ensembles together. We conducted several experiments to check the effectiveness of our approach. We used the NSL-KDD dataset, a common benchmark dataset in intrusion detection, to train our models. The comparative results demonstrate that our proposed models can compete with and, in some cases, outperform existing models.
Źródło:
Journal of Artificial Intelligence and Soft Computing Research; 2022, 12, 2; 149--163
2083-2567
2449-6499
Pojawia się w:
Journal of Artificial Intelligence and Soft Computing Research
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Stochastic schemata exploiter-based optimization of hyper-parameters for XGBoost
Autorzy:
Makino, Hiroya
Kita, Eisuke
Powiązania:
https://bibliotekanauki.pl/articles/38707755.pdf
Data publikacji:
2024
Wydawca:
Instytut Podstawowych Problemów Techniki PAN
Tematy:
evolutionary computation
Stochastic Schemata Exploiter
hyperparameter optimization
XGBoost
obliczenia ewolucyjne
eksplorator schematów stochastycznych
optymalizacja hiperparametrów
Opis:
XGBoost is well-known as an open-source software library that provides a regularizing gradient boosting framework. Although it is widely used in the machine learning field, its performance depends on the determination of hyper-parameters. This study focuses on the optimization algorithm for hyper-parameters of XGBoost by using Stochastic Schemata Exploiter (SSE). SSE, which is one of Evolutionary Algorithms, is successfully applied to combinatorial optimization problems. SSE is applied for optimizing hyper-parameters of XGBoost in this study. The original SSE algorithm is modified for hyper-parameter optimization. When comparing SSE with a simple Genetic Algorithm, there are two interesting features: quick convergence and a small number of control parameters. The proposed algorithm is compared with other hyper-parameter optimization algorithms such as Gradient Boosted Regression Trees (GBRT), Tree-structured Parzen Estimator (TPE), Covariance Matrix Adaptation Evolution Strategy (CMA-ES), and Random Search in order to confirm its validity. The numerical results show that SSE has a good convergence property, even with fewer control parameters than other methods.
Źródło:
Computer Assisted Methods in Engineering and Science; 2024, 31, 1; 113-132
2299-3649
Pojawia się w:
Computer Assisted Methods in Engineering and Science
Dostawca treści:
Biblioteka Nauki
Artykuł
    Wyświetlanie 1-4 z 4

    Ta witryna wykorzystuje pliki cookies do przechowywania informacji na Twoim komputerze. Pliki cookies stosujemy w celu świadczenia usług na najwyższym poziomie, w tym w sposób dostosowany do indywidualnych potrzeb. Korzystanie z witryny bez zmiany ustawień dotyczących cookies oznacza, że będą one zamieszczane w Twoim komputerze. W każdym momencie możesz dokonać zmiany ustawień dotyczących cookies