Informacja

Drogi użytkowniku, aplikacja do prawidłowego działania wymaga obsługi JavaScript. Proszę włącz obsługę JavaScript w Twojej przeglądarce.

Wyszukujesz frazę "BLSTM" wg kryterium: Temat


Wyświetlanie 1-2 z 2
Tytuł:
Email Phishing Detection with BLSTM and Word Embeddings
Autorzy:
Wolert, Rafał
Rawski, Mariusz
Powiązania:
https://bibliotekanauki.pl/articles/27311939.pdf
Data publikacji:
2023
Wydawca:
Polska Akademia Nauk. Czasopisma i Monografie PAN
Tematy:
phishing
BLSTM
word embeddings
Opis:
Phishing has been one of the most successful attacks in recent years. Criminals are motivated by increasing financial gain and constantly improving their email phishing methods. A key goal, therefore, is to develop effective detection methods to cope with huge volumes of email data. In this paper, a solution using BLSTM neural network and FastText word embeddings has been proposed. The solution uses preprocessing techniques like stop-word removal, tokenization, and padding. Two datasets were used in three experiments: balanced and imbalanced, whereas in the imbalanced dataset, the effect of maximum token size was investigated. Evaluation of the model indicated the best metrics: 99.12% accuracy, 98.43% precision, 99.49% recall, and 98.96% f1-score on the imbalanced dataset. It was compared to an existing solution that uses the DL model and word embeddings. Finally, the model and solution architecture were implemented as a browser plug-in.
Źródło:
International Journal of Electronics and Telecommunications; 2023, 69, 3; 485--491
2300-1933
Pojawia się w:
International Journal of Electronics and Telecommunications
Dostawca treści:
Biblioteka Nauki
Artykuł
Tytuł:
Attention-based deep learning model for Arabic handwritten text recognition
Autorzy:
Aïcha Gader, Takwa Ben
Echi, Afef Kacem
Powiązania:
https://bibliotekanauki.pl/articles/2201264.pdf
Data publikacji:
2022
Wydawca:
Szkoła Główna Gospodarstwa Wiejskiego w Warszawie. Instytut Informatyki Technicznej
Tematy:
Arabic handwriting recognition
attention mechanism
BLSTM
CNN
CTC
RNN
Opis:
This work proposes a segmentation-free approach to Arabic Handwritten Text Recog-nition (AHTR): an attention-based Convolutional Neural Network - Recurrent Neural Network - Con-nectionist Temporal Classification (CNN-RNN-CTC) deep learning architecture. The model receives asinput an image and provides, through a CNN, a sequence of essential features, which are transferred toan Attention-based Bidirectional Long Short-Term Memory Network (BLSTM). The BLSTM gives features sequence in order, and the attention mechanism allows the selection of relevant information from the features sequences. The selected information is then fed to the CTC, enabling the loss calculation and the transcription prediction. The contribution lies in extending the CNN by dropout layers, batch normalization, and dropout regularization parameters to prevent over-fitting. The output of the RNN block is passed through an attention mechanism to utilize the most relevant parts of the input sequence in a flexible manner. This solution enhances previous methods by improving the CNN speed and performance and controlling over model over-fitting. The proposed system achieves the best accuracy of97.1% for the IFN-ENIT Arabic script database, which competes with the current state-of-the-art. It was also tested for the modern English handwriting of the IAM database, and the Character Error Rate of 2.9% is attained, which confirms the model’s script independence.
Źródło:
Machine Graphics & Vision; 2022, 31, 1/4; 49--73
1230-0535
2720-250X
Pojawia się w:
Machine Graphics & Vision
Dostawca treści:
Biblioteka Nauki
Artykuł
    Wyświetlanie 1-2 z 2

    Ta witryna wykorzystuje pliki cookies do przechowywania informacji na Twoim komputerze. Pliki cookies stosujemy w celu świadczenia usług na najwyższym poziomie, w tym w sposób dostosowany do indywidualnych potrzeb. Korzystanie z witryny bez zmiany ustawień dotyczących cookies oznacza, że będą one zamieszczane w Twoim komputerze. W każdym momencie możesz dokonać zmiany ustawień dotyczących cookies