Case Study of Improving English-Arabic Translation Using the Transformer Model.
Last updated: 23 Dec 2024
10.21608/ijicis.2023.210435.1270
English-Arabic Translation, neural machine translation, Attention Mechanism, Transformer with Multi-head Attention, Low Data Resource Languages
Donia
Gamal
Computer Science Department, Faculty of computer and information sciences, Ain Shams University, Cairo, Egypt
donia.gamaleldin@cis.asu.edu.eg
Marco
Alfonse
Computer Science Department, Faculty of Computer and Information Sciences, Ain Shams University. aboratoie Interdisciplinaire de l'Université Française d'Égypte (UFEID LAB), Université Française
marco_alfonse@cis.asu.edu.eg
cairo
0000-0003-0722-3218
Salud María
Jiménez-Zafra
Computer Science Department, CEATIC, Universidad de Jaén, Jaén, Spain.
sjzafra@ujaen.es
n/a
Moustafa
Aref
Department Computer Science, Faculty of Computer and Information Sciences,Ain Shams University, Cairo, Egypt.
mostafa.aref@cis.asu.edu.eg
0000-0002-1278-0070
23
2
42109
2023-06-01
2023-05-11
2023-06-01
105
115
1687-109X
2535-1710
https://ijicis.journals.ekb.eg/article_305271.html
https://ijicis.journals.ekb.eg/service?article_code=305271
305,271
Original Article
494
Journal
International Journal of Intelligent Computing and Information Sciences
https://ijicis.journals.ekb.eg/
Case Study of Improving English-Arabic Translation Using the Transformer Model.
Details
Type
Article
Created At
23 Dec 2024