Recurrent Neural Network Techniques: Emphasis on Use in Neural Machine Translation

Authors

  • Dima Suleiman King Abdullah II School of Information Technology, The University of Jordan, Amman, Jordan, dima.suleiman@ju.edu.jo
  • Wael Etaiwi Princess Sumaya University for Technology
  • Arafat Awajan King Hussein School of Computing Sciences, Princess Sumaya University for Technology, Amman, Jordan, awajan@psut.edu.jo College of Information Technology, Mutah University, AlKarak, Jordan, awajan@mutah.edu.jo

DOI:

https://doi.org/10.31449/inf.v45i7.3743

Abstract

Natural Language Processing (NLP) is the processing and the representation of human language in a way that accommodate its use in modern computer technology. Several techniques including deep learning, graph-based, rule-based and word embedding can be used in variety of NLP application such as text summarization, question and answering and sentiment analysis. In this paper, machine translation techniques based on using recurrent neural networks are analyzed and discussed. The techniques are divided into three categories including recurrent neural network, recurrent neural network with phrase-based models and recurrent neural techniques with graph-based models. Several experiments are performed in several datasets to make translation between different languages. In addition, in most of techniques, BLEU is used in evaluating the performance of different translation models.

Author Biographies

Dima Suleiman, King Abdullah II School of Information Technology, The University of Jordan, Amman, Jordan, dima.suleiman@ju.edu.jo

Dima Suleiman received her Bachelor and master degrees in Computer Science from University of Jordan, and she reveived her PhD. degree from Princess Sumaya University for Technology. She has 15 years of experience in teaching undergraduate and graduate students at the University of Jordan in Business Information Technology department. Her research interests are in the areas of algorithms, Natural Language Processing, Data science and data mining. 

Wael Etaiwi, Princess Sumaya University for Technology

Dr. Wael Etaiwi is an assistant professor in the Department of Business Information Technology at Princess Sumaya University for Technology, Jordan. He received his BSc degree in Computer Information Systems from the Hashemite University in 2007, his MSc Degree in Computer Science in 2011 from Al Balqaa Applied University, and his Ph.D. in Computer Science from Princess Sumaya University for Technology in 2020. Dr. Al Etaiwi has 13 years of experience in software development and system analyst. His research interests include, but are not limited to, Artificial intelligence, Data mining, and Natural Language Processing.

Arafat Awajan, King Hussein School of Computing Sciences, Princess Sumaya University for Technology, Amman, Jordan, awajan@psut.edu.jo College of Information Technology, Mutah University, AlKarak, Jordan, awajan@mutah.edu.jo

Prof. Arafat Awajan is a Full Professor at Princess Sumaya University for Technology (PSUT). He received his PhD degree in Computer Science from the University of Franche-Comte, France in 1987. He has held various administrative and academic positions at the Royal Scientific Society and Princess Sumaya University for Technology. Head of the Department of Computer Science (2000 - 2003) Head of the Department of Computer Graphics and Animation (2005 - 2006) Dean of the King Hussein School for Information Technology (2004 - 2007) Director of the Information Technology Center, RSS (2008 - 2010) Dean of Student Affairs (2011 - 2014) Dean of the King Hussein School for Computing Sciences (2014-2017) He is currently the vice president of the university (PSUT) . His research interests include • Natural Language Processing • Arabic Text Mining • Digital Image Processing.

References

D. Suleiman and A. Awajan, "Comparative study of word embeddings models and their usage in Arabic language applications," International Arab Conference on Information Technology (ACIT), Werdanye, Lebanon, pp. 1-7.2018.

D. Suleiman, A. Awajan, and N. Al-Madi, "Deep Learning Based Technique for Plagiarism Detection in Arabic Texts," in 2017 International Conference on New Trends in Computing Sciences (ICTCS), Amman, pp. 216-222, 2017.

D. Suleiman, A. A. Awajan, and W. al Etaiwi, "Arabic Text Keywords Extraction using Word2Vec," in 2019 2nd International Conference on new Trends in Computing Sciences (ICTCS), Amman, Jordan, pp. 1-7, doi: 10.1109/ICTCS.2019.8923034, 2019.

C. Tapsai, P. Meesad, and C. Haruechaiyasak, “Natural Language Interface to Database for Data Retrieval and Processing,” j.asep, May 2020, doi: 10.14416/j.asep.2020.05.003.

D. Suleiman and A. Awajan, "Bag-of-concept based keyword extraction from Arabic documents," in 2017 8th International Conference on Information Technology (ICIT), Amman, Jordan, pp. 863-869, 2017.

D. Suleiman, G. Al-Naymat, M. Itriq, "Deep SMS Spam Detection using H2O Platform," International Journal of Advanced Trends in Computer Science and Engineering 9(5):9179-9188, DOI: 10.30534/ijatcse/2020/326952020, 2020.

D. Suleiman, M. Al-Zewairi, W. Etaiwi, G. Al-Naymat, "Empirical Evaluation of the Classification of Deep Learning under Big Data Processing Platforms," International Journal of Advanced Trends in Computer Science and Engineering 9(5):9189-9196, DOI: 10.30534/ijatcse/2020/327952020, 2020.

A. Alqudsi, N. Omar, and K. Shaker, “Arabic machine translation: a survey,” Artif. Intell. Rev., vol. 42, no. 4, pp. 549–572, Dec. 2014, doi: 10.1007/s10462-012-9351-1.

M. R. Costa-jussà and J. A. R. Fonollosa, “Latest trends in hybrid machine translation and its applications,” Comput. Speech Lang., vol. 32, no. 1, pp. 3–10, Jul. 2015, doi: 10.1016/j.csl.2014.11.001.

LeCun, Y., Bengio, Y., & Hinton, G. (2015). Deep learning. Nature, 521(7553), 436–444. https://doi.org/10.1038/nature14539.

D. Suleiman and A. Awajan, "Deep Learning Based Abstractive Text Summarization: Approaches, Datasets, Evaluation Measures, and Challenges," in 2020 Mathematical Problems in Engineering, vol. 2020, pp. 1-29, doi: 10.1155/2020/9365340.

K. Cho et al., “Learning Phrase Representations using RNN Encoder-Decoder for Statistical Machine Translation,” ArXiv14061078 Cs Stat, Jun. 2014, [Online]. Available: http://arxiv.org/abs/1406.1078.

F. Guzmán, S. R. Joty, L. Màrquez, and P. Nakov, “Machine Translation Evaluation with Neural Networks,” ArXiv171002095 Cs, Oct. 2017, [Online]. Available: http://arxiv.org/abs/1710.02095.

E. Greenstein and D. Penner, “Japanese-to-English Machine Translation Using Recurrent Neural Networks, ” Stanford Deep Learning for NLP Course, 2015.

M. Parmar and V. S. Devi, “Neural Machine Translation with Recurrent Highway Networks,” in International Conference on Mining Intelligence and Knowledge Exploration, 2019, pp. 299–308.

D. Datta, P. E. David, D. Mittal, and A. Jain, “Neural Machine Translation using Recurrent Neural Network,” Int. J. Eng. Adv. Technol., vol. 9, no. 4, pp. 1395–1400, 2020.

L. Liu, A. Finch, M. Utiyama, and E. Sumita, “Agreement on Target-Bidirectional Recurrent Neural Networks for Sequence-to-Sequence Learning,” J. Artif. Intell. Res., vol. 67, pp. 581–606, 2020.

P.-S. Huang, C. Wang, S. Huang, D. Zhou, and L. Deng, “Towards Neural Phrase-based Machine Translation,” Feb. 2018, Accessed: Jun. 01, 2018. [Online]. Available: https://openreview.net/forum?id=HktJec1RZ.

L. M. Werlen, N. Pappas, D. Ram, and A. Popescu-Belis, “Self-Attentive Residual Decoder for Neural Machine Translation,” ArXiv170904849 Cs, Sep. 2017, [Online]. Available: http://arxiv.org/abs/1709.04849.

S. K. Mahata, D. Das, and S. Bandyopadhyay, “Mtil2017: Machine translation using recurrent neural network on statistical machine translation,” J. Intell. Syst., vol. 28, no. 3, pp. 447–453, 2019.

L. Song, Y. Zhang, Z. Wang, and D. Gildea, “A Graph-to-Sequence Model for AMR-to-Text Generation,” ArXiv180502473 Cs, May 2018, [Online]. Available: http://arxiv.org/abs/1805.02473.

K. Hashimoto and Y. Tsuruoka, “Neural Machine Translation with Source-Side Latent Graph Parsing,” ArXiv170202265 Cs, Feb. 2017, [Online]. Available: http://arxiv.org/abs/1702.02265.

J. Su, Z. Tan, D. Xiong, R. Ji, X. Shi, and Y. Liu, “Lattice-Based Recurrent Neural Network Encoders for Neural Machine Translation,” ArXiv160907730 Cs, Sep. 2016, [Online]. Available: http://arxiv.org/abs/1609.07730.

D. Bahdanau, K. Cho, and Y. Bengio, “Neural machinetranslation by jointly learning to align and translate,” inProceedings of the International Conference on LearningRepresentations, Canada, 2014, http://arxiv.org/abs/1409.0473.

Papineni, K.; Roukos, S.;Ward, T.; and Zhu,W. 2002. Bleu: A method for automatic evaluation of machine translation. In Proc. of ACL2002, 311–318.

Downloads

Published

2021-12-23

How to Cite

Suleiman, D., Etaiwi, W., & Awajan, A. (2021). Recurrent Neural Network Techniques: Emphasis on Use in Neural Machine Translation. Informatica, 45(7). https://doi.org/10.31449/inf.v45i7.3743