ABSTRACT

Due to the explosive growth in deep learning in recent years, there has been an acceleration of research in computer vision and natural language processing. Neural machine translation (NMT) is a sub-area of natural language processing that deals with machine-assisted translations between natural languages using deep neural networks. This literature review is concerned with the recent advances in the said field, beginning with the emergence of the recurrent neural networks as a sequence-to-sequence model, then the attention mechanism, the convolutional neural networks-based machine translation, and most recently, the transformer models. The paper also discusses various challenges machine translation researchers face, such as achieving higher performance for low-resource languages, handling rare words, designing more robust but less compute-intensive neural networks, etc.