An attentional mechanism has lately been used to improve neural machine translation (NMT) by selectively focusing on parts of the source sentence during translation. In previous research, it state that there is two class of attention, global attention and local attention. Since the previous research did not test the experiment for text summarization, this thesis compared the impact of the local attention by using LSTM model for generating abstractive text summarization. The result of the project from this thesis is a model that implemented with global attention and model that implemented with local attention.
Keywords: local attention, LSTM, summarization, model