Abstractive Tigrigna Text Summarization using Deep Learning Approach

Authors

  • Yemane Gebrekidan Gebretekle Raya University Author
  • Dr. Gebrekiros Gebreyesus Author
  • Tewelde Hailay Gebreegziabher Author
  • Yemane Hailay Nega Author

DOI:

https://doi.org/10.82489/rjsd.2025.1.01.29

Keywords:

Seq2Seq, LSTM, abstractive, FastText embedding, attention mechanism

Abstract

Text summarization has become essential due to the vast amounts of text data shared online. It is the task of condensing long sequences of text into short, concise, and expressive summaries. Two basic methods exist: extractive and abstractive. In Tigrigna, some studies used extractive methods, focusing on selecting portions of text without addressing underlying meaning. We proposed abstractive text summarization for Tigrigna to generate semantics-based summaries. Abstractive summarization rephrases or reorganizes long text to produce semantically equivalent summaries, possibly with new words or phrases. Tigrigna has not been researched in this way, and the task is difficult since there is no structured dataset, pre-trained word embedding, or summarization model prepared for it. To address these challenges, we applied deep learning models. A dataset of 1,167 structured input paragraphs and reference summaries was prepared for training and evaluation. Different embedding methods, including FastText and Byte Pair Encoding, were trained on about 320 MB of data. To reduce the effect of noisy stopwords dominating embeddings, FastText was trained with stopword removal and down-sampling of frequent words. For Tigrigna abstractive summarization, two models (Seq2Seq LSTM and Transformer) were evaluated. The Seq2Seq works sequentially, whereas the Transformer operates in parallel. An attention mechanism was added to Seq2Seq, while Transformer uses self-attention. Among tested model–embedding matches, Seq2Seq with attention and FastText with down-sampling showed superior performance, achieving accuracy of 0.72 and ROUGE scores of R-1=0.20, R-2=0.183, and R-N=0.17. This work pioneers Tigrigna abstractive summarization, marking a foundational step for future research.

Author Biographies

  • Yemane Gebrekidan Gebretekle, Raya University

    Computer Science Lecturer in Raya University, Maichew, Ethiopia 

  • Dr. Gebrekiros Gebreyesus

    Assistant Professor, Department of Electrical and Computer Engineering

  • Tewelde Hailay Gebreegziabher

    Lecturer, Computer Science

  • Yemane Hailay Nega

    Assistant Lecturer, Department of Computer Science

References

Abdullahi, S. S., Yiming, S., Muhammad, S. H., Mustapha, A., Aminu, A. M., Abdullahi, A., Bello, M., & Aliyu, S. M. (2021). Deep Sequence Models for Text Classification Tasks. 3rd International Conference on Electrical, Communication and Computer Engineering, ICECCE 2021, June, 12–13. https://doi.org/10.1109/ICECCE52056.2021.9514261

Birhanu G, W. M. (2017). College Of Natural Science School of Information Science Automatic Text Summarizer for Tigrinya Language Automatic Text Summarizer for Tigrinya Language. 1–96.Birhanu, G. A. (2017). Automatic text summarizer for Tigrinya language. 1–86.

Bo, T., Li, W., & Liu, Y. (2025). A Technical Review of Sequence-to-Sequence Models. Academic Journal of Natural Science, 2(2). https://doi.org/10.70393/616a6e73.323834

Carenini, G., Chi, J., & Cheung, K. (2006). Extractive vs . NLG-based Abstractive Summarization of Evaluative Text : The Effect of Corpus Controversiality. INLG ’08: Proceedings of the Fifth International Natural Language Generation Conference, http://aclweb.org/anthology-new/W/W08/W08-1106.pdf

Egger, R. (2022). Text Representations and Word Embeddings: Vectorizing Textual Data. Tourism on the Verge, Part F1051(February), 335–361. https://doi.org/10.1007/978-3-030-88389-8_16

Khalil, M. I. (2020). Abstractive Text Summarization. Journal of Xidian University, 14(6), https://doi.org/10.37896/jxu14.6/094

Paritosh Marathe. (2020). Comprehensive Survey on Abstractive Text Summarization. International Journal of Engineering Research And, V9(09), https://doi.org/10.17577/ijertv9is090466

Regassa, M. G., Getachew, M., Advisor, R., & Assabie, Y. (2017). Topic-based Tigrigna Text Summarization Using WordNet.

Relan, S., & Rambola, R. (2022). A review on abstractive text summarization Methods. 2022 13th International Conference on Computing Communication and Networking Technologies, ICCCNT. Doi.org/10.1109/ICCCNT54827.2022.9984332

Sanjabi, N. (2014). Abstractive Text Summarization with Attention-based Mechanism.

Sciences, C. (2025). College of Natural and Computational Sciences School of Information Science. 6–11.

Shafiq, N., Hamid, I., Asif, M., Nawaz, Q., Aljuaid, H., & Ali, H. (2023). Abstractive text summarization of low- resourced languages using deep learning. PeerJ Computer Science, 9. https://doi.org/10.7717/peerj-cs.1176

Shakil, H., Farooq, A., & Kalita, J. (2024). Abstractive text summarization: State of the art, challenges, and improvements. Neurocomputing, 603. https://doi.org/10.1016/j.neucom.2024.128255

Tamiru, M., & Libsie, M. (2009). Automatic Amharic Text Summarization Using Latent Semantic Analysis. 1–106, Angeles, L., Advocacy, S., Location, O. (2002).

tigrinya · GitHub Topics · GitHub. (n.d.). Retrieved October 15, 2025, from https://github.com/topics/tigrinya

Wazery, Y. M., Saleh, M. E., Alharbi, A., & Ali, A. A. (2022). Abstractive Arabic Text Summarization Based on Deep Learning. Computational Intelligence and Neuroscience,2022. Doi.org/10.1155/2022/1566890

Yirdaw, E. D. (2011). Topic-based Amharic Text Summarization. June, Angeles, L., Advocacy, S., Location, O. (2002).

Yirdaw, E. D., & Ejigu, D. (2012). Topic-based amharic text summarization with probabilisic latent semantic analysis. Proceedings of the International Conference on Management of Emergent Digital EcoSystems, MEDES 2012, 8–15. https://doi.org/10.1145/2457276.2457279

Downloads

Published

2025-11-10

Issue

Section

Articles

How to Cite

Abstractive Tigrigna Text Summarization using Deep Learning Approach. (2025). Raya Journal of Science and Development, 1(01). https://doi.org/10.82489/rjsd.2025.1.01.29

Similar Articles

You may also start an advanced similarity search for this article.