Whale-optimized LSTM networks for enhanced automatic text summarization

Bharathi Mohan Gurusamy, Prasanna Kumar Rangarajan, Ali Altalbe

Research output: Contribution to journalArticlepeer-review

5 Scopus citations

Abstract

Automatic text summarization is a cornerstone of natural language processing, yet existing methods often struggle to maintain contextual integrity and capture nuanced sentence relationships. Introducing the Optimized Auto Encoded Long Short-Term Memory Network (OAELSTM), enhanced by the Whale Optimization Algorithm (WOA), offers a novel approach to this challenge. Existing summarization models frequently produce summaries that are either too generic or disjointed, failing to preserve the essential content. The OAELSTM model, integrating deep LSTM layers and autoencoder mechanisms, focuses on extracting key phrases and concepts, ensuring that summaries are both informative and coherent. WOA fine-tunes the model’s parameters, enhancing its precision and efficiency. Evaluation on datasets like CNN/Daily Mail and Gigaword demonstrates the model’s superiority over existing approaches. It achieves a ROUGE Score of 0.456, an accuracy rate of 84.47%, and a specificity score of 0.3244, all within an efficient processing time of 4,341.95 s.

Original languageEnglish
Article number1399168
JournalFrontiers in Artificial Intelligence
Volume7
DOIs
StatePublished - 2024
Externally publishedYes

Keywords

  • Auto Encoded
  • LSTM
  • optimization
  • summarization
  • Whale Optimization Algorithm

Fingerprint

Dive into the research topics of 'Whale-optimized LSTM networks for enhanced automatic text summarization'. Together they form a unique fingerprint.

Cite this