Neural Attention Model for Abstractive Text Summarization Using Linguistic Feature Space

  • Aniqa Dilawari
  • , Muhammad Usman Ghani Khan
  • , Summra Saleem
  • , Zahoor-Ur-Rehman*
  • , Fatema Sabeen Shaikh
  • *Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

25 Scopus citations

Abstract

Summarization generates a brief and concise summary which portrays the main idea of the source text. There are two forms of summarization: abstractive and extractive. Extractive summarization chooses important sentences from the text to form a summary whereas abstractive summarization paraphrase using advanced and nearer-to human explanation by adding novel words or phrases. For a human annotator, producing summary of a document is time consuming and expensive because it requires going through the long document and composing a short summary. An automatic feature-rich model for text summarization is proposed that can reduce the amount of labor and produce a quick summary by using both extractive and abstractive approach. A feature-rich extractor highlights the important sentences in the text and linguistic characteristics are used to enhance results. The extracted summary is then fed to an abstracter to further provide information using features such as named entity tags, part of speech tags and term weights. Furthermore, a loss function is introduced to normalize the inconsistency between word-level and sentence-level attentions. The proposed two-staged network achieved a ROUGE score of 37.76% on the benchmark CNN/DailyMail dataset, outperforming the earlier work. Human evaluation is also conducted to measure the comprehensiveness, conciseness and informativeness of the generated summary.

Original languageEnglish
Pages (from-to)23557-23564
Number of pages8
JournalIEEE Access
Volume11
DOIs
StatePublished - 2023

Keywords

  • Abstractive summarization
  • encoder-decoder
  • extractive summarization
  • feature rich model
  • linguistic features
  • summarization evaluation

Fingerprint

Dive into the research topics of 'Neural Attention Model for Abstractive Text Summarization Using Linguistic Feature Space'. Together they form a unique fingerprint.

Cite this