Compressing Without Losing Context: A Novel Framework for Text Summarization
  • Author(s): Arkesh Kumar Satapathy; Surya LN Pradhan; Sneha Patnaik; Swain Kanheya Bhima; Prof. Sanjit Kumar Acharya
  • Paper ID: 1716603
  • Page: 2279-2285
  • Published Date: 21-04-2026
  • Published In: Iconic Research And Engineering Journals
  • Publisher: IRE Journals
  • e-ISSN: 2456-8880
  • Volume/Issue: Volume 9 Issue 10 April-2026
Abstract

The expansion of Large Language Models (LLMs) into long-context reasoning has introduced critical challenges regarding computational efficiency and information integrity. While increasing context windows provides more space for data, it often leads to information dilution, "lost-in-the-middle" phenomena, and prohibitive key-value (KV) cache costs. This paper presents a comprehensive framework for context-aware text summarization utilizing information theory, discourse analysis, and agentic refinement. We specifically investigate the COMI (COarse-to-fine Context Compression) architecture, which leverages Marginal Information Gain (MIG) to balance relevance and diversity. Furthermore, we explore the shift from passive retention to active, iterative reasoning through paradigms like InftyThink and extreme compression algorithms such as TurboQuant. Experimental results across benchmarks such as NaturalQuestions, GovReport, and LongBench-v2 demonstrate that these techniques maintain high fidelity even at 32x to 40x compression ratios, bridging the gap between computat ional constraints and semantic completeness.

Keywords

Automatic Text Summarization, Context Compression, Information Bottleneck, Large Language Models, Marginal Information Gain, Key-Value Cache Optimization, Agentic Refinement

Citations

IRE Journals:
Arkesh Kumar Satapathy, Surya LN Pradhan, Sneha Patnaik, Swain Kanheya Bhima, Prof. Sanjit Kumar Acharya "Compressing Without Losing Context: A Novel Framework for Text Summarization" Iconic Research And Engineering Journals Volume 9 Issue 10 2026 Page 2279-2285 https://doi.org/10.64388/IREV9I10-1716603

IEEE:
Arkesh Kumar Satapathy, Surya LN Pradhan, Sneha Patnaik, Swain Kanheya Bhima, Prof. Sanjit Kumar Acharya "Compressing Without Losing Context: A Novel Framework for Text Summarization" Iconic Research And Engineering Journals, 9(10) https://doi.org/10.64388/IREV9I10-1716603