BiLAT: A BiLSTM Attention Transformer Model with Hyperparameter Optimization for Robust Fake News Detection
  • Author(s): T. Poornima
  • Paper ID: 1714856
  • Page: 385-402
  • Published Date: 10-03-2026
  • Published In: Iconic Research And Engineering Journals
  • Publisher: IRE Journals
  • e-ISSN: 2456-8880
  • Volume/Issue: Volume 9 Issue 9 March-2026
Abstract

The rapid spread of misinformation on social media, particularly Twitter, poses major problems to information reliability and public trust. This paper includes a complete comparative examination of six deep learning models such as RNN, CNN, BERT, GRU, LSTM, and a suggested BiLSTM Attention Transformer (BiLAT) assessed across three benchmark datasets: Fake and Real News, FakeNewsNet, and ISOT Fabricated News. We systematically test five hyperparameter optimisation methods (Grid Search (GS), Random Search (RS), Bayesian Optimisation (BO), Genetic Algorithm (GA), and BOHB) to see how they affect model performance. Results reveal that transformer-based architectures greatly outperform traditional models, with BiLAT obtaining state-of-the-art performance, including 99% accuracy on FakeNewsNet under BOHB optimization. BOHB consistently gives the best performance gains across all models and datasets, with an accuracy boost of 2–5% over traditional optimisation methods. The findings suggest that integrating advanced transformer topologies with efficient hyperparameter optimization considerably boosts the ability to grasp linguistic intricacies inherent in disinformation. This paper shows the importance of architectural design and optimization technique in constructing effective, scalable fake news detection systems for social media contexts.

Keywords

Fake News Detection, Deep Learning Models, Hyperparameter Optimization, Transformer Architectures, BOHB

Citations

IRE Journals:
T. Poornima "BiLAT: A BiLSTM Attention Transformer Model with Hyperparameter Optimization for Robust Fake News Detection" Iconic Research And Engineering Journals Volume 9 Issue 9 2026 Page 385-402 https://doi.org/10.64388/IREV9I9-1714856

IEEE:
T. Poornima "BiLAT: A BiLSTM Attention Transformer Model with Hyperparameter Optimization for Robust Fake News Detection" Iconic Research And Engineering Journals, 9(9) https://doi.org/10.64388/IREV9I9-1714856