Impacts and Outcomes of Using Dropout Layers to Mitigate Overfitting in Convolutional Neural Networks
  • Author(s): Rajat Gupta ; Rakesh Jindal ; Amisha Naik ; K L Ganatre
  • Paper ID: 1708497
  • Page: 392-407
  • Published Date: 31-12-2022
  • Published In: Iconic Research And Engineering Journals
  • Publisher: IRE Journals
  • e-ISSN: 2456-8880
  • Volume/Issue: Volume 6 Issue 6 December-2022
Abstract

Convolutional Neural Networks (CNNs) have demonstrated high accuracy in various computer vision tasks such as image classification, object detection, and facial recognition. However, these models are prone to overfitting—especially when they are highly complex and trained on limited data. Overfitting hampers a model’s ability to generalize to unseen data, making regularization essential in deep learning. One of the most effective and commonly used regularization techniques is dropout, which involves randomly deactivating a subset of neurons during each training iteration. This process reduces the risk of neurons becoming overly reliant on specific training features, thereby promoting robustness and better generalization. In this study, we empirically examine the impact of dropout layers within CNN architectures. Our focus is on understanding how different dropout rates influence training behavior, generalization capabilities, and overall model performance. We conduct experiments using well-known image classification datasets under a range of dropout configurations. Across all trials, our findings consistently show that incorporating dropout leads to lower overfitting, improved validation accuracy, and enhanced performance on unseen data. These results underscore the importance of integrating dropout into CNN designs, particularly when working with smaller datasets. Our analysis also reveals the critical balance required when selecting a dropout rate, as both excessively high and low rates can impair model effectiveness through underfitting or insufficient regularization. Ultimately, our study affirms dropout as a key technique for improving the robustness and reliability of deep learning models in computer vision.

Keywords

Convolutional Neural Networks, dropout, overfitting, image classification, regularization, generalization.

Citations

IRE Journals:
Rajat Gupta , Rakesh Jindal , Amisha Naik , K L Ganatre "Impacts and Outcomes of Using Dropout Layers to Mitigate Overfitting in Convolutional Neural Networks" Iconic Research And Engineering Journals Volume 6 Issue 6 2022 Page 392-407

IEEE:
Rajat Gupta , Rakesh Jindal , Amisha Naik , K L Ganatre "Impacts and Outcomes of Using Dropout Layers to Mitigate Overfitting in Convolutional Neural Networks" Iconic Research And Engineering Journals, 6(6)