This project promotes an approach for the Human Computer Interaction (HCI) where cursor movement can be controlled using a real-time camera, it is an alternative to the current methods including manual input of buttons or changing the positions of a physical computer mouse. Instead, it utilizes a camera and computer vision technology to control various mouse events and is capable of performing every task that the physical computer mouse can. The Virtual Mouse Gesture recognition program will constantly acquiring real-time images where the images will undergone a series of filtration and conversion. Whenever the process is complete, the program will apply the image processing technique to obtain the coordinates of the targeted Hand gesture position from the converted frames. After that, it will proceed to compare the existing gesture within the frames with a list of gesture tip combinations, where different combinations consists of different mouse functions.
Hand Gesture, Recognition, Deep Learning, Convolutional Neural, Networks (CNN), Arduino, Real-time Processing, Gesture-to-Action, Latency, Volume Control, Robotic Arm Control, Python
IRE Journals:
Sahil Sharad Konde , Avinash Ashok Shinde , Om Navnath Pasalkar , Chaitanya Dattatray Salunke , Prof. Neeta Dimble
"Hand Gesture Controller Using Deep Learning" Iconic Research And Engineering Journals Volume 8 Issue 9 2025 Page 1217-1220
IEEE:
Sahil Sharad Konde , Avinash Ashok Shinde , Om Navnath Pasalkar , Chaitanya Dattatray Salunke , Prof. Neeta Dimble
"Hand Gesture Controller Using Deep Learning" Iconic Research And Engineering Journals, 8(9)