TY - JOUR AU - T Layaraja AU - T Akhila AU - V Harshini Reddy AU - Y Raghunandhu AU - V Pranav Reddy PY - 2026 DA - 2026/02/09 TI - A Deep Learning-Driven Real-Time Eye Gesture Recognition Framework for Intelligent Hands- Free Multimedia Control Systems JO - Global Journal of Engineering Innovations and Interdisciplinary Research VL - 6 IS - 2 AB - Hands-free human-computer interaction is essential for accessibility, especially for individuals with motor impairments, and enhances user experience in multimedia applications. Traditional input devices like keyboards and mice limit usability in scenarios requiring contactless control. This paper proposes a deep learning-driven real-time eye gesture recognition framework for intelligent hands-free multimedia control systems. The system utilizes webcam-captured video streams to detect and classify eye gestures (e.g., blink, double blink, gaze left/right/up/down) using a hybrid Convolutional Neural Network (CNN) and Long Short-Term Memory (LSTM) architecture, augmented with MediaPipe for facial landmark extraction and Eye Aspect Ratio (EAR) computation. Recognized gestures map to multimedia commands such as play/pause, volume adjustment, next/previous track, and fullscreen toggle. The framework ensures low-latency inference (<50 ms per frame) on standard hardware. Experimental evaluation on custom and public datasets (e.g., EYEDIAP, custom multimedia control gestures) demonstrates high accuracy (96.5%), precision (95.8%), recall (96.2%), F1-score (96.0%), and robustness to lighting variations and head pose changes. The proposed system promotes inclusive, intuitive control for media players, smart TVs, and virtual environments while maintaining privacy through on-device processing. SN - 3066-1226 UR - https://dx.doi.org/10.33425/3066-1226.1212 DO - 10.33425/3066-1226.1212