overview - supriyak2003/eyecontrol GitHub Wiki

The Eye-Controlled Mouse Project is a hands-free, gaze-based interface that allows users to control a computer cursor using eye movements and simple gestures. This innovative system uses a combination of computer vision and machine learning tools to detect facial landmarks and interpret eye movements, transforming them into cursor motions and clicks on the screen. Built with OpenCV, MediaPipe, and PyAutoGUI, this project offers a unique and accessible interaction method, particularly beneficial for users with limited hand mobility.

Key Components OpenCV: Captures video input from a webcam, processes frames, and manages basic image transformations. MediaPipe: Detects facial landmarks, focusing on eye and iris positions, to monitor eye movement in real time. PyAutoGUI: Maps eye positions to screen coordinates, allowing the mouse cursor to follow the user's gaze. It also detects blinks, simulating clicks. Functionality Overview Frame Processing: Video frames are captured from the webcam, flipped for a mirror effect, and converted to RGB format for compatibility with MediaPipe’s face detection model. Facial Landmark Detection: MediaPipe’s FaceMesh model identifies landmarks around the eyes and iris, enabling precise tracking of eye movement and gaze direction. Cursor Control: Using PyAutoGUI, the system maps eye position to screen coordinates, allowing the cursor to move in response to the user’s gaze. Blink-to-Click Mechanism: A controlled blink, detected by monitoring the eyelid landmarks, triggers a mouse click, enabling users to interact with the interface without physical input. Applications This project is particularly suited for accessibility-focused applications, enabling individuals with physical disabilities to interact with computers. Additionally, it can be adapted for hands-free control in gaming, virtual environments, and even healthcare settings where touchless interaction is required.

Future Enhancements Potential future improvements include dynamic calibration to improve accuracy across various lighting conditions, more complex gesture recognition, and customization options for broader user accessibility. The eye-controlled mouse concept could inspire new interaction methods in accessible computing and beyond, broadening usability for diverse user groups.