An innovative project that enables users to control the computer cursor using eye movements, enhancing accessibility and interaction for individuals with physical disabilities or those seeking hands-free computing solutions.
- Eye Tracking: Detects and tracks the user's eye movements in real time.
- Cursor Control: Moves the computer cursor based on gaze direction.
- Click Simulation: Supports blinking or gaze dwell for left/right mouse clicks.
- Custom Calibration: Allows users to calibrate the system for improved accuracy.
- Multi-Platform Support: Compatible with major operating systems.
- Eye Detection: Uses a webcam to capture real-time video of the user's face and eyes.
- Gaze Estimation: Detects the position and movement of the eyes using computer vision.
- Cursor Mapping: Maps the gaze coordinates to the screen space to control the cursor.
- Python 3.7 or newer
- A functional webcam
Install the required libraries by running:
pip install opencv-python mediapipe pyautogui numpygit clone https://github.com/yourusername/eye-controlled-cursor.git- Enhanced Accuracy: Improve gaze estimation using advanced models.
- Support for Multiple Monitors: Expand functionality to multi-screen setups.
- Custom Gestures: Enable user-defined gestures for clicks and drag-and-drop.
- AI Integration: Use deep learning models for more robust eye tracking.