EyeC.Design - System Documentation (81 Pages) Version: 1.4 Date: 4/24/2025 Author: Dominic Minnich (Team Leader) https://docs.google.com/document/d/1INnyCdoIOgQ42BxF1yHccJCTsOQfEroU_haqPNBlpD8/edit?tab=t.0
EyeC.Design - Project Purpose and Usefulness Version: 1.2 Date: 4/24/2025 Author: Dominic Minnich (Team Leader) https://docs.google.com/document/d/1AasbVFCKSv3hBNE7Xbs7jtFaS8p8Nv7_ja1ToKxKyaU/edit?tab=t.0
This project is a Senior Capstone focused on developing an innovative software solution that enables remote eye-tracking during user experience (UX) testing. The software utilizes the built-in camera on users’ personal computers to capture and analyze eye movement data remotely, offering comprehensive insights into user interactions with digital products.
- Sponsor: PFW CS Department
- Faculty Advisor: Prof. Jay Johns
- Dominic Minnich (Team Leader)
- Kyle Benich
- Logan Smith
- Sulaiman Hussain
- ☒ Application Development
- ☐ Research-focused
- ☐ Information Systems
The goal of this project is to develop a software tool that allows for remote UX testing by tracking eye movements using the built-in cameras of users' devices. By leveraging advanced algorithms, the software processes video feeds to provide quantitative data on user focus areas during digital interactions.
For much better zoom quality go here🔎:
Vision_Explanation_Nov25.pdf
- Camera-Based Eye Tracking: Tracks eye movement using standard webcams, eliminating the need for specialized hardware.
- Remote Testing Capability: UX testers can participate from any location, increasing flexibility and diversity in testing environments.
- Data Collection and Analysis: Collects detailed data on user focus points, gaze paths, and interaction times, facilitating in-depth analysis.
- User Privacy and Security: Implements secure encryption, consent protocols, and data anonymization to protect user privacy.
- Real-Time and Post-Session Reporting: Offers real-time feedback during sessions and detailed reports afterward for comprehensive insights.
- Enhance Remote UX Testing: Provide a tool that removes geographical constraints, making UX testing more accessible.
- Improve Data Accuracy: Capture real-time eye movement data to gain precise insights into user behavior and preferences.
- Increase Accessibility: Use widely available hardware to make high-quality UX testing available to a broader audience.
- UX Researchers and Designers
- Product Managers
- Companies conducting large-scale remote UX testing
- PFW students in Software Development or UX courses
This software aims to revolutionize UX testing by eliminating physical barriers and providing detailed, actionable data. The resulting insights will enable teams to make informed design decisions, leading to improved user experiences across digital platforms.
- ☒ 4 Members
- ☒ >4 Members
- Frontend Development
- Backend Development
- Data Analysis
- JavaScript: For building interactive, real-time interfaces.
- HTML/CSS: For site structure and styling.
- WebRTC: For handling real-time communication and video streams.
- Python: Ideal for computer vision and data analysis.
- Flask: Framework for building scalable web applications.
- OpenCV: For image processing and eye-tracking algorithms.
- TensorFlow or PyTorch: For advanced computer vision tasks.
- **WebGazer.js: Eye tracking script.
- SQlite
- Pandas/Numpy: For data manipulation and analysis.
- Matplotlib/Seaborn: For creating detailed visual reports.
- Jupyter Notebooks: For shareable analysis reports and exploratory data analysis.
- Docker: For containerizing the application.
Users can create, edit, and delete their profiles, except for guests whose data won’t be saved. Users can create new projects with tasks, descriptions, and URLs. The system tracks eye movements in real-time using standard webcams. The system records sessions, combining screen recordings with eye-tracking overlays. Users can select roles (e.g., Analyst, Project Manager, Admin) and perform tasks based on those roles.
-Performance: Must handle real-time data processing with minimal latency. -Usability: The system should comply with WCAG 2.1 Level AA accessibility guidelines. -Security: Use AES-256 encryption and multi-factor authentication. -Scalability: Support large datasets and concurrent users.
- SSL/TLS: For securing data in transit.
- OAuth2: For secure user authentication and authorization.
- WebAssembly (Wasm): For high-performance computing in the browser.
- Scalability: Designed to handle large amounts of data and users.
- Performance: Supports real-time video processing and eye-tracking.
- Flexibility: Combines powerful backend data processing with a user-friendly frontend.
- Community Support: Strong community support for chosen technologies.
-Mobile Support: Add support for mobile phones to capture eye-tracking data. -Heat Mapping: Extract heat maps from user behavior data, highlighting areas of focus on the screen. -Domain Expansion: Extend support beyond specific domains like Figma to other established or custom websites.
-Agile Development: The team will adopt an Agile methodology to enable continuous improvement based on testing and feedback.
- No NDA or IP Assignment Agreement requested.
This project will deliver a robust and scalable eye-tracking software solution that supports remote UX testing. By using widely available technology, it will provide valuable insights to enhance digital product design, making UX testing more accessible and effective.