Eye Vergence and Gaze Sensing
Project developed at: Harman International
This system leverages eye gaze and vergence detection to enable hands-free, intuitive interaction with transparent displays or head-up displays (HUDs) in vehicles. By detecting where the driver is looking or focusing, the interface dynamically adapts to provide relevant, context-aware information without requiring deliberate input. Drivers can use eye contact, gestures, or simply focus on an area of interest to interact seamlessly with the system.
I designed and built the system, including the projection-based transparent display, interaction methods, and vergence detection, using a Tobii Eye Gaze tracking device. The prototype demonstrates potential applications such as enhancing navigation with gaze-triggered guidance, displaying critical information in real-time, and improving safety.
This early prototype paved the way for further advancements, such as:
directed alerts with eye gaze honking at a specific vehicle or person
in-cabin projection systems that allows important information to shows up where the driver is gazing
stress and cognitive load analysis,
and AR/VR/XR applications.
The project culminated in a working proof of concept and an accompanying UX demonstration video.
Drawings from the granted patent describing the methods used in the engineering prototypes
Company: Harman International
Team: Future Experience Team
Location: Palo Alto, CA
Year: 2014-2016
The image highlights several systems and patents we developed at HARMAN FX team leveraging eye gaze and vergence detection. These include applications such as adaptive HUD interfaces, gaze-enabled navigation systems, detecting a driver's interest in billboards on the road side, directed alerts, and methods to assess driver focus and cognitive load, showcasing the team's contributions to intuitive and context-aware automotive technologies.