Top 7 Use Cases for 3D Eye Tracking

Eyes are the window to the soul. Whether or not this is true, what is undeniable is how much information the eyes can tell us about each other. The cooperative eye hypothesis suggests that the human eye has evolved to increase the size and brightness of the white sclera, in order for us to communicate better by being able to follow each other’s gaze. Combining this human feature with new technologies, like 3D eye tracking, we can further advance the way we communicate with each other and with the machines around us.

Speech recognition, motion detection, and touch control are all elements that we have seen integrated in everyday devices. One key element that is missing is looking into the eyes of the user and understanding their attention, interest, and engagement, as people would naturally do when interacting with each other.  

The introduction of 3D eye tracking, i.e. using consumer depth sensing cameras to track the eye-gaze, now opens up exciting new applications for human-machine interactions. Here the top 7 applications:

 

  1. Retail

Instead of asking a pre-selected focus group to wear glasses for a market research study on shelf attention, potentially influencing their behavior, 3D eye tracking allows to track the gaze of passersby accurately and non-intrusively. By monitoring the attention of the customer, retailers can then analyze product performance. 3D eye tracking can provide shelf-attention analytics that identify the number of views and focal points, without the need for calibration. Market research companies can then use those gained insights to suggest optimizations to retailers with respect to their sales approach and customer experience.

 

  1. Robotics

The use of 3D gaze tracking within robotics allows for a more natural and collaborative human-robot interaction. Robots currently cannot accurately read the level of engagement of humans, however given the “pair of eyes” with a depth sensing camera and eye tracking software that generates a 3D gaze vector, they are enabled to both detect attention towards objects and predict intention of people. Use cases are found in collaborative robots, educational robots, service robots, or assistive robots, especially when wide range of movement and interaction with real objects is required.

Watch a demo of in this educational robot research project by Professor Joakim Gustaffson of the Royal Institute of Technology in Stockholm. The sensor used for the 3D eye tracking is an Intel RealSense camera placed in front of a Furhat robot.

 

  1. Automotive

The limited tracking box, as well as the need for a multisensor setup to track the gaze in 3D, limits the applicability of current eye tracking technologies in cars. Gaze recognition will be solving two main issues with regards to human-car interaction.

Driver Monitoring

According to the European New Car Assessment Programme (NCAP)  not only will driver monitoring be considered a “primary safety” standard by 2020, but they also suggest that “the assessment will evolve around how reliably and accurately the status of the driver is detected.” Therefore being able to detect the driver’s attention to monitor and enable coordinated handovers in semi-autonomous cars will be a must.

Virtual Co-Pilot

Cars are robots. They are already interacting with the driver based on voice and gesture commands. However, attention sensing has been lagging behind. 3D eye tracking can enable the “seeing car assistant” to sense the driver and passengers’ gaze towards parts of the in-car infotainment system such as the dashboard, central console, and mirrors,  but also allow natural interactions with augmented reality heads-up displays and even enable the virtual co-pilot to detect the gaze towards objects outside to the car.

 

  1. Consumer devices

Mobile phones, like the iPhone X or the Xiaomi Mi8, and laptops with user-facing depth sensing cameras can be upgraded to run 3D eye tracking without the need for an additional eye tracking device. The depth sensing camera is then enabled to detect screen attention towards parts of the screen, as well possibly allowing the user to navigate the device partially with their eyes. For gaming laptops, 3D gaze tracking can be utilized to heighten the gaming experience when interacting with avatars or dimming out info menus that are out of focus to declutter the screen.

 

  1. Assistive technology solutions

Customized software solutions using consumer depth sensing cameras can be used to assist less-abled users to control a computer, with mouse functions being mapped onto head movements, facial gestures and gaze tracking. The enhanced assisted ability would allow for better communication with greater ease for the user. Users can also be enabled to guide a robotic assistive arm towards their object of interest, while reducing the need for tedious joystick control to a minimum. In the video below, Henny Admoni, Assistant Professor in the Robotics Institute at Carnegie Mellon University, presents her research on how gaze recognition can improve human-robot coordination.

 

  1. Advertising

Advertisers can better analyze the attention and focus of the consumer on outdoor advertisements by use of 3D eye tracking that does not require any calibration process. Audience analytics can inform on the number of views of the advert, the emotional reception, the most attentive time of day, as well as other real-time statics that are beneficial to the advertiser.

 

  1. Behavioral research

Eye tracking software is often used to diagnose certain behavioural syndromes. Conditions such as ADHD and autism can be more easily detected by tracking how people visually consume information. However, especially for children it can oftentimes be a challenge to keep them within the tracking range of current eye trackers and sometimes even impossible to have them follow a strict calibration procedure. 3D eye tracking can alleviate these pain points, with a wider tracking range, under various extreme head positions, and with the option to allow and uncalibrated tracking.