• The user’s gaze and attention are used to trigger interactive behavior by world objects in a car-like environment. Sound files are played when the user looks at each object
  • The scenario will start visualizing the in-cabin objects in the 3D View of the GazeSense user interface. Look at the different virtual objects and receive an audio feedback, when your attention towards an object is recognized.
  • To edit the values of the predefined configuration simply:
    • 1. From the top bar menu browse: Demo > Export project and select the location of where you want the automotive demo project exported.
    • 2. Browse to the config.yml file and adjust the values of the predefined parameters as per your own setup and save.
    • 3. To visualize your adjusted values in GazeSense, tap Demo > Load configuration and upload that modified config.yml file you adjusted at the step above.