Eyeware:2019 年人機交互 (HRI) 的讚助商和參展商,韓國大邱

The Human-Robot International Conference aims to reunite the best researchers from all over the world and provide them with the opportunity of presenting their innovative projects. The Conference is focused on exchanging ideas about the latest technological developments in HRI. In an era governed by technological improvements, this annual encounter represents an invaluable opportunity to keep up with the latest findings in the field of HRI. More importantly, it inspires and supports multidisciplinary research.

# HRI’s 2019 Focus: Collaborative HRI

如果 last year’s convention from Chicago focused on “Robots for Social Good,” this year’s theme was “Collaborative HRI.”

主席在韓國大邱舉行的 2019 年 Hri 人機交互國際會議上的講話                                            HRI 2019總主席Jung Kim在晚宴上致辭

Among the HRI Keynote Speakers, we highlight:

圭真卓 – distinguished Lecturer from the Seoul National University. He has been exploring novel soft robot designs, including a water jumping robot, origami robots, and a soft wearable robot for the hand, called Exo-Glove Poly, that received coverage by over 300 news media outlets.

珍妮特·維泰西 – Post-Doctoral Fellow at Princeton University’s Society of Fellows in the Liberal Arts; Lecturer in Sociology. The majority of her research is on robotic spacecraft teams at NASA and how organizational factors affect and reflect planetary scientists’ activities and scientific results.

李章元 –  Ph.D. candidate in computer music at the Music and Audio Computing Lab. He is a music researcher focusing on mobile music interaction, specifically how new interaction methods and constraints of mobile devices can be used in expressive computer music. He is also part of the Korean band 胡椒色調,活躍的電視藝人和計算機音樂研究人員。

吉爾溫伯格Ph.D. in Media Arts and Sciences from MIT. His research focuses on developing artificial creativity and musical expression for robots and augmented humans. Human-robot interaction and music technology are his main focus for analysis and applications.


Human-plush toy interaction at the HRI2019 Conference, Eyeware’s Booth

At the International Conference on Human-Robot Interaction 2019, Eyeware showcased GazeSense, its 3D eye tracking software, to enable the interaction between humans and inanimate objects. By simply using gaze tracking, the visitors interacted with Extremely Rabbit and Grizzly Bear at the Eyeware booth.

韓國 Hri 2019 主席通過 Eyeware 3D 眼動追踪軟件與灰熊互動                                                             Jung Kim(HRI 2019 主席)與灰熊互動

As a silver sponsor of the HRI 2019, South Korea Conference, Eyeware actively participated in the event. The image below shows our colleague, Bastjan Prenaj, the company’s CDBO, delivering a demo on the 凝視感 3D software. When Bastjan directs his attention towards Extremely Rabbit, the latter starts interacting with him and requests Bastjan’s undivided attention. On the other hand, Grizzly Bear was feeling flirtatious at this Conference, and each time someone would direct their attention towards him, he tested Joey Tribbiany’s famous pickup line on them: #HowYOUdooin? If you’re wondering how this interaction between the stuffed toys and the audience is possible, read below about the functionality of Gaze Sense 3D, which made this scenario possible.

Eyeware 的 Gaze Sense 3D 演示設置與極兔和灰熊                                              Eyeware’s Gaze Sense 3D demo setup with Extremely Rabbit and Grizzly Bear

Gaze Sense 3D 是如何工作的?

Gaze Sense 3D is an eye tracking software for depth-sensing cameras. The program can track attention towards objects of interest in an open space. For the above demo scenario, presented at the HRI conference, the software captured the visitor’s gaze and attention and used them to trigger interactive behavior by plush animal toys. Extremely Rabbit and Grizzly Bear “talked” when the user looked at them. Had the toys been replaced with other objects, the interaction result could have been the same.

使用 Eyeware 3D 眼動追踪軟件與不同對象的計算機上的人                                                 Left Display: Bastjan, while using the GazeSense 3D gaze tracking feature

                                                 右側顯示:Bastjan 凝視已識別對象的 3D 可視化

Let us know what other 3D eye tracking perspectives or applied solutions you’d like to learn more about via an email to [email protected] or mention any of the social networks we are active on. Please find us on 推特, Facebook, 要么 領英.