Graduation Semester and Year
2020
Language
English
Document Type
Thesis
Degree Name
Master of Science in Computer Science
Department
Computer Science and Engineering
First Advisor
Deokgun Park
Abstract
Joint attention, where a caregiver and an infant follow each other’s eye gaze plays an important role in the learning of language for new-born infants. To study joint attention, it is required to record and analyze the joint attention in a naturalistic environment. For this, we can use the head-mounted eye tracker. However, natural interaction involves the body movement which a?ects the accuracy of the measurement. In this work, I evaluated the accuracy of the eye-tracking system in the three di?erent scenarios: when the subject is sitting still in front of the target when the subject looks at the target while moving the head sitting at a ?xed location when the subject moves freely within the setup while looking at the target. To track the head pose we use an in-built plugin within Pupil Player that detects surface markers (april13-tags) within the world camera feed and generates camera position and orientation in the marker coordinate system. The evaluation shows that the gaze position accuracy and precision are better when the scene is lit under White LED lights compared with Yellow LEDs. The recordings were also replicated in a virtual environment created in Unity3D that are modeled after real-world experimental setup. We expect this evaluation will contribute to the development of the simulated environments for implementing developmental robotics which can provide simulated experiences such as interactions of a mother and a child during various stages of infant development (from fetus stage to 12 months of age)
Keywords
Joint attention, Saccade, Gaze point, Fixation
Disciplines
Computer Sciences | Physical Sciences and Mathematics
License
This work is licensed under a Creative Commons Attribution-NonCommercial-Share Alike 4.0 International License.
Recommended Citation
Narasimhan, Sanath, "Evaluating the Accuracy of Gaze Detection for Moving Character" (2020). Computer Science and Engineering Theses. 393.
https://mavmatrix.uta.edu/cse_theses/393
Comments
Degree granted by The University of Texas at Arlington