Combining eye-tracking data with an analysis of video content from free-viewing a video of a walk in an urban park environment
Amati, Marco, McCarthy, Chris, Parmehr, Ebadat Ghanbari and Sita, Jodi. (2019). Combining eye-tracking data with an analysis of video content from free-viewing a video of a walk in an urban park environment. Journal of Visualized Experiments. (147), p. e58459.
|Authors||Amati, Marco, McCarthy, Chris, Parmehr, Ebadat Ghanbari and Sita, Jodi|
As individuals increasingly live in cities, methods to study their everyday movements and the data that can be collected becomes important and valuable. Eye-tracking informatics are known to connect to a range of feelings, health conditions, mental states and actions. But because vision is the result of constant eye-movements, teasing out what is important from what is noise is complex and data intensive. Furthermore, a significant challenge is controlling for what people look at compared to what is presented to them.
The following presents a methodology for combining and analyzing eye-tracking on a video of a natural and complex scene with a machine learning technique for analyzing the content of the video. In the protocol we focus on analyzing data from filmed videos, how a video can be best used to record participants' eye-tracking data, and importantly how the content of the video can be analyzed and combined with the eye-tracking data. We present a brief summary of the results and a discussion of the potential of the method for further studies in complex environments.
|Journal||Journal of Visualized Experiments|
|Journal citation||(147), p. e58459|
|Publisher||Journal of Visualized Experiments|
|Funder||Australian Research Council (ARC)|
All rights reserved
File Access Level
|Publication process dates|
|Deposited||13 May 2021|
1views this month
0downloads this month