Detecting individuals' spatial familiarity with urban environments using eye movement data
- Author
- Hua Liao, Wendi Zhao, Changbo Zhang, Weihua Dong and Haosheng Huang (UGent)
- Organization
- Abstract
- The spatial familiarity of environments is an important high-level user context for location-based services (LBS). Knowing users' familiarity level of environments is helpful for enabling context-aware LBS that can automatically adapt information services according to users' familiarity with the environment. Unlike state-of-the-art studies that used questionnaires, sketch maps, mobile phone positioning (GPS) data, and social media data to measure spatial familiarity, this study explored the potential of a new type of sensory data - eye movement data - to infer users' spatial familiarity of environments using a machine learning approach. We collected 38 participants' eye movement data when they were performing map-based navigation tasks in familiar and unfamiliar urban environments. We trained and cross-validated a random forest classifier to infer whether the users were familiar or unfamiliar with the environments (i.e., binary classification). By combining basic statistical features and fixation semantic features, we achieved a best accuracy of 81% in a 10-fold classification and 70% in the leave-one-task-out (LOTO) classification. We found that the pupil diameter, fixation dispersion, saccade duration, fixation count and duration on the map were the most important features for detecting users' spatial familiarity. Our results indicate that detecting users' spatial familiarity from eye tracking data is feasible in map-based navigation and only a few seconds (e.g., 5 s) of eye movement data is sufficient for such detection. These results could be used to develop context-aware LBS that adapt their services to users' familiarity with the environments.
- Keywords
- Urban Studies, General Environmental Science, Ecological Modeling, Geography, Planning and Development, Pedestrian navigation, Eye tracking, Machine learning, Random forest, Wayfinding, Spatial familiarity, LOCATION-BASED SERVICES, PEDESTRIAN NAVIGATION, ATTENTION, RESPONSES, TASK, LOAD
Downloads
-
(...).pdf
- full text (Published version)
- |
- UGent only
- |
- |
- 10.85 MB
Citation
Please use this url to cite or link to this publication: http://hdl.handle.net/1854/LU-8735161
- MLA
- Liao, Hua, et al. “Detecting Individuals’ Spatial Familiarity with Urban Environments Using Eye Movement Data.” COMPUTERS ENVIRONMENT AND URBAN SYSTEMS, vol. 93, 2022, doi:10.1016/j.compenvurbsys.2022.101758.
- APA
- Liao, H., Zhao, W., Zhang, C., Dong, W., & Huang, H. (2022). Detecting individuals’ spatial familiarity with urban environments using eye movement data. COMPUTERS ENVIRONMENT AND URBAN SYSTEMS, 93. https://doi.org/10.1016/j.compenvurbsys.2022.101758
- Chicago author-date
- Liao, Hua, Wendi Zhao, Changbo Zhang, Weihua Dong, and Haosheng Huang. 2022. “Detecting Individuals’ Spatial Familiarity with Urban Environments Using Eye Movement Data.” COMPUTERS ENVIRONMENT AND URBAN SYSTEMS 93. https://doi.org/10.1016/j.compenvurbsys.2022.101758.
- Chicago author-date (all authors)
- Liao, Hua, Wendi Zhao, Changbo Zhang, Weihua Dong, and Haosheng Huang. 2022. “Detecting Individuals’ Spatial Familiarity with Urban Environments Using Eye Movement Data.” COMPUTERS ENVIRONMENT AND URBAN SYSTEMS 93. doi:10.1016/j.compenvurbsys.2022.101758.
- Vancouver
- 1.Liao H, Zhao W, Zhang C, Dong W, Huang H. Detecting individuals’ spatial familiarity with urban environments using eye movement data. COMPUTERS ENVIRONMENT AND URBAN SYSTEMS. 2022;93.
- IEEE
- [1]H. Liao, W. Zhao, C. Zhang, W. Dong, and H. Huang, “Detecting individuals’ spatial familiarity with urban environments using eye movement data,” COMPUTERS ENVIRONMENT AND URBAN SYSTEMS, vol. 93, 2022.
@article{8735161, abstract = {{The spatial familiarity of environments is an important high-level user context for location-based services (LBS). Knowing users' familiarity level of environments is helpful for enabling context-aware LBS that can automatically adapt information services according to users' familiarity with the environment. Unlike state-of-the-art studies that used questionnaires, sketch maps, mobile phone positioning (GPS) data, and social media data to measure spatial familiarity, this study explored the potential of a new type of sensory data - eye movement data - to infer users' spatial familiarity of environments using a machine learning approach. We collected 38 participants' eye movement data when they were performing map-based navigation tasks in familiar and unfamiliar urban environments. We trained and cross-validated a random forest classifier to infer whether the users were familiar or unfamiliar with the environments (i.e., binary classification). By combining basic statistical features and fixation semantic features, we achieved a best accuracy of 81% in a 10-fold classification and 70% in the leave-one-task-out (LOTO) classification. We found that the pupil diameter, fixation dispersion, saccade duration, fixation count and duration on the map were the most important features for detecting users' spatial familiarity. Our results indicate that detecting users' spatial familiarity from eye tracking data is feasible in map-based navigation and only a few seconds (e.g., 5 s) of eye movement data is sufficient for such detection. These results could be used to develop context-aware LBS that adapt their services to users' familiarity with the environments.}}, articleno = {{101758}}, author = {{Liao, Hua and Zhao, Wendi and Zhang, Changbo and Dong, Weihua and Huang, Haosheng}}, issn = {{0198-9715}}, journal = {{COMPUTERS ENVIRONMENT AND URBAN SYSTEMS}}, keywords = {{Urban Studies,General Environmental Science,Ecological Modeling,Geography,Planning and Development,Pedestrian navigation,Eye tracking,Machine learning,Random forest,Wayfinding,Spatial familiarity,LOCATION-BASED SERVICES,PEDESTRIAN NAVIGATION,ATTENTION,RESPONSES,TASK,LOAD}}, language = {{eng}}, pages = {{12}}, title = {{Detecting individuals' spatial familiarity with urban environments using eye movement data}}, url = {{http://doi.org/10.1016/j.compenvurbsys.2022.101758}}, volume = {{93}}, year = {{2022}}, }
- Altmetric
- View in Altmetric
- Web of Science
- Times cited: