Table 4 The collected data modalities, equipment used, format, the logging frequencies.

From: Exploring, walking, and interacting in virtual reality with simulated low vision: a living contextual dataset

Type

Equipment

Format

Frequency

Description

System logs

GUsT-3D

.json

10 Hz

objects position, objects state, objects interactive properties, user position, user interaction, current task, objects in user visual field

Physiological sensors

Shimmer GSR+

.csv

EDA 15.9Hz HR 51Hz Resamp. 100Hz

Heart Rate (HR) and electrodermal activity (EDA) collected and automatically resampled by the Consensys software to 100 Hz

Motion capture

XSens Awinda Starter MVN Analyze

.csv .bvh

60 Hz

17 sensors for head (1), torso (4: shoulders, hip, and stern), arms and legs (8: upper and lower limb), and feet and hands (4). The .csv files contain the processed 28 absolute joint coordinates calculated from the .bvh file (original relative coordinates of joints calculated by the MVN Animate software), as well as the transformation matrix to spatially transform the motion data into coordinates of the 3D scene.

Gaze and head tracking

HTC Vive Pro Eye

.json

120 Hz

For left, right, and cyclopean eye (combined gaze vector of both eyes): gaze vector (xyz), pupil size, eye openness percentage, and data validity mask; Head position and rotation vectors

Surveys

LimeSurvey

.json

N/A

Surveys described in Table 2 as well as experimenter observation notes

  1. The description explains the data contents and any processing that was done.