abstract
- The purpose of this work was to assess the feasibility of using a head mounted display with a motion capture system to simulate real world occupational tasks. Participants performed a pointing task under 3 conditions: (1) real environment (REA), (2) virtual environment with auditory stimulus (VEA) and (3) virtual environment with visual stimulus (VEV). End point error, movement time and peak fingertip velocity were calculated for each discrete point event. Upper extremity joint angles were calculated at the end-state for each point and did not significantly differ between real and virtual conditions. There was significantly greater target error in virtual conditions, compared to the real condition. Peak pointing velocity was slower and movement time was longer during virtual conditions. The similarity of joint angles between real and virtual conditions suggests future use of posture-based ergonomic assessments for use with virtual reality task simulations using Oculus Rift and Siemens Jack.