Cookies on this website

We use cookies to ensure that we give you the best experience on our website. If you click 'Accept all cookies' we'll assume that you are happy to receive all cookies and you won't see this message again. If you click 'Reject all non-essential cookies' only necessary cookies providing core functionality such as security, network management, and accessibility will be enabled. Click 'Find out more' for information on how to change your cookie settings.

As observers walk through a 3-D environment with their gaze fixed on a static object,their retinal image of that object changes as if the object itself were rotating. We have investigated how well observers can judge whether an object is rotating when that rotation is linked with the observer's own movement. Subjects wore a head mounted display and fixated a spherical textured object at a distance of approximately 1.5m in an immersive virtual reality environment. Subjects walked from side to side (approximately ±1m). On each trial, the object rotated about a vertical axis with randomly assigned rotational gain factors within a range of ±1: a gain of +1 caused it to always face the observer; a gain of -1 caused an equal and opposite rotation; a gain of zero means the object is static in world coordinates. In a forced-choice paradigm, subjects judged the sign of the rotational gain. We found significant biases in subjects' judgements when the target object was presented in isolation. These biases varied little with viewing distance, suggesting that they were caused by an under-estimation of the distance walked. In a rich visual environment, subjects' judgements were more precise and biases were reduced. This was also true, in general, when we manipulated proprioceptive information by correlating the lateral translation of the target object with the observer's motion.

Original publication

DOI

10.1167/3.9.497

Type

Journal article

Journal

Journal of Vision

Publication Date

01/12/2003

Volume

3