Skip to Main content Skip to Navigation
Journal articles

Measuring perceived self-location in virtual reality

Abstract : Third-person perspective full-body illusions (3PP-FBI) enable the manipulation, through multisensory stimulation, of perceived self-location. Perceived self-location is classically measured by a locomotion task. Yet, as locomotion modulates various sensory signals, we developed in immersive virtual reality a measure of self-location without locomotion. Tactile stimulation was applied on the back of twenty-five participants and displayed synchronously or asynchronously on an avatar's back seen from behind. Participants completed the locomotion task and a novel mental imagery task, in which they self-located in relation to a virtual ball approaching them. Participants self-identified with the avatar more during synchronous than asynchronous visuo-tactile stimulation in both tasks. This was accentuated for the mental imagery task, showing a larger self-relocation toward the avatar, together with higher reports of presence, bi-location and disembodiment in the synchronous condition only for the mental imagery task. In conclusion, the results suggest that avoiding multisensory updating during walking, and using a perceptual rather than a motor task, can improve measures of illusory self-location. Self-location, i.e. the experience that the self occupies a certain volume of space, is considered a core aspect of bodily self-consciousness, together with self-identification and first-person perspective 1-3. Self-location is typically experienced within the physical limits of the body 1,4 , but can be experienced as disembodied in various neurological and psychiatric conditions 5. Self-location has proven difficult to study empirically. Early measures of self-location consisted of introspective reports 6-8 , and pointing tasks on human silhouettes or toward the par-ticipant's body 9,10. They revealed that most participants located their self in the head or torso. Self-reports of self-location have also been collected during illusions with sets of mirrors 11,12 and video systems 13. These studies indicate the possibility to manipulate self-location through unusual visuo-spatial perspectives, in a way that the perceived self-location deviates from the location of the physical body. Whole-body adaptations of the rubber hand illusion 14 accelerated the empirical study of the multisensory foundations of self-location, especially with the development of full-body illusions (FBI) from a third-person perspective (here referred to as 3PP-FBI) 15-19. 3PP-FBIs are characterized by self-identification with a full body in extrapersonal space rather than with a body part in peripersonal space. In a seminal version of the 3PP-FBI, participants wore a head-mounted display (HMD) in which they observed a video of their body or a mannequin's body filmed from behind and projected to the front 18. Tactile stimulation was applied on the participants' back and the video was either shown in synchrony or with a delay relative to the stimulation, so that participants saw and felt the touch on the back synchronously or asynchronously. Results show that the integration of spatially dissociated, but synchronous, visual and tactile events increased self-identification with the virtual body. During such self-identification the participant's skin temperature was found to decrease 20. To date, very few studies have used immersive virtual reality (VR) technology to implement 3PP-FBIs 21-23. Most studies of self-location and self-identification were based on pre-recorded or online video-projections of bodies 16,18,24-28. Yet, the study of self-location should benefit from VR. First, VR allows interacting with avatars in realistic, ecological, and controlled environments 29-32. Second, VR is characterized by presence, the feeling of being "there", even when a virtual character is not shown in the VR environment, modifying the perceived self-location 30,33. How can self-location be measured in VR, other than with questionnaires? In most 3PP-FBI studies self-location was measured by a locomotion task (LT), an action-based (motor) judgement in which participants were moved backward and asked to walk to where they perceived to be located during the visuo-tactile stimulation 15,18,34. Participants relocated their self from 10 to 30 cm toward the seen body from their initial position after synchronous visuo-tactile stimulation 35,36. Thus, self-location was a compromise between the location of the physical body and the location of the seen body participants self-identified with.
Document type :
Journal articles
Complete list of metadatas

Cited literature [63 references]  Display  Hide  Download

https://hal-amu.archives-ouvertes.fr/hal-02553530
Contributor : Christophe Lopez <>
Submitted on : Friday, April 24, 2020 - 3:06:42 PM
Last modification on : Tuesday, April 28, 2020 - 1:36:33 AM

File

Nakul et al 2020 Scientific Re...
Publication funded by an institution

Licence


Distributed under a Creative Commons Attribution 4.0 International License

Identifiers

Collections

Citation

Christophe Lopez, Estelle Nakul, Nicolas Orlando-Dessaints, Bigna Lenggenhager, Christophe Lopez. Measuring perceived self-location in virtual reality. Scientific Reports, Nature Publishing Group, 2020, ⟨10.1038/s41598-020-63643-y⟩. ⟨hal-02553530⟩

Share

Metrics

Record views

51

Files downloads

67