Deutsch
 
Hilfe Datenschutzhinweis Impressum
  DetailsucheBrowse

Datensatz

DATENSATZ AKTIONENEXPORT

Freigegeben

Vortrag

The CyberWalk Platform: Human-Machine Interaction Enabling Unconstrained Walking through VR

MPG-Autoren
/persons/resource/persons84174

Robuffo Giordano,  P
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons84228

Souman,  JL
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83906

Ernst,  MO
Research Group Multisensory Perception and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

/persons/resource/persons83839

Bülthoff,  HH
Department Human Perception, Cognition and Action, Max Planck Institute for Biological Cybernetics, Max Planck Society;
Max Planck Institute for Biological Cybernetics, Max Planck Society;

Externe Ressourcen
Volltexte (beschränkter Zugriff)
Für Ihren IP-Bereich sind aktuell keine Volltexte freigegeben.
Volltexte (frei zugänglich)

HFR-2008-Robuffo.pdf
(beliebiger Volltext), 2MB

Ergänzendes Material (frei zugänglich)
Es sind keine frei zugänglichen Ergänzenden Materialien verfügbar
Zitation

Robuffo Giordano, P., Souman, J., Mattone, R., De Luca, A., Ernst, M., & Bülthoff, H. (2008). The CyberWalk Platform: Human-Machine Interaction Enabling Unconstrained Walking through VR. Talk presented at First Workshop for Young Researchers on Human-friendly Robotics. Napoli, Italy. 2008-10-24.


Zitierlink: https://hdl.handle.net/11858/00-001M-0000-0013-C689-3
Zusammenfassung
In recent years, Virtual Reality (VR) has become increasingly realistic and immersive. Both the visual and auditory rendering of virtual environments have been improved significantly, thanks to developments in both hardware and software. In contrast, the possibilities for physical navigation through virtual environments (VE) are still relatively rudimentary.
Most commonly, users can ‘move’ through highfidelity
virtual environments using a mouse or a joystick.
Of course, the most natural way to navigate through VR
would be to walk. For small scale virtual environments one
can simply walk within a confined space. The VE can be
presented by a cave-like projection system, or by means of
a head-mounted display combined with head-tracking. For
larger VEs, however, this quickly becomes impractical or
even impossible.