In human perception, the ability to determine eye height is essential, because eye height is used to perceive heights of objects, velocity, affordances and distances, all of which allow for successful environmental interaction. It is reasonably well understood how eye height is used to determine many of these percepts. Yet, how eye height itself is determined is still unknown. In multiple studies conducted in virtual reality and the real world, this dissertation investigates how eye height might be determined in common scenarios in virtual reality.
Using manipulations of the virtual eye height and distance perception tasks, the results suggest that humans rely more on their body-based information to determine their eye height, if they have no possibility for calibration. This has major implications for many existing virtual reality setups. Because humans rely on their body-based eye height, this can be exploited to systematically alter the perceived space in immersive virtual environments, which might be sufficient to enable every user an experience close to what was intended by the programmer.