By Steven M. LaValle. To be published by Cambridge University Press.

   This free VR book covers the fundamentals of virtual reality systems, including geometric modeling, transformations, graphical rendering, optics, the human vision, auditory, and vestibular systems, tracking systems, interface design, human factors, developer recommendations, and technological issues.

Free VR lectures on YouTube. These are part of an accompanying MOOC (free on-line course), produced by NPTEL and IIT Madras, 2016.

Why did I write this? Click here. Feel free to send feedback and corrections (I will acknowledge you in the final version).

The whole book in HTML
Book cover art by Anna Yershova.

Download the whole book

[pdf file] -- Two pages per one. Recommended for printing on US Letter paper.
[pdf file] -- Two pages per one. Recommended for printing on A4 paper
[pdf file] -- One page per one (larger print). May be easier for on-line viewing.

Download chapters

Chapter 1: Introduction
17 Nov 2016 Definition of VR, modern experiences, historical perspective.
Chapter 2: Bird's Eye View
17 Nov 2016 Hardware, sensors, displays, software, virtual world generator, game engines, human senses, perceptual psychology, psychophysics.
Chapter 3: The Geometry of Virtual Worlds
17 Nov 2016 Geometric modeling, transforming rigid bodies, yaw, pitch, roll, axis-angle representation, quaternions, 3D rotation inverses and conversions, homogeneous transforms, transforms to displays, look-at and eye transforms, canonical view and perspective transforms, viewport transforms.
Chapter 4: Light and Optics
18 Nov 2016 Light propagation, lenses and images, diopters, spherical aberrations, optical distortion; more lens aberrations; spectral properties; the eye as an optical system; cameras.
Chapter 5: The Physiology of Human Vision
18 Nov 2016 Parts of the human eye, photoreceptors and densities, scotopic and photopic vision, display resolution requiments, eye movements, neural vision structures, sufficient display resolution, other implications of physiology on VR.
Chapter 6: Visual Perception
20 Nov 2016 Depth perception, motion perception, vection, stroboscopic apparent motion, color perception, combining information from multiple cues and senses, implications of perception on VR.
Chapter 7: Visual Rendering
20 Nov 2016 Graphical rendering, ray tracing, shading, BRDFs, rasterization, barycentric coordinates, VR rendering problems, anti-aliasing, distortion shading, image warping (time warp), panoramic rendering.
Chapter 8: Motion in Real and Virtual Worlds
21 Nov 2016 Velocities, acceleration, vestibular system, virtual world physics, simulation, collision detection, avatar motion, vection.
Chapter 9: Tracking
25 Nov 2016 Tracking systems, estimating rotation, IMU integration, drift errors, tilt and yaw correction, estimating position, camera-feature detection model, perspective n-point problem, sensor fusion, lighthouse approach, attached bodies, eye tracking, inverse kinematics, map building, SLAM.
Chapter 10: Interaction
27 Nov 2016 Remapping, locomotion, manipulation, social interaction, specialized interaction mechanisms.
Chapter 11: Audio
29 Nov 2016 Sound propagation, ear physiology, auditory perception, auditory localization; Fourier analysis; acoustic modeling, HRTFs, rendering, auralization.
Chapter 12: Evaluating VR Systems and Experiences
30 Nov 2016 Perceptual training, recommendations for developers, best practices, VR sickness, experimental methods that involve human subjects.
Chapter 13: Frontiers
30 Nov 2016 Touch, haptics, taste, smell, robotic interfaces, telepresence, brain-machine interfaces.

Related resources

VR lectures and course materials appear below from the course at the University of Illinois. It was taught by Steve LaValle in Spring 2015, 2016 and Anna Yershova in Fall 2015, 2016.