While VR has the potential to revolutionise the way we live, work and play, it's a technology that can confuse the average person, thanks to an overabundance of technical terms that tend to be thrown around by developers. Have a read of our concise 'cheat sheet' guide to all the terminology you might come across when discussing virtual reality, augmented reality, and mixed reality.
Virtual reality, commonly abbreviated to VR, is a technology that simulates a fully immersive virtual or imaginary environment in which a user feels that they are physically present.
Augmented reality, commonly abbreviated to AR, is a technology that overlays virtual elements on top of a real-world environment.
Similar to augmented reality, this refers to a technology system whereby a largely virtual environment is merged with real-life objects.
Described variously as either mixed reality, MR or hybrid reality, this term refers to any technology that isn’t a fully immersive VR system, but instead augmented reality or augmented virtuality (see above definitions). This is also (confusingly) used to describe Microsoft’s virtual platform, which includes both VR and AR devices.
This describes any headset that requires a connection to a stand-alone PC in order to function. Well-known computer-based systems include Facebook’s Oculus Rift and the HTC Vive.
Any headset where the processing and display for the VR experience are provided by a mobile phone. Notable examples include the Samsung Gear VR and Google Daydream headset.
A VR or AR headset where the entire system is self-contained within the device. Examples include the Microsoft HoloLens mixed reality headset and the upcoming Oculus Go from Facebook.
Both presence and immersion are used interchangeably to describe the sensation of feeling physically present within a virtual experience, as opposed to the detachment experienced through experiencing content via a conventional screen-based medium.
Navigation and tracking
Devices used to track the exact position of the user while they are using a VR system, and feedback data that is used to inform the information being shown on the screen.
A virtual reality set-up that, thanks to an expansive configuration of positional sensors (see above), allows the user to physically roam around an entire room without experiencing limitations.
Common to many VR experiences – particularly those that are mobile-based – this feature limits the user to a pre-defined number of explorable positions within a virtual reality build, rather than allowing open-world exploration (see below).
Sometimes referred to as sandbox, this type of experience allows users to freely roam around the build, rather than being confined to a set number of fixed viewpoints (see above).
A common method of virtual navigation, this allows the user to quickly move between points without having to traverse the distance between them.
This is a method of tracking a user in virtual reality whereby the picture shifts as they move or angle their head.
The use of positional sensors and markers that register where a device is, allowing it to be mapped to a virtual environment.
Three degrees of freedom
Often abbreviated to 3DoF, this term refers to the ability to move in six directions, namely pitch, yaw and roll.
Six degrees of freedom
Often abbreviated to 6DoF, this term refers to the ability to move in six directions, namely pitch, yaw, roll, elevation, strafing and surging.
This term refers to the use of externally placed positional sensors (see above) to track a user moving in real-time.
This tracking method uses cameras fixed to the device being tracked in order to determine how its position changes relative to its environment.
The measurement of eye positioning and movement to discern where exactly a user is looking. This is a crucial element of foveated rendering (see below).
Audio and visual
Also referred to as Positional audio, 3D audio places sound objects in a three-dimensional space, creating a more realistic soundscape for the user.
Lighting elements within a VR build that are rendered in real-time, allowing shadows and lighting effects to shift perceptively as the user moves, ensuring a more realistic experience.
A tracking method where the user’s eye movements are tracked, allowing peripheral vision to be rendered at a lower quality, thus reducing the amount of processing needed to render a VR experience in real-time.
This refers to the use of physical feedback methods such as vibration, pressure or motion to synthesise the sense of touch.
This specifically indicates how often the buffer is updated and an image (often called a ‘frame’) regenerated on a screen, an important element when creating a realistic virtual environment. This is measured in Hertz (Hz) and is related to, yet distinct from, frames per second (see below). A low refresh rate can cause judder (see below).
Frames per second
Also known as Frame rate or fps, this measures how often images (also called ‘frames’) are shown consecutively. This is related to, yet distinct from, refresh rate (see above).
Fundamental to VR and AR, latency refers to the time that passes between an action occurring and the image being shown updating to reflect this. While latency in traditional video games tends to be around the 50ms mark, this needs to be significantly lower for VR and AR, otherwise the user will notice inconsistencies between their actions and what they are seeing.
Typically caused by a low refresh rate (see above) or dropped frames, judder is the manifestation of motion blur (also known as smearing) and the perception of more than one image simultaneously (known as strobing). This can cause simulator sickness (see below).
Sometimes referred to as VR sickness and with similar effects to motion sickness, this can be caused by factors including judder (see above) and users perceiving self-motion when stationary. This can often be mitigated or eliminated entirely using user-friendly navigation such as teleportation (see above).