WorldViz 2024 Virtual Reality Lab User Guide
- August 20, 2024
- WorldViz
Table of Contents
WorldViz 2024 Virtual Reality Lab
Product Information
Specifications
- Product Name: Virtual Reality Lab
- Manufacturer: WorldViz
- Software: Vizard, Unity, Unreal Engine
- Features: VR hardware, eye tracking, VR training, VR software for generating environments
Product Usage Instructions
-
In-Depth Reading Before Purchase
Before purchasing a VR lab, it is recommended to read the 2024 Budgeting Guidelines for Scientific VR Labs and the WorldViz Partners in Science Report for detailed information on cost considerations and research applications. -
Benefits of Virtual Reality in a Research Lab
Virtual reality allows researchers to design complex scenarios at high cost efficiency. Studies have shown that participants respond to VR simulations almost as if exposed to real-world scenarios, making it a valuable research tool. -
Critical Criteria for Selecting your VR System
When setting up a Virtual Reality lab, ensure that the hardware, software, and content creation tools are integrated seamlessly to deliver an effective VR experience tailored to your specific research needs. -
Software for Generating VR Environments
WorldViz offers Vizard, a comprehensive virtual reality software tool for researchers. Additionally, consider using Unity and Unreal engines for creating VR environments with ease, even for non-programmers.
Frequently Asked Questions (FAQ)
- Q: How can I get more detailed information on setting up a virtual reality lab?
A: For more detailed information and help with custom configuration, please contact us at sales@worldviz.com.
How Do I Set Up a Virtual Reality Lab?
In this 2024 guide, you learn which important questions to ask, and how to
conceptualize a VR lab that sets you up for long-term success. We feature the
latest VR hardware, plus new information on eye tracking and VR training.
Please also read our “2024 Budgeting Guidelines for Scientific VR Labs” to
learn about cost considerations for VR hardware, software and application
development for scientists looking to start or upgrade their VR research lab.
In this updated guide you will find answers to these questions:
- What are the proven benefits of VR in research settings?
- What physical space layout do I need for a VR lab?
- What VR and computer hardware is needed?
- What are the most important selection criteria for me to consider?
- Which VR displays – from consumer VR headsets to 3D multi-wall projection displays should I consider?
- Which VR input devices – from haptic gloves to biofeedback devices and eye tracking – should I consider?
- What software for creating VR environments should I consider?
- How can I design and implement applications for multiple users?
WorldViz VR Lab at University of Dayton
WorldViz VR Lab at Stanford University
For more detailed information on how to set up a virtual reality lab and help with custom configuration, please contact us at sales@worldviz.com .
We’ve been thinking long and hard about the decision making process you as a
customer are faced with. Here are a few reads worth your while:
Our 2024 Budgeting Guidelines for Scientific VR Labs provides a detailed view
on cost considerations including specific pricing for VR hardware, software
and application development for scientists looking to start or upgrade their
VR research lab.
The “WorldViz Partners in Science Report” is an annual report that collects
hundreds of peer reviewed research publications which utilize WorldViz
products. Covering a range of academic disciplines including Computer Science,
Psychology, Engineering,
Physiology and Neuroscience, we invite you to explore the current and past
reports for a comprehensive overview of the state of VR research
applications.1) In-Depth Reading Before You Purchase a VR Lab
Benefits of Virtual Reality in a Research Lab
VR projection
solution at University of Dayton College of Engineering
Dual Use 3D and 2D Immersive Classroom Teaching
Modern VR solutions such as WorldViz Projection VR and WorldViz VizMove PRISM
allow for highly engaging interactive large-scale 360 immersive classroom
teaching, and collaborative 3D VR theater experiences with multiple VR headset
users co-located or networking from remote locations while entire classrooms
can participate as bystanders in a shared experience. These multimodal systems
support both 3D interactive simulations and 2D group presentations.
Immersive Remote Collaboration
You can run multiple participants through a simulation who are located in
separate physical spaces, even in different parts of the world! These
participants can explore simulations together in real time in an interactive
virtual reality environment, whether in VR headsets, in an immersive
Projection VR system, or on desktop computers, or any combination thereof.
Ecological Validity and Experimental Control
More and more research labs are turning to using Virtual Reality for their
studies. Why?Virtual Reality provides high ecological validity, repeatability
of studies, and lets you design complex scenarios at high cost efficiency.
Studies have shown that when participants are exposed to a simulation using a
VR display they respond to it almost as if exposed to a real-world scenario
(see chart below).
Some experimental scenarios would be very difficult to set up and repeat using
traditional methods – think of asking a cohort of participants to navigate a
busy city environment with unchanged conditions for each participant. Not only
is this easy to create in VR, but one can repeat it as many times as needed.
VR headset
solution for wide-area field-of-view scientific use
So what are the things to keep in mind when setting up a Virtual Reality lab,
and how does one go about this? Read on to find out.
Critical Criteria for Selecting your VR System
- Budget – Cost and potential ROI are important considerations when determining the ideal VR system for you.
- Customizability and scalability – A good virtual reality system can be upgraded to incorporate the latest tech advances while scaling to fit your evolving business needs.
- Use case – What do you want to achieve with the VR system? Different use cases sometimes require different hardware.
- Footprint – The installation space and storage you must permanently dedicate to your VR setup.• Quality of experience – Unless a VR system is equipped with high levels of visual refresh rates and fast responses to the movement, participants can experience the environment as lagging and drifting, which can potentially cause motion sickness.
• Responsiveness – How well the VR system responds to your movements and interaction. This responsiveness is typically assessed by the “degrees of freedom” (DoF- up to six) that your VR system delivers. More concretely, does the VR hardware respond to the participant’s three directions of movement – up/down, forward/back, and left/right – in combination with the three directions of head rotation, often termed yaw, pitch, and roll.
These three parts of the VR triangle need to be carefully integrated into a
seamless solution to deliver an effective VR experience tailored to your
specific use case.
Software for Generating VR Environments
At the heart of every VR environment is a software program used to construct and then display the simulated 3D experience, along with a computer processor. The invented worlds of VR can run the gamut from the most simple geometrical block shapes to the most intricately detailed universes that call forth a true sense of awe and wonder. Some VR applications – for example, those delineating the complex worlds of VR games –may take months to develop. Most industry applications don’t need (or want) such elaborate complexity, nor do they have the time to wait for such development. If you and your team are doing the development, important considerations for evaluating such software are:
- Ease of use / easy to learn, even for non-programmers
- Robust open source community libraries of VR code that can accelerate development of simulated environments
- Enhanced “rapid application development” to speed the building of VR environments – often one of the key requirements for industrial applications
- Compatibility with different types of VR system setups and, in particular, with a variety of VR input and output technologies (for example, desktop walkthrough, CAVE, and VR headset). Rendering software should support those integrations hassle-free WorldViz makes Vizard, a comprehensive virtual reality software tool for researchers. In addition, the experiment generator plugin to Vizard “SightLab VR Pro”, allows the ability to generate full VR experiments using little or even no code, saving and viewing of complex data visualizations as well as allowing access to a wide variety of templates and examples for common VR experiment tasks.
Additional options include the Unity and Unreal engines, among others.
Physical Space Layout
First thing to look at is the physical space layout of your room. A few questions to consider:
- Is your space better suited for 3D projection onto one or multiple walls, or for a VR headset based system that provides some room to roam around freely?
- Will the participant(s) be walking around, standing or sitting down?
- Can participants wear VR headsets or not? A 3D Projection system might be the best solution for group settings with no need for VR headsets.
- If walking, how much physical space will they need to move around?
- Will the participant and the researcher be in the same space or not?
- What factors may interfere with getting optimal motion tracking (i.e. interference for motion tracking systems sensitive to infrared light, line-of-sight occlusion, etc.)
- Can the area be kept free from obstacles, allowing participants to freely roam around wearing a VR headset?
- Will your simulation require networking, allowing remote participants to connect in a simulation?
- Will you measure physiological data on participants using biofeedback devices, or even fMRI equipment?
Are there any factors in your designated physical space that may adversely
affect research results?
In order to have a functional, safe space in which to maneuver, you will need
to block off an open area that is free from any obstacles or clutter. If you
are using a camera based motion tracking system that requires free line-of-
sight you need to make sure that everywhere the participant walks they are in
good view of the cameras. Depending on the particular motion tracking system
you choose, different numbers of cameras may be necessary to guarantee optimal
coverage of your space. You may have to also account for things that can cause
light interference, such as reflective surfaces (windows, glass, etc.) and
sources of infrared light (if the system is using infrared motion tracking).
We recommend you consider having a spotter available for the safety of your participants wearing a VR headset. Since the participant will not see the physical boundaries of your lab, it is a good idea to have someone available who can guide them, as well as keep cables out of the way. Most modern VR systems will also provide a virtual boundary system that you can adjust to the size of your physical space. In considering whether the participant and experimenter should be in the same space or not, weigh these factors: Will it be more important for you to keep an eye on theparticipant and maintain direct communication, or does your experiment require full separation of participants from those collecting data? Either way, there are many benefits to choosing a separate room and computer for the experimenter, including the ability to record from multiple viewpoints, or to blend physiological data measurements and monitor them in real-time.
As far as
wires are concerned, keep in mind devices that need a tethered connection, for
example physiological measuring or eye tracking devices. For a non-tethered
solution to connect to the rendering computer, you can consider a backpack
based PC, such as this one from HP. For cable management you can use drop
ceilings, cable runners, or cable pulley systems.
If you are using a projection based VR system (more details on this later),
you will need a large, clean -preferably white or light gray – wall as your
projection surface. You will also want to be able to manage ambient light
interference.
If participants need to be prepared before entering your simulation, for
example to attach physiological measuring devices, you may want to plan for a
separate adjacent room. One last factor to consider is that you may need a
neutral environment free of biases influencing experimental results, for
example an environment that is noticeably hot or cold, noisy, or in other ways
affects a participant’s state of mind.
Equipment Needed
Next we will go over the hardware that you will need. We will cover the basics
first, and then go into a few more advanced setups. For help in acquiring off
the shelf, pre-configured Virtual Reality systems contact
sales@worldviz.com.
- Visual Display
- 3D Wall, 2 sided, or more (especially for education, training, etc)
- VR Headset
- Mixed Reality
- Motion Tracking
- Rendering computers
- Backpack PCs
- Input device
- Projector or large screen to see the experiment live
- Accessories such as gloves, trackers for body parts
- Speakers and sound
- Physiological measuring sensors
- Eyetracking
VR Headsets
The most common display for a VR system uses a VR headset for the display. There are quite a few to choose from, and factors to consider when looking at which one will be best for your setup. One big separation is whether the headset is connected to a PC or runs standalone as an android device. For the best graphical quality and experience a PC based VR headset is recommended such as the Meta Quest Pro or HTC VIVE Focus 3. For an Android based option, the Meta Quest 3 device is a very good option, as it works in three modes: wired to a PC with a USB-C connector cable, or wirelessover airlink, or as an Android based standalone device. Development for the Android based platform proves to be more challenging than for PC based systems, as graphics are much more limited and more optimization is required to reach acceptable performance.Some other factors to consider are resolution, FOV, tracking system, development platforms, price, comfort and whether you wish to have things like eye tracking included. Resolution is consistently increasing in VR headsets, and we have seen the progression from the VFX1 in the 90s at 263×230 per eye resolution to the newer high resolution and wide field of view headsets like the Pimax Crystal, as well as pixel density increase all the way to Human-eye resolution with 71 PPD (Pixel Per Degree) peak fidelity in the Varjo XR-4 headset.
3D Projection
When one thinks of a Virtual Reality display, the first thing they may think
of is a VR Headset (such as the Meta Quest 3). However, there are many
benefits to using a 3D projection setup. If you are going to be using a larger
group of individuals a 3D projection based setup would allow for many users to
experience an immersive simulation just wearing 3D shutter glasses. For an
even more immersive projection experience, you can have the main user’s
viewpoint tracked using a system like our PPT tracking that will then update
the viewpoint based on where a user is located. For the rest of the users to
see the correct perspective, they would have to be positioned fairly close to
the user that is being tracked.
A 3D projection system usually consists of a number of projectors that range from business class to cinema grade. Economical solutions use ultra short throw projectors (or lenses) since this allows to front project the image onto the screen or wall without having participants cast shadows. They also require a smaller footprint compared to rear projections setups. Aside from the projectors one or multiple rendering computers that support Quad buffering (i.e. using an Nvidia Quadro card), 3D glasses (and an emitter for certain shutter glasses), an input device (such as a wand or controller) and possibly some kind of tracking system if you want to have the viewpoint updated to the user’s position are typically part of the system.
3D shutter glasses and emitter Ultra short throw projector
VizMove PRISM Virtual Simulation Room
The WorldViz VizMove PRISM virtual simulation room is an all inclusive
immersive training solution using projection that lets you capture real world
scenes and bring them into your training space without technical expertise.
The PRISM System allows users to utilize 360 videos and images in a touch
screen environment, and convert any room into a content-rich learning
environment. PRISM combines 360 visualization, 3D sound, interactive touch,
scents and more. Drag-And-Drop Workflow
Creating content for PRISM is easy and intuitive. Capture images and videos
from anywhere with the included 360-degree camera, drag-and-drop the media
into PRISM to create a scene, and enhance with sounds, lights, smells, and
interactive triggers.
Multi-Sensory Interactive Training
To interact with the environment, users touch walls or activate triggers with
controllers. PRISM delivers surround sound, controllable lighting, and
diffused smells for enhanced realism. The experience elicits emotions,
memories, and responses that improve learning retention and reliable skills
performance.
Configuration Options
PRISM comes in a variety of configurations to meet specific training needs and
to fit most rooms. PRISM installs high on walls and within ceilings, so rooms
remain open and multi-purpose.
VR Headset versus Projection: Side-by-Side Comparison
VR Headset
3D Projection
Creates a highly immersive VR experience
*best for “lifelike” simulations
| Lower immersive experience
*not as all-encompassing or engaging as a VR headset
---|---
Can offer full freedom of 360° movement & interaction| A full 360° rotation
requires projection on all four walls
Less costly than projection| Projection technology is typically more expensive
than HMDs
One person use only| Ideal for multi-user experience and group collaboration,
and for users that are
reluctant to wear headsets.
Motion Tracking
There are different tracking systems associated with headsets as well, such as
the Valve Index and Vive lighthouse, or the Meta Quest2, Vive Cosmos and HP
Reverb Omnicept inside out tracking. Certain optical tracking systems (such as
WorldViz’s PPT motion tracking system) can also be attached to upgrade a
tracking system if you wish to either increase the positional accuracy or add
the ability to track a larger space (such as adding PPT to an Oculus headset).
Here’s a quick rundown of the some of the common tracking systems used in a VR system:
- Optical tracking (passive) – Uses reflective markers. Requires many cameras and many markers.
- Optical tracking (active) – Active markers flash at specific frequencies. Highly accurate. Can track up to 32 markers with less cameras than passive systems.
- Magnetic – measures the intensity of a magnetic field in various directions.
- Inertial Tracking – Uses accelerometers and gyroscopes. Subject to drift. Not accurate for positional tracking.
- Inside Out Tracking (markerless or with markers) – Cameras or sensors on the headset are used to track position. Additional markerless solutions use AI algorithms to detect body movements and translate those to cameras placed around the room.
Computing
For any VR system, you are going to need a good computer. Most gaming
computers meet the minimum requirements, but this can also vary depending on
whether you are wanting to render very complicated and large models (such as
CAD models or large point cloud models). For walking/headset based systems you
should try to go with an NVidia RTX or GTX based card, and for projection you
would need a Quadro card. Here are some general minimum requirements to shoot
for.
- Operating system: Windows 10 (requires a 64bit OS)
- CPU: Intel™ Core™ i5-4590 equivalent or better
- Memory: 6 GB GPU – GTX 2060
- Hard drive: 1.8 GB free
As mentioned earlier, you can also look to use a backpack based PC solution,
that will allow the users to freely navigate the environment without the
constraint of a wire. Although the downside to this is that you are limited
with battery life and there are complexities that get introduced (such as
having to find a way to stream the display of the backpack PC to an outside
computer if you don’t want to have to plug it into a display each time you
need to start a simulation).
Oftentimes, labs choose to combine walking VR with a projection system, and
one benefit to this is you have a large display in which you can mirror the
simulation for outside viewing. You can also mirror the simulation to the
rendering computer’s monitor.
Input Devices
In order to interact with your virtual simulation users will need some kind of
an input device, this could be anything from the typical hand controllers that
come with most VR headsets (such as the Vive wands), to a keyboard and mouse
if seated, or even using hand gestures with a data glove.
Accessories
To increase immersion and data analysis information, there are many accessories that can be added (way too many to go into detail here, to discuss this further contact one of our helpful sales people to walk you through the options). Vizard’s vizconnect tool makes it easy to connect to over 100 devices for VR simulations, more information on that here. Some of the more common accessories are:
- Data Gloves
- Such as the Manus VR data gloves
- Full body tracking systems
- Haptic devices
- Sound and Speakers
Sound can be an important, sometimes overlooked, aspect in a virtual simulation hardware setup. While most VR headsets come with headphones, if you are using something like a Projection VR setup with a group of participants it may be more beneficial to have a set of good surround speakers installed. Also, when creating your application, it is good to have the sounds placed in the 3D space appropriately using 3D spatialized audio. For a tutorial on how to use 3D sounds in Vizard, see this page in the documentation.
Physiological Measuring Devices In order to validate a user’s implicit reactions to a simulation, it is highly advisable to use some sort of physiological measuring device. This can be measuring things such as heart rate, skin conductance, EEG, and a host of many other signals. One system that works seamlessly with the Vizard software is Acqknowledge and the BIOPAC physiological measuring devices.
Eye Tracking
Eye tracking
is growing in popularity in VR headsets, and can provide a lot of useful data
of how a user is reacting to a simulation.
SightLab VR Pro
For help with setting up an eye tracking experiment and collecting data on
things such as objects of fixation, average view time, heatmaps and much more,
Worldviz offers SightLab VR Pro, an extension to the software Vizard that
allows you to create virtual reality eye tracking experiments with little or
no code. Access visualization and playback tools in a single user or multi-
user environment. Additionally SightLab VR Pro allows connection to various
hardware devices, such as BIOPAC physiological measurement systems and more.
Works for both single and multi-users.
More information on SightLab VR Pro can be found here.
Multi User Capabilities
Multi-user VR laboratory setups are essential for collaborative research and interactive experiences, allowing multiple participants to engage simultaneously in a shared virtual space. Key to this setup is robust network infrastructure for seamless interaction and real-time synchronization across different systems. Avatars, representing each participant, enhance presence and immersion, leveraging tools such as ReadyPlayerMeor Avaturn allows for the creation of customized avatars that can be used in your experiment. Collaborative tools and integrated communication methods facilitate interaction (such as interacting with objects and speech communication). The WorldViz Multi-User software, allows for a streamlined way to add multi-user functionality to your experiment, as well as extending the capabilities to include eye tracking, physiological data collection, visualizations, playbacks of sessions and more. Real-time data collection and playback ensure engaging, effective, and secure multi-user VR environments, thereby broadening the scope for innovative research, training, and simulations.
Quick Tips on Setting up a VR Experiment
- Design with data analysis goals in mind.
- If presenting UI or other elements to a user, keep in mind it is most comfortable to view objects at 0.75 to 3.5 meters from eyes.
- Capture behavioral data, events, physiological responses, in a meaningful way.
- Try to avoid moving a user’s viewpoint without them controlling it, as this can cause nausea.
- Ideally a user should be able to walk naturally and not have to use a controller.
- Make sure to optimize your models so that you can keep a consistent frame rate. Lower frame rates and jumps in frame rate consistency can be uncomfortable for the user. You want to shoot for a framerate that matches that of the VR headset (usually around 90 fps).
For help on how to create an experiment using the VR application development
engine Vizard, see this tutorial or see the documentation for SightLab to see
how to leverage it for VR experiment creation. Also see this page to see a
guide on creating a Visual Search task that goes into modifying independent
variables and measuring dependent variables based on certain conditions.
Additionally, WorldViz provides custom development services if you have a
design in mind, but need help implementing it. Please contact
sales@worldviz.com. for more details.
Conclusion
Setting up a VR lab can bring a huge upgrade in your research. Initially it
can be somewhat challenging and take some time, but with the proper setup you
will have a much improved environment in which to perform your studies. For
further help on how to setup a VR lab and discuss options, contact
sales@worldviz.com.
Please also read our “2024 Budgeting Guidelines for Scientific VR Labs” to
learn about cost considerations for VR hardware, software and application
development for scientists looking to start or upgrade their VR research lab.
References
- Avaturn | Realistic 3D avatar creator
- Vizard 7 Documentation
- Vizard 7 Documentation
- A comparative evaluation of the effectiveness of virtual reality, 3D visualization and 2D visual interactive simulation: an exploratory meta-analysis - Ikpe Justice Akpan, Murali Shanker, 2019
- MANUS | Finger & Full-body tracking for Mocap and VR
- Purchase | Pimax Crystal
- Integrate an avatar creator into your game in days - Ready Player Me
- SightLab VR Documentation
- Virtual and Mixed Reality headsets VR and XR Headsets – Varjo.com
- Data Acquisition, Loggers, Amplifiers, Transducers, Electrodes | BIOPAC
- Tech Tip: Designing an Experimental Study
- Vizible | Virtual Reality Collaboration Software
- Virtual Reality Software Sales | Contact Us
Read User Manual Online (PDF format)
Read User Manual Online (PDF format) >>