Space Training
Issue: Volume 40 Issue 4: (Jul/Aug 2017)

Space Training

While games like “Earthlight” and “Mars 2030” strive to feed the masses’ hunger for a realistic space experience, actual space exploration and human spaceflight is serious, dangerous, and technical business. While it may seem that with our continuous visits to the International Space Station (ISS), space travel has become old hat. But that is far from the truth.

Only a small number of humans have the distinction of having been to space. And before they ever left the Earth’s atmosphere, they spent seemingly a lifetime training for that distinction. Yet, training an astronaut is both time-consuming and expensive; after all, performing a task in zero or reduced gravity is counterintuitive to everything they know, and there is no easy way to learn how to do so. On top of this, astronauts need to be prepared for the unexpected and able to react accordingly under intense life-and-death situations. So, how does one learn to work in unique conditions and make decisions under such immense stress?

For years, NASA has developed ways to simulate the conditions of space for training space-goers. In the 1960s, astronauts geared up and performed tasks atop a gravel surface meant to mimic that of the moon – while engineers and support staff were present nearby within their field of view. Indeed, things have advanced since then, through the use of both virtual and physical environments.

Today, budding astronauts can use various devices and facilities, including the Active Response Gravity Offload System (ARGOS), an apparatus designed to simulate reduced gravity, and the underwater Neutral Buoyancy Lab (NBL), where astronauts learn to live and work in harsh conditions and in confined spaces.

Astronauts also continue to train in the Johnson Space Center’s Virtual Reality Laboratory (VRL), which was founded two decades ago and uses internal NASA tools purely for astronaut training purposes. The system developed was a cutting-edge VR setup based on internally developed proprietary tools.  

NASA continues to employ cutting-edge technologies for a wide assortment of uses, recently establishing the Hybrid Reality and Advanced Operational Concepts Lab at the Johnson Space Center in Houston as another option in its training repertoire. The Hybrid Reality Lab (HRL) uses real-time gaming and visualization technology, and other off-the-shelf tools, for virtual space missions that integrate physical elements into the CG environment.

“Our primary goal is to meet NASA’s training, engineering design, scientific visualization, field analog, surface operation, and human performance study needs by developing a highly immersive, realistic system that combines elements of both physical and virtual reality,” says Matthew Noyes, software lead at the new HRL. “We exist in that space where you have a mix of the physical and virtual, as the lab focuses on the integration of physical elements into the virtual world.”

What’s perhaps surprising is that the space agency is relying on consumer virtual-reality technology – in particular, a commercial game engine – to make that happen. “NASA is leveraging the multibillion-dollar games industry to subsidize the cost of human spaceflight,” says Noyes. “At a fundamental level, training to explore space is sort of like creating a game. We immerse the user in a fabricated, three-dimensional environment and have them complete objectives under various constraints.”

Virtual ISS

The Hybrid Reality Lab is looking at projects that improve the quality of engineering analysis, human performance, and such. More to the point, the group is examining the use of tracked hand tools and large static structures within the hybrid-reality environment.

Perhaps the most intriguing applications are those aboard a hybrid-reality replica of the ISS, currently part of a technology demo. ”NASA is always interested in how cutting-edge technology can help our programs,” says Noyes. “With Unreal Engine, we’ve created a completely immersive, three-dimensional hybrid-reality environment that is incredibly lifelike. In basic terms, that means we can put our crew in space while they’re still on Earth.”

The system can be used to train astronauts to perform maintenance tasks, aid in the design of new equipment, support various studies related to human performance in space, and help to design new tools. 

Tools of the Trade

The mock-up of the internal American ISS modules was created using a variety of data sources. “It is dimensionally accurate,” says Noyes of the ISS, albeit with a lot of the unnecessary detail removed. A high-res image of the Earth can be seen outside the ISS; that model is courtesy of game developer Opaque, with whom NASA has a collaborative relationship.

In fact, other CG models used in the application are supplied by another collaborator, Fusion VR, makers of the “Mars 2030” VR application (see “Gravitational Pull,” July.August 2016). “This hearkens back to our idea of leveraging the games industry. These developers are interested in creating an authentic gaming experience for their audience, to show what it is really like to be a NASA astronaut,” says Noyes. “That is valuable to us from a PR standpoint. We want the public to be engaged in what NASA does, and we think games are an excellent medium to make that experience more authentic. On the flip side, we can also take some of their assets and make high-performance, visually appealing models and incorporate them into a practical NASA application.”

Some of the lab’s CG models were imported from additional NASA applications, while others were hand modeled within open-source Blender or were digitally scanned. Texturing was accomplished with Allegorithmic’s Substance. The system also uses Nvidia’s VRWorks, as well as the company’s PhysX, which supports conservation of momentum and basic friction calculations. While PhysX is not as accurate as a high-fidelity simulation, the real-time physics engine can make astronauts feel like they are in space,” says Noyes.

The crew also uses Flex, an Nvidia GPU-accelerated, particle-based simulation library that supports soft-body fluid mechanics. “We have a lot of applications for soft-body physics in VR in real time, showing how astronauts drink water in space, for example, which is very non--intuitive as to how they would do it on Earth,” explain Noyes. Flex is also great for implementing tether simulations, which are critical for a space walk.

Space Training
Space Training
TOP: DEMONSTRATING THE USE OF 3D-PRINTED TOOLS IN MIXED REALITY. BOTTOM: AN HRL VIRTUAL ENVIRONMENT.

As for the physical elements used within the HRL applications, the lab reproduces inexpensive 3D-printed replicas of objects they scan. “There are small artifacts, but the visual fidelity is far more accurate than if it is hand modeled,” says Noyes. “And we can 3D-print it almost immediately.”

An HTC Vive is used to navigate within the virtual space. A pair of Vive stations enables two people to occupy the same physical space and conduct tasks in tandem. According to Noyes, NASA was attracted to the Vive’s Lighthouse tracking system. “It has good performance versus price, and it is a decentralized system,” he adds. We may switch to a different solution in the future if our requirements change, but for now, the Vive meets our requirements.”

Currently, the room-scale tracking has proven sufficient, to a point. NASA is using the hybrid-reality system with ARGOS, which has a much larger tracking area than what is supported by the Vive. ”We are counting on the Lighthouse protocol, not the current implementation, which is designed to allow for essentially unlimited tracking volume using more base stations,” says Noyes. “We think that will probably be solved in the near future, so we continuing to use Lighthouse as our solution. We are also exploring other solutions that are based on inside-out tracking that would remove the need for bases stations.”

Furthermore, the group is using the HTC puck instead of the main controllers for tracking the physical objects used in the

hybrid-reality applications, since they are more compact. In early versions, lab personnel attached the Vive tracker puck directly onto 3D-printed tools for tracking purposes. “We duct-taped it on, and it just worked,” says Noyes. “In VR, you would see this physically--based rendered tool that looked like metal because we added a metal shader to it. We don’t have to make it look accurate in the real world, it just has to feel accurate both geometrically and tactilely.”

A process that adds a thin layer of metal to the plastic object can achieve that latter objective.

In the future, the team plans to integrate the electronics of the Vive tracking system directly inside the 3D-printed models using the Steamworks hardware development kit.

“Producing these 3D replicas is very inexpensive and quick. Once it is inside the VR environment, it costs about $300 to make. This allows us to carry out tests without the use of ‘space hardware’ or ‘engineering replicas’ that are expensive, in short supply, and incur the risk of being damaged during testing. If we damage one of our hybrid--reality tools, we could have another one fabricated in a day,” says Noyes.

Hybrid-Reality Advantages

As Noyes explains, within a pure VR environment, users learn how to use a tool but develop no muscle memory from using it. “We want them to use the tools without looking at their hands. We want it to become second nature,” he adds. “Using gloves or exoskeletons to simulate that sense of touch is making significant progress, but is just not there yet with current technology. So one of the best ways to do this is through 3D printing the objects.”

Moreover, the 3D printing produces a lightweight object, ideal for mimicking the altered weight of the object in space. A 10-pound object on Earth would weigh 3.75 pounds on Mars. While a physical object cannot be made weightless for training, what it can do is fatigue the user’s arm over a certain period. “And that is important with hybrid reality,” says Noyes.

In addition to the virtual replica of the ISS, there is also an external physical mock-up with handrails that align with the CG handrails within the virtual environment, for learning to maneuver around the outside and inside of the space station.

Also, there are some situations that are difficult to simulate on Earth. For instance, fire behaves differently in space, but within a virtual setting, the lab can reproduce the effects more accurately. By adding a physical element, astronauts can learn to feel their way while navigating to an exit. “Hybrid reality is the only way to accomplish certain types of training like that,” Noyes says.

Another important advantage to using a hybrid-reality environment is the increased sense of presence. “In the future when we teach astronauts how to perform tasks in space, we just don’t want to give them an idea of the general layout of the environment; we want to make them feel like they are actually in that environment, the goal being to produce a strong fight or flight response in the event of a life-threatening situation. By elevating their stress response, we can better approximate how they will react in that environment. And the environments we created would allow us to meet many training goals. The more realistic your training feels, the faster you can respond in critical, real-world situations, which ultimately can save your life.”

Space Training
Space Training
NASA'S HYPER-REALITY ENVIRONMENT TRANSPORTS USERS TO A VR REPLICA OF THE ISS.

Practicing what seems like mundane tasks, such as planning routes for unloading cargo, can be very time intensive and expensive. “Astronauts’ time is very valuable, so anything that can reduce learning effects prior to launch is going to save a lot of time while they are in orbit,” Noyes notes.

So while many gamers out there are using Unreal Engine and other tools for out-of-this-world experiences, for those at NASA, the implementation goes far beyond fun and games, giving new meaning to the term “mission critical.” 

Karen Moltenbrey is the chief editor of CGW.