NASA Discovers Real Uses of VR and AR in Astronomy and Engineering

After years of VR hardware, it has been heavily used in only a few areas, such as games, according to techCrunch, amedia outlet. Yet a NASA team has been assembling useful scientific and engineering applications with encouraging and unique results. Studying the number of stars in the Milky Way is usually using older tools. As a result, it is difficult to use a powerful multi-functional pattern recognition engine to get the most out of your information.

NASA Discovers Real Uses of VR and AR in Astronomy and Engineering

Tom Grubb, an engineer at NASA’s Goddard Space Flight Center, has long considered VR and AR to be valuable tools for exploring and processing such data, and his team has just presented the first paper that uses these technologies directly. He and his colleagues used the VR environment to examine the vicinity of a vibrant star and came up with a novel classification of stars that other astronomers disagreed with. The ability to visually observe the path and position of a star in three-dimensional space provides key insights.

Astronomer Marc Kuchner said in a NASA press release: “The planetarium is uploading all the databases they can use and leading people to understand the universe. Well, I’m not going to build a planetarium in the office, but I can wear a headset, I’m just there. “

Grubb and his team created a number of software projects that not only helped bring astronomical databases into virtual reality, but also helped bring engineering work to VR. Just as heavy industry is learning to include VR and AR in its safety, maintenance, and training programs, NASA is working on it in engineering and cross-site collaboration. Part of this is simply to build basic tools for viewing and manipulating data.

NASA Discovers Real Uses of VR and AR in Astronomy and Engineering

“Hardware is here; support is here.” The software and the conventions on how to interact with the virtual world are lagging behind. Grubb explains. “You don’t have simple conventions such as pinching and zooming, and you don’t have the same way each mouse works with a right-click or left-click. “But once someone looks at the representation of a 3D star or inside a probe in a virtual environment, new opportunities are discovered.”

“We’re going to be in the same environment, and when we point to or manipulate something in the environment, they’re going to be able to see that,” Grubb said. “You still have to build the model, but you can do a lot of iterations before moving to the physical model. Talking about cable routing isn’t easy for the average person, but it’s exciting for engineers to be able to do that in a virtual environment and know how much cable you need and what the wiring looks like. “

This work is still ongoing and a paper describing the team’s first astronomical results will be published shortly. Of course, the work they do is usually publicly available, for example, the PointClouds VR tool they use to view star and lidar data – you can download it on GitHub.