Virtual Reality becomes a Reality for VMI

Two cadets and professor demonstrate virtual reality equipment

Dr. Hongbo Zhang and cadets demonstrate the virtual reality technology.

As consumer-grade Virtual Reality technology advances at a rapid pace, the Department of Computer and Information Science has started incorporating some of that advanced technology into its curriculum.

Two cadet groups are using the CIS department’s newly acquired virtual reality and motion capture software for capstone projects. The department bought a Microsoft HoloLens, a portable head-mounted display used for augmented reality, and an Oculus Rift, another head-mounted device used to view a computer-generated environment.

Sarah MacDougall ’20 and Michael McNamara ’20 are working on building a Virtual Reality simulation to control a small drone with a HoloLens for their independent study.

Using video game engines Unity and Unreal and GPS coordinates, they created a simulated path for a drone to fly in the real world. The drone is preprogrammed with GPS points defining its flight path.

“Our initial thought was to develop a semi-independent GPS system with a side project to measure the accuracy of GPS, comparing and contrasting the simulated flight to the actual flight,” said McNamara.

The project started in the research phase this past summer and both cadets hope to roll it into a capstone project for their 1st Class year. Meanwhile, having experience working with simulations would be helpful when they commission in the military.

“Right now, we are working on testing in a simulation versus in the real world, and that can help with simulated military flights before [actually] flying airplanes. Testing all kinds of things in simulation before actual testing can save time,” MacDougall said.

Simulations are also a part of a project led by Nikolos Van Leer ’19 and Jonathan Turney ’19 using motion capture technology to compare a person shooting a gun to what a video game would simulate. Eventually, they hope to add a feature that gives feedback on posture or aim while a person is shooting.

“We analyze someone while they are shooting, to monitor posture, using something like Google Glass. We would want to talk to you in your ear, or text feeds out to see as you are shooting, or a 3-D model of what you should be doing,” Van Leer said.

He added the project started as an idea to compare real, combat-trained shooters to what is simulated in a video game and involved into their current capstone project.

“Eventually, we want to just use motion capture in order to train someone in the fundamentals of training with a weapon,” Turney said.

Van Leer said he hopes the CIS department begins to incorporate the VR technology and motion capture more into the classrooms because of the future uses of this technology.

CIS department head Col. Mohamed Y. Eltoweissy said he would like to expand using the technologies in education along with collaborating with other colleges and high schools.

If another college, such as Virginia Tech, has equipment that VMI does not, VMI could use VR to train on the equipment, he said.

“Instead of moving physically over there, we can train on equipment from here using VR,” Eltoweissy added, noting that schools that lack extensive equipment could also train on VMI’s gear.

“We intend to expand this to high schools; we have equipment here, they can use some training on configuration,” he said.

Another idea is a virtual conference room where participants can participate in a meeting or conference via virtual reality.

“The meeting is virtually coexistent as an experimental environment, not just using Skype, like a meeting in real life,” Eltoweissy said.

 - Ashlie Walter, December 2018 Institute Report

VMI: Forging 21st Century Leaders