Virtual Reality Training Helps First Responders

man in vr headset

Virtual Reality Training Helps First Responders

by Julie Cooper

Texas State and Austin-Travis County develop technology that saves time and money

An interdisciplinary team of Texas State University faculty and students is helping to train first responders using augmented reality (AR) and virtual reality (VR) through Just-in-Time VR Training.

Working with the Austin-Travis County EMS (ATCEMS), the Texas State group developed a VR training system that helps medics learn about the AmBus, a 20-bed ambulance. The system allows EMTs to train any time in their station house.

The Austin AmBus is currently one of 13 in Texas with more expected to be added in 2019, says ATCEMS Commander Keith Noble. “The new ones are not buses at all but big tractor-trailers.” The AmBus has been used for recent mass casualty or large-scale disaster evacuations including Hurricane Harvey, and flooding that impacted a Buda nursing home.

It all began three years ago when Travis County started looking for ways to improve training for EMS cadets. At the same time, they wanted to control costs. They held an internal competition with a $5,000 grant for the winning suggestion. Noble worked with one IT team from the city of Austin to use 360-degree cameras in the training process. EMS Chief Ernesto Rodriguez was part of a team looking at VR. “The idea was, what can we learn from VR training with AmBus? We don’t get to use it often and there is a lot of equipment on it,” Noble says.

Next, a Texas State team led by Dr. Scott Smith, associate professor in the School of Social Work and director of the Virtual Reality and Technology Lab, joined the project. The university team also includes Dr. George Koutitas, Ingram School of Engineering and director of the X-Reality Research Lab; Grayson Lawrence, School of Art and Design; Dr. Vangelis Metsis, Department of Computer Science; and Dr. Mark Trahan, School of Social Work. City of Austin team members include Ted Lehr, IT data architect; Marbenn Cayetano, IT business systems senior analyst; and Charles Purma, IT project manager.

man in vr headset

Explaining how VR and AR differ, Koutitas says: “In AR you see a virtual environment that is overlaid in the form of a hologram on the physical environment or space. This provides the advantage of improving the physical memory during the training.

“Putting a VR headset over your eyes will leave you blind to the current world but will expand your senses,” Koutitas says. He explains that augmented reality takes the current reality and enhances it — essentially augments it — with a holographic world. “My role is to learn about the needs for training andbuild an AR training platform application to help the first responders in their mission, which is to save lives.”

Where traditional training can take weeks, AR training can be completed in a few days. The cost of training one class of 80 cadets on the AmBus in the traditional way is more than $50,000 and could take weeks. VR/AR training can be completed in four days for much less money. It is also possible this kind of training can be used in other areas such as hazardous materials, hospital emergency rooms, and fire rescues.

The plummeting cost of the VR equipment has also aided the success. Smith points out that a few years ago, a headset could cost $30,000. Today, an improved model sells for less than $700. He estimates that in a few years such a headset could be the size of a pair of glasses.

The success of the first tests has been very encouraging. “Not only was our training more effective in reducing the amount of time it took, but they didn’t have as many errors as the traditional group,” Smith says.

Texas State students were also part of the project. “They did everything,” he says. “Design, develop, collaborate. It was really cool to watch them. They got the most hands-on experience they’ve ever had developing a project.”

With AR, the designers have re-created the interior of the AmBus, down to the exact number of steps an EMT would take to find supplies within the bus. VR students included lead developer and lab coordinator Clayton Stamper, James Bellian, Dante Cash, and Elijah Gaytan. The AR students included Shivesh Jadon, Chaitanya Vyas, and Shashwat Vyas.

“I joined the group because it follows two of my research paths,” says Lawrence, associate professor of communication design. “One is interdisciplinary collaboration — as designers, we are working with developers, business people, and engineers — the other is the user experience. That’s studying how people interact with software.”

Recent Texas State graduates Jose M. Banuelos and Kayla Roebuck taught themselves 3D modeling over one summer to be part of Lawrence’s communication design team. Sophomore Chloe Kjosa joined their team because of a love of drawing and video games. Lorena Martinez (B.F.A. ’18) helped recruit students to work on the AmBus project, gathered data, and worked on the first prototype.

In November, key members of the ATCEMS experienced VR training at the Austin Fire Academy. After putting on the headset, Rodriguez saw the interior of the AmBus as he walked the length of a hallway. He opened and closed the fingers on one hand to open and close drawers on the AmBus. Koutitas then instructed the chief to locate various medical supplies. As he moved about,

the chief was asked to describe what he was seeing: “I am looking at the bus, I see A/C controls and oxygen masks,” he says.

The people behind the Just-in-Time VR Training are taking their project on the road — including Smart Cities/Innovator’s conferences in Kansas City and Las Vegas, emergency healthcare conferences in Texas, and a technologies conference in Greece.

“We are proof that this kind of interdisciplinary collaboration works — electrical and computer engineering, social work, and design,” Koutitas says. “We collaborate on the research and the product that has an impact on people’s lives.”  


UPDATE: 03/04/2020

Recently, the VR lab joined forces with the XReality lab to provide a more holistic technological solution to AR/VR. Koutitas says the collaboration with the VR lab gives a more spherical impact to the lab, and create new technologies and gain interest. Currently, the team is collaborating with Toyota with Jesus Jimenez, professor in the Ingram School of Engineering, and his lab called High-Performance Systems (CHIPS) and uses AR and IoT to capture motion of operations and reduce accidents.