Visual reality lab takes flight with the Air Force

In the late 1600s and early 1700s, philosophers Locke and Hume proposed that we know reality only by what our sensory and perceptual systems reveal to us. Today’s digital world of computers and artificial intelligence — as portrayed in the film, The Matrix, for example — only help to strengthen that premise, according to Robert Patterson, WSU associate professor of psychology and neuroscience.

Patterson, who has devoted his career to investigating how humans use vision to interact with their environment, recently established an Educational Partnership Agreement (EPA) between WSU and the Air Force Research Laboratory (AFRL) in Mesa, Ariz. This agreement allows Patterson to continue research on visual cueing in high-performance flight simulators, as well as head-mounted virtual reality displays, which he began in 2002 through a summer faculty fellowship grant to AFRL. Over the subsequent two summers, as he became increasingly involved with the lab, the door opened to formalizing the EPA, which includes a major state-of-the-art equipment loan to Patterson’s lab at WSU.

“This equipment allows us to open up new lines of research at WSU,” said Patterson. “On top of continuing our basic research on visual depth and motion position, we can now offer interdisciplinary research in the fields of engineering, psychology and applied vision.”

The four-year EPA agreement also involves the sharing of expertise.

“It’s very open ended,” Patterson said. “In theory, the AFRL engineers could help develop course material or collaborate with other WSU researchers in areas such as applied vision or cognition and memory studies. If all goes well, the agreement could easily be extended beyond the initial four years.”

Students test virtual reality

As a collaborator on two specific lines of research at the AFRL, Patterson realized he could contribute more to the effort if he were able to continue his research during the school year in Pullman. This also provided the advantage of allowing students to participate in the studies as both co-investigators and subjects. And what student could resist trying out the virtual reality displays used in this kind of research?

One key member of Patterson’s lab is his graduate student Jason Rogers, who will perform much of this research as part of his studies.

In their research, two types of head-mounted displays are used. The virtual reality display employs the use of a synthetic scene in front of both eyes, while in augmented reality the display is somewhat transparent and overlays information onto the real world.

“There are many possible applications for augmented reality technology,” said Patterson. “For Air Force pilots, it could mean adding targeting information, flight parameters or synthetic scenes for practice missions. It could also be developed for surgeons — as an overlay showing where to cut on a dotted line — or for race car drivers to show fuel consumption. There are possibilities for nighttime search and rescue operations, as well as coordinated overlays to help pilots fly across an ocean with no visible navigation cues.”

One-eyed monster

Despite the fact that these technologies have been around for the past decade or more, there are still major problems with their application. One of the biggest problems is caused by monocular head-mounted displays that add virtual information to only one eye.

“Since both eyes typically see the same scene in the real world,” said Patterson, “adding virtual information to just one eye confuses the brain, leading to a perceptual conflict between the eyes known as “rivalry.”

Another problem with head-mounted displays is an effect similar to looking through a digital video camera. As the subject moves his/her head, information must be constantly updated, and this can cause perceptual problems if there is a significant uptake lag in the electronics.

“People often lose their balance and end up dizzy and nauseous,” Patterson said. “These conflicts occur because our visual system and brain are exquisitely sensitive to detecting spatial misalignment or temporal delays. Engineers are facing a huge task in trying to build a system that will work within the brain’s tolerance levels.”

Cuing up optical flow

Patterson’s second line of research — in collaboration with Brian Dyre at the University of Idaho — focuses on the visual cues people use when navigating, i.e. walking, driving or flying. Here the goal is to determine the most critical navigational cues in synthetic scenes for the flight simulators.

Using large rear-projection screens in his lab, Patterson manipulates satellite imagery to create synthetic scenes that give the subject an illusion of moving forward in his/her environment. In other words, the scenes will contain “optic flow” information.

“As we move forward in our environment,” he explained, “images flow over the retinas of our eyes in an expansion pattern. We see a change in perspective from large to small — like wearing a camera on your head while riding on a roller coaster.”

In the laboratory, subjects will use a joystick to maneuver through their virtual scene while Patterson monitors those elements necessary to help them accurately control their direction of travel.

Navigating dyslexia

As an experimental psychologist, Patterson hopes to apply his research results in down-to-earth ways. Intrigued by recent evidence suggesting that people with dyslexia suffer impaired motion perception, Patterson said one of his long-term goals is to have people with dyslexia try the flight simulator at zero altitude — essentially turning it into a driving simulator — and determine whether or not there is any impairment in navigation.

“In the end, our reality, as humans know it, is a neural computation — a brain-computed product,” said Patterson. “Mental illness, psychoactive drugs, brain surgery — all of them alter a person’s sense of reality. We see this in virtual reality: people feel a sense of immersion as if they were really there.”

Next Story

Recent News

Student receives NASA graduate fellowship

Ian Wells, a mechanical and materials engineering graduate student, has won a NASA Space Technology Graduate Research Opportunities fellowship.

Summer schedule for WSU Insider

Look for news highlights in the daily push email most days Monday through Thursday through late August, with Friday emails resuming around the start of the fall semester.