SpArK
ELITE MEMBER
- Joined
- May 5, 2010
- Messages
- 22,519
- Reaction score
- 18
- Country
- Location
NASA Flight Deck Concepts Set For New Tests
NASAs vision of the future flight deck is an intelligent cockpit that forms an integral and interactive part of the airspace system. It will be aware of the aircraft, the crew operating it and the surrounding environment. It will sense hazards, evaluate them and provide timely and appropriate responses to the crew.
The same vision is shared by other research agencies, academia and industry, which are pursuing a raft of new technologies and concepts to bring it to fruition. Approaches range from near-term avionics and visual systems to enhance situational awareness, to wholesale systems engineering research and technology efforts (see pp. 47-56).
NASAs road map is encapsulated in the Integrated Intelligent Flight Deck (IIFD), one of four projects in the agencys evolving Aviation Safety Program. Now into its fourth year, the effort builds on the technology foundations laid by programs a decade ago that spawned the enhanced and synthetic vision systems (EVS/SVS) now entering service on business jets.
We see EVS/SVS integration in the near term, and we think thats going to happen, says IIFD Principal Investigator Steve Young. Were going to build on that, and were starting with the notion of a visual and virtual environment, which is what it will provide.
The future flight deck will go way beyond enhanced visuals, he adds. Features could include systems that portray a full image of the weather environment, rather than todays basic radar-derived image. It could also include the ability to receive updated Notams (notices to airmen) about new airspace restrictions, closed runways and so on as you fly, says Young. Another prospect is a system to visualize wake vortices in-trail ahead of the aircraft.
In coming months, the Aviation Safety Program is expected to change the names of its projects, but Young adds that our general goals and vision for the flight deck and flight-deck research are not changing. Its more of a restating or clarification of our focus based on what weve learned from the research and industry developments. The work will therefore remain aimed at two over-arching safety challengesincreasing situational awareness and improving human-automation interaction. To avoid the sort of mode confusion that has caused accidents in the past, were trying to come up with better procedures and define what is the right role for the pilot, he says.
The safety targets are driven by what NASA believes will be the most significant implications of NextGen, the FAAs Next-Generation Air Transportation System plan to modernize the National Airspace System through 2025. Statistically, the highest safety risk is in the terminal area and is therefore the focus for researchers. Based on accident data from 1998 to 2007, NASA says ground operations, takeoffs and initial climbs accounted for 31% of fatal accidents and 29% of onboard fatalities, while landings, initial and final approaches accounted for 43% and 22% respectively.
The IIFD effort comprises five main elements, one of which is focused on robust automation-human systems (RAHS) concepts. This is defining the flight-deck automation functions necessary for the operational environment that will come with NextGen as well as those nearer-term functions that may be possible within current avionics architectures. Focus areas include continuous-descent arrivals, closely spaced parallel approaches and departures, metroplex operations, merging and spacing, and low-visibility arrivals and departures.
What are the right roles for the pilot and the automated systems, and what are the responsibilities for each? asks Young. RAHS is also looking at new types of automation that will come with NextGen, such as trajectory-based operations in which the crew and air traffic control will effectively negotiate a flight path. Air traffic control will send up a 4D [the fourth dimension being time] path which theyd want the crew to follow. The question is, will the crew simply monitor this, or will they, as we think, negotiate with air traffic over which parts they will follow? he says.
A second element, dubbed displays and decision support, takes on three main goals: methods of conveying massive amounts of data to the crew without overwhelming them, improved information management and integrity, and more effective communication and collaboration among decision-makers. Well be including things the pilot cant see today, says Young, who again refers to SVS/EVS as a building block.
To stress these goals, the program highlights operational challenges that require new display and decision-support solutions and which NASA says are of critical interest to its Aviation Safety Program. These include achieving a better-than-visual flight operations capability, providing integrated alerting and notification (IAN) and enabling a highly collaborative working environment for crews.
The third major IIFD element is research into enabling avionics for the intelligent flight deck. NASA stresses that this is not aimed at developing black boxes or specific components but rather integrated high-level functions and systems. The focus, in particular, is experimental development of an IAN concept that continuously monitors information from all available sources to evaluate hazard potential. Young says Ohio University and Boeing are among those that have been working on concepts for an IAN that would draw information from onboard sensors, databases and the crew, as well as from off-board sources via data link.
There is a lot more information available to the crews today, says Young. But too much information is not necessarily a good thing, he adds. It is analogous to the way we use cell phones. Theres so much information available on todays smart phones that there is the temptation to use them in inappropriate wayslike when were driving. So what is the right way to present that information to the crew, and what is the best way to let them use it in an intuitive way?
A subset of the avionics element covers research into advanced means of sensing, signal processing and hazard characterization. Until recently, all hazard information has been collected by specific systems or black boxes, and the pilot acts as the information integrator. Systems such as current-weather radars, radar altimeters, data links, EVS, traffic alert and collision avoidance, and terrain awareness and warning are essentially stand-alone, independent functions. Focus areas have included improved means of forward-looking remote sensing, advanced image processing and detecting hazards such as wake turbulence and icing.
Young highlights a forward-looking interferometer (FLI) project as an example of a potential multi-role sensor. It is a system that looks out ahead of the aircraft to pick up some of the things that sensors currently cannot see, he explains.
Led by Georgia Tech Research Institute, the focus is development of a passive infrared radiometer device based on high-resolution Fourier transform spectrometry technology originally developed for satellite remote sensing. The devices can detect the presence of the environmental hazards by identifying each hazards distinct infrared spectral signature. These instrumentspreviously used to detect aerosols and gases in the air from spacecan detect specific hazards by characterizing tell-tale thermal signtures, but have not previously been used by aircraft. In this way they will help with wake visualization, says Young. The FLI is also being evaluated for detection of the presence and severity of clear-air turbulence, volcanic ash, wind shear and icing.
The remaining elements are dedicated to human-related studies of operator performance, as well as developing tools for flight-deck system design and evaluation. Research within this is aimed at advances in visual, aural/speech interface technologies and multi-modal integration of novel interface technologies. A lot of the work is looking at developing a model for how humans apply attention, and how long they can apply it for. The tools will be good for predicting performance, and that will then be used to compare predictions with test results, Young adds.
Other related research tasks include studies into modeling pilot-automation interaction, as well as the effects of stimulus and task demands. Other studies have focused on specific technologies such as smart-sensor processing for automatic runway-hazard detection and a microphysics-based detection system using phased-array radar. Under the IIFD program, Honeywell has investigated (and subsequently patented) a head-worn display system developed to perceive and identify a potential target or conflict. The study, building on military helmet-mounted display work, indicated that monocular displays seem to have an advantage over bi-ocular alternatives.
Once the various technologies are sufficiently mature for testing, the selected concepts move forward for evaluation in NASAs two main advanced cockpit simulation facilitiesLangleys Integrated Flight-Deck Simulator and the Advanced Concepts Flight Simulator in NASA Ames Research Centers Crew-Vehicle Systems Research Facility. Testing at Langley is focused on displays and decision-support-center technology. Were in the middle of that right now and it will finish by the end of the month, Young comments. The next test phase will take place at Ames in early 2011, when the center will evaluate new pilot-automation interaction technologies.
Photo Credit: NASA