Sensory System for the Transition from Aided to Unaided Vision During Flight to Mitigate Spatial Discordance
Navy SBIR 2015.1 - Topic N151-007 NAVAIR - Ms. Donna Moore - [email protected] Opens: January 15, 2015 - Closes: February 25, 2015 6:00am ET N151-007 TITLE: Sensory System for the Transition from Aided to Unaided Vision During Flight to Mitigate Spatial Discordance TECHNOLOGY AREAS: Air Platform, Human Systems ACQUISITION PROGRAM: JSF-Sus The technology within this topic is restricted under the International Traffic in Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and import of defense-related material and services, including export of sensitive technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774, which controls dual use items. Offerors must disclose any proposed use of foreign nationals (FNs), their country(ies) of origin, the type of visa or work permit possessed, and the statement of work (SOW) tasks intended for accomplishment by the FN(s) in accordance with section 5.4.c.(8) of the solicitation. Offerors are advised foreign nationals proposed to perform on this topic may be restricted due to the technical data under US Export Control Laws. OBJECTIVE: Develop a system to seamlessly transition from aided to unaided vision while performing night operations. DESCRIPTION: When pilots transition from aided to unaided vision during flight, the number of visual cues that can be used as reference for aircraft attitude is greatly reduced. If this occurs during nights with very low ambient light, spatial discordance can occur. Rapid transition from aided to unaided sight reduces the number of peripheral visual cues from many to few which can lead to spatial disorientation and unsafe flight. Dark adaptation, or the ability to perceive low-level light, can take as long as half an hour [2]. Other cues that indicate the attitude of the aircraft must be made present to mitigate the effects of night-vision aides on the visual system, where a light adapted eye must quickly transition to extremely dark conditions. A lack of sufficient peripheral visual orientation cues may lead to a number of spatial discordance issues (e.g., black-hole effect) [4]. Peripheral visual cues are reduced during a dark night or white-out (atmospheric or blowing snow) conditions. In either case, it is the lack of peripheral visual cues that lead to disorientation. Another situation in which pilots require peripheral visual cues is when approaching and closing in on another aircraft (e.g., in-flight refueling). Pilots use peripheral cues to estimate their relative position to the Earth and the aircraft to which they are approaching [4]. Without this peripheral information, as it occurs in extremely dark conditions, closing in on another aircraft becomes significantly more challenging and potentially dangerous. Currently, pilots rely on the plane�s attitude indicator, a visual representation of the plane�s position relative to the horizon, when experiencing spatial discordance. This visual cue provides information to the foveal visual field and does not take advantage of the benefits of cuing peripheral sensory receptors. Although this information is quite salient in the foveal visual field, pilots report dismissing this information since the vestibular cues they experience provide more compelling evidence of their (incorrect) spatial orientation. As previously mentioned, peripheral visual cues are a major contributor to maintaining straight and level flight and avoiding spatial discordance. More recent research, however, has demonstrated that spatial information can be improved with multimodal (i.e., vision, hearing, tactile) stimulus presentation [1]. With the appropriate combination of more than one stimulus modality, humans can orient themselves more quickly and accurately than with the activation of one sensory modality alone [1]. Technology with the ability to provide a pilot transitioning from aided to unaided flight with additional stimuli to maintain a straight, level, and safe flight is needed. This technology should be able to be activated at the pilot's discretion and suitable for different platforms that have different requirements and constraints. At a minimum, however, this project should be applicable to Navy 5th generation fighter aircraft. Since the only 5th generation fighter in the current inventory is the F-35 Lightning II, this technology should be compatible with the current cockpit design and successfully integrate with the baseline pilot-vehicle interface (PVI). If possible, the technology should extend to previous generation fighters and other aircraft (e.g., helicopters). Collaboration with original equipment manufacturers (OEMs) in all phases is highly encouraged to assist in defining aircraft integration, commercialization requirements, and providing test platforms. The stimulation of more than one sensory system (e.g., vision, hearing) is not required, but only illustrated as an example. PHASE I: Based upon stated needs in the description, develop an approach that demonstrates the ability for a pilot to orient themselves more quickly and accurately than current technology allows. Provide documentation that demonstrates the suitability of the design into representative platforms and mission environments. A proof-of-concept demo should be performed along with a Technology Readiness Level (TRL)/Manufacturing Readiness Level (MRL) assessment. PHASE II: Develop the system into a prototype, perform further testing in a relevant environment, and demonstrate performance in a simulated or actual flight environment. During this phase, performer should engage appropriate PMA, PEO, and/or appropriate contract support (e.g., Joint Program Office (JPO), Lockheed Martin (LM), etc.) to discuss options for in-flight test in Navy aircraft. If this is cost- or time-prohibitive, testing in commercial aircraft is acceptable. Tests during this phase should demonstrate the superiority of the new system compared to the standard avionics used during spatial discordance. Feasibility of aircraft/fighter integration should also be demonstrated. TRL/MRL assessment should be updated. PHASE III: Transition the system into the Fleet by providing the system to appropriate testing-and-evaluation (T&E) programs. Contacts described in Phase II should be aware of technology by Phase III and providing in-flight (T&E) during Phase III. Concurrent with in flight T&E, performer should develop commercialization plans for the private sector. PRIVATE SECTOR COMMERCIAL POTENTIAL/DUAL-USE APPLICATIONS: This system would be useful in the private sector as spatial discordance has been found to be a large contributor to civilian mishaps as well. REFERENCES: 2. Bear, M. F., Connors, B. W., & Paradiso, M. A. (2006). Neuroscience: Exploring the Brain 3rd Edition. Lippincott, Williams, & Wilkins. 3. Bertelson, P. & Radeau, M. (1981). Cross-modal bias and perceptual fusion with auditory-visual spatial discordance. Perception & Psychophysics, 29(6), 578-584. 4. Gillingham, K. K. & Previc, F. H. (1993). Spatial orientation in flight. (No. AL-TR-1993-0022). ARMSTRONG LAB BROOKS AFB TX. 5. DoD 5000.2-R, Appendix 6, pg. 204., Technology Readiness Levels and Their Definitions. http://www.acq.osd.mil/ie/bei/pm/ref-library/dodi/p50002r.pdf 6. Manufacturing Readiness Level (MRL) Deskbook, May 2011. http://www.dodmrl.com/MRL_Deskbook_V2.pdf KEYWORDS: spatial orientation; spatial discordance; peripheral cues; vision; multisensory; sensory system
Return
Offical DoD SBIR FY-2015.1 Solicitation Site: |