Optically-Aided, Non-Global Positioning System (GPS) for Aircraft Navigation Over Water
Navy SBIR 2019.1 - Topic N191-003 NAVAIR - Ms. Donna Attick - [email protected] Opens: January 8, 2019 - Closes: February 6, 2019 (8:00 PM ET)
TECHNOLOGY
AREA(S): Air Platform, Information Systems ACQUISITION
PROGRAM: PMA266 Navy and Marine Corp Multi-Mission Tactical UAS The
technology within this topic is restricted under the International Traffic in
Arms Regulation (ITAR), 22 CFR Parts 120-130, which controls the export and
import of defense-related material and services, including export of sensitive
technical data, or the Export Administration Regulation (EAR), 15 CFR Parts 730-774,
which controls dual use items. Offerors must disclose any proposed use of
foreign nationals (FNs), their country(ies) of origin, the type of visa or work
permit possessed, and the statement of work (SOW) tasks intended for
accomplishment by the FN(s) in accordance with section 3.5 of the Announcement.
Offerors are advised foreign nationals proposed to perform on this topic may be
restricted due to the technical data under US Export Control Laws. OBJECTIVE:
Design and develop a capability using optically-sensed features of the
environment and ocean as external references for augmenting aircraft navigation
when flying over water without the use of the Global Positioning System (GPS). DESCRIPTION:
The concept of using optical sensors for navigation has been used extensively
in the past. Existing visual navigation solutions have been used within
multiple weapon and military aircraft applications, but are limited to use over
land, requiring detailed knowledge of the terrain. Features identified from radar
or camera images can then be correlated to terrain, or map, features for
estimating a given position. In addition, horizontal positioning from
down-facing cameras is becoming the industry standard on high-quality,
commercial unmanned aerial vehicles (UAVs), especially for use indoors where
satellite navigation is not available. PHASE
I: Conceptually develop one or more optically-based solutions that show the
feasibility of a new capability in using external (e.g., ocean, sky, and/or any
temporary features of opportunity) characteristics to augment aircraft
navigation technologies. Provide documentation that demonstrates the suitability
of the design for typical aircraft operations and mission environments, and the
potential impacts to use without GPS aiding. Aircraft operational and mission
environment information will be provided to Phase I performers. Perform a proof
of concept demonstration to show the scientific and technical merit, along with
a Technology Readiness Level (TRL)/Manufacturing Readiness Level (MRL)
assessment. The Phase I effort will include prototype plans to be developed
under Phase II. PHASE
II: Develop the optically-based concept into a prototype, perform testing and
demonstrate performance of the prototype in a representative flight environment
over water with varying sea and atmospheric conditions; aircraft operational
and mission environment information will be provided to Phase II performers.
Perform tests that demonstrate and validate the superiority of the
optically-aided navigation compared to traditional aircraft navigation without
external aiding (i.e., using only on-board aircraft systems, such as only air
data and inertial). Show the feasibility of aircraft integration. Update the
TRL/MRL assessment based on prototype advancements and test results. PHASE
III DUAL USE APPLICATIONS: Identify requirements for transitioning to U.S. Navy
aircraft with support of any appropriate PMA. Expand the prototype solution to
satisfy the identified hardware and software requirements for applications to
U.S. Navy fleet of aircraft, which may be manned, unmanned, fixed wing, or
rotary wing platforms. Perform final testing of a fleet representative solution
for at-sea aircraft navigation. Possibly further integrate the developed
concepts� with other navigation solutions for a more comprehensive solution to
aircraft utilized in regions of the globe where GPS solutions are degraded or
unavailable with coverage in both land and sea environments. The general
technology can be applied in new and emerging ways for commercial applications
in both large and small aircraft industries (e.g., small UAVs for operating
over rivers and streams without drift). REFERENCES: 1.
Chahl, J., Rosser, K., and Mizutani, A. �Bioinspired Optical Sensors for
Unmanned Aerial Systems.� Bioinspiration, Biomimetics, and Bioreplication. SPIE
Proceedings, 2011. https://spie.org/Publications/Proceedings/Paper/10.1117/12.880703 2.
Chao, H., Gu, Y., Gross, J., Guo, G., Fravolini, M., and Napolitano, M. �A
Comparative Study of Optical Flow and Traditional Sensors in UAV Navigation.�
2013 American Control Conference, Washington DC, IEEE. https://ieeexplore.ieee.org/document/6580428/ 3.
Chao, H., Gu, Y., and Napolitano, M. �A Survey of Optical Flow and Robotics
Navigation Applications.� Journal of Intelligent and Robotics Systems, 2014,
pp. 361-372. https://link.springer.com/article/10.1007/s10846-013-9923-6 4.
Chao, H., Gu, Y., and Napolitano, M. �A Survey of Optical Flow for UAV
Navigation Applications.� 2013 International Conference on Unmanned Aircraft
Systems, Atlanta, IEEE. https://ieeexplore.ieee.org/abstract/document/6564752/ 5.
Chao, H., Gu, Y., Gross, J., Rhudy, M., and Napolitano, M. �Flight-Test
Evaluation of Navigation Information in Wide-Field Optical Flow.� Journal of
Aerospace Information Systems, 2016, 13(11), pp. 419-432. https://arc.aiaa.org/doi/10.2514/1.I010482 6.
Rhudy, M.B., et al. �Unmanned Aerial Vehicle Navigation Using Wide-Field
Optical Flow and Inertial Sensors.� Journal of Robotics, Volume 2015, Article
ID 251379, 1-12. https://web.statler.wvu.edu/~irl/Unmanned%20Aerial%20Vehicle%20Navigation%20Using%20Wide-Field%20Optical%20Flow%20and%20Inertial%20Sensors.pdf 7.
Rhudy, M., Chao, H., and Gu, Y. �Wide-field Optical Flow Aided Inertial
Navigation for Unmanned Aerial Vehicles.� 2014 IEEE/RSJ International Conference
on Intelligent Robots and Systems, Chicago. https://ieeexplore.ieee.org/document/6942631/citations?part=1 8.
Trittler, M., Rothermel, T., & Fichter, W. �Autopilot for Landing Small
Fixed-Wing Unmanned Aerial Vehicles with Optical Sensors.� Journal of Guidance
Control and Dynamics, 2016, 39(9), pp. 2011-2021. https://www.researchgate.net/publication/305925697_Autopilot_for_Landing_Small_Fixed-Wing_Unmanned_Aerial_Vehicles_with_Optical_Sensors?_sg=5LKoAjFrO4ccPixP4oLcb8VnUcFVTx_ceplrQubvohrjhRLgiyKpvM4gAeuQ1cmJ93zPmlofLCOTbd33hkVzwKGeMZN2 9.
Zhang, J., and Singh, S. �Visual-Lidar Odometry and Mapping: Low-drift, Robust,
and Fast.� 2015 International Conference on Robotics and Automation, Seattle. http://www.frc.ri.cmu.edu/~jizhang03/Publications/ICRA_2015.pdf KEYWORDS:
GPS Denied; Navigation; UAS; Visually-Aided Inertial Navigation System (INS);
Optical Flow; Visual-Lidar Odometry; Unmanned Aerial Systems
|