Future Military Vertical-Flight Operations
Tomorrow’s innovations will be driven by present-day needs for more payload and speed in rotorcraft.
The next 50 years will be marked by a mix of conventional rotorcraft, compounds, tiltrotors, hybrid electrics and electric lift-fan vertical-takeoff-and-landing (VTOL) aircraft. The need for lightweight mission systems will be driven especially by future vertical-lift aircraft whose narrowing payload margins, at the altar of speed, will drive low-drag, multi-purpose sensors and lightweight weapons. The low speed and electric aircraft will benefit from those innovations and possibly vice-versa.
This article will focus on mission and weapons developments in manned and optionally manned, traditional military vertical flight and how those advancements will enable extended operational envelopes and safety over the next 50 years. The evolutions assume tacit acceptance of emerging technologies in sensors, automation and computing with human factors emphasis on improving situational awareness with workload reduction and information management. Reducing mundane hands-on flying duty, targeting, countermeasure maneuvers and weapons employment will be the principal foci of augmented automation, leaving greater brain-byte overhead margins for those fog-of-war, uncertain, wartime mission scenarios where humans excel over machines.
One area in which I expect remarkable advancements will be sensors. In the next 50 years, the Venn diagram lines of mission versus air vehicle systems will blur. Let’s think about them as degraded visual environments (DVE)-navigation/pilotage and defensive/offensive system sensors. All the services want to own not just the night, but also the weather and threat-choked airspace while sustaining the ability to operate in remote landing zones without DVE losses. The future standard will include DVE sensors whose imagery will be fused with, or overlayed on, a synthetic vision 3-D terrain backbone and/or electro-optical/infrared (EO/IR) imagery.
Active terrain-following/terrain-avoidance radars and flight directors will be a thing of the past, replaced by obstacle sensors that plot obstructions in a 3-D evidence grid or with real-time (no latency) imagers that allow crews to see through all obscurants, including self-induced brownout. Femto light-speed 3-D imaging will allow us to “see around” things blocking direct view.
The real breakthroughs will be how the environment is presented to the crews. Two schools of thought prevail: either cue the pilot with tactile inputs, 3-D audio and abstract symbology on a multi-function display or head-up display to avoid collisions, or present the fused terrain/obstacle imagery as realistically as possible to allow normal visual perceptions through a full-width helmet-mounted display or windscreen projected synthetic vision. Visual approaches require head tracking and provide peripheral vision sensation essential for smoothly controlled transitions.
Owning the night and weather is a multi-spectral problem. I predict a move toward multi-spectral sensor arrays – EO/IR, image intensifying, millimeter wave (MMW), illuminated terahertz and Femto camera – on the skin of the aircraft in an advanced distributed aperture system providing a spherical sense-and-display capability to the head-tracked pilot helmet for flight and weapons engagement.
System integrators recognize the need to multi-task sensors to “earn their weight” onto the high-speed platforms. Sensor suites will be tailored to provide not only visual sensing but navigational sensing. We can no longer rely on GPS use in anti-access, area-denial (A2AD)
environments. Light detection and ranging (LIDAR) and MMW radars will be multi-tasked to build a 3-D point-cloud for obstacle avoidance while providing real-time, highly accurate Doppler-velocity or terrain-contour matching (TERCOM) measurements to a sophisticated, tightly-coupled, Kalman-filtered inertial measurement system, which will provide self-contained inertial navigation system (INS) employment after GPS signal loss. Technology leaps in INS gyros and algorithms will decrease the nav system component size while improving gyro accuracies.
Defensive and offensive weapons systems will similarly be enhanced. The same hyperspectral-sensing capability that provides vision or navigation augmentation will provide spectral exploitation for detection, avoidance, declaration, targeting and countermeasures. That same LIDAR that was used to find wires, towers and trees and update the INS can also, in the same raster scan, be used to detect threat optics that may be employed for guiding weapons against us.
Countermeasures will slowly move away from expendables and toward sensor-based active jammers with unobstructed views. Closed-loop technology will jam IR threats without having to guess what it’s fighting. That distributed aperture IR sensor will be pulling double-duty seeking friction-heated bullets to determine their source and accuracy and to provide fire-control solutions to weapons or optical dazzler. Conformal-multispectral, beam-forming transmit/receive antennae will be incorporated for threat detection and geolocation. Those same antennae will be “time-shared” for jamming and data links.
Coherent synthetic aperture cooperative detection within multi-ship formations will make geolocation and targeting highly accurate and faster. Directed energy weapons employment against helicopters will reap airborne laser and pulsed-RF retaliatory engagements at the speed of light. For kinetic engagements, who can resist suggesting the employment of self-steering bullets to improve probability of kill against moving and hiding targets?
A big concern for operators is their signature — whether acoustic, IR, visual or RF. Ownship signature awareness and signature management to keep “spikes” away from detected threats will be a standard feature, especially as flight-control automation handles countermeasure procedure maneuvers. Automation will be central to modernization in combat helicopter operations.
As we’ve learned over the past decade or so, there are things at which computers excel and others where the human still performs like a boss. Let the human be the Jedi of uncertain wartime situations.
Mission planning is the essential element of streamlining automation and workload shift.
As aircraft grow more complex, mission planning systems, by necessity, will be simplified. Future aircraft technology promises more system setup complexity. Planning will be strealined with electronically coded standard operating procedures driving smart default settings for multi-ship ops. Or it will be directly manipulated from the electronic fragmentary orders, air tasking orders and OPSTASK LINK.
Weather will dovetail with the planning environment incorporating TurboTax-like queries for employment decisions — “Do you want me to enable automatic blade and inlet de-ice or cue you in flight?” Pre-mission planning systems will mimic aircraft capabilities, allowing easy en route mission re-tasking or automated route replanning due to threats (including weather). Route replanning algorithms will include real-time ownship and wingman performance (climb, turn, speed), and fuel limit computations to check re-routing feasibility. Cabin occupants will have easy-to-use, virtual sand-tables, allowing mission briefing and contingency planning while en route. Airborne implementation of the Global Information Grid will ensure tailored data gets to the right recipients at the right time. Smart publish and subscribe service-based architectures will allow better data management to and from warfighters.
Lastly, flight control automation will be a big enabler. For manned flight, mission-based tailorable flying handling qualities can also provide emergency return to base in the event of pilot casualty. Decision-aiding is an adjunct of flight-control automation. The RAH-66 Comanche’s cognitive decision aid dreams will be enabled and enhanced. Then as we execute missions, the onboard fault detection and fault isolation will have been running in the background, watching for mechanical, vibratory or electronic faults and any airframe exceedances while forwarding maintenance or parts support requests via high-bandwidth data links.
Mission system complexity discussed thus far is extraordinary and will demand remarkable parallel processing. Micro-cloud architectures aboard the aircraft will divide and accelerate computing among small processors at the skin sensors for redundancy and to reduce heavy computing processing costs in centralized multi-core mission computers. These architectures will have redundant fiber-optic “photonic” backbones for flight and mission management. We may see the migration to massively parallel processing using multi-frequency laser-optical CPUs to accommodate intense graphic processing and cooling demands. We’ll see industry-standard partitioned architectures for sensor control/display and weaponry. This will allow enhancements to systems with minimal regression testing.
Interestingly, everything discussed here is in development at some stage today, making for a safer, more effective tomorrow. RWI