The Dryden Flight Research Center (DFRC) has been a partner in many uninhabited-aerial-vehicle (UAV) test programs. Our participation has largely been in the areas of technical oversight and range safety, but the X-36 program provided an opportunity to get some "stick and throttle" experience in this very unique type of flying. It was also an excellent opportunity to evaluate first hand the elements which make a successful UAV test program. I participated in the program as part of an independent review team, a chase pilot, and as a project pilot. In November 1997, I piloted two parameter-identification and flight-envelope-expansion sorties - they proved to be some of the most intense flight testing of my career.

The X-36 is a 28-percent scale, remotely piloted research aircraft (see figure) designed to develop tailless, high angle of attack fighter agility with a stealth design. It uses a control system consisting of canards, split ailerons, leading and trailing edge flaps, and thrust vectoring. The aircraft length is nearly 18 ft (5.5 m), the wingspan is about 11 ft (3.4 m) and the takeoff weight is approximately 1,250 lb (567 kg). It is powered by a Williams International F-112 turbojet engine with approximately 700 lb (318 kgf) of thrust. The flight operations were conducted from a control trailer, which contained both control-room stations and the pilot cockpit.

The X-36 UAV is being rolled out of a hangar for a flight test.

The cockpit controls and displays were designed to emulate a standard fighter type aircraft cockpit. The controls included a displacement stick, rudder pedals, a throttle quadrant, and several instrument-panel pushbutton switches. All normal flight functions could be controlled from the buttons on the stick and throttle (HOTAS). Two 20-in. (0.51-m) color displays were used to display information. One had the "out-the-nose" picture from the on-board video camera, with heads-up display (HUD) symbology overlaid to obtain a true 1:1 correlation with the video imagery. The second screen had a map indicating the location of the aircraft in the test area and numerous system-status and warning indicators. Finally, a microphone located inside the aircraft provided some valuable audio information on aircraft systems and engine performance.

Piloting the X-36 is an intensely visual task. Gone are the large field of regard, the subtle "seat-of-the-pants" inputs and numerous audio clues which normally allow the test pilot to have better situational awareness than anyone else on the test team. Instead, the pilot must rely on a rapid and focused cross check of his/her displays and precise communications with the test conductor. To aid concentration, the pilot was isolated from the rest of the control room area by a curtain and his communications were restricted to the flight director and radio traffic using a separate "flight loop" intercom setup. Behind the pilot was a position which allowed another pilot/engineer to view the displays. This position proved useful for both pilot training and for providing "copilot type" assistance to the flying pilot.

The display symbology needed to meet two conflicting requirements. First, it had to provide enough information to the pilot, in an intuitive format, to compensate for the lack of audio and physical cues discussed previously. Second, the symbology needed to be uncluttered enough to allow the pilot to find and assess quickly key flight parameters. During rapid maneuvering, it was often desired to track airspeed, angle of attack, bank angle, and normal acceleration precisely, while looking for any indication of sideslip or angle-of-attack excursions, engine-compressor stalls, and the like. The X-36 symbology was developed primarily by the chief contractor pilot and reflected his experience with F-15 and F-18 aircraft. I found the symbology very complete, but often too cluttered and a large portion of my training was devoted to finding the correct cross check for each maneuver block. The tailoring of symbology is an issue in all types of aircraft, but for an RPA, the correct symbology set is often critical to flight safety and mission success.

The key to my success with these sorties was the 10 hours of high-fidelity simulation training I obtained prior to an actual takeoff. I "piloted" the aircraft from the actual RPA cockpit with the most current aircraft model and a simulated out-the-nose visual presentation. Both normal and emergency operations were practiced until I was proficient. The simulator training culminated in a full test mission practice prior to the actual flight. Additional training included an actual engine start and a high-speed taxi test to 70 kn on the lakebed runway.

The test flights closely followed the simulator training. The takeoff was accomplished from lakebed runway 15 and the aircraft quickly climbed into the test airspace. Basic aircraft control was easily accomplished using the HUD symbology and the video image. The test maneuvers included a series of control-stick and rudder-pedal rolling maneuvers at various airspeeds and angles of attack. Several level accelerations were accomplished to expand the speed envelope of the aircraft. The aircraft exhibited excellent flying qualities throughout the flight. All the rolls were rapid and the bank-angle/angle-of-attack targets were precisely tracked. The aircraft was able to change speed rapidly, and the lack of any audio or physical speed cues required the pilot to spend extra attention to this display.

Perhaps the most challenging part of the flight was the landing pattern. Airspace restrictions forced us to use a continuous turn to final which meant that the runway was not in sight until very late in the approach. The turn was made using the moving map display to insure the correct offset. This maneuver required the use of almost all the information presented to the pilot. The landing was accomplished by simply establishing an on-speed descent at about 1 degree. The aircraft would smoothly touch down, and ground control and braking were very easy.

My experience with this program confirmed my belief that successful development and flight testing of a UAV requires the same discipline and expertise as any other aircraft. Years of flight test experience have defined a set of "flight test best practices." Simply stated, these embody the attitude and the processes which have proven to be critical to mission success. A few of the key points are:

  1. Robust Designs and Quality Construction - A major "benefit" of going with a UAV is the ability to simplify the systems of the vehicle. However, simple, nonredundant systems demand careful design to reduce the effect of failures. A "graceful degradation" of system performance is desired. Critical "single string" systems (such as flight controls) are only successful when supported by high-quality parts and construction. "Off-the-shelf" technologies which are integrated into a new vehicle do not insure low-risk flight operations.
  2. Hazard Analysis - The failure of the UAV prior to meeting its mission objectives is unacceptable. The hazard analysis must properly identify the probability and the severity of the hazards the vehicle may encounter. Once identified, steps must be taken to mitigate each of the hazards to the lowest level possible. Only then can the program present to management the true level of risk that must be accepted during flight test.

    Also, the program must avoid the "expendability mind-set," which accepts that UAVs fail at a higher rate than other aircraft. This attitude may result in accepting a substandard system or procedure. Today's UAVs are not often cheap, throw-away aircraft. They are sophisticated, expensive, and often one-of-a-kind aircraft, the damage or loss of which has a major impact on the mission success of the program.

  3. Configuration Control - The hardware and software of the system will always change during the course of development. A well designed and built vehicle cannot maintain its high standard without a process for identifying and controlling changes to the baseline.
  4. Test Planning and Test Mission Conduct - Programs need to recognize that flight test personnel can make valuable inputs to the design of the entire system. Often, the best engineering choice is the one which satisfies the operational requirements for the vehicle, and the flight tester usually has the most experience in this area. Also, it is important to recognize that it is normally unacceptable to take "short cuts" in the flight test process. Flight safety and mission success are seriously impacted by the omission of any of the "best practices" discussed.

Overall, the X-36 program was both challenging and educational. In my opinion, it is an excellent example of how to conduct UAV developmental flight test.

This work was done by Dana D. Purifoy of Dryden Flight Research Center. DRC-97-55

NASA Tech Briefs Magazine

This article first appeared in the June, 1999 issue of NASA Tech Briefs Magazine.

Read more articles from the archives here.