Technical Seminar highlights progress made with integrating design of experiments at AEDC

  • Published
  • By Philip Lorenz III
  • AEDC/PA
A recent lunch and learn Technical Excellence seminar on the concept behind "design of experiments" (DOE) represented the latest salvo generated by AEDC's leadership to encourage integrating that approach to ground testing at the world's largest ground testing complex.

For approximately three years, an effort has been underway at AEDC to formally bring DOE into the picture - an effort that has included seminars, classes and case-by-case applications of the approach to testing and test planning and processes contributing to testing.

Jerry Kitchen, AEDC's deputy director of engineering and technical management, is in charge of navigating the way for the introduction and incorporation of DOE into ground testing at the base.

According to Kitchen and other advocates of the approach, DOE can be applied to ground testing at AEDC and will enable engineers to determine simultaneously the individual and interactive effects of many factors which could affect the results in a given test and the effect those factors will have on the system being tested.

Glen Lazalier, an aerospace engineer whose AEDC career has spanned 46 years, explained how DOE works.

"DOE is the application of a rigorous process to the selection of the test points used to produce optimal information from a given set of test resources," he said. "DOE includes ways to investigate both the direct and interactive effects of multiple input variables on desired output variables by simultaneously varying the inputs in a disciplined and mathematically appropriate manner."

In 1976, Lazalier was one of the first engineers at Arnold to apply DOE to a test.

While working on enhanced efficiencies for testing and evaluation of gas turbine aircraft engines, he found that DOE was the best approach for the project. Already familiar with DOE through his undergraduate studies, an ongoing J57 engine evaluation project was an ideal opportunity for developing the test plan and execution of the test using DOE.

"The results of the application of DOE showed a significant increase in information produced from a specified set of test resources," Lazalier said. "However, in subsequent years proposed uses of DOE were often met with reluctance by the user community who were more comfortable with a one-at-a-time test parameter variation methodology.

"While many DOE methodologies are limited by an inability to address significant discontinuities in parametric variations, their use for situations in which there is a 'smooth' variation will provide a marked multiplier on information returned for a given resource use."

More recently, an effort has been underway at AEDC to formally incorporate DOE in ground testing whenever it is appropriate and applicable.

"We've been applying DOE on everything we can at APTU (Aerodynamic and Propulsion Test Unit)," said Dusty Vaughn, an APTU project engineer.

Vaughn's team initially tried to incorporate DOE into work on the facility's combustion air heater (CAH). This effort proved to be unsuccessful, but appropriate uses for DOE were subsequently found and explored.

"At the conclusion of the CAH activation project, we completed projects scoped to characterize three of the APTU fixed area ratio nozzles in preparation for the FaCET (Falcon Combined Cycle Engine Test) test," he said. "With help from Dr. (Doug) Garrard we were able to develop and apply DOE to characterize the FaCET inlet capture area of the nozzle exit flow.

"We were then able to use that correlation to set FaCET desired conditions with 100 percent success. The approach was verified during the FaCET runs where the data they collected agreed well within the required specifications they requested of the desired set point."

The most recent application of DOE into a ground test at AEDC and subsequent flight test series, centered upon the development and validation of a Towed Airborne Plume Simulator (TAPS) for the DoD Center for Countermeasures, based in White Sands, N.M.

When AEDC's Dr. Robert Hiers was helping his team with the design and characterization of the device that evolved into TAPS, he became aware of DOE through a directive written by Dr. J. Michael Gilmore, the director of operational test and evaluation for the Office of the Secretary of Defense.

Dr. Hiers said, "That directive filtered down [to us] at the same time we were designing our flight tests series and we had been struggling with what approach to take to characterize TAPS."

Kitchen points out that DOE is not always the appropriate approach to a given ground test at AEDC.

"Design of experiments is a planned approach for determining real relationships between inputs and outputs of any process or system that is measurable," he said. "[However], DOE is just another tool in our toolbox."

Dr. Hiers, who agrees with Kitchen's assessment, said, "You have to ask the right questions to determine whether DOE is going to be a valid [and] appropriate tool."

Gregg Hutto, Wing Operations Analyst with the 46th Test Wing at Eglin AFB, Fla., has been one of the leading advocates for the Air Force's use of DOE in both ground and flight testing.

In an American Institute of Aeronautics and Astronautics published report titled (Application of Design of Experiments to Flight Test: A Case Study), Hutto wrote, "In times of enormously expensive flight test programs, the efficiencies realized through the application of designed experiments to flight test could mean the difference between the timely delivery of a needed capability to the warfighter; an over cost, late, under-performing system; or outright cancellation of the system. Design of experiments has the capability to make flight test safer and more efficient."

Hutto said his advocacy for a DOE approach to testing is based on practical considerations.

"Our passion for experimental design is easily explained," he said. "With a test constructed according to the principles of DOE; we are assured that we will rarely fail to discover deficiencies wherever they exist. That is, our tests are effective in uncovering flaws.

"At the same time, we can 'right-size' the experimental effort - calling for no more, or no fewer trials than are required to learn the truth. A designed experiment is efficient - the least cost for what we must know. We experiment well so our warriors do not have to. The ultimate cost of poor testing is failure in combat."