AEDC adapts propulsion technology for wind tunnel tests of F-35
By Darbie Sizemore
/ Published May 02, 2007
ARNOLD AIR FORCE BASE, TENN --
A recent series of successful F-35 Joint Strike Fighter (JSF) aerodynamic tests in Arnold Engineering Development Center's (AEDC) 16-foot transonic wind tunnel (16T) demonstrated the center's ability to adapt propulsion test technologies for use in wind tunnel tests.
"We developed and implemented technologies necessary to analyze high-speed F-35 inlet data in near-real time within two weeks of meeting with the Lockheed Martin F-35 team members in Fort Worth," said Dr. Donald J. Malloy, Aerospace Testing Alliance's project manager for an Air Force test and evaluation improvement program with Air Force Flight Test Center at Edwards AFB, Calif. "This demonstration highlights the fact that there is often a better way to do the things we do."
Moreover, this test marked the first time that automated data quality checks were performed in near-real time on high-speed data in 16T.
"Our goal in developing these data analysis tools for dynamic measurements is to ensure the highest level of data quality in near real-time by assessing instrumentation system performance and making the data accessible to the analysts," said Dr. Charlie Vining, AEDC's aeropropulsion technology manager and Naval Air Systems (NAVAIR) research and engineering associate fellow.
"Partnership with the Air Force Flight Test Center has been critical to the successful development of these tools. We are looking forward to develop a partnership with the Naval Air System Command Aircraft Division to further develop these analysis tools with potential application to F-35 flight testing."
According to Dr. Malloy, the AEDC team relied heavily on the expertise of senior programmer, Tommie Heard, to interface to the portable data acquisition system used to acquire the high-speed F-35 inlet data. The results were impressive - 25,000 data points acquired in seven days with 50 billion samples of data screened in near-real time.
"Most of the anomalies can be detected by analysts if they are looking for them," Dr. Malloy said. "The first problem is that there is so much data that it is impossible for someone to look at all of it and as a result, intermittent anomalies may go undetected. The second problem is that you have to have the test technologies and computational resources necessary to detect the anomalies."
For an inlet test, a minimum of 40 measurements are required at the aerodynamic interface plane between the aircraft inlet and engine to characterize the flow entering the engine.
"If there are too many invalid measurements or, in the worst possible scenario, there are invalid measurements that you are unaware of, you can't characterize the distortion level entering the engine and what the effect of that distortion is on engine's ability to operate continuously without surging," Dr. Malloy said.
The flow field entering the engine is highly dynamic and susceptible to distortion from various sources including high angle of attack maneuvers, heavy cross winds and wake from aircraft and missiles.
"To ensure reliable engine and airframe integration, it is critical to fully assess this flow field for temporal and spatial variations," said Maj. Kurt Rouser, deputy chief of AEDC's Aeropropulsion Systems Test Division.
Dr. Malloy said the team was confident they could develop the propulsion capability into a technique that could be used in a wind tunnel. "We have a lot of experience applying state-of-the-art test technologies to validate propulsion system data in ground test or flight test," he said.
"We've always known that these technologies could be extended to aero data - either internal aero data from a sub-scale inlet test or external aero data from a sub-scale aircraft model."
During the test, Dr. Malloy said the goal was to ensure that the data quality was consistent with pre-test estimates. "We wanted to look at enough of the data to verify that the data was of the quality we promised the customer," he said.
"We were hoping that there would be no anomalies and that if there were anomalies, that they could be quickly repaired or that there were enough good measurements that we wouldn't have to stop the test."
There was only one measurement anomaly and the analysis software alerted analysis engineers at the instant the measurement failed.