Date of Award

Spring 2004

Document Type

Thesis - Open Access

Degree Name

Master of Aerospace Engineering

Department

Aerospace Engineering

Committee Chair

Eric v. K. Hill

Committee Member

Yi Zhao

Committee Member

David J. Sypeck

Abstract

The research presented herein demonstrates the feasibility of predicting ultimate strengths in composite beams subjected to 3-point bending using a neural network analysis of acoustic emission (AE) amplitude distribution data. Fifteen unidirectional fiberglass/epoxy beams were loaded to failure in a 3-point bend test fixture in an MTS load frame. Acoustic emission data were recorded from the onset of loading until failure. After acquisition, the acoustic emission data were filtered to include only data acquired up to 80 percent of the average ultimate load.

A backpropagation neural network was constructed to predict the ultimate failure load using these AE amplitude distribution data. Architecturally, the network consisted of a 61 processing element input layer for each of the event frequencies, a 13 processing element hidden layer for mapping, and a single processing element output layer for predicting the ultimate load. The network, trained on seven beams, was able to predict ultimate loads in the remaining eight beams with a worst case error of +4.34 percent, which was within the desired goal of ± 5 percent.

A second analysis was performed using a Kohonen self organizing map and multivariate statistical analysis. A Kohonen self organizing map was utilized to classify the AE data into 4 failure mechanisms. Then multivariate statistical analysis was performed using the number of hits associated with each failure mechanism to develop a prediction equation. The prediction equation was able to predict the ultimate failure load with a worst case error of-11.34 percent, which was well outside the desired goal of ± 5 percent. This was thought to be the result of noisy or sparse data, since statistical predictions are inherently sensitive to both, whereas backpropagation neural networks are not.

Share

COinS