Date of Award

Spring 2011

Document Type

Thesis - Open Access

Degree Name

Master of Aerospace Engineering

Department

Graduate Studies

Committee Chair

Dr. Eric v. K. Hill

Committee Member

Dr. William C. Barott

Committee Member

Christopher D. Hess

Abstract

The purpose of this project was to investigate how accurately an artificial neural network could predict the ultimate compressive loads of impact damaged 24-ply graphite-epoxy coupons from ultrasonic C-scan images. The 24-ply graphite-epoxy coupons were manufactured with bidirectional preimpregnated tape and cut into 21 coupons, 4 inches by 6 inches each. The coupons were impacted at known impact energies of 10, 12, 14, 16, 18, and 20 Joules in order to create barely visible impact damage (BVID). The coupons were then scanned with an ultrasonic C-scan system to create an image of the damaged area. Each coupon was then compressed to failure to determine its ultimate compressive load.

Numeric values for each pixel were determined from the C-scan image. Since the image was represented as a red-green-blue (RGB) map, each pixel had three numbers associated with it, one for each of the three colors. To make the image readable to the artificial neural network the columns of the resulting matrix were then summed, and these numbers were used as inputs for a backpropagation neural network (BPNN) to generate accurate predictions of the ultimate compressive loads. The BPNN was trained and optimized on 15 of the 21 sample data sets and tested on the remaining 6 sample data sets. The optimized BPNN was able to produce ultimate compression after impact (CAI) load predictions for the BVID composite coupons with a worst case error of -8.98%. This was within the ±10% goal for this research and comfortably within the B-basis allowables commonly applied to composite structures.

The ultrasonic C-scan images were then preprocessed using Fast Fourier Transforms (FFTs) in an effort to remove any image noise present. The results of the BPNN that was trained and tested on the green color data only were then compared to the results yielded by the BPNN trained and tested on the images that were processed through the FFT. It was found that the FFT processed images had a worst case BPNN prediction error of 8.65%, which was only slightly lower than the -8.98% error that was generated by the unprocessed green layer only C-scan image data. This improvement suggested that the added work involved in FFT preprocessing of the worst case error was not as productive as had been hoped, leading to a few suggestions for future noise removal research. This also reinforced the notion that BPNNs, being an iterative optimization scheme, can provide accurate predictions in the presence of at least small amounts of noise. Thus, image filtering methods coupled with the iterative optimization technique that comprises a BPNN have demonstrated the ability to generate accurate CAI load predictions in composite coupons that have experienced BVID.

Share

COinS