Blade Fatigue Life Assessment With Application to VAWTS

[+] Author and Article Information
P. S. Veers

Sandia National Laboratories, Division 5523, Albuquerque, N.M. 87185

J. Sol. Energy Eng 104(2), 106-111 (May 01, 1982) (6 pages) doi:10.1115/1.3266281 History: Received March 04, 1982; Online November 11, 2009


A cursory analysis of the stress history of wind turbine blades indicates that a single stress level at each wind speed does not adequately describe the blade stress history. A statistical description is required. Blade stress data collected from the DOE/ALCOA Low Cost experimental turbines indicate that the Rayleigh probability density function adequately describes the distribution of vibratory stresses at each wind speed. The Rayleigh probability density function allows the distribution of vibratory stresses to be described by the RMS of the stress versus time signal. With the RMS stress level described for all wind speeds, the complete stress history of the turbine blades is known. Miner’s linear cumulative damage rule is used as a basis for summing the fatigue damage over all operating conditions. An analytical expression is derived to predict blade fatigue life. Input to the blade life expression includes a basic blade S-N curve, RMS stress versus wind speed data, the probability density function of vibratory stress, and the probability density function which describes the wind speed distribution. The implications of the assumptions and the limitations of this approach are discussed.

Copyright © 1982 by ASME
Your Session has timed out. Please sign back in to continue.






Some tools below are only available to our subscribers or users with an online account.

Related Content

Customize your page view by dragging and repositioning the boxes below.

Related Journal Articles
Related eBook Content
Topic Collections

Sorry! You do not have access to this content. For assistance or to subscribe, please contact us:

  • TELEPHONE: 1-800-843-2763 (Toll-free in the USA)
  • EMAIL: asmedigitalcollection@asme.org
Sign In