Approximately thirty years ago, the design methodology for structural engineers was deterministic. Discrete values were used to define loads/demand (D) and resistances/capacity (R). In the Working Stress Design method (WSD), a design was considered satisfactory if the ratio of R to D (representing the nominal resistance and nominal demand, respectively) was greater than or equal to a prescribed factor of safety, whose minimum value was based on engineering experience and was different depending on the structural element being designed. There are drawbacks to the WSD method*, one of which is it doesn’t adequately account for the variability inherent in R and D.
Today, most structural engineers incorporate probabilistic design into a new design philosophy called Limit State Design (LSD). From my graduate class notes:
“[LSD] is a design philosophy that requires the designer to recognize the various limit states for his/her structure and design to reduce the probability of each of these limit states being exceeded to an acceptably low level.”
It is now widely recognised that there are uncertainties in determining both R and D. Uncertainties in D (loads/demand) are due to the fact that they may vary depending on location and time (eg. there are no snow loads in summer, change of use). Different types of loads acting in combination serve to increase the amount uncertainty. There are three main factors that affect uncertainties in R (resistance/capacity): geometry, material properties and theory. The geometry of the member may be different than assumed during design (this is especially commonplace for cast-in-place concrete members). Similarly, material properties, such as strength, may be different than assumed. The strength of the member is determined using simplified equations that may not exactly represent the behaviour, which can be quite complex.
The range of values for R and D are assumed to be represented by lognormal distribution curves (See Figure 1). This assumption has been found to be reasonable because of the control of material properties and positive skewness of known load distribution curves. When R and D are plotted on the same graph, the location of the non-zero probability of failure is the region where the two curves overlap (D > R). This method of determining the likelihood of failure is known as statistical interference.
A convenient way to assess the probability of failure (D > R) is to consider a single lognormal distribution curve called the safety margin or S, where S = R – D (see Figure 2). The portion of the curve on the negative side of the graph (D > R) represents the probability of failure. This location is bound by bsS where b is the safety or reliability index and sS is the standard deviation for S. Figure 2 implies the higher b is, the lower the probability of failure will be.
However, most engineers (including myself) prefer to use deterministic values rather than probability distributions in their design calculations. Thus, the LSD method uses nominal values of D and R that are multiplied by load and resistance factors, respectively, that incorporate a low probability of failure. For a design to pass, the factored resistance must be greater than or equal to the factored demand, or f R ≥ aD, where f is the resistance factor (generally less than 1) and a is the load factor (generally greater than 1). The equations for f and a are derived using the lognormal distribution curve S (Figure 2) along with statistical mathematics and algebra. Thus, when engineers use the LSD, they can use discrete (nominal) values for R and D while accounting for uncertainties using load factors that were developed using probabilistic design theory.
Now that we've seen how structural engineers make use of probability theory in their designs, let's shift gears a bit. Is it plausible that the engineering method used to design life's biodiversity was based on probabilistic design? A piece of supporting evidence is that several evolutionary mechanisms tend to be stochastic processes, which means their behaviour could be potentially represented by distribution curves (normal, lognormal, or other). Thus, these curves have the potential to be incorporated into an engineering design methodology to design the first life form to evolve according to a preconceived plan (i.e. design objective).
I’m not alone in suggesting probabilistic design could have been part of the engineering design method used by a front-loading engineer.
“Life’s designer may have also made an intelligent use of chance. …the bait could have been the entire cell, or set of heterogeneous cells. What the blind watchmaker could subsequently find was then constrained by the carefully chosen initial conditions. …life’s initial conditions [may] have been rigged by the design of the cell’s architecture and the choice of which components to employ.” (emphasis mine)
The Design Matrix, Chapter 7, p. 153
The terms “bait” or “baiting evolution” are mentioned (in one form or another) several times in Chapter 7. IMO, this suggests that a front-loading engineer could use stochastic properties of evolutionary mechanisms to design the structure of the first life forms to achieve certain design objectives (my apologies to Mike if this was not his intended message).
Let me be clear. I am not suggesting that probabilistic design was actually used by a front-loading engineer. I am only saying this is an intriguing, yet plausible, option that deserves a closer look.
*This is not to say that structures designed to WSD were unsafe and are ready to collapse at any moment. Quite the opposite: WSD would generally lead to “overly” safe, robust, and (more often than not) costly structures as compared to structures designed using LSD method.
"I am only saying this is an intriguing, yet plausible, option that deserves a closer look."
ReplyDeleteSure. Sounds good.
HOW do you propose to go about giving this a "closer look", from a scientific perspective?
Ditto to what Dave said.
ReplyDeleteI posed a possible way here at TT.
ReplyDelete