The Proposed Framework
As an engineer, I see a designed object as incorporating three basic but essential parameters:
- Material,
- Assembly, and
- Function
For example, take the floor system of a typical building. The "main" materials of this floor system consist of a concrete slab supported by a thin steel deck, which is itself supported by structural steel beams and/or trusses. The beams/trusses are supported by structural steel columns. To assemble the floor, the columns must be installed first, followed by the beams/trusses, followed by the steel deck installed on top of the beams/trusses, and then concrete is poured on the steel deck to form the floor. A function of the floor is to "hold up", or resist, the loads applied by the materials' self-weight, people and various equipment (partitions, desks, carpet, etc.)
Parameters of the Framework
Material:
The material information parameter, I(m), has two features:
- Material always contains a minimum amount of information, I(m) > 0, and
- A specific material potentially can be a designed object.
To illustrate this second point, take, for example, the structural steel beam mentioned above. The steel beam (material) is fabricated* by rolling/extruding a steel ingot to obtain the shape of an "I". The ingot (material) is the product of combining molten iron-ore with other specific elements/compounds (carbon, manganese, chromium, etc.) It is at the solid iron-ore stage where one can say the "raw material" is found (i.e. no design).
*Please note that this is a highly simplified description of what goes into making a steel beam. The purpose of the description was to provide an example of how a material can be designed.
Now let's look at the example from an information point-of-view. At the "raw material" stage, the iron-ore is at the "lowest-end" of the information scale. At each step, iron-ore to ingot, ingot to structural steel, information is added in the form of "assembly" of extra materials in to a new shape, and function(s) beyond the original "raw material" (ex: in the case of the structural steel beam, resisting applied loads in a floor system).
This is how materials themselves can be designed. However, the levels of design are "noise" right now. Let's consider "raw material" (i.e. no design) from this point on when referring to materials, or I(m) > 0.
To the best of my knowledge, Information Theory is currently able to objectively measure the amount of information of an event, but not of an object. Therefore, we do not have an absolute measure (at this time) for I(m). Thus, I propose a relative measure until an absolute measure is developed. For the sake of discussion, let's say that "raw material" has I(m) = 1.
Assembly:
Let's say a design is composed of only "raw materials", or I(m) = 1. When these materials are arranged (or assembled) in a specific manner, information is increased. The value of the assembly information parameter, I(a), depends on at least three variables**:
a. Natural law: If it can be shown that the "assembly" of the material is one that follows the laws of science, then I(a) = 1 (ex: crystallisation and vortices).
b. Degree of assembly: The more "minimum steps of assembly" required, the higher I(a) becomes (ex: a "sand castle" consisting of a single bucket of sand vs. an actual medieval castle).
c. Various assembly pathways: I(a) increases as the ways to orderly assemble the object decreases (i.e. an object that can be assembled in only one way would contain more information than an object that can be assembled in multiple ways).
Both b and c imply that if the assembly process is beyond natural law, then I(a) > 1.
Function:
A toddler could assemble a "structure" of sticks, stones and mud, but almost no one would consider it a designed object because it lacks function. The value of the function information parameter, I(fn), depends on at least two variables**:
a. Natural law: This refers to the natural properties of the object. If the function of the object goes beyond the properties, then the amount of "functional information" increases [I(fn) > 1].
b. Multiple functions: the more functions beyond natural properties, the more information contained in object [I(fn) increases]. This could be useful in determining the level of design in an object.
**It should be noted that there could be other variables for determining I(a) and I(fn). These are just the basic ones.
Minimum Threshold of Design
All three parameters - I(m), I(a) and I(fn) - contribute to a final outcome to determine design. A simple equation can be stated:
I(object) = I(m) x I(a) x I(fn)
The minimum value of each parameter is set to 1. This means that if all parameters contain the minimum value of information, then I(object) = 1. I propose the following statement for a reasonable minimum threshold for detecting design:
If the amount of information contained in an object is greater than the minimum amount of information contained in raw material (or 1), then the object is considered designed.
Issues Facing the Framework
There is an obvious issue with the above statement: at what point in a natural object do we define I(m) = 1, or what is the "raw material" in nature?
I propose the amino acid as the "raw material". The immediate problem with doing this is that potentially any protein, DNA, cellular structure, etc. could surpass the minimum design threshold when it may have no business doing so. If true, this could be a potential fatal weakness of the framework.
However, there are checks and balances that can be incorporated through I(a) and I(fn). For example, if it could be shown is that proteins organise according to some "law of nature", then I(a) = 1 since the "assembly" of the protein can be fully described by natural law. Also, if it could be shown that the function of proteins "naturally flow" from the function of amino acids (i.e. it is a natural property of amino acids to form functioning proteins), then I(fn) = 1. To the best of my knowledge, neither assembly nor "flow of function" has been shown to be subject to natural law; however, I do concede that this may change with some future discovery. That said, scientists cannot depend on future discovery; they (and we) must work with what we've got at present. Therefore, if presently I(a) and I(fn) > 1 for proteins, then so be it.
There are several other challenges for this framework to be a suitable tool for objectively recognising design. They include, but are not limited to:
- Developing a suitable "measuring stick" for determining values (absolute or relative) of I(m).
- Determining what (if any) other variables affect the value of I(a) and I(fn).
- Further investigation into the minimum threshold for design (currently set at I(object) > 1).
- Determining the effect of other (if any) parameters beyond I(m), I(a) and I(fn).
For Public Comment...
Whew! Got all that?
Ladies and gentlemen, I present to you my proposal for a framework to objectively recognise design in nature. Keep in mind, this is simply a conceptual framework. There is a lot of work to be done to hash out the details, as described in the list above. If this framework is valid, I fully expect revisions to be made.
So what do you think? Is this framework a good first step or cr*p? Let the discussion/critiques begin...
After re-reading the analogy of a toddler's pile of sticks, stones and mud, I realise that the case would give I(a)>1 and I(fn)=1 which would give a "false positive" with regards to the toddler's pile.
ReplyDeleteThis can be corrected by connecting I(a) and I(fn) into a larger parameter. The new equation would be:
I(object) = I(m) x I(a-fn)
where I(a-fn) = I(a) x I(fn)
and I(a),I(fn) > 0
Thus, for the toddler's pile, I(a) > 1, while I(fn) < 1, and I(a) x I(fn) = 1 (or something like that).
It would appear that assembly and function are partially dependent upon one another. Either that, or I have to ammend the definitions of assembly and function.
Another way of correcting the equation would be to raise the threshold.
Yeesh! Criticisms already, and it's ME!!! ;P
After thinking (not drinking) some more on the subject:
ReplyDelete1. Perhaps Information Entropy does play a role in the Conceptual Design Framework (CDF). The way I've defined I(a) and I(fn) would suggest that if an object's assembly and function are explained fully by natural law, then there is no uncertainty; thus I(a) and I(fn) can remain separate and independent parameters.
2. Back to what I am now calling the "mudpie" (pile of sticks, stones and mud made by toddler). I am thinking now that I left out an important part of the CDF. Not only am I proposing the CDF be used to detect design, but degrees/complexity of design. This would mean two things:
a. The minimum threshold of design could remain at I(object) > 1
b. Different thresholds describing different levels of design would have to be derived.
More thinking (and maybe drinking) to do...
More random, not-so-random thoughts:
ReplyDeleteRe-reading the Information Theory Primer William Wallace gave me, if something is contingent or subject to natural law (100% probability), then the amount of information contained is equal to zero bits. Regardless of whether one uses absolute or relative measurements, zero is zero. Thus, for assemblies and functions subject to natural law, it is logical to define I(a) & I(fn) = 0.
If this is the case, then that affects the structure of the equation. This would mean that I(a) and I(fn) would literally add information to I(m), thus giving us:
I(object) = I(m) + I(a) + I(fn)
and we're back to the task of defining design thresholds.
That said, IMO, the additive equation is better suited to absolute measurements than relative measurements. Thus, I will continue to work with the original equation:
I(object) = I(m) x I(a) x I(fn)
The mudpie is still troubling me...