Friday, March 12, 2010

Reply to Paul Nelson

Glenn Branch has kindly pointed out that Paul Nelson thinks that my counterexample to the claims in Meyer's Signature in the Cell is a "fluffy confection" and "sophistry".

Recall that Meyer claimed that only minds can create information. I gave a simple counterexample: weather prediction. Meteorologists gather the information needed to make their predictions from the natural world - things like wind speed, wind direction, temperature, and so forth - and then use this information to make their predictions. Nelson is unconvinced. But his reply is unconvincing.

Nelson starts by claiming that you can't determine quantities like barometric pressure and temperature without looking at a measuring instrument. This is both (a) false and (b) not relevant. Nelson must spend all his time indoors, because I can (and do all the time) estimate the temperature quite accurately (typically within 2-3 degrees C) just by sticking my head out the door.

Next, Nelson claims that you can't predict the weather accurately without a complex analytical model. Again, this is both (a) false and (b) not relevant. Even the boy scout manual gives some basic techniques that can be understood by teenagers.

Nelson also fails to recognize that answering a simple question like "Is it warm enough to go around without a coat?" gives you information about the weather. If the answer is yes, you can be pretty damn sure that it is not going to snow in the next hour. That's not a lot of information, perhaps, but it certainly constitutes information.

Third, Nelson claims that "there is complexity aplenty in the data, but, as SITC explains, that complexity is unspecified". This is the standard creationist ploy: admit that weather observations constitute "information", but just claim it is not the right kind of information. Never mind that this "right kind of information" is not recognized by anybody who actually studies information theory for a living - we should listen to Nelson because of his great credentials in mathematics.

However, weather observations do constitute information according to Meyer's own definition in Signature in the Cell, because these data are images of the underlying physical systems that cause the weather.

Finally, Nelson claims that the raw data (which he claims is not "specified information") suddenly becomes "specified" after it passes through an algorithm that does weather prediction. Nelson scores it in his own goal, because according to Dembski, computer algorithms cannot produce specified complexity. Even if Nelson is going to claim that the "specification" is contained in the algorithm, that doesn't explain how weather prediction algorithms can go on, day after day, producing weather forecast after weather forecast, each forecast with its own new amount of "specified information", from only the input data and a finite program.

Nelson's response is completely without merit. Look for more similar creationist attacks in the future, because this simple example of weather prediction is so devastating to their bogus claims.

20 comments:

manuel moe g said...

Paul Nelson gets nutty (nuttier) at the end.

> One final point. In other writings, Shallit has indicated his hostility to the notion of human agency. In light of this, it’s perhaps not surprising that Shallit reduces the creative intellectual activity of meteorologists...

> Support your local meteorologist.

What a lather Paulie worked himself into!

Nelson and Meyer are defending a theory of information that when "bits" of information are ruminated over by a human mind, they acquire infectious "cooties" that can be passed to other "bits" through proximity.

The presences of these "brain cooties" let you tell "real" information from fake information.

Readings from weather instruments can only acquire the cooties when a human meteorologist looks at the instrument (and only if he looks hard enough, while wearing corrective eyewear, and considers the instrument's markings closely).

By putting a sheet of paper covered with "real" information in a blender, after tearing it up into small pieces, with a cup of water and egg whites, you can blend the whole mass into a paste, thus ridding it of the brain cooties. Now, the molecules only contain "fake" information.

These "brain cooties" can be seen with the same spectacles that render the Holy Spirit visible.

At this point, I have developed a theory of information with more novelty, specificity, and substance than Meyer's.

Anonymous said...

I believe that should be "quantum cooties." You should always get the quantum in there.

Diogenes said...

Ugggh. This is the reason why the very first post on my blog was called "Intelligent Design Mysticism and the Rape of Information Theory". (By the way, the Diogenes' Lamp blog is mirrored at Blogspot and Wordpress.)

In that post, I specifically criticized Nelson for a thread at UD, where a poster (Gedanken) compares human-made art like Mount Rushmore to rocks that look like human faces (Old Man of the Mountain etc.) and asks: is that specified? And what's the specificiation?

And the only hierophant from the DI that writes in is Paul Nelson, who says: the specification is "the anatomical form of Homo Sapiens". But OOPS! The Old Man of the Mountain was made by natural forces!

So Nelson moves his goal posts and adds that that doesn't count. They're not specified, because you can only see the Old Man of the Mountain from some angles and not others. So Nelson jury-rigs and gerrymanders his "specification" to exclude everything produced by natural forces.

The whole idea of "specification" is pure argument from authority: the hierophants of the DI have the priestly authority to say specification is whatever they say it is. And so, The Rape of Information Theory" continues. It never ends!! It never ends!!

Anonymous said...

Paul Nelson is dangerously close to shooting himself in the foot. Darn, there is information at every stage of the weather data collection and forecasting process. Raw data, readings, models, prediction. Is this the twaddle these guys are peddling?

Truti

Anonymous said...

Paul Nelson is an unimaginably dense maroon Does he actually believe that the parameters that are used to characterise weather such as pressure, temperature, wind speed etc come into existence only when put through human made instruments, and then compiled? These entities are physical and their existence owes nothing to us. Jeffrey Nelson reminds me of your review of Phil Johnson's trash - every page is an opportunity to play "is he a fool or a knave"

Truti

RBH said...

While your weather example is a good one, I tend to use a different example in conversations. I ask the ID proponent (who around here tends to be YEC parroting the "information" argument) to consider a rapids in the river that runs through a valley in this county. I ask them to predict the distribution of gravel sizes below the rapids: "What data would you get if you sampled and measured the gravel size every foot for the 50 yards downstream from the rapids." They're perfectly willing to concede that larger chunks of gravel will tend to be concentrated near the foot of the rapids while smaller chunks will be further downstream and that the distribution is ordered and non-random. But that pattern--that non-random distribution--embodies "specified" information created by a purely natural physical system.

Anonymous said...

RBH,

You have earned your Evo Bio Science spurs in the "badlands" of Southern Ohio. That is as stunning an argument I have heard as any!

Truti

Mark said...

The only natural information is microinformation. The major kinds of information were poofed into existence 6,000 years ago.

If Intelligent Creative Minds were not available to create new information, seismic waves would not behave the way they do.

Anonymous said...

Forget about the Boy Scout Manual:
Matthew 16:2
"He replied,[a] "When evening comes, you say, 'It will be fair weather, for the sky is red,'"

KeithB

Filipe Calvario (from Brazil) said...

"(...)because I can (and do all the time) estimate the temperature quite accurately (typically within 2-3 degrees C) just by sticking my head out the door."
Wow.

Ewan said...

The Medium is Not the Message

Information is knowledge which requires a sender and a receiver. Looking at something and gathering facts about it does not make it into information. You're attempting to redefine information as materialism so that evolution of DNA seems plausible. This cannot be done.

Information exists prior to the formation of its medium. Information is not bound to its medium. This is empirical proof that life was both designed and Specially Created.

Jeffrey Shallit said...

Ewan:

You're extremely confused.

First, you depend on lawyers like Barry Arrington for your understanding of information theory at your peril.

Second, you've got it exactly backwards. Information and information theory is well understood by mathematicians, who have developed theories like Shannon's and Kolmogorov's. It is the creationists who are redefining "information", not I.

Information exists prior to the formation of its medium.

Can you point to any information at all that exists outside a physical medium of representation?

Ewan said...

Can you point to any functionally embedded information algorithm-operating, interdependent code that has arisen by molecules randomly banging about?

Darwinism vs Facts
Definitions:
Information: For this entry we're talking about biologically meaningful information, or semantic information or more specifically still biosemiotics. Shannon information is useful in biology as well but not at the level required for ID. That is, both descriptive info and prescriptive info.

Complexity: Here ID refers to specified complexity - and this is not an IDist invention - it was first used by Leslie Orgel. Complexity alone is insufficient. A long string of random letters for example is complex but not specified. A string of letters from a Shakespearean sonnet is both complex and specified.

Here I quote Dr David L. Abel; The Origin of Life Science Foundation:

Semantic (meaningful) information has two subsets: Descriptive and Prescriptive. Prescriptive Information (PI) instructs or directly produces nontrivial formal function (Abel, 2009a). Merely describing a computer chip does not prescribe or produce that chip. Thus mere description needs to be dichotomized from prescription. Computationally halting cybernetic programs and linguistic instructions are examples of Prescriptive Information. “Prescriptive Information (PI) either tells us what choices to make, or it is a recordation of wise choices already made.” (Abel, 2009a)

Not even Descriptive semantic information is achievable by inanimate physicodynamics (Pattee, 1972, 1995, 2001). Measuring initial conditions in any experiment and plugging those measurements appropriately into equations (e.g., physical “laws”) is formal, not physical. Cybernetic programming choices and mathematical manipulations are also formal.
...
DNA strings are formed through the selection of one of four nucleotides at each locus in a string. These programming choices at quaternary decision nodes in DNA sequences must be made prior to the existence of any selectable phenotypic fitness (The GS Principle, (Abel, 2009b). Natural selection cannot explain the programming of genetic PI that precedes and prescribes organismic existence.

No one has ever observed PI flow in reverse direction from inanimate physicodynamics to the formal side of the ravine—the land of bona fide formal pragmatic “control.” The GS Principle states that selection for potential function must occur at the molecular-genetic level of nucleotide selection and sequencing, prior to organismic existence (Abel, 2009b, d). Differential survival/reproduction of already-programmed living organisms (natural selection) is not sufficient to explain molecular evolution or life-origin (Abel, 2009b). Life must be organized into existence and managed by prescriptive information found in both genetic and epigenetic regulatory mechanisms. The environment possesses no ability to program linear digital folding instructions into the primary structure of biosequences and biomessages. The environment also provides no ability to generate Hamming block codes (e.g. triplet codons that preclude noise pollution through a 3-to-1 symbol representation of each amino acid) (Abel and Trevors, 2006a, 2007). The environment cannot decode or translate from one arbitrary language into another. The codon table is arbitrary and physicodynamically indeterminate. No physicochemical connection exists between resortable nucleotides, groups of nucleotides, and the amino acid that each triplet codon represents. Although instantiated into a material symbol system, the prescriptive information of genetic and epigenetic control is fundamentally formal, not physical.


If you understood that then you'll realize that the above facts already by themselves refute Darwinism at the most fundamental level - encoded meaningful information.

Watch and learn.

Jeffrey Shallit said...

Ewan:

You remain extremely confused.

Yes, Orgel used the words "specified complexity" once in a popular science book. But his definition is not the same as the one used by Dembski and Meyer, and furthermore he did not give a formal definition of his term.

I see you have completely evaded the fact that information theorists do not use the definition of "information" used by creationists.

Abel's "paper" consists of pure assertions and bafflegab. It has had no impact whatsoever on science - just consult Science Citation Index.

RBH said...

Ewan wrote

Can you point to any functionally embedded information algorithm-operating, interdependent code that has arisen by molecules randomly banging about?

Since no one claims a direct single-link causal relationship between that code and molecules banging about, the question is a non sequitur. As to the evolution of complicated computer code from primitive instructions 'randomly banging about' see Genetic programming for an introduction, and then see
The evolutionary origin of complex features for some research germane to your claim.

R0b said...

Apparently the DI was impressed with Nelson's response, as they're including it in their new book, Signature of Controversy. It's nice of the DI to do our work for us, shooting down Dembski's LCI in their own publication.

Anonymous said...

About 7 years ago, I was active on a discussion board that a friend of Nelson's* was active on. The subject of molecular phylogeneticvs came up and his friend claimed that she had been told by a "biggie" at the Discovery Institute that such analyses were no good because of investigator bias - you can just arrange sequences alignments to reflect what you want the outcome to be.
OK, so I took an alignment that I had used in several analyses, removed and coded the names of the taxa, rearranged them in the alignment, and removed all gaps and spaces. I sent the file to Nelson via his friend. He replied that he was 'too busy' to do anything with it but that he might have one of his associates take a look at it, you know, to prove their claim that investigator bias disctates the outcomes of such analyses.

7 years later, my list of taxa and their new codes gathers dust on my bulletin board.

These people sound so convicing because they refuse to find out if their assertions actually have merit.


*it may have been Wells, but they are pretty much the same...

Fair Shake said...

Anonymous (Truti) above wrote: "Paul Nelson is dangerously close to shooting himself in the foot. Darn, there is information at every stage of the weather data collection and forecasting process. Raw data, readings, models, prediction. Is this the twaddle these guys are peddling?"--
Jeff or Diogenes: Why didn't you gently correct Anonymous for misrepresenting Nelson's position? Nelson's response to Anonymous would be, "Duh, you moron, of course the readings, models, and prediction -- all intelligently designed -- are information! Were you paying attention?"
(That wasn't a defense of Nelson; it was a critique of you.)

Umlaut said...

Even if Nelson is going to claim that the "specification" is contained in the algorithm, that doesn't explain how weather prediction algorithms can go on, day after day, producing weather forecast after weather forecast, each forecast with its own new amount of "specified information", from only the input data and a finite program.

Does it have to? Doesn’t all that matters, to Nelson, is the fact that the program is needed?

“according to Dembski, computer algorithms cannot produce specified complexity.”

I searched this up but couldn’t find him saying this. Rather, I found him saying that evolutionary algorithms cannot produce specified complexity. Far be from you to misquote him.

Jeffrey Shallit said...

Rather, I found him saying that evolutionary algorithms cannot produce specified complexity.

Well, you see, I teach about algorithms for a living. Currently I am teaching CS 341, "Introduction to Algorithms". Somehow we manage to teach an entire course and not mention "evolutionary algorithms" at all. The point is that there is no firm definition of when an algorithm is either "evolutionary" or not.

Typically we think of an evolutionary algorithm as one that uses a source of random numbers and performs operations analogous to creating a population, evaluating a fitness function, and so forth. There is no really rigorous definition, any more than there is a rigorous definition of when an algorithm uses "dynamic programming" or the "greedy algorithm". It's just a rough classification.

What I'd like to know is, how much "evolutionary" does an algorithm A have to be before it suddenly loses the ability to generate specified complexity? Does it have to use the fitness function once or twice or a million times, or what? For every answer, I can easily generate a weather prediction program that behaves like that.

You might look at No Free Lunch, p. 207. There Dembski says "There is only one known generator of specified complexity, and that is intelligence." But we know that there are natural "algorithms" that don't quite fit the rough classification of evolutionary algorithms very well: algorithms like the rain cycle, or orogeny, for example. Doesn't Dembski's statement seem to rule out the generation of specified by complexity by these algorithms, too?

Doesn’t all that matters, to Nelson, is the fact that the program is needed?

Look, if there's a nontrivial probability that a particular algorithm could arise naturally, and that algorithm then has the property that it can produce arbitrarily large amounts of specified complexity over time, then the game is up for intelligent design. Maybe you didn't understand that.