The Onion Test

The onion test has been used to batter creationists for many years. It is based on the fact that onion DNA is five times the length of human DNA, and challenges creationists to explain why the designer would make it that way. 

A lot of ID arguments are based on information. Why, then, does an onion, a very mundane plant, need fives times as much information in its DNA as humans do?

This has become more of a hot topic with the results of the ENCODE project being published, and the realisation that much of what was previous thought of as "junk" DNA is not actually junk. IDists say this is what ID predicted, though they seem very reluctant to say exactly what that prediction was - exactly how much junk do they predict?

They also claim 80% junk DNA is a failed prediction of evolution, which, as far as I can tell, is simply not true. The 80% figure was what earlier observation seemed to indicate, not what was predicted by evolution. Evolution could explain that figure, but it can also explain 10%.

However, none of this helps IDists with the onion test. It is still the case that the genome of the onion is five times the size of the human genome, and it is still the case the IDists cannot explain it.

An IDists at CARM suggested:

But one answer to your question might be that the human genome is more densely packed information than that of the onion. That is human DNA might have 3 layers of information which would mean that the information is not 3 Gig (or 6 Gig in bits) but rather 3^3 gigs or 27 Gigs which is greater than the 15 Gigs for the onion. This is all hypothetical but it is you who brought in the unknown onion genome.

Then that leaves us two questions:

  • Why did the purported designer use different data compression techniques for the two organisms?
  • Why are ID scientists not studying this?

In real science, scientists would be all over this - see here, for example, where real scientists are studying genome size with respect to evolution. The data compression issue is hugely important because it potentially gives us insight into the thinking of the designer. What other organisms have onion-type data compression, what other organisms have human-type data compression? 

Let us dig a little deeper. This web page has a graph showing the spread of genome sizes for some clades.

Turns out amphibians have a spread of over two orders of magnitude! That is, the largest genome size is more than 100 times that of the smallest! Just looking at salamanders, their genomes range from 10 to 120 pg (see here). Why do some salamanders have twelve times as long DNA compared to other salamanders?

IDists usually talk of "common design", but here we have exactly the reverse of that; here we have the designer use vastly different designs for data compression for very similar organisms. If you want to compare DNA to a computer program, as IDists often do, this is like God devising a data compression algorithm for one salamander species, then devising a different one for the next species, then devising another for the next, and so on. And some of his better efforts are twelve times better than his worst, which makes me wonder why he even implemented the bad ones.

How many creationists will tell you the salamander was one "kind" on the ark?

Bony fish and fungi have a similar spread, by the way, while bacteria and insects not much smaller. The onion test is really just scratching the surface. The problem for IDists runs much, much deeper.

Comments

Popular posts from this blog

Southern Baptist Convention Position on Abortion

Kent Hovind: Third wife in three years?

Hinman's "Argument From Transcendental Signifier"