Previous: 3: Introduction    | Next: 5: Proof by experiment    

4. That biologists and ecologists misunderstand basic scientific concepts

Any self-respecting biologist or ecologist must surely be disturbed to read Karsai and Kampis’ assessment of education in their discipline as given in The Crossroads between Biology and Mathematics: The Scientific Method as the Basics of Scientific Literacy:

Studies such as that by Lombrozo and colleagues (2006) indicate that students of various institutions carry several misconceptions about science and lack a sufficient understanding of how scientific views differ from everyday opinions or even religious claims.

The misrepresentation of science and the incorrect use of the scientific method has generated various myths and distorted views of science …. (Karsai and Kampis, 2010).

The authors’ particular interest is how biologists understand—and not simply use—scientific and mathematical information. They also say:

Scientific literacy doesn’t necessarily call for deep understanding of difficult concepts such as the Nernst equations or the precise conditions of the Hardy-Weinberg equilibrium, but it does require a general understanding of basic scientific notions and the nature of scientific inquiry. (Karsai and Kampis, 2010).

We must begin our investigations somewhere. Harte’s Maximum Entropy and Ecology: A Theory of Abundance, Distribution and Energetics is as good a place as any. It gives a cogent example of the misunderstandings to which biologists and ecologists are all too prone:

With this in mind, consider the formula defining a change in entropy (S), in terms of a change in heat content (Q) at temperature (T), that was originally introduced to science by Sadi Carnot in the nineteenth century:

ΔS = ΔQ/T (Harte, 2011, p. 118).

Carnot, unfortunately, did not discover the stated formula. It was, rather, discovered by Rudolf Clausius. Carnot, for his part, believed in the caloric theory of heat. This long discarded theory insisted that heat was the manifestation of a weightless invisible fluid. It was therefore antithetical to the emerging energy doctrine. Given his beliefs, Carnot could only state the broad outlines of his important discovery. He noted the existence of the upper bound that eventually became the second law of thermodynmaics … but he could not properly quantify it. He gave no formula. Furthermore, given his belief that heat, in the form of caloric, was always conserved, then he incorrectly denied its ability to change into other forms. He therefore could not have conceived of entropy and so could not have discovered the above formula attributed to him. He did not know how to relate work to heat, and so could not generalize these concepts; he could not formalize equality of temperatures; and so he could not describe how an arbitrarily large number of systems could be brought together so that their states could be specified to produce the monotonic and increasing function, and cyclic integral, that states the first law of thermodynamics. It was again Clausius who recognized energy’s transformation capability; gave its formula; and successfully measured it.

The difficulties that ensue for biology and ecology when such basic scientific concepts are somewhat improperly applied is made clearer in Harte et al’s Maximum Entropy and the State Variable Approach to Macroecology:

The approach proposed here for achieving that goal builds on the concept of state variables. These are properties of a system that comprise the conditions whose specification is necessary to implement theory, but whose determination lies outside the theory. … We have chosen area A0 as the first state variable because it is the obvious measure of the physical scale of the system, in analogy with the state variable, volume, in thermodynamics. (Harte et al, 2008).

This claim to an analogy must be questioned. A state variable in information theory need be no more than an ‘information symbol’, or similar. It references a probability set of something conveyed, as a “message” is conveyed. State variables in thermodynamics, however, fulfill a rather different purpose. Volume’s entire purpose within thermodynamics, for example, is to force a count for that system’s molecules. This is the general intent of all such thermodynamic state variables. It is a system's molecules and their motions that give that system its pressure and its temperature. It is therefore unlikely that area is a state variable “in analogy with volume” for it does not specify a molecular count, and nor does it provide any similar mechanisms for the reckoning of molecules.

Granted that thermodynamic entropy’s principal purpose is to predict molecular behaviour then where, for example, is the pressure implied by this proposed area? The closest we get is when the authors say:

We have chosen total abundance and metabolic energy rate as the remaining two state variables because they scale additively, increasing linearly with area in complete nested designs (that is, when the data from all nonoverlapping cells of a specified area are averaged). Moreover, the individual organism and its energy requirements are of fundamental importance in biology and so those two state variables are intuitively reasonable ones to base theory upon. They also share a close analogy with the number of molecules and the total internal energy in thermodynamic systems. (Harte et al, 2008).

But surely … total abundance and metabolic energy have to do something rather more than simply “scale additively” before they can be presumed to interact with area in the way that pressure, volume, temperature and entropy interact in thermodynamic systems. And … it may be an undeniable fact that individual organisms and their energy requirements are of “fundamental importance” to biology, but scientific and mathematical rigour demand something rather more substantial than simply being “intuitively reasonable” before a close analogy with anything is acceptable, never mind with numbers of molecules and internal energy.

There are further difficulties in the choice of variables:

Consider an ecosystem of area A0 and within it a group of organisms such as the trees on a 50-ha plot. We assume prior specification of four state variables: total area, A0, of the ecosystem; total number of individual trees, N0, within A0; summed metabolic energy rate of all the trees, E0, within A0; and total number of tree species, S0, within A0. ….

We have chosen the state variable S0 because of the central role that species richness plays in ecology and in macroecological metrics, although we could apply the same theoretical methods to the total number of genera, G0, and then derive abundance distributions over the genera as well as genera–area relationships.

In our theory, A0, N0, and E0 are extensive variables, but S0 is intermediate between an intensive and an extensive variable; it neither adds linearly nor is it averaged when systems are adjoined and thus has no analogy in thermodynamics. Pursuing this further, in thermodynamics the intensive variable temperature emerges as the inverse of the Lagrange multiplier in the MaxEnt derivation of the Boltzmann energy distribution; in our theory, the inverses of the Lagrange multipliers are neither intensive nor extensive, and we do not know if they can be associated with a generalized ecological ‘‘temperature’’ (Harte et al, 2008).

The thought that a proposed state variable for species abundance could perhaps be extended, without difficulty, to genera abundance is certainly encouraging. However … what is to be made of the fact that this extremely important variable, S0, is “intermediate between an intensive and an extensive variable”? There is no damage done to the theory if it is made clear exactly what that means, but nothing is made clear.

An intensive variable within thermodynamics is tightly defined. An intensive variable is a homogeneous function of degree zero of some extensive variable (Callen, 1960). Each, such as temperature, can be expressed as the first partial derivative of either the energy or the entropy of the system (Chandler, 1987). Thus temperature obeys the rule T = T(S, V, N) = T(λS, λV, λN) where λ is a scaling factor. In other words, intensive variables do not scale with the size of the system such that in any system at thermal equilibrium, the temperature of any subset or subsystem is the same as that of the system itself.

Extensive variables are also tightly defined. They are additive over susbsystems, and therefore scale with system size. For this reason they are represented by homogeneous functions of degree one. Since entropy, for example, is generally extensive and a function of energy, numbers of molecules, and volume, then it will be represented by the function S = (λE, λV, λN) = λ(E, V, N) where λ is again a scaling factor. Any function obeying this is extensive, and any function that is extensive obeys it immediately. These are rigorous definitions. Since no definition or behavioural template is offered for the “intermediate property” to which the authors allude, it is difficult to know what to make either of the proposed property or the analogy on which it is based. None of this of course prevents area from being a coherent conveyer of relevant ecological information, and so a state variable in that more specific information theory sense. It simply fails to meet the criteria for a state variable in the thermodynamic sense, and so is not “in analogy”.

A further issue in their treatment arises with their proposed definition of attributes through “inverses of the Lagrange multipliers”. This is all very well, but temperature plays an overwhelming role in thermodynamics. It is virtually how the subject is defined. Thermodynamics is centred on Boyle’s law of PV = T. It is no exaggeration to say that without temperature there simply is no thermodynamics. Thanks to Maxwell, temperature is now rigorously defined in terms of entropy. Thus for the authors to say, at the conclusion of their paper, that they “do not know” how species abundance relates to temperature … and to even doubt if there is an association … surely calls the whole procedure into doubt. If rigour and a move away from the ad hoc in theories about species abundance is indeed the interest, then is this relationship not the first thing that should have been settled, and not virtually the last thing to be mentioned, and then as an aside?

It is important to be clear that the above issues are distinct from the more general issues raised by the MaxEnt theory on which Harte and Harte et al base their work. MaxEnt is in its turn based on Shannon’s unification of entropy with information to produce his information theory. The “Jaynes principle” then proceeds from that, being a statistical interpretation of entropy through the Boltzmann theorem, again based on an algorithm first introduced by Gibbs. The Jaynes principle turns thermodynamics into an expression of the calculus of variations, using Lagrange multipliers … and as again based on Shannon’s information theory (Biró, 2011, pp. 56-58; Gull, 1991; Stewart, 2007, pp. 970–977). Entropy in information theory now becomes a measure of the amount of information missing from a signal before it is received. Following on Shannon, Jaynes in brief recognized that the signalling of information requires the movements of molecules and photons, which in its turn leads to a far more general definition of entropy in terms of discrete sets of probabilities. Information theory and information entropy can be broadened to embrace the more traditional thermodynamic entropy, concerned as that is with more prosaic volumes, temperatures and pressures, always notwithstanding that the movements of molecules are common to both:

The function H is called the entropy, or, better, the information entropy of the distribution {pi}. This is an unfortunate terminology, which now seems impossible to correct. We must warn at the outset that the major occupational disease of this field is a persistent failure to distinguish between the information entropy, which is a property of any probability distribution, and the experimental entropy of thermodynamics, which is instead a property of a thermodynamic state as defined, for example by such observed quantities as pressure, volume, temperature, magnetization, of some physical system. They should never have been called by the same name; the experimental entropy makes no reference to any probability distribution, and the information entropy makes no reference to thermodynamics. Many textbooks and research papers are flawed fatally by the author’s failure to distinguish between these entirely different things, and in consequence proving nonsense theorems (Jaynes, 2003, p. 351).

Whether some link exists between the information and the thermodynamic entropies remains a much-debated topic. Some authorities declare a link, while others deny it. While perhaps true that the areas in which species live should not be expected to behave ‘like’ volumes, the fact remains that the analogies being drawn, with state variables, are direct. They are being explicitly stated. MaxEnt also states its Lagrange multipliers coherently and rigorously. An ecological theory based on state variables and its associated concepts simply cannot do less than these two theories on which it declares itself based.

For a second example we can turn to Jørgensen and Fath’s Application of thermodynamic principles in ecology:

All ecosystems are open systems embedded in an environment from which they receive energy–matter input and discharge energy–matter output. From a thermodynamic point of view, this principle is a prerequisite for the ecological processes (Jørgensen and Fath, 2004).

Here we have a complete misunderstanding of ‘open’ and ‘closed’ as used in thermodynamics. All ecosystems are certainly not open. In point of fact, and as we shall see shortly, most ecosystems are—technically—closed. ‘Open’ and ‘closed’ are rigorously defined in thermodynamics, and we misunderstand them at our peril. There is a difference between geometric and thermodynamic closure. Indeed, we cannot properly understand ‘open’ without first understanding mass, volume, and what it takes for the two to be associated:

Problems treated in thermodynamics begin with a description of a system under study. The system is generally an object or a collection of objects, where the objects may be macroscopic or microscopic and may or may not have mass. This very broad use of the term “object” allows us to apply thermodynamical concepts even to electromagnetic radiation (photons) in a cavity.

Systems are defined either by specifying the objects directly or by specifying the location of the objects. For instance, we could say the system is this textbook. … Or, we could say the system is the collection of all gas molecules in a room. Implicit in this second description is the notion of the system boundary. In that example, the walls of the room played the role of the boundary. The boundary need not be a physical object, as long as it is well defined geometrically. … We avoid [this] potential for confusion in thermodynamics by insisting that our boundaries always be geometrically closed.

We have just been describing geometric closure. We now introduce the notion of thermodynamic closure …. a system is closed if no mass associated with the system can cross the boundary of the system. (Bold emphasis by original author, italic emphasis by this author). (Thomsen, 2011).

The emphasis on ‘mass associated with the system’ is critical. William James, for example, revolutionized psychology by applying ideas of mass and inertia to … ideas:

We may then lay it down for certain that every representation of a movement awakens in some degree the actual movement which is its object; and awakens it in a maximum degree whenever it is not kept from so doing by an antagonistic representation present simultaneously to the mind.

The express fiat, or act of mental consent to the movement, comes in when the neutralization of the antagonistic and inhibitory idea is required. But that there is no express fiat needed when the conditions are simple, the reader out now to be convinced. (James, 1902).

Until the system’s creator has specified what is to be regarded as mass and inertia within that specified system of enquiry, it is not wise to believe that we already know what they are. And by the same token … just because physics and chemistry have very specific and convincing ideas of mass, does not immediately mean that those requirements are either met, or need to be met, by biology and ecology, which should instead investigate closely whether or not those particular ideas of mass and inertia are relevant within their discipline.

The core issue is complacency and ignorance. It is all too easy for each of us, individually, to believe ourselves immune. But when complacency and ignorance regarding basic concepts are pervasive throughout an entire discipline, it creates something of a problem:

The great unifying principle of 19th century science was the doctrine of energy. The conception of energy had its roots far back in the dynamics of the 17th century, but the final formulation of the principle of the conservation of energy was due to the labours in the 19th century of Carnot, Joule, Kelvin and Helmholtz. The simplification introduced by this principle was one which could be easily grasped—and which brought with it a feeling of satisfaction and security.

Towards the close of the 19th century the prevalent idea, even in the universities, was that the great battle of science was almost won …. There was an air of finality, too, about the two great principles of invariance—the conservation of energy and the conservation of mass. The simplicity and grandeur of these doctrines was such that men felt inclined to survey with pardonable complacency the achievements of the past. But men of science were not allowed to slumber for long (Turner, 1981).

As is well known, this period of widespread complacency in science preceded the upheavals of the quantum theory. It is all too easy to believe that this occurs in “other” fields, and could not and is not happening in biology or ecology. It is certainly not our wish to link any particular biologist or ecologist with the dark days and severe misadventures that Russian science and society went through following Trofim Lysenko’s deliberate rejection of Mendel and his misuse of Darwin’s evolutionary ideas (Milner 1990; Bragg, 2008). But since complacency and ignorance were again the core issues, it is instructive to note that both the quotes immediately hereunder are from Lysenko:

“Darwin investigated the numerous facts obtained by naturalists in living nature and analysed them through the prism of practical experience”.

“Agricultural practice served Darwin as the material basis for the elaboration of his theory of Evolution, which explained the natural causation of the adaptation we see in the structure of the organic world. That was a great advance in the knowledge of living nature”.

Karsai and Kampis point out, in their somewhat dispiriting review, that the complacency and ignorance found in biology arises from the scientific and mathematical education that so many working biologists and ecologists currently receive:

We believe that for mathematics to make sense in biology education, science should make sense first. The issue has two interrelated aspects, and we deal with them separately: They are the understanding of science and the understanding of mathematics for science (in this case, for biology).

… Using scientific inquiry without first teaching the proper scientific method may generate a complete misunderstanding of how science works. (Karsai and Kampis, 2010).

It is surely surprising that anyone reviewing this field should seriously claim that biologists and ecologists misunderstand how science works. But as Turner points out in her own review of English science education (Turner, 1981), surely an inappropriate response is to just smile and to believe one’s self immune. After all, we have scarcely begun our journey and we have already seen a misuse of the words ‘mass’, ‘open’, ‘closed’, ‘area’, ‘volume’, ‘inertia’, ‘temperature’, and ‘intensive’ and ‘extensive’. Surely … a more appropriate response is to resolve to try harder:

If we accept the function of science as a description of experience we are free from the old controversies about the outer world. We judge the conclusions of science accordingly as they are consistent with our own field. A scientific law is a generalization stated by the man of science as a means of summarizing his experience. Future generations will use the law as a means for further research or they will abandon it if it is of no use. Thus the so-called laws of nature are made by man.

Science, however, does not consist of a mass of ad hoc hypotheses and unrelated laws. In the sense that man frames the generaliztions, he makes the laws. But these laws can often be fitted into a logical scheme. [i]ndeed [sic] our minds demand that they shall fit into a logical whole, and we go on trying until we succeed (Turner, 1981).