Excerpt from the book "Scale: The Universal Laws of Life, Growth, and Death in Organisms, Cities, Companies" by Geoffrey West.
Published by Penguin Books, 2017.
pages 108-110
To varying degrees, all theories and models are incomplete. They need to be continually tested and challenged by increasingly accurate experiments and observational data over wider and wider domains and the theory modified or extended accordingly. This is an essential ingredient in the scientific method. Indeed, understanding the boundaries of their applicability, the limits to their predictive power, and the ongoing search for exceptions, violations, and failures has provoked even deeper questions and challenges, stimulating the continued progress of science and the unfolding of new ideas, techniques, and concepts.
A major challenge in constructing theories and models is to identify the important quantities that capture the essential dynamics at each organizational level of a system. For instance, in thinking about the solar system, the masses of the planets and the sun are clearly of central importance in determining the motion of the planets, but their color (Mars red, the Earth mottled blue, venus white, etc.) is irrelevant: the color of the planets is irrelevant for calculating the details of their motion. Similarly, we don't need to know the color of the satellites that allow us to communicate on our cell phones when calculating their detailed motion.
However, this is clearly a scale-dependent statement in that if we look at the Earth from a very close distance of, say, just a few miles above its surface rather than from millions of miles away in space, then what was perceived as its color is now revealed as a manifestation of the huge diversity of the Earth's surface phenomena, which include everything from mountains and rivers to lions, oceans, cities, forests, and us. So what was irrelevant at one scale can become dominant at another. The challenge at every level of observation is to abstract the important variables that determine the dominant behavior of the system.
Physicists have coined a concept to help formalize a first step in this approach, which they call a "toy model”. The strategy is to simplify a complicated system by abstracting its essential components, represented by a small number of dominant variables, from which its leading behavior can be determined. A classic example is the idea first proposed in the nineteenth century that gases are composed of molecules, viewed as hard little billiard balls, that are rapidly moving and colliding with one another and whose collisions with the surface of a container are the origin of what we identify as pressure. What we call temperature is similarly identified as the average kinetic energy of the molecules. This was a highly simplified model which in detail is not strictly correct, though it captured and explained for the first time the essential macroscopic coarse-grained features of gases, such as their pressure, temperature, heat conductivity, and viscosity. As such, it provided the point of departure for developing our modern, significantly more detailed and precise understanding not only of gases, but of liquids and materials, by refining the basic model and ultimately incorporating the sophistication of quantum mechanics. This simplified toy model, which played a seminal role in the development of modern physics, is called the "kinetic theory of gases" and was first proposed independently by two of the greatest physicists of all time: James Clerk Maxwell, who unified electricity and magnetism into electromagnetism, thereby revolutionizing the world with his prediction of electromagnetic waves, and Ludwig Boltzmann, who brought us statistical physics and the microscopic understanding of entropy.
A concept related to the idea of a toy model is that of a "zeroth order" approximation of a theory, in which simplifying assumptions are similarly made in order to give a rough approximation of the exact result. It is usually employed in a quantitative context as, for example, in the statement that "a zeroth order estimate for the population of the Chicago metropolitan area in 2013 is l0 million people." Upon learning a little more about Chicago, one might make what could be called a "first order" estimate of its population of 9.5 million, which is more precise and closer to the actual number (whose precise value from census data is 9,537,289). One could imagine that after more detailed investigation,an even better estimate would yield 9.54 million, which would be called a "second order" estimate. You get the idea: each succeeding "order" represents a refinement, an improved approximation, or a finer resolution that converges to the exact result based on more detailed investigation and analysis. In what follows, I shall be using the terms "coarse-grained" and "zeroth order" interchangeably.
This was the philosophical framework that Jim, Brian and I were exploring when we embarked on our collaboration. Could we first construct a coarse grained zeroth order theory for understanding the plethora of quarter-power allometric scaling relations based on generic underlying principles that would capture the essential features of organisms? And could we then use it as a point of departure for quantitatively deriving more refined predictions,the higher order corrections, for understanding the dominant behavior of real biological systems?
I later learned that compared with the majority of biologists, Jim and Brian were the exception rather than the rule in appreciating this approach. Despite some of the seminal contributions that physics and physicists have made to biology, a prime example being the unraveling of the structure of DNA, many biologists appear to retain a general suspicion and lack of appreciation of theory and mathematical reasoning.