« June 2006 | Main | August 2006 »

July 31, 2006

Criticizing global warming

Dr. Peter Doran, an antarctic climate resesarcher at UIC, was the author of one of two studies that the polemicists like to use to dispute global warming. Although he's tried to correct the out-of-control spinning on the topic that certain deniers are wont to do, he's been largely unsuccessful. Politics and news, as always, trump both accuracy and honesty. In a recent article for the Amherst Times (apparently pulled mostly from his review of An Inconvenient Truth, which he gives "two frozen thumbs up"), he discusses this problem, and the facts. From the original:

...back to our Antarctic climate story, we indeed stated that a majority -- 58 percent -- of the continent cooled between 1966 and 2000, but let’s not forget the remainder was warming. One region, the Antarctic Peninsula, warmed at orders of magnitude more than the global average. Our paper did not predict the future and did not make any comment on climate anywhere else on Earth except to say, in our very first sentence, that the Earth’s average air temperature increased by 0.06 degrees Celsius per decade in the 20th century.

New models created since our paper was published have suggested a link between the lack of significant warming in Antarctica to the human-induced ozone hole over the continent. Besides providing a protective layer over the Earth, ozone is a greenhouse gas. The models now suggest that as the ozone hole heals, thanks to world-wide bans on harmful CFCs, aerosols, and other airborne particles, Antarctica should begin to fall in line and warm up with the rest of the planet. These models are conspicuously missing from climate skeptic literature. Also missing is the fact that there has been some debate in the science community over our results. We continue to stand by the results for the period analyzed, but an unbiased coverage would acknowledge the differences of opinion.

Tip to onegoodmove.

posted July 31, 2006 04:18 PM in Global Warming | permalink | Comments (0)

July 29, 2006

Look around you, maths & germs

These are completely ridiculous. My favorite bits were the synthesizer background music, which reminded me of the video games I played on our Atari growing up, the mathematical graffiti and the Lepidopterus arboratus.

posted July 29, 2006 08:52 PM in Humor | permalink | Comments (0)

July 26, 2006

Models, errors and the methods of science.

A recent posting on the arxiv prompts me to write down some recent musings about the differences between science and non-science.

On the Nature of Science by B.K. Jennings

A 21st century view of the nature of science is presented. It attempts to show how a consistent description of science and scientific progress can be given. Science advances through a sequence of models with progressively greater predictive power. The philosophical and metaphysical implications of the models change in unpredictable ways as the predictive power increases. The view of science arrived at is one based on instrumentalism. Philosophical realism can only be recovered by a subtle use of Occam's razor. Error control is seen to be essential to scientific progress. The nature of the difference between science and religion is explored.

Which can be summarized even more succinctly by George Box, famously saying "all models are wrong but some models are useful" with the addendum that this recognition is what makes science different from religion (or other non-scientific endeavors), and that the sorting out the useful from the useless is what drives science forward.

In addition to being a relatively succinct introduction to the basic terrain of modern philosophy of science, Jennings also describes two common critiques of science. The first is the God of the Gaps idea: basically, science explains how nature works and everything left unexplained is the domain of God. Obviously, the problem is that those gaps have a pesky tendency to disappear over time, taking that bit of God with them. For Jennings, this idea is just a special case of the more general "Proof by Lack of Imagination" critique, which is summarized as "I cannot imagine how this can happen naturally, therefore it does not, or God must have done it." As with the God of the Gaps idea, more imaginative people tend to come along (or have come along before) who can imagine how it could happen naturally (e.g., continental drift). Among physicists who like this idea, things like the precise value of fundamental constants are grist for the mill, but can we really presume that we'll never be able to explain them naturally?

Evolution is, as usual, one of the best examples of this kind of attack. For instance, almost all of the arguments currently put forth by creationists are just a rehashing of arguments made in the mid-to-late 1800s by religious scientists and officials. Indeed, Darwin's biggest critic was the politically powerful naturalist Sir Richard Owen, who objected to evolution because he preferred the idea that God used archetypical forms to derive species. The proof, of course, was in the overwhelming weight of evidence in favor of evolution, and, in the end, with Darwin being much more clever than Owen.

Being the bread and butter of science, this may seem quite droll. But I think non-scientists have a strong degree of cognitive dissonance when faced with such evidential claims. That is, what distinguishes scientists from non is our conviction that knowledge about the nature of the world is purely evidential, produced only by careful observations, models and the control of our errors. For the non-scientist, this works well enough for the knowledge required to see to the basics of life (eating, moving, etc.), but conflicts with (and often loses out to) the knowledge given to us by social authorities. In the West before Galileo, the authorities were the Church or Aristotle - today, Aristotle has been replaced by talk radio, television and cranks pretending to be scientists. I suspect that it's this conflicting relationship with knowledge that might explain several problems with the lay public's relationship with science. Let me connect this with my current reading material, to make the point more clear.

Deborah Mayo's excellent (and I fear vastly under-read) Error and the Growth of Experimental Knowledge, is a dense and extremely thorough exposition of a modern philosophy of science, based on the evidential model I described above. As she reinterprets Kuhn's analysis of Popper, she implicitly points to an explanation for why science so often classes with non-science, and why these clashes often leave scientists shaking their heads in confusion. Quoting Kuhn discussing why astrology is not a science, she says

The practitioners of astrology, Kuhn notes, "like practitioners of philosophy and some social sciences [AC: I argue also many humanities]... belonged to a variety of different schools ... [between which] the debates ordinarily revolved about the implausibility of the particular theory employed by one or another school. Failures of individual predictions played very little role." Practitioners were happy to criticize the basic commitments of competing astrological schools, Kuhn tells us; rival schools were constantly having their basic presuppositions challenged. What they lacked was that very special kind of criticism that allows genuine learning - the kind where a failed prediction can be pinned on a specific hypothesis. Their criticism was not constructive: a failure did not genuinely indicate a specific improvement, adjustment or falsification.

That is, criticism that does not focus on the evidential basis of theories is what non-sciences engage in. In Kuhn's language, this is called "critical discourse" and is what distinguishes non-science from science. In a sense, critical discourse is a form of logical jousting, in which you can only disparage the assumptions of your opponent (thus undercutting their entire theory) while championing your own. Marshaling anecdotal evidence in support of your assumptions is to pseudo-science, I think, what stereotyping is to racism.

Since critical discourse is the norm outside of science, is it any wonder that when non-scientists, attempting to resolve the cognitive dissonance between authoritative knowledge and evidential knowledge, resort to the only form of criticism they understand? This leads me to be extremely depressed about the current state of science education in this country, and about the possibility of politicians ever learning from their mistakes.

posted July 26, 2006 11:26 PM in Scientifically Speaking | permalink | Comments (1)

July 20, 2006

One more hurdle, cleared.

Today, I passed my dissertation defense, with distinction. Thank you to everyone who helped me get to, and past, this point in my career.

To celebtrate, this weekend I'm off to Las Vegas to complete my two-part homage to the gods of chance. Each part of the homage involves a pilgrimage to a holy site and a ritualistic sacrifice of 20 units of currency on the alter of probability. The first part was two summers ago, when I offered 20 euro to the gods at Monte Carlo, who rewarded me with a repayment in kind. This weekend, I'll offer 20 dollars to the gods of Las Vegas, and see if they're as nice as their French counterparts.

Update, July 24: The gods of chance who reside in Las Vegas are decidedly more callow than their fellows in Monte Carlo, which is to say that I lost my $20.

posted July 20, 2006 11:28 AM in Self Referential | permalink | Comments (3)

July 17, 2006

Uncertainty about probability

In the past few days, I've been reading about different interpretations of probability, i.e., the frequentist and bayesian approaches (for a primer, try here). This has, of course, led me back to my roots in physics since both quantum physics (QM) and statistical mechanics both rely on probabilities to describe the behavior of nature. Amusingly, I must not have been paying much attention while I was taking QM at Haverford, e.g., Neils Bohr once said "If quantum mechanics hasn't profoundly shocked you, you haven't understood it yet." and back then I was neither shocked nor confused by things like the uncertainty principle, quantum indeterminacy or Bell's Theorem. Today, however, it's a different story entirely.

John Baez has a nice summary and selection of news-group posts that discuss the idea of frequentism versus bayesianism in the context of theoretical physics. This, in turn, led me to another physicist's perspective on the matter. The late Ed Jaynes has an entire book on probability from a physics perspective, but I most enjoyed his discussion of the physics of a "random experiment", in which he notes that quantum physics differs sharply in its use of probabilities from macroscopic sciences like biology. I'll just quote Jaynes on this point, since he describes it so eloquently:

In biology or medicine, if we note that an effect E (for example, muscle contraction) does not occur unless a condition C (nerve impulse) is present, it seems natural to infer that C is a necessary causative agent of E... But suppose that condition C does not always lead to effect E; what further inferences should a scientist draw? At this point the reasoning formats of biology and quantum theory diverge sharply.

... Consider, for example, the photoelectric effect (we shine a light on a metal surface and find that electrons are ejected from it). The experimental fact is that the electrons do not appear unless light is present. So light must be a causative factor. But light does not always produce ejected electrons... Why then do we not draw the obvious inference, that in addition to the light there must be a second causative factor...?

... What is done in quantum theory is just the opposite; when no cause is apparent, one simple postulates that no cause exists; ergo, the laws of physics are indeterministic and can be expressed only in probability form.

... In classical statistical mechanics, probability distributions represent our ignorance of the true microscopic coordinates - ignorance that was avoidable in principle but unavoidable in practice, but which did not prevent us from predicting reproducible phenomena, just because those phenomena are independent of the microscopic details.

In current quantum theory, probabilities express the ignorance due to our failure to search for the real causes of physical phenomena. This may be unavoidable in practice, but in our present state of knowledge we do not know whether it is unavoidable in principle.

Jaynes goes on to describe how current quantum physics may simply be in a rough patch where our experimental methods are simply too inadequate to appropriately isolate the physical causes of the apparent indeterministic behavior of our physical systems. But, I don't quite understand how this idea could square with the refutations of such a hidden variable theory after Bell's Theorem basically laid local realism to rest. It seems to me that Jaynes and Baez, in fact, evoke similar interpretations of all probabilities, i.e., that they only represent our (human) model of our (human) ignorance, which can be about either the initial conditions of the system in question, the causative rules that cause it to evolve in certain ways, or both.

It would be unfair to those statistical physicists who work in the field of complex networks to say that they share the same assumptions of no-causal-factor that their quantum physics colleagues may accept. In statistical physics, as Jaynes points out, the reliance on statistical methodology is forced on statistical physicists by our measurement limitations. Similarly, in complex networks, it's impractical to know the entire developmental history of the Internet, the evolutionary history of every species in a foodweb, etc. But unlike statistical physics, in which experiments are highly repeatable, every complex network has a high degree of uniqueness, and are thus more like biological and climatological systems where there is only one instance to study. To make matters even worse, complex networks are also quite small, typically having between 10^2 and 10^6 parts; in contrast, most systems that concern statistical physics have 10^22 or more parts. In these, it's probably not terribly wrong to use a frequentist perspective and assume that their relative frequencies behave like probabilities. But when you only have a few thousand or million parts, such claims seems less tenable since it's hard to argue that you're close to asymptotic behavior in this case. Bayesianism, being more capable of dealing with data-poor situations in which many alternative hypotheses are plausible, seems to offer the right way to deal with such problems. But, perhaps owing to the history of the field, few people in network science seem to use it.

For my own part, I find myself being slowly seduced by their siren call of mathematical rigor and the notion of principled approaches to these complicated problems. Yet, there are three things about the bayesian approach that make me a little uncomfortable. First, given that with enough data, it doesn't matter what your original assumption about the likelihood of any outcome is (i.e., your "prior"), shouldn't bayesian and frequentist arguments lead to the same inferences in a limiting, or simply very large, set of identical experiments? If this is right, then it seems more reasonable that statistical physicists have been using frequentist approaches for years with great success. Second, in the case where we are far from the limiting set of experiments, doesn't being able to choose an arbitrary prior amount to a kind of scientific relativism? Perhaps this is wrong because the manner in which you update your prior, given new evidence, is what distinguishes it from certain crackpot theories.

Finally, choosing an initial prior seems highly arbitrary, since one can always recurse a level and ask what prior on priors you might take. Here, I like the ideas of a uniform prior, i.e., I think everything is equally plausible, and of using the principle of maximum entropy (MaxEnt; also called the principle of indifference, by Laplace). Entropy is a nice way to connect this approach with certain biases in physics, and may say something very deep about the behavior of our incomplete description of nature at the quantum level. But, it's not entirely clear to me (or, apparently, others: see here and here) how to use maximum entropy in the context of previous knowledge constraining our estimates of the future. Indeed, one of the main things I still don't understand is how, if we model the absorption of knowledge as a sequential process, to update our understanding of the world in a rigorous way while guaranteeing that the order we see the data doesn't matter.

Update July 17: Cosma points out that Jaynes's Bayesian formulation of statistical mechanics leads to unphysical implications like a backwards arrow of time. Although it's comforting to know that statistical mechanics cannot be reduced to mere Bayesian crank-turning, it doesn't resolve my confusion about just what it means that the quantum state of matter is best expressed probabilistically! His article also reminds me that there are good empirical reasons to use a frequentist approach, reasons based on Mayo's arguments and which should be familiar to any scientist who has actually worked with data in the lab. Interested readers should refer to Cosma's review of Mayo's Error, in which he summarizes her critique of Bayesianism.

posted July 17, 2006 03:30 PM in Scientifically Speaking | permalink | Comments (0)

July 10, 2006

That career thing

I'm sure this piece of advice to young scientists by John Baez (of quantum gravity fame) is old news now (3 years on). But, seeing as it was written before I was paying attention to this kind of stuff myself, and it seems like quite good advice, here is it, in a nutshell:

1. Read voraciously, ask questions, don't be scared of "experts", and figure out what are the good problems to work on in your field.
2. Go the most prestigious school, and work with the best possible advisor.
3. Publish often and publish stuff people will want to read (and cite).
4. Go to conferences and give good, memorable talks.

Looking back over my success, so far, I think I've done a pretty good job on most of these things. His advice about going to a prestigious place seems to be more about getting a good advisor - I suppose that in physics, being a very old field, the best advisors can only be found at the most prestigious places. But, I'm not entirely convinced that this is true for the interdisciplinary mashup, which includes complex networks and the other things I like to study, yet...

posted July 10, 2006 12:50 AM in Simply Academic | permalink | Comments (3)

July 08, 2006

Electronic Frontier Foundation

For many years, I've been interested in the evolution of intellectual property law and of our civil liberties in a fully-wired world (where privacy is non-existent, by default). Typically, I get my updates on these fronts from Ars Technica, a techie site that follows these things closely. Net Neutrality (see here and here) is a big issue right now, and Congress is holding hearings about just how democratic it wants online speech to be (which is to say, some very deep pockets like Comcast and AT&T are trying to make it very undemocratic). One of the champions of keeping the digital world sane (by any reasonable measure) is the Electronic Frontier Foundation (EFF), who recently sued AT&T (new slogan: Your world. Delivered. To the NSA.) over their collusion with the NSA to violate the civil liberties of millions of Americans. MSNBC has an article that discusses a bit of the EFF's history and ongoing work that's worth a quick read. After that, I highly recommend popping over to the EFF's website, and, if you are so motivated, give them a little support.

posted July 8, 2006 01:35 AM in Political Wonk | permalink | Comments (0)

July 07, 2006

The popularity contest

Congratulations to friends Cosma Shalizi over at Three-Toed Sloth and Dave Bacon at Quantum Pontiff for making the top 50 of Nature News' scientist's blog ranking! I'm duly impressed with you both, and simply hope that you continue to write interesting stuff at your regular prolific rates.

Cosmic Variance, which I occasionally frequent for my fix of cosmological weirdness, also made the list at #4.

posted July 7, 2006 02:50 AM in Reviews | permalink | Comments (0)

July 06, 2006

An ontological question about complex systems

Although I've been reading Nature News for several years now (as part of my daily trawl for treasure in the murky waters of science), I first came to recognize one of their regular writers Philip Ball when he wrote about my work on terrorism with Maxwell Young. His essay, now hidden behind Nature's silly subscription-only barrier, sounded an appropriately cautionary note about using statistical patterns of human behavior to predict the future, and was even titled "Don't panic, it might never happen."

The idea that there might be statistical laws that govern human behavior can be traced, as Ball does in his essay, back to the English philosopher Thomas Hobbes (1588-1679) in The Leviathan and to the French positivist philosopher Auguste Comte (1798-1857; known as the father of sociology, and who also apparently coined the term "altruism"), who were inspired by the work of physicists in mechanizing the behavior of nature to try to do the same with human societies.

It seems, however, that somewhere between then and now, much of sociology has lost interest in such laws. A good friend of mine in graduate school for sociology (who shall remain nameless to protect her from the politics of academia) says that her field is obsessed with the idea that context, or nurture, drives all significant human behavior, and that it rejects the idea that overarching patterns or laws of society might exist. These, apparently, are the domain of biology, and thus Not Sociology. I'm kind of stunned that any field that takes itself seriously would so thoroughly cling to the nearly medieval notion of the tabula rasa (1) in the face of unrelenting scientific evidence to the contrary. But, if this territory has been abandoned by sociologists (2), it has recently, and enthusiastically, been claimed by physicists (who may or may not recognize the similarity of their work to a certain idea in science fiction).

Ball's background is originally in chemistry and statistical physics, and having spent many years as an editor at Nature, he apparently now has a broad perspective on modern science. But, what makes his writing so enjoyable is the way he places scientific advances in their proper historical context, showing both where the inspiration may have come from, and how other scientists were developing similar or alternative ideas concurrently. These strengths are certainly evident in his article about the statistical regularity of terrorism, but he puts them to greater use in several books and, in particular, one on physicists' efforts to create something he calls sociophysics. As it turns out, however, this connection between physics and sociology is not a new one, and the original inspiration for statistical physics (one of the three revolutionary ideas in modern physics; the other two are quantum mechanics and relativity) is owed to social scientists.

In the mid 1800s, James Clerk Maxwell, one of the fathers of statistical physics, read Henry Thomas Buckle's lengthy History of Civilization. Buckle was a historian by trade, and a champion of the idea that society's machinations are bound by fundamental laws. Maxwell, struggling with the question of how to describe the various motions of particles in a gas, was struck by Buckle's descriptions of the statistical nature of studies of society. Such studies sought not to describe each individual and their choices exactly, but instead represent the patterns of behavior statistically, and often pointed to surprising regularities, e.g., the near-stable birth or suicide rates in a particular region. As a result, Maxwell abandoned the popular approach of describing gas particles only using Newtonian mechanics, i.e., an attempt to describe every particle's position and motion exactly, in favor for a statistical approach that focused on the distribution of velocities.

It was the profound success of these statistical descriptions that helped cement this approach as one of the most valuable tools available to physicists, and brought about some pretty profound shifts in our understanding of gasses, materials and even astrophysics. So, it seems fitting that statistical physicists are now returning to their roots by considering statistical laws of human behavior. Alas, I doubt that most such physicists appreciate this fact.

These efforts, which Ball surveys in "Critical Mass" (Farrar, Straus and Giroux, 2004) via a series of well-written case studies, have dramatically altered our understanding of phenomena as varied as traffic patterns (which have liquid, gaseous, solid and meta-stable states along with the corresponding phase transitions), voting patterns in parliamentary elections (which display nice heavy-tailed statistics), the evolution of pedestrian traffic trails across a university quad, economics and the statistics of businesses and markets, and a very shallow discussion of social networks. Although his exposition is certainly aimed at the layman, he does not shy away from technical language when appropriate. Pleasantly, he even reproduces figures from the original papers when it serves his explanations. Given that these phenomena were drawn from a burgeoning field of interdisciplinary research, it's easy to forgive him for omitting some of my favorite topics, treating others only shallowly, and mercifully leaving out the hobby horses of cellular automata, genetic algorithms and artificial life.

Now, after seeing that list of topics, you might think that "Critical Mass" was a book about complex systems, and you might be right. But, you might be wrong, too, which is the problem when there's no strict definition of a term. So, let's assume he has, and see what this offers in terms of clarifying the corresponding ontological question. For one thing, Ball's choices suggest that perhaps we do not need other ill-defined properties like emergence, self-organization or robustness (3) to define a complex system. Instead, perhaps when we say we are studying a "complex system," we simply mean that it has a highly heterogeneous composition that we seek to explain using statistical mechanisms. To me, the former means that I, because of my limited mental capacity to grasp complicated equations, relationships or a tremendously large configuration space, pretty much have to use a statistical characterization that omits most of the detailed structure of the system; also, I say heterogeneous because homogeneous systems are much easier to explain using traditional statistical mechanics. The latter means that I'm not merely interested in describing the system, which can certainly be done using traditional statistics, but rather in explaining the rules and laws that govern the formation, persistence and evolution of that structure. For me, this definition is attractive both for its operational and utilitarian aspects, but also because it doesn't require me to wave my hands, use obfuscating jargon or otherwise change the subject.

In general, it's the desire to establish laws that reflects complex systems' roots in physics, and it is this that distinguishes it from traditional statistics and machine learning. In those areas, the focus seems to me to be more on predictive power ("Huzzah! My error rate is lower than yours.") and less on mechanisms. My machine learning friends tell me that people are getting more interested in the "interpretability" of their models, but I'm not sure this is the same thing as building models that reflect the true mechanical nature of the underlying system... of course, one fundamental difference between much of statistical learning and what I've described above is that for many systems, there's no underlying mechanism! We shouldn't expect problems like keeping the spam out of my inbox to exhibit nice mechanistic behavior, and there are a tremendous number of such problems out there today. Fortunately, I'm happy to leave those to people who care more about error rates than mechanisms, and I hope they're happy to leave studying the (complex) natural world, mechanisms and all, to me.

Updates, July 7

(1) The notion of the tabula rasa is not antithetical to the idea that there are patterns in social behavior, but patterns per se are not the same as the kind of societal laws that the founders of sociology were apparently interested in, i.e., sociology apparently believes these patterns to be wholly the results of culture and not driven by things that every human shares like our evolutionary history as a species. I suppose there's a middle ground here, in which society has created the appearance of laws, which the sociophysicists then discover and mistake for absolutes. Actually, I'm sure that much of what physicists have done recently can be placed into this category.

(2) It may be the case that it is merely the portion of sociology that my friend is most familiar with that expresses this odd conviction, and that there are subfields that retain the idea that true mechanistic laws do operate in social systems. For all I know, social network analysis people may be of this sort; it would be nice to have an insider's perspective on this.

(3) Like the notions of criticality and universality, these terms actually do have precise, technical definitions in their proper contexts, but they've recently been co-opted in imprecisely ways and are now, unfortunately and in my opinion, basically meaningless in most of the complex systems literature.

posted July 6, 2006 07:09 PM in Reviews | permalink | Comments (0)