Fundamental Research Funding

Michael Nielsen, who’s so smart it’s like he’s posting from tomorrow, offers a couple of provocative questions about the perception of a crisis in funding for basic science:

First, how much funding is enough for fundamental research? What criterion should be used to decide how much money is the right amount to spend on fundamental research?

Second, the human race spent a lot lot more on fundamental research in the second half of the twentieth century than it did in the first. It’s hard to get a good handle on exactly how much, in part because it depends on what you mean by fundamental research. At a guess, I’d say at least 1000 times as much was spent in the second half of the twentieth century. Did we learn 1000 times as much? In fact, did we learn as much, even without a multiplier?

Well, at least there’s nothing controversial there…

These are excellent questions, but they’re also uncomfortable questions. They’re not unaskable, but it’s almost unsportsmanlike to ask them directly of most scientists. They’re questions that really need to be confronted, though, and a think a good deal of the image problems that science in general has at the moment can be traced to a failure to grapple more directly with issues of funding and the justification of funding.

Taking these in reverse, I think it’s hard to quantify the amount of “learning” that went on in the 20th Century– are you talking only about fundamental discoveries (the Standard Model, the accelerating universe), or do expermental realizations of old ideas (Bose-Einstein Condensation) count? In the latter half of the 20th century, we probably worked out the quantum details of 1000 times as many physical systems as in the first half, but that sort of thing feels a little like stamp collecting– adding one new element to a mixture and then re-measuring the band structure of the resulting solid doesn’t really seem to be on the same level as, say, the Schrödinger equation, but I’m at a loss for how to quantify the difference.

If we think only about really fundamental stuff, I think it’s interesting to look at the distribution of funding, which has become much more centralized out of necessity. It’d be hard to argue that the increase in fundamental knowledge has kept pace with the increase in funding, but to some degree, complaining about that is a little like a first-year grad student grumbling that it was much easier to get your name on an equation back in 1930. All the easy problems have been done, meaning that you have to sink a lot of resources into research these days to make even incremental progress.

Experiments have gotten more expensive, and as a result, the number of places they can be done has gotten smaller– when the LHC finally comes on line, it will pretty much be the only game in town. And that sort of necessarily limits the total amount of stuff you can hope to discover– if there’s only one facility in the world at which you can do some experiment, you’re not going to be able to make as many discoveries as you could with 1000 different facilities doing the same sorts of experiments.

This isn’t restricted to high-energy physics, either. Somebody at DAMOP this year remarked that we seem be be asymptotically approaching particle physics– the number of lasers and gadgets involved in a typical BEC experiment is increasing every year, the author lists are getting longer, and fewer groups are able to really compete.

The more important question, though, is should we really expect or demand that learning be proportional to funding? And what, exactly, do we as a society expect to get out of fundamental research?

For years, the argument has been based on technology– that fundamental research is necessary to understand how to build the technologies of the future, and put a flying car in every garage. This has worked well for a long time, and it’s still true in a lot of fields, but I think it’s starting to break down in the really big-ticket areas. You can make a decent case that, say, a major neutron diffraction facility will provide materials science information that will allow better understanding of high-temperature superconductors, and make life better for everyone. It’s a little harder to make that case for the Higgs boson, and you’re sort of left with the Tang and Velcro argument– that working on making the next generation of whopping huge accelerators will lead to spin-off technologies that benefit large numbers of people. It’s not clear to me that this is a winning argument– we’ve gotten some nice things out of CERN, the Web among them, but I don’t know that the return on investment really justifies the expense.

And this is where the image problem comes in– I think science suffers in the popular imagination in part because people see vast sums of money being spent for minimal progress on really esoteric topics, and they start to ask whether it’s really worth it. And the disinclination of most scientists to really address the question doesn’t help.

Of course, it’s not like I have a sure-fire argument. Like most scientists, I think that research is inherently worth funding– it’s practically axiomatic. Science is, at a fundamental level, what sets us apart from other animals. We don’t just accept the world around us as inscrutable and unchangeable, we poke at it until we figure out how it works, and we use that knowledge to our advantage. No matter what poets and musicians say, it’s science that makes us human, and that’s worth a few bucks to keep going. And if it takes millions or billions of dollars, well, we’re a wealthy society, and we can afford it.

We really ought to have a better argument than that, though.

As for the appropriate level of funding, I’m not sure I have a concrete number in mind. If we’ve got half a trillion to piss away on misguided military adventures, though, I think we can throw a few billion to the sciences without demanding anything particular in return.