WATERLOO, Ontario—One thing that’s both disconcerting and exhilarating about physics is how many seemingly simple questions remain unanswered. When you hear the questions that physicists struggle with, you sometimes say to yourself, Wait, you mean they don’t even know that? Physics might be defined as the subject that tries to figure out why the world may look incomprehensibly complex at first, but on closer examination is governed by simple laws. Those laws, applied repeatedly, build up the complexity. From this definition, you’d presume that physicists have at least sorted out what they mean by “law”.
Why should nature be governed by laws? Why should those laws be expressible in terms of mathematics? Why should they be formulated within space and time? These were the questions posed at a fascinating workshop two weeks ago at the Perimeter Institute, the sequel to a workshop held at Arizona State University in December 2008. One of the participants, Sabine Hossenfelder, talked about it yesterday at Backreaction, one of the most consistently thoughtful of all physics blogs. The bottom line is that the organizers had better start planning on more sequels, because the questions seem as intractable as ever.
I don’t think I’d ever been to a conference quite like this workshop. Where else could I have heard a derivation of the theory of quantum mechanics, an argument against polytheism, and a trick for giving directions to a place you don’t know, all in the span of a couple of hours? The participants were a mix of physicists and academic philosophers. These two communities were very close in the days of Einstein, but then drifted apart. In his book Dreams of a Final Theory, particle physicist Steven Weinberg had a chapter entitled “Against Philosophy” that summarized the disdain physicists of his generation felt for the subject. But as I wrote in an essay several years ago, times are changing, largely because many physicists think their search for a unified theory is stalling for a failure to think through philosophical questions. At meetings where the two groups come together, they strike me as quite compatible. The philosophers in attendance tend to have training in physics, and the physicists, even if they can’t tell their Hegel from their Heidegger, are eager to learn.
Their main difference is style. Physicists tend to speak rather loosely and rely on mathematics to back them up, whereas philosophers are more meticulous rhetorically (sometime to a fault). Physicists also have a tendency of interrupting speakers with questions early and often, preventing the philosophers from ever getting to their point. “Philosophers are much more civilized than physicists are,” mused (physicist) Niayesh Afshordi.
What really made this workshop odd, though, is that, with a few exceptions, the talks were long on assertions and short on arguments. It was essentially a three-day brainstorming session, meant to provoke and send the participants home with new ideas they might eventually weave into their work, rather than convey concrete results. That very quality makes writing a blog post about it as challenging as summarizing Proust. Pour yourself a cup of coffee and settle back.
What are laws?
The first several speakers went straight for the question of what laws are. Among them were philosopher John Roberts of the University of North Carolina at Chapel Hill. (Video of his talk is here.) A law not only describes a pattern in nature, but distinguishes between patterns that arise by chance and those that are always there, independent of the particulars of a situation. What this means is frustratingly tricky to pin down, and it gets worse when you talk about the entire universe. If the universe is all there is, how could it have been any different? If it couldn’t have, then what’s the difference between a chance pattern and an inherent one?
Roberts reviewed some leading philosophical schools of thought, found them wanting, and argued that the concept of a law is inseparable from the way that physicists discover laws. Their main tool is a controlled experiment, which, by its very nature, looks for patterns that hold whatever the specific conditions might be. I confess that I didn’t understand how Roberts’s approach helps with the questions we most care about: Why is nature is patterned rather than chaotic? Why does a law gleaned from one situation (say, falling apples) works in unrelated situations (orbiting planets)? But it does seem useful to acknowledge that our laws, even if they capture some objective reality, are conditioned by our process of discovery.
The following speaker, physicist Marcelo Gleiser of Dartmouth, made a similar point in a physics-y way: science, he said, is very tool-dependent. But he then went off in a different direction, arguing that a final theory is a false dream because new tools invariably mean new discoveries. Backreaction has more commentary on his talk, which you can watch for yourself, and there’s always Gleiser’s new book. Throughout the workshop, the participants kept returning to the concern that there may be no final, unified theory, but only a patchwork of theories.
My own reaction was that although it’s useful to caution against clinging to preconceived ideas about a final theory, Gleiser was too insistent on seeing the glass of physics as half-empty. We may be ignorant of much, but we know a remarkable amount, too, and everything we see indicates that nature is governed by simple laws. Observers are making new discoveries all the time, but new discoveries don’t mean new laws. The vast bulk of what they find can be understood using existing laws, and the exceptions arise in situations where laws come into conflict, suggesting that reconciling them will provide an explanation for the exceptions, too.
In the question-and-answer period after Gleiser’s talk, astrophysicist and novelist Janna Levin of Columbia University, made a good point that although physicists often draw a contrast between observation and pure thought, our minds are shaped by the physical world, so our thoughts, too, represent an indirect form of observation.
A later speaker, physicist David Wolpert of the NASA Ames Research Center, also sounded some caution. The main purpose of laws is to make reliable predictions, but this goal might be inherently unachievable. Using a variant of the argument that Kurt Gödel used to prove his incompleteness theorems, Wolpert showed that there are predictions that physicists can never guarantee to be correct. One cute implication is the “monotheism theorem”: there can be at most one omniscient god. If there were two, they’re be able to read each others’ minds and run into paradoxes of circularity. For more, watch the talk, read the paper, or, even better, reread your John Milton.
How do physicists choose laws and test them?
Philosopher Chris Smeenk of the University of Western Ontario picked up the question of how to formulate a law of the entire universe. A law typically applies to multiple situations such as reproducible experiments, yet there is only one universe. But he argued that a unique universe still has multiple levels of approximation. Physicists routinely start with a crude guess for planetary orbits or particle behavior and gradually refine it. Each of these steps of refinement, Smeenk suggested, is a distinct situation that allows you to test the laws. Watch the video here.
In the Q&A period, English physicist Julian Barbour said that people overstate the role of reproducible experiments in classical physics, anyway. In practice, a single experiment can be plenty. Only in quantum physics does repetition become essential, because quantum theory is probabilistic and probability implies multiple instances.
I found common ground between Smeenk’s talk and a later one by Carnegie Mellon philosopher Kevin Kelly. Kelly sought to explain Occam’s razor: the precept that the best law is the simplest one that fits the data. The razor is one of those ideas that physicists use all the time without really thinking why — or whether it causes them to see simplicity that isn’t necessarily there. The leading argument for why the razor works comes from probability theory and holds that simple theories really are more likely to be correct than complex ones; American Scientist magazine had a fantastic article in 1991 that laid out the case. But Kelly put forward a different explanation: the razor works because a simpler law is less likely than a complex one to be retracted. It can be considered the first step in a series of successive approximations. It might need to be enhanced and augmented, but it’s less likely to be flat-out wrong.
Kelly compared it to giving road directions. Suppose a driver comes up to you and asks for directions to a place you don’t know. You want to be helpful and not admit your ignorance. What should you do? The trick is to choose the route that leads to the most places — probably the nearest freeway or the road leading downtown. That way, you’ve got the best chance of sending them the right way and sparing them from having to double-back. Kelly said Occam’s razor puts physicists on the best route to the right law, even if it can’t pick out that law. Video of his talk is here.
Is time illusory or real?
The real fireworks at the workshop came from disagreements over time — not over whether the speakers were running behind schedule and cutting into the coffee breaks, but over whether time itself is a derived concept or a fundamental one. Does time emerge from something deeper or is it an irreducible part of the natural world? In our current issue, philosopher Craig Callender of U.C. San Diego lays out the case for the first option, based partly on Barbour’s ideas.
Barbour has a typically English understated sense of humor. “I’m happy to let go of time,” he told the workshop participants. “I did it about 40 years ago.” His talk, which you can watch here, was masterful, if unconventional. It did not lay out a scientific argument in the usual sense: a pile of data and formulas that bludgeon even the hardest skeptic into grudging acceptance. Rather, Barbour took us on a guided walk through the forest of his fertile mind.
For instance, as a metaphor for the universe, he drew a circle (see photo above) with 24 red and blue dots. There was a logic to the coloring: it maximized the variety of color sequences around the circumference. If you didn’t know the color of a dot, you could deduce it by looking at all the others and figuring out which color would maximize the variety of the whole. It reminded me of one of those logic puzzles where you don’t know what color hat you have on your head, but can figure it out by seeing what hats everyone else is wearing.
The universe, Barbour suggested, is a bit like this. The particles that constitute it do not have built-in properties such as spatial position. Instead those properties arise from the relations among the particles. A particle is ascribed a certain position by virtue of what relations it bears to all the other components. He described how particle relations might be categorized geometrically and how such basic concepts as position, length, duration, and simultaneity can then be derived. The only geometric property he had to presume, rather than derive, was angles between lines. In fact, if you think about it, when do you ever observe a length? You always infer length from angles, such as the angle between light rays subtended at your eyes.
Barbour is not the only physicist to argue that the fundamental laws of nature are “conformally invariant,” meaning that they have no built-in sense of scale, but do involve angles. In a theory such as relativity theory, those angles represent cause-effect relations.
The main trouble I had with the talk was that I didn’t see how the abstract ideas related to the world we experience. Time seems so real. How did it arise? Why is the world structured in the very special way that is needed to give rise to time? In short, what do we really gain by saying that time isn’t real?
Talks by two other physicists put some flesh on the bones of the emergent-time idea. Kevin Knuth of the University of Albany showed how you can begin with a network of cause-effect relations and recover space and time from them. For more details, read his paper or roll the video. Philip Goyal of Perimeter showed that you can even recover the entire theory of quantum mechanics from such a network. His talk is here.
An irony is that Barbour used to be a lonely voice for this option, but it is becoming the mainstream view. It is now more radical to suggest that time is foundational. That is what the unlikely pairing of physicist Lee Smolin of Perimeter and political philosopher Roberto Unger of Harvard Law School did. Like Barbour, they didn’t really present an argument, but a manifesto.
Smolin took issue with what he called the Newtonian Paradigm, the conceptual division of nature into two elements: (a) the state of the world, and (b) the laws of physics. The state of the world is defined in space. In classical mechanics, such as the rules that govern a pool table, the state consists of the positions and velocities of objects. The laws of physics operate in time. They take one state to the next. Smolin suggested that this framework, though it works well for everyday situations, runs off the rails when applied to the entire universe. It leads to conclusions that he and Unger called absurd, such as the “block universe” — the proposition that all times, past and future, are equally real. Watch their tag-team talk here.
The most tangible idea I took away is that if time is real and the future is genuinely open, then the laws of physics might themselves change. You can either take the laws of nature as fixed, in which case time emerges, or take time as fixed, in which case the laws of nature evolve. To me, this sounds like the restatement of a controversial argument made by the French physicist Henri Poincaré a century ago. Smolin and Unger find the latter view more natural, but it runs into two immediate problems: What evidence is there that the laws have ever changed? And if they do change, are those changes themselves subject to laws? If so, you’re either caught in an endless regress of laws, meta-laws, meta-meta-laws, and on and on, or you have to suppose that some laws really are fixed.
Ultimately, the manifesto will prove its worth only if it leads to a fleshed-out theory. The best attempts so far are Smolin’s own cosmological natural selection, Fotini Markopoulou’s quantum graphity model for spacetime, and Petr Horava’s emergent-space proposal.
Could black holes be responsible for the accelerating universe?
By this point, my head was throbbing with deep thoughts. Niayesh Afshordi’s talk came as a relief. It had equations! It cited observations! It made predictions! It didn’t tell me that everything I thought I knew was wrong! All it tried to do was explain away dark energy. At any other conference, that would have counted as dauntingly radical. Here, it was effacingly modest.
The model he presented took off from two speculative but plausible ideas. First, space is filled with an invisible fluid — an aether — as predicted by some proposed quantum theories of gravity, such as Horava’s. Second, black holes give off feeble radiation, as predicted by almost every quantum gravity theory. Afshordi calculates that the radiation should heat the aether and, like bringing a pot of water to a boil, generate a (negative) pressure throughout the cosmos. Such a pressure is the quintessential attribute of dark energy and has the consequence of speeding up cosmic expansion.
In other words, quantum gravitational effects might mimic dark energy. This model neatly explains why cosmic acceleration began several billion years ago rather than all the way back at the big bang: it took a while for black holes to form and heat up the aether. The paper is worth reading, and the talk isn’t bad, either.
If you’ve made it all the way to the end of this blog post, you have truly proven yourself an aficionado of physics, and I’d love to hear from you in the comments section! You’ll be pleased to learn that I hope to invite many of these researchers to present their ideas in the print magazine over the coming years.
Photograph of Julian Barbour by George Musser