Sean Carroll
Blog post with audio player, show notes, and transcript: https://www.preposterousuniverse.com/podcast/2019/03/04/episode-36-david-albert-on-quantum-measurement-and-the-problems-with-many-worlds/
Patreon: https://www.patreon.com/seanmcarroll
Quantum mechanics is our best theory of how reality works at a fundamental level, yet physicists still can’t agree on what the theory actually says. At the heart of the puzzle is the “measurement problem”: what actually happens when we observe a quantum system, and why do we apparently need separate rules when it happens? David Albert is one of the leading figures in the foundations of quantum mechanics today, and we discuss the measurement problem and why it’s so puzzling. Then we dive into the Many-Worlds version of quantum mechanics, which is my favorite (as I explain in my forthcoming book Something Deeply Hidden). It is not David’s favorite, so he presents the case as to why you should be skeptical of Many-Worlds. (The philosophically respectable case, that is, not a vague unease at all those other universes.)
David Albert received his Ph.D. in physics from Rockefeller University. He is currently the Frederick E. Woodbridge Professor of Philosophy at Columbia University. His research involves a number of topics within the foundations of physics, including the arrow of time (coining the phrase “Past Hypothesis” for the low-entropy state of the early universe) and quantum mechanics. He is the author of a number of books, including Time and Chance, Quantum Mechanics and Experience, and After Physics.
Source
This is what you get when starting from postulates instead of working out the dynamics of a realistic quantum measurement: No clue where to go and over one hour of mumbling about misconceptions. For your interest: a macroscopic system can not be in a pure state (can not have "its" wave function), hence the Schrödinger cat problem can at best be discussed for a microscopic cat.
Sean, why not dedicate one podcast on what is learnt from solving the dynamics of realistic measurements and the ensuing statistical interpretation of quantum mechanics?
This conversation was like science porn for me.
Ok, I listened to the whole thing, and got absolutely no temptation to embrace Many Worlds. I don't think the best objections were really given any air time at all, though. I still think it's problematic that the number of branches at a decision point appears to depend on the superposition probabilities. And because of the way numbers behave, a tiny, tiny change in the probability split could vastly change that number. For example, let a two-possibility event have a 50/50 probability. Clearly we get two new branches. But what if the probability is 50.001 / 49.999? Now it looks like we get 100,000 branches. An absolutely trivial change in initial conditions has VASTLY changed the outcome. The only way I see to remove the variance is to ALWAYS create a huge number of branches. Ok, how many? And why? Where is the theory that explains all of that?
No offense intended, Dr. Carroll, and I have nothing but the highest respect for your skills and abilities, but no matter how hard I try I can't seem to keep this one from having a very foul odor. Just doesn't have the "ring of rightness" to it.
56:10 – I don't see that as strange at all. You're just stating that there is a perfectly definite fact of the matter about everything IN THE PAST. Of course there is.
46:00 – Ok. But what about the state in which some other observable altogether was measured, and the associated instruments are pointing toward the possible values of that one, and the human observer's brain is in the state of seeing that? A quantum system's state vector has one value. The decision to regard it as a linear combination of other values (eigenvectors of some observable) is an entirely arbitrary one. What keeps those universes from appearing? Also, let's say I make a measurement right here in Houston right now. And the universe splits. How does that split propagate? Are there instantly two Andromeda galaxies? Or does the split ripple outward at the speed of light?
Also, let's say we have a quantum event about to occur that has two possible outcomes, A and B. And let's say our pre-conditioning has created a condition where there is a 37% chance of A and a 63% chance of B. What exactly happens when things split? Do we get two universes? Or do we get a hundred, with 37 of them showing A and 63 of them showing B? If the answers is "two," then what do the Born probabilities mean in this interpretation?
This whole interpretation, in addition to being unprovable and therefore non-scientific, raises as many questions as it answers. Honestly I fail, and I mean utterly fail, to see how any serious-minded scientist can consider it for more than a few seconds.
Wait – expel someone from the PhD program for READING A BOOK????? Ok, after listening to that, I just want to say that I consider behavior of the sort exhibited by "the department" at Rockefeller as patently CRIMINAL.
51:38 ………. 51:52 there we go 😃
To sum up what I've learned from this:
There is no point for the fact of the matter.
Watched
Jung actually talked a lot about quantum mechanics and thought it might be an explanation for his observation of 'synchronicity'. Jung had some pretty extreme ideas but he's intellect was incredible so i wouldn't denote him as a pseudo scientist.
Sean, please have Niels Bohr on. He seems mighty interesting.
Just listened to a bit again and wanted to vent. His argument about stat mech c. 1:24:00 is just a bunch of hot air. It almost ascends to the level of "I don't see why it works, therefore it's wrong." The reason you get e.g. heat dynamics anything like "for free" from stat mech is that you've already defined temperature as the thing you can measure with a thermometer. Maybe he's not fluently equating "I have no clue about microstates" with "therefore I have a broad probability distribution over them?" Then Sean points out that "It does work, though," and his proposed explanations-away are vague poppycock! Okay, venting over.
I'm a fairly smart person…I have no clue what the hell David is talking about through this whole thing.
David is always fantastic and logical, although I prefer "many worlds".
I would love to hear some good arguments against everett. Sadly I didn't understand a thing David was saying, especially how it relates to everett. Maybe if Sean would be willing to summarize and steelman it and perhaps offer a counterargument?
I can’t necessarily follow all of his philosophical reasonings, but I like Albert and it just gives me a reason to listen through the whole conversation multiple times. Rather than just clicking on another introductory lecture on quantum mechanics which will normally will only cover material I’ve already heard a million times.
As an analytic philosopher, I really enjoy your podcasts. Thanks for bringing the disciplines of science and philosophy together for the rest of the public. I find the intersection of the two fascinating. We could all use some more education on social media too. Cheers.
I wish this kept going for another hour or two.
My favourite guest so far. Classy conversation.
Sorry but I just can't understand David's objection with MWI. Can someone explain it for me?
I'm somewhat confused by David's explanation of decision theoretic problems of MW but it reminds me of the problem that we see in this scenario:
Let's say God comes out of the clouds and makes me the following offer: I can throw this fair quantum coin and if heads all people become immortal but if tails all people die immediately. Now I need to decide whether I want to throw the coin (let's imagine I really like immortal humanity and don't want humans to go extinct). In the classical world or in the world of probabilistic QM I might consider this too risky because of 50% chance of extinction. However, under MW I'm certain to have immortal humans in 50% of branches and I'm also certain that nobody's going to be in the rest of the branches. This seems to be a very different choice from the classical one.
So we are left with the questions like: "should I care about the outcomes in the branches that won't have me if I can influence them? how much should I care?" instead of the questions like "at what odds should I play these kinds of risky lotteries?"
On the multiverse and probabilities.
I really enjoyed the discussion and David Albert's analysis and objections to the "decision theoretic" approach (I am reading David Wallace's "The Emergent Universe").
I feel that this "problem of probabilities" hints at an understanding of the multiverse that misplaces its relation to QM, specifically to the epistemic formulation that constitutes "QM as we use it".
I find that this misplacement comes out as astonishing (I mean to myself, of course) in the decision approach, but there are perhaps less blatant hints in other discussions as well (e.g. see the recent P. Tappenden's "Everett's multiverse and the World as Wavefunction" arXiv:1912.06821).
The root of the problem I am seeing is that the multiverse is intended under an ontology that does not reflect the multiverse connection to the epistemic formulation of QM. I think that this understanding of the multiverse is in fact inconsistent to this relation. The probability problem seems to me more a symptom than the malady.
Let's start by asking why and how we get to the notion of the multiverse.
Everett had an amazing intuition, still that intuition was framed by the knowledge available at his time. We live in a different time, and I think that there are reasons today, that lead to the notion of the multiverse, that none had at his disposal when Everett wrote his thesis.
I wrote in my other comment on the measurement problem that I deem that decoherence is at the core of the foundations of the interpretation of QM, what decoherence does that places it in this role is briefly written there. To the aim of the present argument, what is important to notice is that decoherence implies its occurrence within entanglement. The very nature (logic nature) of decoherence implies that alternatives are not excluded or selected, instead they are "combed" by the globalizing consistency enforced by interactions and encoded in entanglement. It is this consideration on the nature of decoherence that today shall be the concrete motivation for the notion of the multiverse. But it is the very same decoherence that in our epistemic formulation of QM yields that emergent classical logic that implies both our solely epistemic probabilities and the objective determination that renders legit a realistic intention of our physical experience.
This shows that the multiverse is inseparable from probabilities already in its genesis. And this is one of the reasons why I emphasize a difference in our motivation respect to the possible motivations of Everett, we should just not forget from where we got to the multiverse: it stems from decoherence, from our epistemic probabilities, as a way to better understand the structure of QM.
But it is also true that this does not completely excuses the problem. The multiverse is not a simple and direct analytical deduction, it is more like a model of the theory, a structure by which we hope to obtain terms and notions that would render the analytical narration of the structure of QM simpler and more intuitive, while reflecting it accurately. For this reason it is a valid question to ask if this model is satisfied by QM, and a way to disprove this is to show that the multiverse is inconsistent with the very probabilities that are its original motivation.
I think that what is inconsistent with probabilities is the ontological intention used to narrate the multiverse, that does not corresponds to its origin and relation to decoherence and our probabilities. This results in an "ontological misplacement" consisting in the unawareness of the necessity of employing modal logic, which would instead reflect the multiverse constitutive relation with our epistemic formulation of QM.
Modality here means that "being" "existing" can not be expressed without qualification, that it is necessary to specify the "modus essendi" when we predicate existence.
The first kind of modality is the one that interests us more: "actuality", this is our way of being, and it applies to our experience and, thanks to QM's decoherence, to its implication of a determination satisfying realism. But the moment You move outside the determination scope, You can no longer talk in terms of actuality. The way decoherence, in our epistemic formulation of QM, provides us with solely epistemic probabilities is such that the fundamental workings of QM in the proximity of our emergent actuality shall be described as "propensity", i.e. as an objective feed to our epistemic probabilities. Or "potency", to use a classical term. But even this two connotations of the modus essendi of QM seem diminutive when they are regarded in the perspective of the multiverse. The only term I can find to describe it, reflecting its relation to us, conscious of how decoherence unfolds alternatives by mere consistency, is omni-potency. I don't use this term lightly, for its grandeur, and with much reluctance for something that may sound somehow religious (and it has nothing to do with that other notion, omnipotence), but I'm led to it simply because I can't find any better term to name what the fundamental logic of QM is about, and because this seems to me to reflect well the relation of the multiverse to us.
What all this imports to our problem is that as soon as one realizes that the modus essendi of the multiverse is impossibly different from ours, notwithstanding the fact that ours is emergent still within that but crucially not the other way around, then that same person would never consider the possibility of even abstractly placing himself in the shoes of the multiverse and from that impossible perspective consider his expectations in the form of probabilities to argue the inconsistency of this perspective with ours. A reasonable modal narration of QM immediately disallows this perspective, as the way we can say that that exists has nothing to do with the way we say we exists.
But note that the force of this argument apparently based on modality, in fact stems from what the narration is intended to reflect and represent, which is once again the relation of the general structure and logic of QM to our epistemic formulation and its interpretation, with the mediation between them relying on decoherence.
It is the choice to follow an ontological narration of our analysis of this structure and logic that leads us to the form of these arguments, but then we have to choose very carefully the way we speak, somehow allegorically, of very formal notions, because one point is clear, that, if our narration is not an accurate representation of the structure of QM, we will be very easily led astray. Modality is a way to help this necessary accuracy, I think, and this is the motivation of its importance here.
In conclusion, my advice is to remember that knowledge precedes ontology, thus set Your perspective accordingly and don't forget where You come from.
Then, I think there won't be a probability problem for the multiverse.
@28:52 "Wigner has this idea that it was precisely conscious agency that caused departures from the standard quantum mechanical equations of motion that caused the collapse of the wave function" is, I think, partly incorrect respect to Wigner's "Remarks on the Mind-Body question", though it could describe other parts of Wigner's thinking, and it doesn't change the general discussion which is about how some physicists (Wigner was not alone) used to invoke "minds & stuff" to make up for the inability to figure out QM.
… which perhaps contributed to the lull on the reasoning on the interpretational foundations of QM?!? I mean, it really makes sense to react with a "shut up and compute!" if on the opposite side of Your desk they are debating if a mice can collapse the wavefunction…
This was the article of the famous "Wigner's friend" scenario.
Wigner's observation is that the statistics that are correct for him to describe his friend making an observation of which Wigner himself ignore the result are not achievable by the pure state formulation of QM, they require a mixture, a statistical operator.
The departure was not so much that of the projection rule from unitary evolution, as that of the mixed state formulation from the pure state formulation of QM (that he calls "orthodox QM").
He puzzles about this, and it is this that he proposes to be due to his friend's consciousness (with which we can't really agree).
I find interesting a couple of observations he makes in the article.
That the pure sate description of his friend plus object won't give statistics that are physically correct, the mixture does.
That in the statistical operation formulation of QM, the evolution from a pure state to the required mixture can be continuous.
It is not completely clear to me why the statistical operator formulation was deemed puzzling.
Obviously, though, Wigner could not rely on the work done in later years on decoherence.
The Wigner's friend problem is, at least in part, connected to the problem of the correct statistics for an un-observed measurement: how do You represent Your knowledge that a (specific) measurement has occurred of which You don't know the result? The reasoning about the physically correct statistics are the same as those made by Wigner, but of course consciousness can't apply here, since the measurement device has none.
Ref. @ 15:00 ca., when David Albert describes the "measurement problem".
I'm trying to synthesize this formulation of the measurement problem as: since determinism fails in QM and indeterminism is "contagious" (by entanglement) at all levels, how do we reconcile QM with our determined experience?
Another form of the measurement problem is the question of the legitimacy of the projection rule for observations, if and how its discontinuity reconciles with the continuous linear evolution or QM, and if in general it is really consistent with locality, relativity, causality, etc..
The former is perhaps a wider formulation of measurement problem, and borders the whole interpretation problem. But the two are strictly connected.
To see this consider quantum logic (of probabilities), the projection rule is the conditionalization of quantum logic. Conditionalization is very much a requirement for a logic of probabilities intended as expectations. Now, conditionalization is in principle non-problematic IF the probabilities that are conditionalized are solely epistemic, i.e. if their uncertainty is solely due to lack of knowledge. I say in principle because it still has to be consistent with all the rest of our physical theory (which I think turns out to be the case, for the projection rule and QM).
The probabilities of QM are probabilities about the results of observations, moreover about determined results according to the nature of our experience (or naive understanding of it).
(Note that we are starting from "QM as we use it", which prescribes us to use this naive understanding, deemed "classical" understanding by CI.)
That is, we have solely epistemic probabilities IF uncertainty is entirely due to our lack of knowledge about the determination, more explicitly, the implication of solely epistemic probabilities is determination. Essentially, solely epistemic probabilities means a probability logic satisfying the ignorance interpretation of probabilities.
So the two formulations of the measurement problem are linked: determination would both correspond to our experience and allow an un-problematic conditionalization.
The general problem then stems from the failure of determinism in QM (or, QM does not satisfy determinism): uncertainty is not solely epistemic, it has an objective content in the sense that it is not that we don't know the determined value of the physical quantity, it is that the physical quantity is not determined beyond some level. It is the impossibility of the infinite removal of uncertainty that constitutes the failure of determinism in QM.
We know that determinism is satisfied by classical mechanics, and more specifically by probabilities that satisfy classical logic, because we can get to a non probabilistic formulation of the mechanics as the limit of infinite removal of uncertainty. This assure us (analytically!) that classical probability logic satisfies the ignorance interpretation. In this case we are ok: the determined quantities of the theory match our experience and conditionalization is solely epistemic: it does not affect in any way a system if I peep or not.
The same criterion fails for QM: the limit of infinite removal of uncertainty does not exist, does not converge within the theory.
BUT, even if CM was right, we never experience that limit: all the determinations that we experience have a limit of resolution (we don't distinguish pi from all other real numbers in our observations). Determinism is an infinite, and moreover it is metaphysical. So, we don't actually need that classical logic remains valid to the infinite and metaphysical limit to match our physical experience, we only need that it is valid within a scope, and then we can let it break down when we reach some (hopefully objective) resolution limit.
Both formulations of the measurement problem can be solved if QM gave us a scope of determination that matches our physical experience, and within that scope conditionalization would also be solely epistemic and for this reason unproblematic.
But determination is not fundamental, i.e. determinism fails, so how can we get to the needed determination if we can't get it directly as a formulation of the mechanics that avoids probabilities, which again would be the determinism that we can't get to?
We get the necessary scope of determination by implication from our epistemic and probabilistic formulation of QM, we don't need any more than this to solve the problem.
And the way we get this implication is already sketched above: determination is implied by the ignorance interpretation of probabilities, which in turn is analytically implied by classical logic.
If quantum logic has a feature by which, in our epistemic formulation of QM, classical logic is emergent (still within quantum logic), we have the implied scope of determination that is all we need to solve the two versions of the measurement problem.
It has it, and it's decoherence.
I think this is all the reasoning required to give the foundations of the interpretation of QM.
The very non trivial rest (a modal logic narration, the implied realism, the multiverse "ontology", how the orthodox formulation and even the quantum-classic complementarity of CI are both justified as practices and revolutionized in their fundamental meaning, possible implications for an emergent space-time extension, ….) is the analysis of the consequences of this interpretation.
I really wish there was another, longer episode of you two talking about these topics. Fascinating, time just flies by so fast listening to it.
about 15:15 Seinfeld lives!
Sean is very smart at physics and I’ve read lots of his work , but I am starting to become disillusioned with his dead reckoning approach, the many worlds theory is an endless rabbit hole, each time we get closer to understanding what reality is , it makes its self even more complex . There is no bedrock to reality , our dreams don’t have a physical weight , there is no space for the universe to expand into , if you travelled at light speed though a town and somehow were able to measure the size of your bedroom , it would be the size of a match box . Totally ridiculous, the universe is not there , we are all collapsing the wave function inside a simulation .
Do not listen to this while driving. It’s not boring but it is soothing. It’s like being a kid and hearing adults talk in the other room. You are not the audience here, you’re eavesdropping on two really smart guys talking. My guess is they are pretty good friends.
I want more!!!!
so much respect for your approach to all this Sean <3 just so on point
Forgive me but I know quantum mathematics is very successful and you guys are extremely clever people and I am just an average person. Listening to the logic behind the theory it surprises me how successful it is. I like the idea that everything is a wave function but shouldn’t be a 3D wave function and shouldn’t the interactions be wave interactions. What I mean is. Isn’t the question such as Where is an electron the wrong kind of question. Shouldn’t it be how do the various wave functions react to each other. Example an electron and a proton or two electrons or a quark . All the components must be 3D wave functions too and not treated as being a certain strength and a certain spin . I’m sorry if I’m not very clear. Thank you for your podcasts they are very interesting