Jump to content

Quantum decoherence

From Wikipedia, the free encyclopedia

This is an old revision of this page, as edited by David s graff (talk | contribs) at 09:13, 17 March 2020 (Loss of interference and the transition from quantum to classical probabilities: Improve equation formating). The present address (URL) is a permanent link to this revision, which may differ significantly from the current revision.

In classical scattering of a target body by environmental photons, the motion of the target body will not be changed by the scattered photons on the average. In quantum scattering, the interaction between the scattered photons and the superposed target body will cause them to be entangled, thereby delocalizing the phase coherence from the target body to the whole system, rendering the interference pattern unobservable.

Quantum decoherence is the loss of quantum coherence. In quantum mechanics, particles such as electrons are described by a wave function, a mathematical representation of the quantum state of a system; a probabilistic interpretation of the wave function is used to explain various quantum effects. As long as there exists a definite phase relation between different states, the system is said to be coherent. A definite phase relationship is necessary to perform quantum computing on quantum information encoded in quantum states. Coherence is preserved under the laws of quantum physics.

If a quantum system were perfectly isolated, it would maintain coherence indefinitely, but it would be impossible to manipulate or investigate it. If it is not perfectly isolated, for example during a measurement, coherence is shared with the environment and appears to be lost with time; a process called quantum decoherence. As a result of this process, quantum behavior is apparently lost, just as energy appears to be lost by friction in classical mechanics.

Decoherence was first introduced in 1970 by the German physicist H. Dieter Zeh[1] and has been a subject of active research since the 1980s.[2] Decoherence has been developed into a complete framework, but it does not solve the measurement problem, as the founders of decoherence theory admit in their seminal papers.[3]

Decoherence can be viewed as the loss of information from a system into the environment (often modeled as a heat bath),[4] since every system is loosely coupled with the energetic state of its surroundings. Viewed in isolation, the system's dynamics are non-unitary (although the combined system plus environment evolves in a unitary fashion).[5] Thus the dynamics of the system alone are irreversible. As with any coupling, entanglements are generated between the system and environment. These have the effect of sharing quantum information with—or transferring it to—the surroundings.

Decoherence has been used to understand the collapse of the wave function in quantum mechanics. Decoherence does not generate actual wave-function collapse. It only provides an explanation for apparent wave-function collapse, as the quantum nature of the system "leaks" into the environment. That is, components of the wave function are decoupled from a coherent system and acquire phases from their immediate surroundings. A total superposition of the global or universal wavefunction still exists (and remains coherent at the global level), but its ultimate fate remains an interpretational issue. Specifically, decoherence does not attempt to explain the measurement problem. Rather, decoherence provides an explanation for the transition of the system to a mixture of states that seem to correspond to those states observers perceive. Moreover, our observation tells us that this mixture looks like a proper quantum ensemble in a measurement situation, as we observe that measurements lead to the "realization" of precisely one state in the "ensemble".

Decoherence represents a challenge for the practical realization of quantum computers, since such machines are expected to rely heavily on the undisturbed evolution of quantum coherences. Simply put, they require that the coherence of states be preserved and that decoherence is managed, in order to actually perform quantum computation. The preservation of coherence, and mitigation of decoherence effects, are thus related to the concept of quantum error correction.

Mechanisms

To examine how decoherence operates, an "intuitive" model is presented. The model requires some familiarity with quantum theory basics. Analogies are made between visualisable classical phase spaces and Hilbert spaces. A more rigorous derivation in Dirac notation shows how decoherence destroys interference effects and the "quantum nature" of systems. Next, the density matrix approach is presented for perspective.

Quantum superposition of states and decoherence measurement through Rabi oscillations

Phase-space picture

An N-particle system can be represented in non-relativistic quantum mechanics by a wave function , where each xi is a point in 3-dimensional space. This has analogies with the classical phase space. A classical phase space contains a real-valued function in 6N dimensions (each particle contributes 3 spatial coordinates and 3 momenta). Our "quantum" phase space, on the other hand, involves a complex-valued function on a 3N-dimensional space. The position and momenta are represented by operators that do not commute, and lives in the mathematical structure of a Hilbert space. Aside from these differences, however, the rough analogy holds.

Different previously isolated, non-interacting systems occupy different phase spaces. Alternatively we can say that they occupy different lower-dimensional subspaces in the phase space of the joint system. The effective dimensionality of a system's phase space is the number of degrees of freedom present, which—in non-relativistic models—is 6 times the number of a system's free particles. For a macroscopic system this will be a very large dimensionality. When two systems (and the environment would be a system) start to interact, though, their associated state vectors are no longer constrained to the subspaces. Instead the combined state vector time-evolves a path through the "larger volume", whose dimensionality is the sum of the dimensions of the two subspaces. The extent to which two vectors interfere with each other is a measure of how "close" they are to each other (formally, their overlap or Hilbert space multiplies together) in the phase space. When a system couples to an external environment, the dimensionality of, and hence "volume" available to, the joint state vector increases enormously. Each environmental degree of freedom contributes an extra dimension.

The original system's wave function can be expanded in many different ways as a sum of elements in a quantum superposition. Each expansion corresponds to a projection of the wave vector onto a basis. The basis can be chosen at will. Let us choose an expansion where the resulting basis elements interact with the environment in an element-specific way. Such elements will—with overwhelming probability—be rapidly separated from each other by their natural unitary time evolution along their own independent paths. After a very short interaction, there is almost no chance of any further interference. The process is effectively irreversible. The different elements effectively become "lost" from each other in the expanded phase space created by coupling with the environment; in phase space, this decoupling is monitored through the Wigner quasi-probability distribution. The original elements are said to have decohered. The environment has effectively selected out those expansions or decompositions of the original state vector that decohere (or lose phase coherence) with each other. This is called "environmentally-induced superselection", or einselection.[6] The decohered elements of the system no longer exhibit quantum interference between each other, as in a double-slit experiment. Any elements that decohere from each other via environmental interactions are said to be quantum-entangled with the environment. The converse is not true: not all entangled states are decohered from each other.

Any measuring device or apparatus acts as an environment, since at some stage along the measuring chain, it has to be large enough to be read by humans. It must possess a very large number of hidden degrees of freedom. In effect, the interactions may be considered to be quantum measurements. As a result of an interaction, the wave functions of the system and the measuring device become entangled with each other. Decoherence happens when different portions of the system's wave function become entangled in different ways with the measuring device. For two einselected elements of the entangled system's state to interfere, both the original system and the measuring in both elements device must significantly overlap, in the scalar product sense. If the measuring device has many degrees of freedom, it is very unlikely for this to happen.

As a consequence, the system behaves as a classical statistical ensemble of the different elements rather than as a single coherent quantum superposition of them. From the perspective of each ensemble member's measuring device, the system appears to have irreversibly collapsed onto a state with a precise value for the measured attributes, relative to that element.

Dirac notation

Using Dirac notation, let the system initially be in the state

where the s form an einselected basis (environmentally induced selected eigenbasis[6]), and let the environment initially be in the state . The vector basis of the combination of the system and the environment consists of the tensor products of the basis vectors of the two subsystems. Thus, before any interaction between the two subsystems, the joint state can be written as

where is shorthand for the tensor product . There are two extremes in the way the system can interact with its environment: either (1) the system loses its distinct identity and merges with the environment (e.g. photons in a cold, dark cavity get converted into molecular excitations within the cavity walls), or (2) the system is not disturbed at all, even though the environment is disturbed (e.g. the idealized non-disturbing measurement). In general, an interaction is a mixture of these two extremes that we examine.

System absorbed by environment

If the environment absorbs the system, each element of the total system's basis interacts with the environment such that

evolves into

and so

evolves into

The unitarity of time evolution demands that the total state basis remains orthonormal, i.e. the scalar or inner products of the basis vectors must vanish, since :

This orthonormality of the environment states is the defining characteristic required for einselection.[6]

System not disturbed by environment

In an idealised measurement, the system disturbs the environment, but is itself undisturbed by the environment. In this case, each element of the basis interacts with the environment such that

evolves into the product

and so

evolves into

In this case, unitarity demands that

where was used. Additionally, decoherence requires, by virtue of the large number of hidden degrees of freedom in the environment, that

As before, this is the defining characteristic for decoherence to become einselection.[6] The approximation becomes more exact as the number of environmental degrees of freedom affected increases.

Note that if the system basis were not an einselected basis, then the last condition is trivial, since the disturbed environment is not a function of , and we have the trivial disturbed environment basis . This would correspond to the system basis being degenerate with respect to the environmentally defined measurement observable. For a complex environmental interaction (which would be expected for a typical macroscale interaction) a non-einselected basis would be hard to define.

Loss of interference and the transition from quantum to classical probabilities

The utility of decoherence lies in its application to the analysis of probabilities, before and after environmental interaction, and in particular to the vanishing of quantum interference terms after decoherence has occurred. If we ask what is the probability of observing the system making a transition from to before has interacted with its environment, then application of the Born probability rule states that the transition probability is the squared modulus of the scalar product of the two states:

where , , and etc.

The above expansion of the transition probability has terms that involve ; these can be thought of as representing interference between the different basis elements or quantum alternatives. This is a purely quantum effect and represents the non-additivity of the probabilities of quantum alternatives.

To calculate the probability of observing the system making a quantum leap from to after has interacted with its environment, then application of the Born probability rule states that we must sum over all the relevant possible states of the environment before squaring the modulus:

The internal summation vanishes when we apply the decoherence/einselection condition , and the formula simplifies to

If we compare this with the formula we derived before the environment introduced decoherence, we can see that the effect of decoherence has been to move the summation sign from inside of the modulus sign to outside. As a result, all the cross- or quantum interference-terms

have vanished from the transition-probability calculation. The decoherence has irreversibly converted quantum behaviour (additive probability amplitudes) to classical behaviour (additive probabilities).[6][7][8]

In terms of density matrices, the loss of interference effects corresponds to the diagonalization of the "environmentally traced-over" density matrix.[6]

Density-matrix approach

The effect of decoherence on density matrices is essentially the decay or rapid vanishing of the off-diagonal elements of the partial trace of the joint system's density matrix, i.e. the trace, with respect to any environmental basis, of the density matrix of the combined system and its environment. The decoherence irreversibly converts the "averaged" or "environmentally traced-over"[6] density matrix from a pure state to a reduced mixture; it is this that gives the appearance of wave-function collapse. Again, this is called "environmentally induced superselection", or einselection.[6] The advantage of taking the partial trace is that this procedure is indifferent to the environmental basis chosen.

Initially, the density matrix of the combined system can be denoted as

where is the state of the environment. Then if the transition happens before any interaction takes place between the system and the environment, the environment subsystem has no part and can be traced out, leaving the reduced density matrix for the system:

Now the transition probability will be given as

where , , and etc.

Now the case when transition takes place after the interaction of the system with the environment. The combined density matrix will be

To get the reduced density matrix of the system, we trace out the environment and employ the decoherence/einselection condition and see that the off-diagonal terms vanish (a result obtained by Erich Joos and H. D. Zeh in 1985):[9]

Similarly, the final reduced density matrix after the transition will be

The transition probability will then be given as

which has no contribution from the interference terms

The density-matrix approach has been combined with the Bohmian approach to yield a reduced-trajectory approach, taking into account the system reduced density matrix and the influence of the environment.[10]

Operator-sum representation

Consider a system S and environment (bath) B, which are closed and can be treated quantum-mechanically. Let and be the system's and bath's Hilbert spaces respectively. Then the Hamiltonian for the combined system is

where are the system and bath Hamiltonians respectively, is the interaction Hamiltonian between the system and bath, and are the identity operators on the system and bath Hilbert spaces respectively. The time-evolution of the density operator of this closed system is unitary and, as such, is given by

where the unitary operator is . If the system and bath are not entangled initially, then we can write . Therefore, the evolution of the system becomes

The system–bath interaction Hamiltonian can be written in a general form as

where is the operator acting on the combined system–bath Hilbert space, and are the operators that act on the system and bath respectively. This coupling of the system and bath is the cause of decoherence in the system alone. To see this, a partial trace is performed over the bath to give a description of the system alone:

is called the reduced density matrix and gives information about the system only. If the bath is written in terms of its set of orthogonal basis kets, that is, if it has been initially diagonalized, then . Computing the partial trace with respect to this (computational) basis gives

where are defined as the Kraus operators and are represented as

This is known as the operator-sum representation (OSR). A condition on the Kraus operators can be obtained by using the fact that ; this then gives

This restriction determines whether decoherence will occur or not in the OSR. In particular, when there is more than one term present in the sum for , then the dynamics of the system will be non-unitary, and hence decoherence will take place.

Semigroup approach

A more general consideration for the existence of decoherence in a quantum system is given by the master equation, which determines how the density matrix of the system alone evolves in time (see also the Belavkin equation[11][12][13] for the evolution under continuous measurement). This uses the Schrödinger picture, where evolution of the state (represented by its density matrix) is considered. The master equation is

where is the system Hamiltonian along with a (possible) unitary contribution from the bath, and is the Lindblad decohering term.[5] The Lindblad decohering term is represented as

The are basis operators for the M-dimensional space of bounded operators that act on the system Hilbert space and are the error generators.[14] The matrix elements represent the elements of a positive semi-definite Hermitian matrix; they characterize the decohering processes and, as such, are called the noise parameters.[14] The semigroup approach is particularly nice, because it distinguishes between the unitary and decohering (non-unitary) processes, which is not the case with the OSR. In particular, the non-unitary dynamics are represented by , whereas the unitary dynamics of the state are represented by the usual Heisenberg commutator. Note that when , the dynamical evolution of the system is unitary. The conditions for the evolution of the system density matrix to be described by the master equation are:[5]

  1. the evolution of the system density matrix is determined by a one-parameter semigroup,
  2. the evolution is "completely positive" (i.e. probabilities are preserved),
  3. the system and bath density matrices are initially decoupled.

Examples of non-unitary modelling of decoherence

Decoherence can be modelled as a non-unitary process by which a system couples with its environment (although the combined system plus environment evolves in a unitary fashion).[5] Thus the dynamics of the system alone, treated in isolation, are non-unitary and, as such, are represented by irreversible transformations acting on the system's Hilbert space . Since the system's dynamics are represented by irreversible representations, then any information present in the quantum system can be lost to the environment or heat bath. Alternatively, the decay of quantum information caused by the coupling of the system to the environment is referred to as decoherence.[4] Thus decoherence is the process by which information of a quantum system is altered by the system's interaction with its environment (which form a closed system), hence creating an entanglement between the system and heat bath (environment). As such, since the system is entangled with its environment in some unknown way, a description of the system by itself cannot be made without also referring to the environment (i.e. without also describing the state of the environment).

Rotational decoherence

Consider a system of N qubits that is coupled to a bath symmetrically. Suppose this system of N qubits undergoes a rotation around the eigenstates of . Then under such a rotation, a random phase will be created between the eigenstates , of . Thus these basis qubits and will transform in the following way:

This transformation is performed by the rotation operator

Since any qubit in this space can be expressed in terms of the basis qubits, then all such qubits will be transformed under this rotation. Consider a qubit in a pure state . This state will decohere, since it is not "encoded" with the dephasing factor . This can be seen by examining the density matrix averaged over all values of :

where is a probability density. If is given as a Gaussian distribution

then the density matrix is

Since the off-diagonal elements—the coherence terms—decay for increasing , then the density matrices for the various qubits of the system will be indistinguishable. This means that no measurement can distinguish between the qubits, thus creating decoherence between the various qubit states. In particular, this dephasing process causes the qubits to collapse onto the axis. This is why this type of decoherence process is called collective dephasing, because the mutual phases between all qubits of the N-qubit system are destroyed.

Depolarizing

Depolarizing is a non-unitary transformation on a quantum system which maps pure states to mixed states. This is a non-unitary process, because any transformation that reverses this process will map states out of their respective Hilbert space thus not preserving positivity (i.e. the original probabilities are mapped to negative probabilities, which is not allowed). The 2-dimensional case of such a transformation would consist of mapping pure states on the surface of the Bloch sphere to mixed states within the Bloch sphere. This would contract the Bloch sphere by some finite amount and the reverse process would expand the Bloch sphere, which cannot happen.

Dissipation

Dissipation is a decohering process by which the populations of quantum states are changed due to entanglement with a bath. An example of this would be a quantum system that can exchange its energy with a bath through the interaction Hamiltonian. If the system is not in its ground state and the bath is at a temperature lower than that of the system's, then the system will give off energy to the bath, and thus higher-energy eigenstates of the system Hamiltonian will decohere to the ground state after cooling and, as such, will all be non-degenerate. Since the states are no longer degenerate, they are not distinguishable, and thus this process is irreversible (non-unitary).

Timescales

Decoherence represents an extremely fast process for macroscopic objects, since these are interacting with many microscopic objects, with an enormous number of degrees of freedom, in their natural environment. The process explains why we tend not to observe quantum behaviour in everyday macroscopic objects. It also explains why we do see classical fields emerge from the properties of the interaction between matter and radiation for large amounts of matter. The time taken for off-diagonal components of the density matrix to effectively vanish is called the decoherence time. It is typically extremely short for everyday, macroscale processes.[6][7][8]

Measurement

The discontinuous "wave-function collapse" postulated in the Copenhagen interpretation to enable the theory to be related to the results of laboratory measurements cannot be understood as an aspect of the normal dynamics of quantum mechanics via the decoherence process. Decoherence is an important part of some modern refinements of the Copenhagen interpretation. Decoherence shows how a macroscopic system interacting with many microscopic systems (e.g. collisions with air molecules or photons) moves from being in a pure quantum state—which in general will be a coherent superposition (see Schrödinger's cat)—to being in an incoherent improper mixture of these states. The weighting of each outcome in the mixture in case of measurement is exactly that which gives the probabilities of the different results of such a measurement.

However, decoherence by itself may not give a complete solution of the measurement problem, since all components of the wave function still exist in a global superposition, which is explicitly acknowledged in the many-worlds interpretation. All decoherence explains, in this view, is why these coherences are no longer available for inspection by local observers. To present a solution to the measurement problem in most interpretations of quantum mechanics, decoherence must be supplied with some nontrivial interpretational considerations (as for example Wojciech Zurek tends to do in his existential interpretation). However, according to Everett and DeWitt, the many-worlds interpretation can be derived from the formalism alone, in which case no extra interpretational layer is required.

Mathematical details

We assume for the moment that the system in question consists of a subsystem A being studied and the "environment" , and the total Hilbert space is the tensor product of a Hilbert space describing A and a Hilbert space describing , that is,

This is a reasonably good approximation in the case where A and are relatively independent (e.g. there is nothing like parts of A mixing with parts of or conversely). The point is, the interaction with the environment is for all practical purposes unavoidable (e.g. even a single excited atom in a vacuum would emit a photon, which would then go off). Let's say this interaction is described by a unitary transformation U acting upon . Assume that the initial state of the environment is , and the initial state of A is the superposition state

where and are orthogonal, and there is no entanglement initially. Also, choose an orthonormal basis for . (This could be a "continuously indexed basis" or a mixture of continuous and discrete indexes, in which case we would have to use a rigged Hilbert space and be more careful about what we mean by orthonormal, but that's an inessential detail for expository purposes.) Then, we can expand

and

uniquely as

and

respectively. One thing to realize is that the environment contains a huge number of degrees of freedom, a good number of them interacting with each other all the time. This makes the following assumption reasonable in a handwaving way, which can be shown to be true in some simple toy models. Assume that there exists a basis for such that and are all approximately orthogonal to a good degree if ij and the same thing for and and also for and for any i and j (the decoherence property).

This often turns out to be true (as a reasonable conjecture) in the position basis because how A interacts with the environment would often depend critically upon the position of the objects in A. Then, if we take the partial trace over the environment, we would find the density state[clarification needed] is approximately described by

that is, we have a diagonal mixed state, there is no constructive or destructive interference, and the "probabilities" add up classically. The time it takes for U(t) (the unitary operator as a function of time) to display the decoherence property is called the decoherence time.

Experimental observations

Quantitative measurement

The decoherence rate depends on a number of factors, including temperature or uncertainty in position, and many experiments have tried to measure it depending on the external environment.[15]

The process of a quantum superposition gradually obliterated by decoherence was quantitatively measured for the first time by Serge Haroche and his co-workers at the École Normale Supérieure in Paris in 1996.[16] Their approach involved sending individual rubidium atoms, each in a superposition of two states, through a microwave-filled cavity. The two quantum states both cause shifts in the phase of the microwave field, but by different amounts, so that the field itself is also put into a superposition of two states. Due to photon scattering on cavity-mirror imperfection, the cavity field losses phase coherence to the environment.

Haroche and his colleagues measured the resulting decoherence via correlations between the states of pairs of atoms sent through the cavity with various time delays between the atoms.

Reducing environmental decoherence

In July 2011, researchers from University of British Columbia and University of California, Santa Barbara were able to reduce environmental decoherence rate "to levels far below the threshold necessary for quantum information processing" by applying high magnetic fields in their experiment.[17][18][19]

Criticism

Criticism of the adequacy of decoherence theory to solve the measurement problem has been expressed by Anthony Leggett: "I hear people murmur the dreaded word "decoherence". But I claim that this is a major red herring".[20] Concerning the experimental relevance of decoherence theory, Leggett has stated: "Let us now try to assess the decoherence argument. Actually, the most economical tactic at this point would be to go directly to the results of the next section, namely that it is experimentally refuted! However, it is interesting to spend a moment enquiring why it was reasonable to anticipate this in advance of the actual experiments. In fact, the argument contains several major loopholes".[21]

In interpretations of quantum mechanics

Before an understanding of decoherence was developed, the Copenhagen interpretation of quantum mechanics treated wave-function collapse as a fundamental, a priori process. Decoherence provides an explanatory mechanism for the appearance of wave function collapse and was first developed by David Bohm in 1952, who applied it to Louis DeBroglie's pilot-wave theory, producing Bohmian mechanics,[22][23] the first successful hidden-variables interpretation of quantum mechanics. Decoherence was then used by Hugh Everett in 1957 to form the core of his many-worlds interpretation.[24] However, decoherence was largely ignored for many years (with the exception of Zeh's work),[1] and not until the 1980s[25][26] did decoherent-based explanations of the appearance of wave-function collapse become popular, with the greater acceptance of the use of reduced density matrices.[9][7] The range of decoherent interpretations have subsequently been extended around the idea, such as consistent histories. Some versions of the Copenhagen interpretation have been modified to include decoherence.

Decoherence does not claim to provide a mechanism for the actual wave-function collapse; rather it puts forth a reasonable mechanism for the appearance of wave-function collapse. The quantum nature of the system is simply "leaked" into the environment so that a total superposition of the wave function still exists, but exists – at least for all practical purposes[27] — beyond the realm of measurement.[28] Of course, by definition, the claim that a merged but unmeasurable wave function still exists cannot be proven experimentally. Decoherence explains why a quantum system begins to obey classical probability rules after interacting with its environment (due to the suppression of the interference terms when applying Born's probability rules to the system).

See also

References

  1. ^ a b H. Dieter Zeh, "On the Interpretation of Measurement in Quantum Theory", Foundations of Physics, vol. 1, pp. 69–76, (1970).
  2. ^ Schlosshauer, Maximilian (2005). "Decoherence, the measurement problem, and interpretations of quantum mechanics". Reviews of Modern Physics. 76 (4): 1267–1305. arXiv:quant-ph/0312059. Bibcode:2004RvMP...76.1267S. doi:10.1103/RevModPhys.76.1267.
  3. ^ Joos and Zeh (1985) state ‘'Of course no unitary treatment of the time dependence can explain why only one of these dynamically independent components is experienced.'’ And in a recent review on decoherence, Joos (1999) states ‘'Does decoherence solve the measurement problem? Clearly not. What decoherence tells us is that certain objects appear classical when observed. But what is an observation? At some stage we still have to apply the usual probability rules of quantum theory.'’Adler, Stephen L. (2003). "Why decoherence has not solved the measurement problem: a response to P.W. Anderson". Studies in History and Philosophy of Science Part B: Studies in History and Philosophy of Modern Physics. 34 (1): 135–142. arXiv:quant-ph/0112095. Bibcode:2003SHPMP..34..135A. doi:10.1016/S1355-2198(02)00086-2.
  4. ^ a b Bacon, D. (2001). "Decoherence, control, and symmetry in quantum computers". arXiv:quant-ph/0305025.
  5. ^ a b c d Lidar, Daniel A.; Whaley, K. Birgitta (2003). "Decoherence-Free Subspaces and Subsystems". In Benatti, F.; Floreanini, R. (eds.). Irreversible Quantum Dynamics. Springer Lecture Notes in Physics. Vol. 622. Berlin. pp. 83–120. arXiv:quant-ph/0301032. Bibcode:2003LNP...622...83L. doi:10.1007/3-540-44874-8_5. ISBN 978-3-540-40223-7. {{cite book}}: |journal= ignored (help)CS1 maint: location missing publisher (link)
  6. ^ a b c d e f g h i Zurek, Wojciech H. (2003). "Decoherence, einselection, and the quantum origins of the classical". Reviews of Modern Physics. 75 (3): 715. arXiv:quant-ph/0105127. Bibcode:2003RvMP...75..715Z. doi:10.1103/revmodphys.75.715.
  7. ^ a b c Wojciech H. Zurek, "Decoherence and the transition from quantum to classical", Physics Today, 44, pp. 36–44 (1991).
  8. ^ a b Zurek, Wojciech (2002). "Decoherence and the Transition from Quantum to Classical—Revisited" (PDF). Los Alamos Science. 27. arXiv:quant-ph/0306072. Bibcode:2003quant.ph..6072Z.
  9. ^ a b E. Joos and H. D. Zeh, "The emergence of classical properties through interaction with the environment", Zeitschrift für Physik B, 59(2), pp. 223–243 (June 1985): eq. 1.2.
  10. ^ A. S. Sanz, F. Borondo: A quantum trajectory description of decoherence, quant-ph/0310096v5.
  11. ^ V. P. Belavkin (1989). "A new wave equation for a continuous non-demolition measurement". Physics Letters A. 140 (7–8): 355–358. arXiv:quant-ph/0512136. Bibcode:1989PhLA..140..355B. doi:10.1016/0375-9601(89)90066-2.
  12. ^ Howard J. Carmichael (1993). An Open Systems Approach to Quantum Optics. Berlin Heidelberg New-York: Springer-Verlag.
  13. ^ Michel Bauer; Denis Bernard; Tristan Benoist. Iterated Stochastic Measurements (Technical report). arXiv:1210.0425. Bibcode:2012JPhA...45W4020B. doi:10.1088/1751-8113/45/49/494020.
  14. ^ a b * Lidar, D. A.; Chuang, I. L.; Whaley, K. B. (1998). "Decoherence-Free Subspaces for Quantum Computation". Physical Review Letters. 81 (12): 2594–2597. arXiv:quant-ph/9807004. Bibcode:1998PhRvL..81.2594L. doi:10.1103/PhysRevLett.81.2594.
  15. ^ Dan Stahlke. "Quantum Decoherence and the Measurement Problem" (PDF). Retrieved 23 July 2011.
  16. ^ "Observing the Progressive Decoherence of the "Meter" in a Quantum Measurement". Phys. Rev. Lett. 77 (24): 4887–4890. 9 December 1996. Bibcode:1996PhRvL..77.4887B. doi:10.1103/PhysRevLett.77.4887. PMID 10062660. {{cite journal}}: Unknown parameter |authors= ignored (help)
  17. ^ "Discovery may overcome obstacle for quantum computing: UBC, California researchers". University of British Columbia. 20 July 2011. Retrieved 23 July 2011. Our theory also predicted that we could suppress the decoherence, and push the decoherence rate in the experiment to levels far below the threshold necessary for quantum information processing, by applying high magnetic fields. (...)Magnetic molecules now suddenly appear to have serious potential as candidates for quantum computing hardware", said Susumu Takahashi, assistant professor of chemistry and physics at the University of Southern California. "This opens up a whole new area of experimental investigation with sizeable potential in applications, as well as for fundamental work".
  18. ^ "USC Scientists Contribute to a Breakthrough in Quantum Computing". University of California, Santa Barbara. 20 July 2011. Retrieved 23 July 2011.
  19. ^ "Breakthrough removes major hurdle for quantum computing". ZDNet. 20 July 2011. Retrieved 23 July 2011.
  20. ^ Nobel Symposium 2001. "Probing quantum mechanics towards the everyday world: where do we stand?"
  21. ^ Leggett, A. J. (2002). "Testing the limits of quantum mechanics: Motivation, state of play, prospects". Journal of Physics: Condensed Matter. 14 (15): R415–R451. doi:10.1088/0953-8984/14/15/201.
  22. ^ David Bohm, A Suggested Interpretation of the Quantum Theory in Terms of "Hidden Variables", I, Physical Review, (1952), 85, pp. 166–179.
  23. ^ David Bohm, A Suggested Interpretation of the Quantum Theory in Terms of "Hidden Variables", II, Physical Review, (1952), 85, pp. 180–193.
  24. ^ Hugh Everett, Relative State Formulation of Quantum Mechanics, Reviews of Modern Physics, vol. 29, (1957) pp. 454–462.
  25. ^ Wojciech H. Zurek, Pointer Basis of Quantum Apparatus: Into what Mixture does the Wave Packet Collapse?, Physical Review D, 24, pp. 1516–1525 (1981).
  26. ^ Wojciech H. Zurek, Environment-Induced Superselection Rules, Physical Review D, 26, pp. 1862–1880, (1982).
  27. ^ Roger Penrose (2004), The Road to Reality, pp. 802–803: "...the environmental-decoherence viewpoint [...] maintains that state vector reduction [the R process] can be understood as coming about because the environmental system under consideration becomes inextricably entangled with its environment. [...] We think of the environment as extremely complicated and essentially 'random' [...], accordingly we sum over the unknown states in the environment to obtain a density matrix [...] Under normal circumstances, one must regard the density matrix as some kind of approximation to the whole quantum truth. For there is no general principle providing an absolute bar to extracting information from the environment. [...] Accordingly, such descriptions are referred to as FAPP [for all practical purposes]".
  28. ^ Huw Price (1996), Times' Arrow and Archimedes' Point, p. 226: "There is a world of difference between saying 'the environment explains why collapse happens where it does' and saying 'the environment explains why collapse seems to happen even though it doesn't really happen'."

Further reading

  • Schlosshauer, Maximilian (2007). Decoherence and the Quantum-to-Classical Transition (1st ed.). Berlin/Heidelberg: Springer.
  • Joos, E.; et al. (2003). Decoherence and the Appearance of a Classical World in Quantum Theory (2nd ed.). Berlin: Springer.
  • Omnes, R. (1999). Understanding Quantum Mechanics. Princeton: Princeton University Press.
  • Zurek, Wojciech H. (2003). "Decoherence and the transition from quantum to classical – REVISITED", arXiv:quant-ph/0306072 (An updated version of PHYSICS TODAY, 44:36–44 (1991) article)
  • Schlosshauer, Maximilian (23 February 2005). "Decoherence, the Measurement Problem, and Interpretations of Quantum Mechanics". Reviews of Modern Physics. 76 (2004): 1267–1305. arXiv:quant-ph/0312059. Bibcode:2004RvMP...76.1267S. doi:10.1103/RevModPhys.76.1267.
  • J. J. Halliwell, J. Perez-Mercader, Wojciech H. Zurek, eds, The Physical Origins of Time Asymmetry, Part 3: Decoherence, ISBN 0-521-56837-4
  • Berthold-Georg Englert, Marlan O. Scully & Herbert Walther, Quantum Optical Tests of Complementarity, Nature, Vol 351, pp 111–116 (9 May 1991) and (same authors) The Duality in Matter and Light Scientific American, pg 56–61, (December 1994). Demonstrates that complementarity is enforced, and quantum interference effects destroyed, by irreversible object-apparatus correlations, and not, as was previously popularly believed, by Heisenberg's uncertainty principle itself.
  • Mario Castagnino, Sebastian Fortin, Roberto Laura and Olimpia Lombardi, A general theoretical framework for decoherence in open and closed systems, Classical and Quantum Gravity, 25, pp. 154002–154013, (2008). A general theoretical framework for decoherence is proposed, which encompasses formalisms originally devised to deal just with open or closed systems.