The discovery by John Bell in 1964 that nature is nonlocal is arguably one of the most dramatic in physics. Yet, surprisingly enough, and with very few notable exceptions, it was largely ignored by the physics community for a long time. When I first became interested in the subject, almost three decades after Bell’s result, I was shocked to realise that even the most fundamental questions about this phenomenon had not been asked. First and foremost, which states have nonlocal properties? All that was known was that a few particular states, such as the singlet state of two spin ½ particles, are nonlocal. But is nonlocality a generic feature of quantum mechanics, or do only a few, very exceptional states have this property? My first work on nonlocality was to raise, and answer, this question. In [6] in collaboration with Rohrlich, and simultaneously and independently from Gisin, I showed that nonlocality is generic: Almost every quantum state, more precisely every entangled pure state of two or more particles separated in space, is nonlocal. This paper established nonlocality as a central, if not the central, aspect of quantum mechanics. This result is now viewed as so basic and so much part of the ABC of the subject, that entanglement and nonlocality are very often taken to mean the same thing, and these early papers are barely cited anymore!
After establishing that non-locality is a generic qualitative feature of quantum states, the natural question was to ask whether there is also a quantitative aspect: Are some states more non-local than others? In collaboration with Bennett et al., I established the quantitative description of nonlocality and entanglement. We defined measure of non-locality as the quantity that cannot be generated or increased by local operations and classical communication LOCC. In particular we introduced the procedures of entanglement concentration and dilution [24] and entanglement purification [23]. We showed that different non-local states can be inter-converted from one to another by means of purely local actions and classical communication; states which can be converted into each other by such procedures must contain the same amount of non-locality [23,24,33]. This allowed us to introduce a natural quantitative measure of non-locality. (The unit of quantum non-locality derived in these papers is now customarily called the e-bit, i.e. a bit-of-entanglement, and it is the non-local counterpart of the well-known unit of information, the bit.) These papers constitute the basis of the modern view of non-locality and entanglement, namely non-locality (entanglement) as a resource which, very much like energy, can be stored, transformed from one form into another, and consumed for performing useful tasks.
In 1993, in a pioneering paper, Bennett and collaborators described a communication method they called teleportation. In this process the information contained in a quantum state is first disassembled into a purely classical part, that is communicated to the receiver by ordinary means (such as a telephone), and a purely quantum part that is teleported: information instantaneously jumps from the transmitter to the receiver without being anywhere in between. At present, teleportation is considered to be the paradigmatic quantum communication method. The experimental verification of this phenomenon posed very difficult challenges. I proposed a simplified quantum optical scheme that succeeded in avoiding the main difficulties. This led us [36] to the first experimental realisation of teleportation, arguably one of the best known experiments in quantum information.
That nonlocality can exist at all, given the constraints imposed by relativistic causality, is an extraordinary fact. To understand it better, together with Rohrlich, I asked whether the nonlocality originating from quantum mechanics is the only possible form of non locality [11]. Surprisingly, we found that even stronger nonlocal correlations are possible in principle, without contradicting relativity. The correlations we discovered are now considered to be the basic unit of nonlocality and are known as Popescu-Rohrlich correlations (or PR boxes). Whether or not such correlations exist in nature is an open experimental question. If they exist, quantum mechanics is wrong and has to be replaced. If they do not exist - why not? What is the fundamental physical principle that forbids them? These questions initiated an entire new field of research, one of the most active in present day quantum information.
In [29] we used the idea of entanglement distillation to design quantum privacy amplification, a new cryptographic protocol, and used it to obtain the first proof of absolute security of quantum cryptography in real settings. (All previous proofs were valid for only ideal, completely noiseless channels.)
Another fundamental question that I addressed is how to determine what the state of a quantum particle is. If one is given a single quantum particle in an unknown state there is no way, even in principle, to determine its state. Given the probabilistic nature of quantum measurements we need to perform measurements on an infinite ensemble of identically prepared particles in order to accumulate enough statistics to reliably identify the state. But if we have only a few copies, what is the optimal procedure? Together with S. Massar [19] we found that the optimal procedure requires measurements to be performed collectively on all particles together – separate measurements are not optimal. This is now one of the basic results in the new area of “quantum state estimation” and raises fundamental questions on how information is stored in quantum systems.
Another of my main research subjects has been the study of pre- and post-selected quantum ensembles, an area initiated by Y. Aharonov in collaboration with L.Vaidman, D. Rohrlich and myself. By post-selection one can effectively impose final conditions on the evolution of a quantum system, in addition to and independent of the initial conditions. Pre- and post selected quantum ensembles are therefore the most “pure”, the most “refined”, ensembles. In this sense, one may regard them as the fundamental type of quantum ensemble. On one hand, our work led to the discovery of an entire class of novel quantum effects as well as to new insights into some of the classic paradoxes of quantum mechanics and the discovery of many new ones. In particular, I mention discovering new aspects of Hardy’s paradox [64] and of quantum tunneling [10], the concept of a quantum time-translation machine [2], the “Quantum Cheshire Cat” effect [122], the “Quantum Pigeonhole Principle” [130] and the discovery of super-oscillations [87]. On the other hand, this work has a more speculative aspect: very recently we proposed a new framework for quantum mechanics in which the basic objects are multiple-time states [97] and we raised fundamental questions about the nature of time in quantum mechanics [123]. Whilst pre- and post-selection is not yet an area that is vastly studied, despite the basic ideas having been formulated about twenty years ago, interest is now increasing significantly. Novel experiments and practical applications, in particular providing an extremely powerful amplification method, are emerging, and entire conferences are now dedicated to it.
The main problem facing the foundation of statistical mechanics is that, in principle, all the features of the theory should be derived from the basic equations of motion – Newton or Schrodinger’s laws. This has been hitherto impossible; instead many basic features were simply postulated and the rest of the theory was then derived from the postulates. Furthermore, subjective lack of knowledge needed to be invoked. My aim was to prove the postulates from the basic laws. That every physical system when left undisturbed eventually reaches thermal equilibrium is one of the most fundamental facts of nature. As a first step towards showing this, in [89], in collaboration with Short and Winter, I showed that given a sufficiently small subsystem of a large closed system almost every individual state of the system is such that the subsystem is approximately in a canonical state. As this is a property of individual states, ensemble or time-averaging are not required and hence the “equal a-priori probability” postulate of statistical mechanics is redundant. Building upon the concepts formulated in [89], in collaboration with Linden, Short and Winter [99], I proved that, with virtual generality, (i.e. for almost all Hamiltonians and almost all initial states, including non-typical ones), reaching equilibrium is a universal property of quantum systems: in a large closed system, almost any subsystem will reach an equilibrium state and remain close to it for almost all times. We have thus proven, from first principles, the postulate of equilibration.
In quantum thermodynamics I raised the question of whether or not there are limitations on the size of thermal machines – could thermal machines be built with only a small number of quantum states? And if they could, would they reach maximal (Carnot) efficiency, or there is a trade-off between size and efficiency. We first presented the smallest possible refrigerator [107], consisting only of two qubits (two-state systems, such as spin ½ particles) and proved that this refrigerator can reach maximal efficiency [110]. Following from this work we introduced [113] the notion of virtual temperatures, which effectively gives a new approach to thermodynamics. More recently using this approach we showed that the laws of thermodynamics that were originally found to apply to large ensembles of particles apply also to individual quantum ones; in particular, that one can define a notion of “free energy” for individual particles, very similar to the ordinary free energy, and that the work one can extract from one particle is equal to the change of its free energy [128].
Work with Bennett et al. established the framework for describing multi-partite entanglement and showed that there are irreducible types of entanglement that cannot be converted one into the other in reversible ways [51].
In [80], with Groisman and Winter, I determined the total amount of correlations in a bi-partite state and I provided the first operational meaning of the concept of quantum mutual information.
In [14] I showed that the whole notion of nonlocality needed to be revisited – the nonlocality revealed by violations of Bell Inequalities is qualitatively different from that employed for teleportation. Paper [20] introduced the notion of “hidden” nonlocality, extending for the first time the scope of Bell inequalities beyond ideal von Neumann measurements.
The Collins-Gisin-Linden-Massar-Popescu Bell inequality [61] was the first generalisation of the standard Clauser-Horne-Shimony-Holt inequality to arbitrary dimensions and is now recognized as the Bell inequality to use in the bipartite case.
With N. Gisin we showed that two spin ½ particles polarised in anti-parallel directions are a better direction indicator than if they are parallel [47]. This result forms the basis of the new subject of aligning frames of reference by quantum means.
In the 90s a leading contestant for building a quantum computer, to which considerable human and financial effort were dedicated, was the so called “pseudo-pure state” liquid state NMR method. We showed that, even in principle, this method cannot lead to quantum computers with exponential speed-up [46,55]. The method was subsequently abandoned.
Finally, with H. Briegel I showed the possibility of long-lived entanglement in biological systems, one of the first results in the new field of quantum biology [106].
Last but not least, in 1997 I co-organised at Hewlett-Packard Bristol one of the first courses on quantum information and then I co-edited and co-authored the first textbook on quantum information (Introduction in Quantum Information and Computation, H-K Lo, T. Spiller and S. Popescu eds. ). More recently I co-authored the first textbook in the new area of quantum biology (Quantum Effects in Biology, Mohseni et al. eds.)