This is a brief note on the current status of quantum revolution. Since our University has plans going on for building a complexity economics center (JSGP), which would be a novel interface between natural and social science, where the author is deeply involved , it is a good time we initiate a discussion on the same line more generally.
Though quantum mechanics [QM] remains an enigma till date, even after a century since its foundations, the ‘meaning’ or its correspondence with ‘reality’ still eludes u., It goes without saying that this stupendously successful enterprise has undergone numerous interpretive changes. However, as the old French proverb goes, the more you change the thing, the more it looks the same.
The breathtaking power of QM and its advancements in forms of quantum field theory, quantum electrodynamics, quantum chromodynamics and alike has been the magnanimous predictive power in the microworld. Such predictive power has brought breakthroughs in numerous spheres of knowledge, from atomic physics to quantum chemistry to modern day technology. Also, then the quantum information and computing revolution has the capacity to fundamentally change human civilization. Very strange, and in the words of Einstein, ‘spooky’ features like entanglement have shown to be true, and are now regularly used by information scientists to improve cryptography in a fundamental way. On the theoretical front, quantum gravity, the cherished unification of general relativity [GR] with the standard model of particle physics, still eludes scientists, but a huge enterprise is devoted to the same.
If this is not enough, then QM or the mathematical and logical foundations of the same has found a fascinating application in human cognition modelling. It is fascinating to find that where age old standard Boolean logic based decision theory fails to accommodate various real life choice making scenario , the quantum probability model (based on Hilbert space set up) comes to the rescue. Thus, it opens up an entirely new front of research, and also raises very deep philosophical questions about the mind itself.
In the current exploration then, we would focus on the central unsolved features of QM and its possible philosophical implications.
So called Copenhagen interpretation and its responses
Original quantum physics began with Planck and Einstein, where the former provided a fundamentally different and successful explanation of black body radiation based on the concept of quanta (packets of energy) and constant “h” which is now interpreted as quantum of action (incidentally this derivation was rederived using statistical modelling by the Indian genius SN Bose later, which along with Einstein’s collaboration came to be known as Bose-Einstein statistics), and then later took one step ahead in thinking photoelectric effect in terms of quanta (photon or quanta or particles of electromagnetic radiation) radiation, propagation and absorption, hence giving birth to quantum physics. It is noteworthy that in the breakthrough paper on photoelectric effect itself, Einstein held some skeptical view of particle nature / quantization of electromagnetic radiation, possibly kicking off his life long struggle to find the actual meaning of QM which would emerge later.
Soon after following the path -breaking thinking by Bohr, and later Heisenberg (from whose paper on harmonic oscillator the word QM sprang into life) a new fundamentally different school of thought emerged. Copenhagen was the breeding ground of ideas of the founders in the likes of Bohr, Heisenberg, Born, and later many luminaries, hence the name “Copenhagen interpretation”stood justified. Erwin Schrödinger, soon became another hero of the emerging paradigm, with his ever-famous Schrödinger wave equation.
Hence summarizing Heisenberg, Born and Schrodinger’s works, it was found that till one ‘measures’ a quantum state (whatever that may be) it evolves according to the wave equation in a completely deterministic way, however, it is only when one makes a projective measurement with the help of an apparatus on the state, that the picture randomly ‘collapses’ to one specific value (where as the initial quantum state can well be in ‘superposition’ of complete set of Eigenvalues). Hence, the name collapse of the wave function was given and the probability of getting a specific measurement was provided by Born’s famous formulation. Later a superb mathematical harmony was found between Heisenberg’s operator / matrix based approach and Schrodinger’s wave mechanics based approach (Dirac was instrumental in this, who later emerged as a giant in QM himself).
Nevertheless, all was not well in the state of Denmark. How was it possible for a coherent theory to accommodate both determinism and pure randomness in measurement? This question haunted the biggest of minds then, and also does so till date [Penrose,3] dubbed as the MEASUREMENT PROBLEM. Also, thorny questions were raised, like who or what is responsible for collapsing the wave function? What is even meant by pure randomness? Since earlier we only had an idea of epistemic randomness, which suggests as in Bayesian modelling, the use of a subjective degree of beliefs as probabilities since we are ignorant of full description of the world at initial state, and hence keep updating and following a specific procedure. But ontic randomness would mean that they are basic features of the real world and our existence is itself probabilistic.
Well, there were numerous attempts to make sense out of this ‘madness’. Bohr and Heisenberg developed the notion of divide between the observed quantum state with the observer classical apparatus, but such a distinction was ad hoc and later another great theoretician J Bell  termed it as a ‘shifty’ divide.
Bohr and Einstein, had famous debates which till date are read as fables. To make things worse Einstein, with his collaborators,  came up with a very puzzling and deep question to QM formalism with their EPR paper, which is one of the most widely cited papers in the history of Physics – How entanglement can be accommodated with special theory of relativity [SR]? Any two space-like separated systems can never communicate with each other since then superluminal signaling would be required, which brings the possibility of action at a distance, which is prohibited by SR. It seemed that Alice and Bob, if the particles on which they measure any property are entangled (where entanglement, which has its own huge literature, is such a description of a total quantum system made of non-interacting subsystems, which cannot be written in a factorized format/ products of individual state descriptions, there are many measures of entanglement like concurrent measurement. Entanglement is one of the most difficult nuts to crack, with a problem called separability, where it is very non trivial to detect whether any arbitrary quantum state is in product or entangled state Then it is allowed in QM formalism that just by measuring his or her own particle state (which would be a random measurement) the agent would immediately know about the state of the other’s particle. Isn’t this breakdown of SR?
So was the fear of Einstein, and also Schrödinger, however later it has been convincingly shown that there is a coexistence of entanglement or so called non local correlations and no signaling theorem, which means no superluminal communication in spacetime is possible. Back then, Einstein raised the proposal that whether quantum state or wave function description could only be considered as an incomplete statistical description of the system, while there should be some underlying, or hidden variable description of the system which is actual/ real. Hidden variable theories were suggested  with large numbers to resolve such issues, and make QM again sensible to ‘classically’ accustomed minds. David Bohm certainly became the central figure for providing an alternative hidden variable theory, where the wavefunction which is in a way real (some suggested recently, as nomological or law like) and guides particles along. Thus, providing a deterministic description to QM again, but again this complex theory has its own problems of nonlocality.
Later Bell  in a very landmark paper, suggested a formidable way of testing local hidden variable theories, with the help of so-called Bell type inequalities, later modified by many researchers. The basic approach was that in an EPR like set up, if such inequalities are violated then quantum predictions can never be faithfully reproduced by any local hidden variable theory. Well, this line of research has its own history, and in 2015, there have been some milestone noise free experiments which suggest that local hidden variable theory as a concept is untenable, or so is the standard perception.
Other kinds of measurements?
Such experiments may sound to be confirmations of orthodox QM views, including ontic randomness or fundamental uncertainty, but that may not be the end game yet. In mid 90s and since then, there have been attempts of demonstrating other than projective measurements, we remind ourselves here that projective measurements fundamentally mean that there is a back action by the apparatus on the system being measured, and hence the state of the system gets significantly perturbed, which is the source of pure randomness in measurement. But what if we can design a measurement where the state always remains protected from such back actions? Is it then possible to have a fully deterministic measurement, one that is completely determined by some underlying actual state of the system? Well the jury is still out, interested readers can look into some good reviews .
Other epistemic views
We see that in the whole murky history of foundations of QM, the central debate hovers around epistemic-ontic status of the wavefunction, whether the quantum state is a part of our bigger reality or our only reality, or is it just a statistical ploy to compute probabilities.
QBism , is the latest significant thinking in the epistemic direction, its founders and supporters hold that the quantum state is just a fully personal, subjective state of knowledge/ belief which can be updated by using a Bayesian like mechanism. Hence, in a way QM becomes an everyone’s theory for decision making! Certainly, a significant amount of discussions has been made, the central claim is that, if measurement and updation is just an epistemic problem then the original collapse postulate is redundant.
QBism has some interesting connection to the emerging paradigm of quantum cognition modelling.
Concluding remarks: new ventures
QM applications at least have come a long way, even though the ‘meaning’ debate continues. Recently there have been breakthroughs in applying the mathematical set up in various areas which are not fundamentally tied with physics: cognition, contextuality modelling, financial market modelling, and it seems that extension of the so-called quantum logic to social science as a whole can be a very interesting paradigm.
The current author wrote a brief series in the current journal on the same earlier. The analogy between QM and the human mind in general may not just be a pure operational thinking, there may lie deeper philosophical implications also. Recently  QM math is used to describe neural workings better than the available complex and nonlinear differential equations techniques. It would be an exciting thing to see for the future if the human brain is indeed quantum!
Dr Sudip Patra is a Visiting Scholar at the Aston University, UK and the Memorial University, Canada. He has his PhD from Glasgow University, Scotland. The focus of his research areas are Quantum Cognition modelling and its applications in Economics and Finance, Social science.