Home » Reading clubs » Classic (Page 3)

Category Archives: Classic

Summary: Einstein, Podolsky & Rosen (1935), Can Quantum-Mechanical Description of Physical reality Be Considered Complete

David Emms summarised today’s discussion:

A very interesting paper to read and for one that was foundational to the future understanding of Quantum Mechanics it was also quite straight-forward to follow. Despite being considered one of the founders of QM thanks to his paper on the photo-electric effect in 1905, Einstein was never quite happy with the theory, famously declaring “God doesn’t play dice” (to which Bohr, the main proponent of the Copenhagen interpretation, replied “Stop telling God what to do with his dice!”) The EPR paper cut to the heart of the matter by pinpointing an aspect of quantum theory that appears so against our physical intuition (entanglement). At the time the paper was written the Heisenberg uncertainty principle was known, but generally thought of in a physically intuitive way whereby making a measurement of one property physically disturbs the system and thereby causes another property to become uncertain, for example position and moment. The EPR paper lead to the realisation that things were stranger than that. They showed that the uncertainty principle applied even without a particle being disturbed by a measurement on the particle itself. This was done by way of a thought experiment using a pair of particles that were entangled (though they didn’t use the term themselves this was the paper that first discussed this distinctly quantum behaviour).

The paper begins by arguing that a ‘complete’ theory must be able to account for every element of physical reality. And that an ‘element of reality’ is any physical quantity that can be predicted with certainty without disturbing the system. They recap that if the momentum of a particle is known in QM then its position is unknown and vice-versa. Hence, when the momentum is known then the position has no physical reality. This applies to any observables in QM for which their operators don’t commute. Thus, if it turns out that the momentum and position of a particle both simultaneously had ‘physical reality’ then QM can’t be complete.

Setting up the thought experiment, they allow a pair of particles (particle 1 and 2) to interact and then become separated. QM provides the wave function for the complete system but doesn’t give the individual state of either of the two particles. Two different measurements of particle 1 leave particle 2 (which no longer interacts with particle 1) in two different states. However, no real change has occurred to the system. Thus, as they see it, it is possible to assign two different wave functions to the same reality. This is discussed, and the maths worked through, in the context position and momentum of a pair of particles. They demonstrate that (as they see it) they are able to know both properties of a particle by carrying out one of the two measurements on the other, entangled particle. Thus, both properties (position and momentum) are known with certainty and so are elements of physical reality. They are not both given by QM and so QM must be incomplete.

The view of Einstein, Podolsky and Rosen was that QM is incomplete and that what is often referred to as a ‘hidden variables’ theory is required. The QM resolution of the EPR paradox is that the principle of locality or the concept of realism has to be abandoned. The violation of locality may seem to be impossible because it could in turn cause problems for causality since signals appear to travel faster than the speed of light. In fact, it is impossible to construct a situation that uses quantum entanglement to send information and so causality is not lost. John Bell in 1964 showed that there were inequalities linking the correlations between measurements that had to be obeyed by any local hidden variables theory and which were violated by QM. Subsequent experiments starting in the 1970s and with a notable one last year showed that these inequalities were violated and so a local hidden variables theory could no describe reality. Despite this, the various thought experiments devised by Einstein to challenge aspects of QM are rightly credited with strengthening the understanding of the theory and identifying those aspects that most strongly clash with our physical intuition about the world.

Summary: Fisher (1925) Theory of Estimation

[Summary by Jotun Hein]

Again, extremely worthwhile!!

Fisher clearly had a series of pioneering papers 1918 and the following years/decades and if it is not totally the one we discussed today is the one that introduces most new concepts, but that is not so essential: A series of ideas that since found their way into textbooks are (almost) first introduced in these papers.

He seems to suffer a bit by not having an established framework of probability and on the first page he has what seems a very convoluted formulation of the law of large number, but manages to have 6 vectors to define it. Eventually stating that if a probability vector in the hidden model model is close to the true model, then the probability fractions in the finite observations from the two models will also be close to each other. And then he states that this could probably be proven, but decides not to!!
He then moves on to define estimation, which is a mapping from observations to parameter space and he then exemplifies the problems you can get into if trying to estimating the mid-point in the Cauchy-distributions by the empirical means. And he doesn’t refer to Cauchy either. Then consistent esimators (converges to the true parameter as observation number increases) and if you ask for more you also wan’t efficiency (the asymptotic variance is minimal). He goes on to prove that the maximum likelihood estimator is efficient under mild conditions. He proves that two efficient estimators will asymptotic correlation of 1, which must mean one is an afine function of the other as far as I can see. He shows how you can rectify the bias (converging to the wrong value) of an efficient estimator. He defines the Fisher Information matrix but doesn’t call it that. Very interesting discussion of the inherent noise in the data (unavoidable) and error cause by inefficient estimators (avoidable).
Clearly one issue is the contrast between asymptotic properties and properties of estimators for finite data.
The sufficient statistics (a function of the data that contains ALL the information about the parameter) and ancilliary statistics (functions of the data that contains NO information about the data). Nothing about minimally sufficient statistics or the non-existance of maximally ancillary statistics.

I hope I didn’t mess up too much in my summary – it was very worth reading a recaptulated a series of key concepts in likelihood statistics. We got a bit tired the last 10 pages. The ½ page letter by Rev. Bayes was the wrong one so we will return to that.

The writing style is a bit peculiar: He only have 3 references and that is to himself. Nothing on hypothesis testing in this paper. He has some intuitive jumps and some places a curve or illustration would have helped, but in general very enjoyable. He uses the phrase “Without ancedent knowledge” – almost sounds like “Don’t use the word prior, please”

Summary: Chomsky (1959) On certain formal properies of grammars

[Summary by Jotun Hein]

I think it was an incredible paper.

We only managed to discuss p137-154 in detail and then we try to guess what he did in the remaining 15 pages.

NC starts out with 4 pages of quite general considerations about recursive functions and grammars, but then he becomes a more specific and defines his 4 classes of grammars that today could be called general [type 0], context sensitive [type 1], context free [type 2] and regular [type 3]. The last three are defined in terms of restrictions on the grammatical rules they use which implies diminishing power of how large a language they can generate.
About 2 pages are used to find languages that is in one type but not in the next so it is a proper inclusion.

There is very little proper linguistics – max 1 page in total. NC seems to conclude that real languages are between type 1 and 2 but NC to say there is no natural restriction on the allowed grammatical rules that will define them. If I had more time I should like to read more on this.

NC uses the term Markov Finite State Machines, but there is no probability in his definition.