Arguably the most onerous philosophical question attached to Quantum Mechanics (QM) is that of Measurement. The accepted (a.k.a. Copenhagen) Interpretation of QM says that our very act of conscious, intelligent, observable measurement – determines the outcome of the measurement in the quantum (microcosmic) realm. The wave function (which describes the co-existing, superpositioned, states of the system) collapses following a measurement. It seems that just by knowing the results of a measurement – we determine its outcome, determine the state of the system and, by implication, the state of the Universe as a whole. This notion is so counter-intuitive that it fostered a raging debate which has been on going for more than 7 decades now.
But, could we have turned the question (and, inevitably, the answer) on its head? Is it the measurement that brings about the collapse – or, maybe, we are capable of measuring only collapsed results? Maybe our very ability to measure, to design measurement methods and instrumentation, to conceptualize measurement and so on – are thus limited as to yield only the collapse solutions of the wave function?
Superpositions are notoriously unstable. Even in the quantum realm they should last but an infinitely split moment of time. Our measurement apparatus is not as refined as to capture a superposition long enough to justify the title of "measurement" or "observation". By contrast, collapses are sufficiently stable to last, to be observed and measured. This is why we measure collapses.
But in which sense (excluding longevity which, anyhow, is a dubious matter in the quantum world) are collapse events measurable, what makes them so? Collapse events are not the most highly probable – some of them are associated with low probabilities and still they occur and are measured. Ex definitio, the more probable states will tend to be measured more (the wave function will collapse more often into high probability states). But this does not exclude the less probable states of the quantum system from materializing upon measurement.
The other possibility is that the collapse events are carefully "selected" for some purpose, within a certain pattern and in a certain sequence. What could that purpose be? Probably, the extension and enhancement of order in the Universe. That this is so can be easily substantiated: it is so. Order increases all the time. This is doubly true if we adopt the anthropocentric view of the Copenhagen Interpretation (conscious, intelligent observers determine the outcomes of measurements in the quantum realm). Humans are notoriously associated (of their very being) with negentropy (the decrease of entropy and the increase of order). This is not to say that entropy cannot increase locally (and order decreased or low energy states attained). But it is to say that low energy states and local entropy increases are disturbances and that the overall order in the Universe must increase even as local pockets of disorder are created. The overall increase of order in the Universe should be, therefore, a paramount constraint in any formalism.
Yet, surely we cannot say that a specific measurement (collapse) increased order (with the exception of the knowledge – a form of order – added to the observer's brain). In order to say that a certain collapse event contributes to the increase of order (as an extensive parameter) in the Universe – we must assume the existence of some "Grand Plan", "Grand Design" within which it would make sense (it would be logically compatible) to use this statement. Such a Grand Design must incorporate tools to gauge the amount of order extent at any given moment (for instance, before and after the collapse). Humans are such tools (they are special tools because they design and implement additional tools).
But, how does the quantum system "know" about the Grand Scheme, its details and the place of the quantum system in it? Should we choose to say that it has no way of knowing such things (assuming that a Grand Design does exist) – then, how is the information transferred from the Grand Plan to the quantum system and / or to the measurement system (not much of a distinction between those two systems, really). It must be communicated superluminally (at a speed greater than the speed of light) because the reactions are instantaneous and simultaneous – while the information about the Grand Design must come from near and far. A second issue is: what are the transmission and reception mechanisms and channels? Which is the receiver, where is the transmitter, what is the form of the information, what is the vehicle carrying it (we will probably have to postulate yet another particle to account for this last one).
The second, no less crucial, question relates to the apparent arbitrariness of the selection process of the collapse event. All the "parts" of the superposition constitute potential collapse events and, therefore, can be measured. The issue is: why is only a certain event measured in a specific measurement? Why is it "selected" to be the collapse? Why does it retain a privileged status versus the measurement apparatus or act?
Since the order in the Universe does increase, it would seem that the preferred status will have to do with this inexorable process (i.e., the increase of order). If other collapses were to be selected, order would have diminished. The proof is again in the pudding: order increases – therefore, collapse events increase order. There is a process of negative, order-orientated, selection: collapse events which are order decreasing are filtered and avoided. They will not be measurable because they will be annulled in the collapse.
We are back to the same conclusion: there is, at a minimum, a guiding principle (that of the statistical increase of order). This guiding principle is not communicated to quantum systems (see the problems that we mentioned earlier). The only logical conclusion is that all the information relevant to the decrease of entropy and to the increase of order in the Universe is stored in each and every part of the Universe, no matter how minuscule and how fundamental. Very much like living organisms, all the information is stored in a Physical DNA (PDNA). The unfolding of this PDNA takes place in the physical world, during interactions between physical systems (one of which is known to us as measurement). The Biological DNA contains all the information about the living organism and is replicated trillions of times over, stored in the basic units of the organism, the cell. What reason is there to assume that nature deviated from this (very pragmatic) principle in other realms of existence? Why not repeat the winning formula in quarks? The Biological DNA requires a context to translate itself into an organism – an environment made up of amino acids, etc. The PDNA should also require a context: the physical world as interfaced through the act of measurement.
The information stored in the physical particle is structural – because order has to do with structure. Very much like a fractal (or a hologram), the particle should reflect the whole Universe accurately and the same laws of nature should apply to both. The distinction between functional (operational) and structural information is superfluous and superficial. A "function" is the name that we, humans, gave to an increase in order. There is a magnitude bias here: we are macrocosmic and at our level, form and function look distinct. But if we accept the afore-suggested definition of function (the process of increasing order), than the distinction is cancelled because the only way to measure the increase of order is structurally. We measure functioning (=the increase in order) using structural methods (the alignment or arrangement of units).
Still, the information should encompass, at least, the relevant (close, non-negligible and non-cancelable) parts of the Universe. This is a tremendous amount of data. How is it stored in tiny corpuscles?
Either it is stored, utilizing methods and processes which we are far even from guessing – or else the relevant information is infinitesimally (almost vanishingly) small. It could be somehow linked to (even equal to) the number of possible quantum states, to the superposition itself, or to the collapse event. It may well be that the whole Universe can be adequately described in an unbelievably minute, negligibly tiny, amount of information which is incorporated in those quantum supercomputers that today, for lack of better understanding, we call "particles".