Information arises out of a communication game played between a sender and receiver of signals. For physical systems, the players may be designated as Nature and the scientist. The average information obtained from a quantum system is given by the von Neumann measurement scheme, which represents a generalization of thermodynamic entropy that is in perfect accord with common sense when considering a mixed state. While testing multiple copies of such a state can reveal information about the choice made by the sender, entropy for an unknown pure state is zero. A related puzzle is how the total information in the universe appears to be increasing over time.^{1}

Zero entropy for an unknown pure state is reasonable inasmuch as once that state has been identified, no further information may be gained from examining its copies. But such an idea will not be reasonable if the game between sender and receiver consists of the former choosing one of a certain number of polarization states (say, for a photon) and supplying several copies to the receiver. In this case, measurements made on the copies reveal information regarding the sender's choices. If the set of choices is infinite, the ‘information’ generated by the source is unbounded. From the point of view of the preparer of the states, information in the pure state is limited by the ‘relationship’ between source and receiver, and by the precision of the receiver's measurement apparatus. If the sender chose a polarization state with which the apparatus was synchronized, the receiver could readily recognize the state.

In recent papers,^{2,3} I have been investigating information obtainable from an unknown pure state within the framework of communication between source and receiver. I propose a measure of entropy that covers both pure and mixed states. In general, entropy has two components. One is informational, related to the pure components of the quantum state, which can vary by receiver. The other component is thermodynamic and independent of the receiver. The increase of information over time is a consequence of the interplay between unitary and non-unitary evolution, which makes it possible to transform one type of information into the other. Such complementarity indicates that a fundamental duality is essential for information.

For a two-component elementary mixed state, the most information to be obtained from each measurement is one bit. Each further measurement of identically prepared states will also yield one bit. For an unknown pure state, information represents the choice made by the source from an infinity of choices related to the values of probability amplitudes with respect to the basis components of the receiver's measurement apparatus. For a two-component pure state, each measurement also provides a maximum of one bit of information. If the source has made available an unlimited number of identically prepared states, the receiver can obtain additional information from each measurement until the probability amplitudes have been correctly estimated. Once that has occurred, unlike the case with a mixed state, no further information will be obtained from testing additional copies of this pure state.

These ideas have potential application in the field of quantum computing. Estimates by the receiver can be made by adjusting the basis vectors to move closer to the unknown pure state. In the most general case, the information obtainable from such a state in repeated experiments is potentially infinite. But if the observer learns the value of the pure state, the information associated with the states vanishes. This suggests a fundamental divide between objective and subjective information.

This approach is consistent with the positivist view that one cannot speak of information associated with a system except in relation to an experimental arrangement together with the protocol for measurement. The experimental arrangement is thus integral to the amount of information that can be obtained.

The informational measure I have proposed resolves the puzzle of entropy increase. We can suppose that, in the beginning, the universe had immense informational entropy associated with a pure state, a portion of which, during the physical evolution of the universe, has been transformed into thermodynamic entropy.

Subhash Kak

Department of Computer Science

Oklahoma State University

Stillwater, OK

Subhash Kak is professor and head of the Department of Computer Science at Oklahoma State University, Stillwater. He was formerly Donald C. and Elaine T. Delaune Distinguished Professor of Electrical and Computer Engineering at Louisiana State University, Baton Rouge.