Numbers in this sense are the contents of a 'register' which is 8 bits, 1 for the sign, 7 for the 1's and zeros
Thank you for your contribution, Christopher. You may then consider an entropy of information whose value is k ln 2, or appr. 0.693 k, to know whether you should read from right to left or inversely. For more explanations, I'll quote here my father physicist's work "Science and Absurdity":
"It was in the second half of last century that Maxwell, perhaps in all seriousness, perhaps with tongue in cheek, postulated his little demon, who, astraddle a sluice to which two types of molecules are fed, diverts nimbly the type A molecules to a left compartment, while diverting the type B molecules to a right compartment, the volumes of the two compartments being respectively proportional to the relative numbers of the A and B type molecules. At the end of this sorting operation the two compartments are in temperature and pressure equilibrium, but the difference between the two types of molecules permits one to extract energy from the system at the expense of the heat fed to it, as we have seen earlier, in order to keep the two gas volumes in temperature equilibrium with their surroundings and with each other
[see fig.1 and 2]. And Maxwell asked: Is not the second law violated by the little fellow?
No one really thought that the system would work, or could work, yet no one for a long time gave a logically satisfying answer why Maxwell's demon just could not be. The thoughtful explanation which Maxwell's thoughtful paradox deserved, and the clever exorcism which his clever demon also deserved, did not come until over half a century later when, a little like the little boy exclaiming: "But the king has no clothes", the physicist Szilard said, in 1929,? "But Maxwell' s demon does not have the information he needs to time the opening and closing of his trapdoor". That did it.
It took some time, though, for Szilard's paper to be generally appreciated, and there is a question whether Szilard himself realized its full import. At any rate his paper was gathering dust on his desk and, had it not been for friends of him who noticed it and persuaded him to send it for publication, it might have remained unpublished. But eventually it became noticed and Rube Goldbergish machines were devised to show with theoretical experiments that the most efficient thinkable process of determining the favorable right and left shunting epochs of the demon's trapdoor involved increases in entropy no smaller than those the demon could offset with his discrimination. Actually, the inventors of these machines could have saved themselves the trouble for, even if the information about the molecules arrivals to the trapdoor from both sides were available at no expense of entropy increase, the very process of communicating this information to the demon with an ideal communication link would entail at least as much entropy increase as he could offset. The strong connection between the second law of thermodynamics and the attendant concept of entropy on the one hand, and the concept of information content on the other, was brought forward fully in 1948,
19 years after Szilard's original paper. It was then that Shannon introduced the concept of the entropy of information (see Appendix II), and developed formulae entirely similar to those of entropy, the only two differences being, firstly, one of sign: information is equivalent to an entropy decrease, and accordingly the term negentropy was adopted to designate the measure of ordcr brought about by information; and secondly a constant factor, for a "bit" of information, the dimmensionless information content which decides the one out of two choices, is equivalent to an entropy of value k ln 2, or appr. 0.693 k, where k, Boltzmann's constant, designates the unit of entropy."
By the way, Christopher, as you answered my former post, do you know the chances for a solicited manuscript to get published?