There is much said about the difference between Data and Information. Some of it even makes sense. Personally I have found that the people who have grappled with the concept of entropy (as in the concept of Shannon Entropy) have thought about this more deeply than most. For example, here are some thoughts from Ariel Caticha on information (Caticha is a Prof of Physics at the University of Albany).
The need to update from one state of belief to another is driven by the conviction that not all probability assignments are equally good; some beliefs are preferable to others in the very pragmatic sense that they enhance our chances to successfully navigate this world.
The idea is that, to the extent that we wish to be called rational, we will improve our beliefs by revising them when new information becomes available: Information is what forces a change of rational beliefs. Or, to put it more explicitly: Information is a constraint on rational beliefs.
Entropic Inference
Ariel Caticha
Department of Physics, University at Albany-SUNY,
Albany, NY 12222, USA.
HERE