I have read quite a bit about information over the years (and entropy and noise and inference, all of which are related). It's a tricky thing to understand and there are not so many plain English descriptions of what information, bits and Claude Shannon are about. James Gleick's book The Information: A History, a Theory, a Flood is very good. But also pretty long.
HERE is a very well written and readable shorter piece by Rob Goodman and Jimmy Soni that explains the early work of Claude Shannon and his 'invention' of information theory.