Part One: Information is bits
https://plus.maths.org/content/information-birth-bit
EXCERPT: We send information over the internet every day. All that information, whether it's a birthday email or your homework, is in encoded into sequences of 0s and 1s. What's the best way of doing this? The way that uses the least amount of 0s and 1s and therefore requires the smallest amount of computer memory? . . .
Part Two: Information is surprise
https://plus.maths.org/content/information-surprise
EXCERPT: It's not very often that a single paper opens up a whole new science. But that's what happened in 1948 when Claude Shannon published his Mathematical theory of communication. Its title may seem strange at first — human communication is everything but mathematical. But Shannon wasn't thinking about people talking to each other. Instead, he was interested in the various ways of transmitting information long-distance, including telegraphy, telephones, radio and TV. It's that kind of communication his theory was built around.
Shannon wasn't the first to think about information. Harry Nyquist and Ralph Hartley had already made inroads into the area in the 1920s (see this article), but their ideas needed refining. That's what Shannon set out to do, and his contribution was so great, he has become known as the father of information theory....
Part Three: Information is complexity
https://plus.maths.org/content/information-complexity
EXCERPT: How much information is there in this article? It's hard to tell. Its length clearly isn't an indicator because that depends on my choice of words. One thing you could do to measure information is to condense the article into bullet points. If you end up with many of them, then that's because the article is pretty dense. If there's only one, then the information it conveys is rather basic, or at least not very complex....
https://plus.maths.org/content/information-birth-bit
EXCERPT: We send information over the internet every day. All that information, whether it's a birthday email or your homework, is in encoded into sequences of 0s and 1s. What's the best way of doing this? The way that uses the least amount of 0s and 1s and therefore requires the smallest amount of computer memory? . . .
Part Two: Information is surprise
https://plus.maths.org/content/information-surprise
EXCERPT: It's not very often that a single paper opens up a whole new science. But that's what happened in 1948 when Claude Shannon published his Mathematical theory of communication. Its title may seem strange at first — human communication is everything but mathematical. But Shannon wasn't thinking about people talking to each other. Instead, he was interested in the various ways of transmitting information long-distance, including telegraphy, telephones, radio and TV. It's that kind of communication his theory was built around.
Shannon wasn't the first to think about information. Harry Nyquist and Ralph Hartley had already made inroads into the area in the 1920s (see this article), but their ideas needed refining. That's what Shannon set out to do, and his contribution was so great, he has become known as the father of information theory....
Part Three: Information is complexity
https://plus.maths.org/content/information-complexity
EXCERPT: How much information is there in this article? It's hard to tell. Its length clearly isn't an indicator because that depends on my choice of words. One thing you could do to measure information is to condense the article into bullet points. If you end up with many of them, then that's because the article is pretty dense. If there's only one, then the information it conveys is rather basic, or at least not very complex....