The science of information is the most influential, yet perhaps least appreciated field in science today. Never before in history have we been able to acquire, record, communicate, and use information in so many different forms. Never before have we had access to such vast quantities of data of every kind. This revolution goes far beyond the limitless content that fills our lives, because information also underlies our understanding of ourselves, the natural world, and the universe. It is the key that unites fields as different as linguistics, cryptography, neuroscience, genetics, economics, and quantum mechanics. And the fact that information bears no necessary connection to meaning makes it a profound puzzle that people with a passion for philosophy have pondered for centuries.
Table of Contents
LECTURE 1 The Transformability of Information 4 LECTURE 2 Computation and Logic Gates 17 LECTURE 3 Measuring Information 26 LECTURE 4 Entropy and the Average Surprise 34 LECTURE 5 Data Compression and Prefix-Free Codes 44 LECTURE 6 Encoding Images and Sounds 57 LECTURE 7 Noise and Channel Capacity 69 LECTURE 8 Error-Correcting Codes 82 LECTURE 9 Signals and Bandwidth 94 LECTURE 10 Cryptography and Key Entropy 110 LECTURE 11 Cryptanalysis and Unraveling the Enigma 119 LECTURE 12 Unbreakable Codes and Public Keys 130 LECTURE 13 What Genetic Information Can Do 140 LECTURE 14 Life’s Origins and DNA Computing 152 LECTURE 15 Neural Codes in the Brain 169 LECTURE 16 Entropy and Microstate Information 185 LECTURE 17 Erasure Cost and Reversible Computing 198 LECTURE 18 Horse Races and Stock Markets 213 LECTURE 19 Turing Machines and Algorithmic Information 226 LECTURE 20 Uncomputable Functions and Incompleteness 239 LECTURE 21 Qubits and Quantum Information 253 LECTURE 22 Quantum Cryptography via Entanglement 266 LECTURE 23 It from Bit: Physics from Information 281 LECTURE 24 The Meaning of Information 293
Truly exceptional lecture on information theory from one of John Wheeler's graduate students. Schumacher boiled down the history of information to make it extremely digestible. You will be treated to the brilliance of one of my favorite people to ever live, Claude Shannon, as well as others in the field and be fed information theory in a very digestible bit by bit form. Schumacher will give you the usual history-- everything can be boiled down to logic gates-- and trace the understanding of information theory through the encoding, decoding, compressing of information. He provided an extremely clear lecture-- one of the most clear i have heard to date-- about entropy = the average surprise of information. You will be taken through to quantum computing and even hear some debates Schumacher had with Wheeler.
This is one of those lecture series that I will not delete from my phone, even though I have finished because I enjoyed it so much, I will probably listen again while doing yard work.
Lets imagine a regular multifaceted 3D object, where each side has a center. Project a perpendicular line from that center: we'll get vectors with unique destinies. Information theory (the 3D object) as was formulated by Claude Shannon laid the ground for widely-ranged and far-reaching indispensible applications (the vectors) that shape the informational era we live now.
Now lets not imagine. Information theory, like relativity, chaos and quantum mechanics, is one of the hallmarks of the 20th century's great achivements and one of the most elegant theories. Claude Shannon, its father, is a genius no less than Albert Einstein (at least, I think so). No problem in information science had not initially been adressed, albiet not all solved, by him; and, at first, he worked ALONE! because he thought that others would think his work too abstract and theoretical (seriously?).
Side insignificant note: In the documentery commemorating him, The Bit Player, Shannon said that he used to see Einstein at MIT, where they both worked; he was fascinated by him; and waved to him everytime they passed by each other. (Isn't this weird that genuises and great scientists and thinkers tend to be spatially distributed in condensed geographical clusters? I imagine that it could be related to paranormal or implicit information transmition!)
First of all Shannon wondered: what is communication? As a faithful mathematician, he was not thinking about its meaning but its formal abstract essence. He figured out that regardless of the content, communication, in general, consists of:
this abstract sketch is why information is transformable (patterns carried by any means) and distinguishable (every message that is communicated is one particular message out of a set of many possible messages). So, how could we implement this? Since the least alternatives possible messages are two: yes of no, he applied it on the recently manufactured transistors as communication channels bearing codes, this is digital information.
Moreover, he used entropy as a measure of the amount of information contained in a message (given that the possible messages are probably equal: the logarithm of 2 of possible messages). However, the fascinating part is when the possible messages are probably inequal. He used probability: the less probable a message is, the more amount of information it conveys. Accordingly, his first theorem used those mathmatical concept on how to devise codes, that transmit the message efficiently, based on how to overcome redundency. The second theorem is on how to avoid noise to deliver a reliable message by coding! because eliminating redundency increases errors. So we have lower and upper limits that could be solved by block coding (Block coding? I try to stay away from technical jargon used in the lectures for no reason other than the fact that I've forgotten them!) and other methods to reduce errors.
How to represent bits in different ways: sounds, pictures and videos? They use perceptual coding, which depends on our inability to distinguish small differences in frequencies of sounds and wave, so this allows less uses of bits by coding only differences. How wireless or analog information is transmitted? This accomplished by detecting pattern of waves from the information source in the electromagnetic field... this illustaretes how information itself is not electrical nor energy dependent, what requires those is the carrier of information.
Information science has ubiquitous and important applications in multidisciplines: cryptography (e.g, bitcon); genetics; neuroscience, thermodynamics, quantum mechanics and even economics.
Its application in neuroscience quite fascinated me. Neuronal information is digital: the all-or-non law of action potential. What is the code to this pattern? There are two: rate coding, the message is the frequency of spikes in a specific length of time; temporal coding, the message is the distribution of spikes in time. Then came the memory with its three types: sensory, short-term and long-term. (This reminded me of Shannon's game: the mouse in the maze, which was one of the early thinking machines that learn by trials and errors.)
These very informative and interesting lectures are intended for the public audience, non-specialists by Prof. Schumacher; who explained informatin science's fundamental ideas in a simple way employing extensive examples and some mathematical and geometrical demonstrations covering a wide variety of its central topics and usages. (Subjective-reading parallelism: this was my first time seeing mathematics at work.. and it was really important to me.)
I consider it one of the fundemental sciences that should be taught in schools, because it is a pity to use mobiles, computers, internet and TV and not know a little bit about the foundational concepts of how they work.
The Science of Information is a relatively unique course. Its subject matter is immensely important. Yet, this is not a topic addressed much in popular science books.
This must be at least partly because of its difficult nature. Unlike even some of the densest elements of the most esoteric quantum physics concepts, common language discussions of information theory theories are more challenging.
Claude Shannon and his successors' works have revolutionized how we live. The professor unequivocally establishes the role played by their pioneering contributions. The lectures are uneven - mainly because of the uneven nature of topics, as not all are equally amenable to non-mathematical descriptions. Some concepts like quantum cryptography and error-corrections are lucid. Other simpler ones on information entropy and data compression are less so.
Readers will have to work hard to get through the sessions. The extent of efforts put in will have a significant bearing on how much one is able to understand. The work is essential and unique, but not easy. That's why it required a genius like Shannon to get humanity on the right path. For the curious, the book will provide a great starting point. Most readers will need more books to gain a decent understanding of the concepts, so from that viewpoint, this book is not complete. And that cannot be anything, but a matter of fact observation as no such book on this topic could be.
It was way better than I expected. Informative and interesting. Sometimes too hard to follow without any visual aid, but it has a comprehensive pdf accompanying it, I'll definitely come back to it later.
Another great book of "The great courses", this much more comprehensive for me that I have been dedicated to information technology for many years.
It ranges from the most basic to the most complicated concepts: “The Science of Information: From Language to Black Holes” by Benjamin Schumacher.
Concepts as varied as language, the principles of computing, how to measure information are the beginning of the course.
And then "again" principles of physics like the laws of thermodynamics and entropy.
Later: data compression, how to save images and sounds, how to avoid noise generated in communications. Bandwidths. Cryptography and codes.
And then other topics related to biology, DNA, and biological information.
And in the end, topics as varied as quantum information, black holes and the "disappearance of information."
I think the issue that got me thinking the most is how will we pass information to our descendants? Will they understand it? Can we understand and be understood by aliens if they exist?
The Science of Information: From Language to Black Holes by Benjamin Schumacher is the Great Courses at its best. Complicated information conveyed simply by a professor that is in love with his subject. I learned quite a bit and I was often thinking through problems and implications during many of the lectures. If there was one problem with the book, it would be that Schumacher goes a little too far in the direction of saying everything is information. Just because something can be expressed as information doesn't quite make it something that would follow information theory. Otherwise, it really was quite good, and is a decent primer for anyone trying to get more involved in information technology. I could easily see this as a first or second year course in a CS program or an elective for most majors.
I didn't really enjoy this course at the beginning. I'm not sure if it was because Schumacher referenced several diagrams that I couldn't see because I was listening to it in my car or if the subject matter just seemed a bit droll -- I really don't know. Regardless, it kicked in about mid-way through; and while downloading a PDF version of the aforementioned diagrams probably helped a bit, I think it was more that Schumacher proceeded to expand the scope of the Science of Information to include so many cool and cutting edge ideas, that the last several sessions became philosophical lessons on how, at the quantum level, matter might really just be information. Yeah, he goes there. Stick it out, it is worth it.
I think I was less interested in information science for its own sake, and so I didn't really give this a fair shot. I skipped most of the lectures, gathering just enough to discern the concept of reducing complex information down to "bits" of 1s and 0s, "on" and "off." I listened to a bit of the segment on cryptography, but really I downloaded the course due to the comment on black holes. I was hoping to get a lot more on how quantum theory- qbits - pertain to information. There was only one lecture on that. Fortunately the same professor has another Great Course on gravitational physics, so I'm going there next.
I've worked in IT for 15 years, majored in it, and been a techie and science geek my whole life. I did not know the science of information was even a field.
Many of the concepts were already familiar to me, but many others I'd never even heard of. Some of them are brain melting.
There was quite a bit of heavy math content in some sections, which is pretty rough to listen to when you're driving a car. And I still don't grasp quantum mechanics, despite multiple encounters with the subject now. Probably the course guide PDF would help patch up some of these lectures.
HOW THE F#*K IS THE INFORMATION THEORY OF ENTROPY SUPPOSED TO BE TRUE WHEN PARTICLES LOSE INFORMATION DUE TO QUANTUM UNCERTAINTY ALL THE TIME!!! :@ :s SERIOUSLY! IF QUANTUM EFFECTS PREVENT YOU FROM PREDICTING THE THE FUTURE STATE OF THE UNIVERSE BEYOND A CERTAIN LEVEL OF ACCURACY THEY ALSO MAKE THE PAST UNRECOVERABLE TO THE SAME EXTENT!!!! :@ :@ :@
Also the chapter on open key cryptography left too many questions unanswered.
This was a really fascinating insight into the Information Theory and a wonderful coverage of such varied and interesting fields; however, while I did enjoy the series, I found that it was intended to be visualized, as the lecturer consistently made reference to visual data such as graphs, which made it very frustrating when consuming it as an audio-book. This left me imagining a lot of the formulas he was discussing, which made it really hard to listen to.
This is an excellent tour of an underappreciated subject. Information science underpins most of the big changes humanity is going through at the moment. It's quite mathsy and I think it'll help to have some passing familiarity with the topic to really enjoy. Something about this subject just makes me happy. It has the satisfaction of elegant mathematics with real-world implications behind every step of exposition. And the lecturer is very engaging and accomplished.
Was expecting more philosophical approach to this theme, instead i got a litteral science dump of information. There is a lot of formulee and mathematics involved, so if you are listening to this better be focused and not doing it as a leasurely activity.
Dont get me wrong, it is well written\presented, and while i got only a small fracture of it to remember, someone with a basic knowledge can get a lot more out of it.
I had no idea how connected it was to modern physics. I knew the bare bones just from being around Information Theory, but I had no idea that the theories were intertwined with the idea of entropy. I kept hearing about information in relation to popular physics, but didn't understand what that meant. This book has clarified a lot for me.
The math in these lectures is insane, and I have a degree in Computer Science. I'm giving this two stars because I did enjoy the non brain melting lectures that didn't involve longwinded logarithmic equations, such as DNA computers and the link between physics and information.
In general I was hoping for a 101 course and I got a 301.
This is a series of lectures about Information Theory, that goes from the very basis to advanced concepts and their applications in physics and modern technology. For me it was a good review of details I had learned from my days in engineering school, many of which I had forgotten or had become a bit foggy.
I'm loving information theory. It is not my area of expertise by any standard so a lot of it flies over my head but when I understand insights shared, whether they be through other books or media, or this course, I am deeply impressed. I recommend this course to anybody interested in the learning material and subject matter, but there are other sources that could offer the same or more.
Truly enjoyable! Not sure if information theory is normally taught like this, but he ties together the theories with narratives and makes the lectures memorable.
This is an excellent book that goes from broad knowledge to specific and technical pretty fast. It's a great read, but more oriented towards more engineering or science oriented people.