What is the essence of information? We explore the history of communication technology leading to the modern field of information theory. We’ll build up towards Claude Shannon’s measure of information entropy, one step at a time.
Art of the Problem
Source
Similar Posts
42 thoughts on “What is Information Theory? (Information Entropy)”
Comments are closed.
don't worry, it was probably twilight.
You should have put a screenshot of your youtube page in a bucket and dropped it on the scale. This is a form of exchanging information as well! (Recently became a big fan! Thanks for starting a new series)
I'm sure they found a bad book to rip up. No one over there is tearing up copies of Principia đ
Maybe it WAS Principia then! haha
By the way great job with these series, I learned a lot from the encryption episodes. Keep it up!!
these videos explain such mindboglingly complex ideas so much better than any textbook or lecture ever could thank you
Amazing
I have a few technical issues with this video. The biggest:
* Pounds is not a unit of mass.
* Bits and (Shannon) entropy only make sense for discrete symbol sets. So, "information, no matter the form, can be measured using a fundamental unit" isn't really true. ( see wikipedia article for Shannon entropy )
* Describing the alphabet as the "most powerful invention in human history" made me cringe. Cultural imperialism, much? Chinese writing system, for example.
two month now, hopefully there is more to come.
well I certainly want to know more now lol
watching the series from beginning again đ
Such a great series. Thank you.
Talking drums
I'm rediscovering youtube thanks to KHANACADEMY and everyone involved in this awesome project !
Who else is with me ?!
me too!
Information Theory Free Course on Coursera Org's Site…..Intro
fuck man shit was deep, love these, i am high after watching the first video…
The entropy of the term "Language of coins" is much higher to a noob than the author intends. I would have expected an explanation of the meaning of this phrase in the part 1.
Nice video. But does Information theory depend on binary logic?
Shouldn't it work irrespective of the representation system
The definition should be broader to accommodate non binary representations
What if we start using quantum computers where each bit has multiple states?
ditto that
Information theory does not indicate how much information is in a page or a book; it indicates the maximum amount of information that can *potentially* be embedded in a medium, whether DNA or a bit-stream. This is like temperature or mass in that such attributes indicate one very narrow and averaged attribute of matter; many more attributes are necessary to say what that matter actually is, and how it interacts with other matter. The physics of the medium and what they represent matters.
Great. I look forward.
Cheers.
How neatly done! Thanks for making this great video.
Lost me at, "measure of surprise". I don't know why he said that. It just pops up in the middle of things. I have no idea what it's connected to. It just comes out of nowhere and then disappears. This is why I can't learn from lectures or videos. I need to stop the person and ask questions. Learning has to be student led. Otherwise it's just infotainment and something to pad a resume with for the producers.Â
Entropy = Information | Objections
Every since 1948, with the publication of Shannon's paper, there has been a growth in the assumed equivalence of heat engine entropy and the entropy of a message, as well as growth in the objections to this point of view. In the 1999, to cite one example, American chemistry professor Frank Lambert, who for many years taught a course for non-science majors called "Enfolding Entropy" at Occidental College in Los Angeles, stated that another major source of confusion about entropy change as the result of simply rearranging macro objects comes from information theory "entropy" of Claude Shannon. [12] In Shannonâs 1948 paper, as discussed, the word "entropyâ was adopted by the suggestion von Neumann. This step, according to Lambert, was âWryly funny for that moment,â but âShannon's unwise acquiescence has produced enormous scientific confusion due to the increasingly widespread usefulness of his equation and its fertile mathematical variations in many fields other than communications". [13]Â
According to Lambert, âcertainly most non-experts hearing of the widely touted information entropy would assume its overlap with thermodynamic entropy. However, the great success of information "entropy" has been in areas totally divorced from experimental chemistry, whose objective macro results are dependent on the behavior of energetic microparticles. Nevertheless, many instructors in chemistry have the impression that information "entropy" is not only relevant to the calculations and conclusions of thermodynamic entropy but may change them. This logic, according to Lambert, is not true. [12] In sum, according to Lambert, information "entropy" in all of its myriad nonphysicochemical forms as a measure of information or abstract communication has no relevance to the evaluation of thermodynamic entropy change in the movement of macro objects because such information "entropy" does not deal with microparticles whose perturbations are related to temperature. Even those who are very competent chemists and physicists have become confused when they have melded or mixed information "entropy" in their consideration of physical thermodynamic entropy. This is shown by the results in textbooks and by the lectures of professors found on the Internet.Â
In the 2007 book A History of Thermodynamics, for instance, German physicist Ingo MĂźller summarizes his opinion on the matter of von Neumannâs naming suggestion:Â
âNo doubt Shannon and von Neumann thought that this was a funny joke, but it is not, it merely exposes Shannon and von Neumann as intellectual snobs. Indeed, it may sound philistine, but a scientist must be clear, as clear as he can be, and avoid wanton obfuscation at all cost. And if von Neumann had a problem with entropy, he had no right to compound that problem for others, students and teachers alike, by suggesting that entropy had anything to do with information.â
MĂźller clarifies the matter, by stating that: âfor level-headed physicists, entropy (or order and disorder) is nothing by itself. It has to be seen and discussed in conjunction with temperature and heat, and energy and work. And, if there is to be an extrapolation of entropy to a foreign field, it must be accompanied by the appropriate extrapolations of temperature, heat, and work.â [11]Â
——————————————————————————————————————-
Exactly. So it is untenable extrapolation of ideas which mean something different in another field. And from that, even wilder extrapolations by others to far broader ideas of man and his concerns. The root untenability is compounded exponentially. Wikipedia does not reveal much but a searching, sceptical article critical of IIT is that by Scott Aaronson at
http://www.scottaaronson.com/blog/?p=1799
and a full treatment in
http://www.scientificamerican.com/article/a-theory-of-consciousness/?page=1
in both cases with some very searching and intelligent critiques by a variety of very well-informed and deep-thinking people.
Great video, you hooked me into watching more, good job!
Excellent job, congratulations!
The alphabet is the most powerful invention in human history? The Chinese beg to differ.
At 2:17, when you tear the page from the book, would have made me burst into tears as a child. Â I am surely not the only nerd who was taught as a child to treat books as sacrosanct — to see tearing a page from a book on the moral level of tearing a limb from a living animal. Â Even now, the scene made the blood drain from my face. Â Much as I overall like this video, and the others in the series I have seen so far, I could not in good conscience show it to a child with that scene in it.
These videos are absolutely amazing! Thanks so much for sharing this!
Mind blown!!! Thanks for the upload
Call me coinage, for I be alpha-betting, Kaizen. ;D nice intro
The bottom line is that information requires agency 100% of the time, no exceptions: period. At 1:25 in this video the line between matter/energy and information is falsely bridged. The video falsely claims at 1:50 that no matter the form of communication it can be measured in units. The fundamental particle of information is not a heads or tails, one or zero, on or off, true or false, light or dark or whatever attribute one decides to use in parsing information. Information in part or whole is an attribute of mind, 100% of the time, no exceptions. To analysis or use information requires agency. The information in our DNA required agency to place it there and information theory cannot bridge the infinite gap between crystallography and biology; Shannon and Yockey be damned.
Interesting video but if this is a series, don't you think that numbers giving us the order in which these videos should be played would be good information to have?
What a beautiful and entirely inaccurate description
"But first, we need to talk about parallel universes."
entropy is the meaning of the universe.. of it all
what we know as the universere.. all existence.. is from one entropy point.
Thanks. By the way it felt like I was watching howtobasic for a couple of seconds ?.
This explains null and nothing about entropy. It's only clear for those that already confusing of it for tons years. But not for me, that never thinking about it, just started to grab its meaning. But, sorry, its explanation so fast, unable to attract me, and this is not an art product. Caused it makes me get confusion.
Typical academic approach. I do it the opposite way I start with the bolts. If i write " Anna banana golf god woods" and the write "i am a dirty boy"… The first sentence contains more information high entropy. The next sentence contains less information. And how do you decide ? The sentence that raise the most new questions contains the most information. I am a dirty boy may mean being a pervert or something. Anna banana golf god woods raise alot more new questions. Therefor it carries more information.
What is the essence of information?
The essence of information is a single electron
(and no entropy)
Is it possible to store information into a single electron?
Of course a single electron is a keeper of information.
Why?
An electron has six ( 6 ) formulas:
   E=h*f and  e^2=ah*c ,
  +E=Mc^2  and -E=Mc^2 ,
   E=-me^4/2h*^2= -13,6eV  and E= â . . . .
   and obeys five (5) Laws :
a) The Law of conservation and transformation energy/ mass
b) The Heisenberg Uncertainty Principle / Law
c) The Pauli Exclusion Principle/ Law
d) Dirac – Fermi statistic
e) Maxwell / Lorentz EM law
It means in different actions electron must know six different formulas
and must observe five laws. To behave in such different
conditions a single electron itself must be a keeper of information
===============
ts not excix or not