This is very cool but the use of the terms "Information Entropy" together as if they were a separate thing is maybe the furthest that any "ATM-machine"-type phrase has rustled my jimmies.
It is a cool article, just, wowzers, what a phrase.
They're two names which references the same concept, but Shannon entropy is not the only type of entropy. I mean maybe everything really is just Shannon entropy if you look deep enough and have a sufficiently powerful model of physics but we're not there today.
I'm meaning the term "information entropy". Both of those words are generally used interchangeably to mean the same thing (to include the different forms of entropy, as I understand it. :')))) ) <3 :'))))
There's other kinds of 'entropy' that have distinct lineages and contexts than 'information entropy', but information is kind of the basis for everything physically so if you squint a bit you can make any non-information entropy look like information entropy
It is a cool article, just, wowzers, what a phrase.