Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

This is very cool but the use of the terms "Information Entropy" together as if they were a separate thing is maybe the furthest that any "ATM-machine"-type phrase has rustled my jimmies.

It is a cool article, just, wowzers, what a phrase.



are signal and noise the same thing?


grabs water spritzer

Stop that, now. Bad cat.


Yes, also no.


To be fair I think information entropy is more self descriptive than Shannon entropy


Corporate has asked you to find the differences between these two pictures.


They're two names which references the same concept, but Shannon entropy is not the only type of entropy. I mean maybe everything really is just Shannon entropy if you look deep enough and have a sufficiently powerful model of physics but we're not there today.


I'm meaning the term "information entropy". Both of those words are generally used interchangeably to mean the same thing (to include the different forms of entropy, as I understand it. :')))) ) <3 :'))))


Oh, no I was talking about ex https://en.m.wikipedia.org/wiki/Entropy_(classical_thermodyn...

There's other kinds of 'entropy' that have distinct lineages and contexts than 'information entropy', but information is kind of the basis for everything physically so if you squint a bit you can make any non-information entropy look like information entropy


Yes that's where the term came from. They're (effectively) the same thing as I understand it.


I mean, it's obviously not relevant here, but technically there is also heat entropy, isn't there? It's not always information.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: