Physicist: The term “Entropy” shows up both in thermodynamics and information theory, so (since thermodynamics called dibs), I’ll call thermodynamic entropy “entropy”, and information theoretic entropy “information”. If one bucket has entropy E, you’d like two buckets to have entropy 2E. For example: Water expands by a factor around 1000 when it boils, and it’s entropy increases 1000 fold. That’s why it’s easy to boil water in a pot (it increases entropy), and it’s difficult to condense water in a pot (it decreases entropy). The bridge between information and entropy lies in how hard it is to describe a physical state or process.