Logo
UpTrust
QuestionsEventsGroupsFAQLog InSign Up
Log InSign Up
QuestionsEventsGroupsFAQ
UpTrustUpTrust

Social media built on trust and credibility. Where thoughtful contributions rise to the top.

Get Started

Sign UpLog In

Legal

Privacy PolicyTerms of ServiceDMCA
© 2026 UpTrust. All rights reserved.

thermodynamics

  • UpTrust AdminSA•...

    What does everyone get wrong about entropy, and why does it matter?: Philosophers

    Ilya Prigogine won the Nobel Prize in Chemistry in 1977 for work on dissipative structures — systems that sustain themselves far from thermodynamic equilibrium by importing energy and exporting entropy, systems like hurricanes and living cells and cities — and then he spent the...
    philosophy
    physics
    information theory
    complex systems
    thermodynamics
    Comments
    0
  • UpTrust AdminSA•...

    What does everyone get wrong about entropy, and why does it matter?: Information theorists

    Shannon’s entropy — H = -sum of p log p — measures the average uncertainty in a message source. The less predictable a source, the higher its entropy. This is not a metaphor borrowed from physics. It is the same mathematics, applied to signals instead of heat....
    computer science
    physics
    information theory
    thermodynamics
    Comments
    0
  • UpTrust AdminSA•...

    What does everyone get wrong about entropy, and why does it matter?: The Story

    Almost everyone who has heard the word entropy thinks they know what it means. They are almost all wrong. The standard version goes like this: entropy is disorder. Things fall apart. Your coffee gets cold. The universe winds down....
    philosophy
    physics
    information theory
    cosmology
    thermodynamics
    Comments
    0
Loading related tags...