(shāng) - Entropy — a concept in thermodynamics and information

Tone 1

shāng | 15 strokes | radical:

· shāng

Entropy — a concept in thermodynamics and information theory measuring disorder, randomness, or uncertainty in a system.

Entropy — a concept in thermodynamics and information

Usage highlights

Entropy increaseInformation entropyEntropy reductionEntropy lawThermodynamic entropyEntropy generation

Synonyms

Antonyms

Usage & contexts

Examples

  • The second law of thermodynamics states that entropy increases (熵增).
  • Information entropy measures data uncertainty (信息熵).
  • High entropy systems are more disordered (高熵系统).
  • Entropy is a key concept in statistical mechanics (统计力学).

Collocations

  • Entropy increase(熵增)
  • Information entropy(信息熵)
  • Entropy reduction(熵减)
  • Entropy law(熵定律)
  • Thermodynamic entropy(热力学熵)
  • Entropy generation(熵产)

Idioms

  • Entropy increase principle(熵增原理)
  • Law of entropy increase(熵增定律)
  • Information entropy theory(信息熵理论)

Cultural background

FAQ
  • The character 熵 was coined in the 20th century specifically for scientific use, representing a modern Chinese character creation.
  • It combines the 'fire' radical 火 with the character 商 (shāng, 'commerce'), where 商 serves as phonetic component.
  • Entropy represents a fundamental concept in physics describing the arrow of time and irreversibility of natural processes.
  • In information theory, entropy quantifies the amount of uncertainty or information content in data.

FAQ