Entropy

Entropy - Thermodynamics

Entropy - Thermodynamics

What is entropy? Entropy (S) is a thermodynamic quantity originally defined as a criterion for predicting the evolution of thermodynamic systems.

Entropy is a function of extensive character state. The value of entropy, in an isolated system, grows in the course of a process that occurs naturally. Entropy describes how a thermodynamic system is irreversible.

The meaning of entropy is evolution or transformation. The word entropy comes from the Greek.

Entropy in the world of physics

In physics, entropy is the thermodynamic magnitude that allows us to calculate the part of the heat energy that can not be used to produce work if the process is reversible. Physical entropy, in its classical form, is defined by the equation.

Entropy equation

or more simply, if the temperature remains constant in process 1 →; 2 (isothermal process):

Thus, if a hot body at temperature T1 loses a quantity of heat Q1, its entropy decreases in Q1 / T1, if it gives this heat to a cold body at temperature T2 (lower than T1) the entropy of the cold body increases more than that the entropy of the hot body has decreased because

loss of entropy

A reversible machine can, therefore, transform a part of this heat energy into work, but not all.

The performance of the reversible machine (which is the maximum that any machine can give) is:

Reversible machine performance

In order that all the heat energy could be transformed into work, it would be necessary that either the hot focus was at an infinite temperature, or that the cold focus was at zero kelvin; in another case, the thermodynamic performance of the reversible machine is less than 1.

The expression of entropy is a logical consequence of the second principle of thermodynamics and the way in which temperature is measured.

The second principle of thermodynamics says that, if work is not consumed, the heat of the hot bodies to the cold bodies, either directly by conduction as if done through any machine.

The temperature must be measured on a thermodynamic scale; otherwise, the expression of entropy is not as elegant and depends on the thermometric substance that is used to build the thermometer. When defining the thermodynamic temperature scale, there is a degree of freedom that can be chosen arbitrarily. If it is imposed that between the boiling temperature and the freezing temperature of the water there is 100 degrees, the Kelvin scale is obtained and it turns out that the freezing temperature of the water must be 273 K.

Entropy and energy

Assuming that the whole universe is an isolated system, that is, a system for which it is impossible to exchange matter and energy with the outside, the first law of thermodynamics and the second law of thermodynamics can be summarized as follows: "The total energy of the universe is constant and the total entropy increases continuously until it reaches an equilibrium" This means that not only can not create or destroy energy, nor can it completely transform from one form to another without a part being dissipate in the form of heat.

History of entropy

Rudolf Clausius was the first to introduce the concept of entopia The concept of entropy is developed in response to the observation of a certain fact: there is a certain amount of the energy released in combustion reactions that is lost due to dissipation or friction. In this way, the energy that is lost is not transformed into useful work.

The first thermal engines such as the Thomas Savery (1698), the Newcomen engine (1712) and the three-wheeled Steam Cugnot (1769) were inefficient. Of the input energy, only 2% was converted into useful energy. A large amount of useful energy dissipated or was lost in what seemed like a state of immeasurable randomness. Over the next two centuries, physicists investigated this enigma of lost energy. The result of these studies led scientists to the concept of entropy.

The physicist Rudolf Clausius was the first to introduce it in 1865. Since then, several definitions of entropy have appeared. The most relevant definition of entropy is the one that elaborated â € <â €

valoración: 3 - votos 1

Last review: October 20, 2017

Back