entropy formula

Tamás Sándor Biró, Vice Director at the Wigner Research Centre for Physics, discusses the current status of entropy formula research

The word “entropy” was coined in 1865 by Rudolph Clausius, a German professor of physics. It was an analogy to “energy”, from the Greek “En-” meaning towards and “Tropos”, a place. It describes motion in abstract parameter space, a trend, establishing the asymmetrical nature of the past and future. Entropy is an integral quantity, characterizing the totality of the motion, alike the “profit” characterizes the result of several complex processes in a single number. The second law of thermodynamics dictates that the total entropy never decreases in a closed system.

Entropy and evolution

Here, the restriction in closed systems is important. Earth and the evolution of Life upon it is not a closed system. Society is also not closed. In energy technology, due to the interpretation of heat as molecular motion, entropy revealed itself as a concept describing complexity in general. Permutation entropy by Ludwig Boltzmann is the logarithm of the number of interchanges on the microscopic level, leaving the macroscopic look unchanged. This lends entropy a deep meaning. It makes equilibrium to an optimal state of big systems built from many simple elements by simple rules. Our own contemporary research reveals that entropy is also useful in the description of processes with dynamic equilibria when the detailed balance – the equity of rates in microscopic changes – fails, but a total balance of pluses and minuses is provided. The statistics of such dynamic processes stand the trial of time, as they deliver stationary probability density functions.

Chaos

Entropy is related to chaos: i) an entropy formula expresses how the value of entropy depends on probabilities of alternative states, ii) the growth in entropy while elementary quantities change to and back, reflects the convexity of it, and iii) chaotic motion is a genuine producer of entropy. Chaos poses limitations on predictability, the most cited example is the weather changed by a butterfly. Financial markets are comparable to the capricious behaviour of our atmosphere.

Exactly at the edge of chaos, where a dynamic system is just becoming chaotic, the classical logarithmic formula for entropy fails. It also fails when the system under study is too small. Such “finiteness effects” are light-heartedly neglected by theorists who discuss atomic systems. However, if someone intends to generalize the concept of entropy for more complex systems – like in a random network where “small world” effects are happening – then such modifying terms cannot be neglected any more.

Generalised entropy

Indeed, entropy and its very formula have always been studied by several researchers. Both the general validity of this concept and the generalized forms using probability. Entropy expressions were and are objects of long-term studies. It is required that the new suggestion must contain a case when the classical formula applies, and it must be clarified which of the original assumptions are invalidated. Beyond this, a demonstration is requested in the real world. We do not wish to construct a whole new game with new rules but to understand how Nature and human societies function.

A well-known generalization of Boltzmann’s formula is Alfred Renyi’s suggestion. Renyi constructed his formula to satisfy additivity, but this formula is not an expectation value. Remarkably, a transformation of this formula leads to an expectation value, however without being additive. This result due to Constantino Tsallis in 1987, contains a further parameter beyond the probabilities and named as q-entropy. Whenever q=1, the Boltzmann formula is reconstructed.

So, what is more important? Additivity or expectation value? In fact, more complicated formulas have also been suggested with more adjustable parameters. Some authors argue that their formula is more natural than that of others. So how should we find concepts leading beyond the entropy formula, concepts which govern us in seeking for classification of complexity itself?

Entropic distance

For deciding between alternative strategies entropy differences are important. In fact, two probability density functions, PDF-s, which help us in the deliberations of our actions, can be closer or further from each other: Their “distance” is in fact described by an entropic divergence formula. The actual entropy itself is just the distance of the actually considered PDF from the uniform distribution. And here lies the connection to informatics: When one must calculate with a uniform distribution then he/she reveals the maximal ignorance, the maximal lack of information about the estimated problem.

Following these ideas, it is not difficult to construct our own entropy formula. The classical result emerges when two effects counterbalance each other: the finite heat capacity of our container establishing the temperature on the one hand and the nature and size of random fluctuations on the other hand. If the fluctuations are much larger than classically assumed, we obtain a new parameter-free formula, and if small-world effects win the overhand then the Tsallis and Renyi formulas occur. The mathematical theory behind deforming additivity uses the group-entropy. Interestingly enough, other addition formulas also underwent modification during the history of physics. Most renowned is the velocity addition formula, modified in relativistic theories.

Economical inequalities and entropy

Still unsatisfied with pure math experiments? Us too. We are looking for real-world principles for finding out non-classical entropies. Most prominently the evolution of probabilities of alternatives are the most suggestive tools. In a simple model describing random growth in small steps and rare but big jumps to the ground state a number of phenomena can be studied. As long as the change of probability is linear in the probability factors, the classical formula is preferable. In such cases, an exponential distribution occurs in the stationary state. However, reversing the question, a non-exponential stationary PDF must stem either from a nonlinear randomness evolution or from constraints on complicated expressions of the observed variable. Our newest approach tries to connect the inequality measure in income and wealth distributions, established by Gini, and advertised in the famous 80/20 formula: 20% of the population owns 80% of wealth. We find that calculating the Gini coefficient as an expectation value non-exponential distributions suggest new formulas. Pandora’s box is now open.

Contributor Profile

Vice Director
Wigner Research Centre for Physics, Budapest
Phone: +36 20 435 1283
Website: Visit Website

LEAVE A REPLY

Please enter your comment!
Please enter your name here