entropy

(redirected from entropies)
Also found in: Thesaurus, Medical, Financial, Encyclopedia.
Related to entropies: entropically

en·tro·py

 (ĕn′trə-pē)
n. pl. en·tro·pies
1. Symbol S For a closed thermodynamic system, a quantitative measure of the amount of thermal energy not available to do work.
2. A measure of the disorder or randomness in a closed system.
3. A measure of the loss of information in a transmitted message.
4. The tendency for all matter and energy in the universe to evolve toward a state of inert uniformity.
5. Inevitable and steady deterioration of a system or society.

[German Entropie : Greek en-, in; see en-2 + Greek tropē, transformation; see trep- in Indo-European roots.]

en·tro′pic (ĕn-trō′pĭk, -trŏp′ĭk) adj.
en·tro′pi·cal·ly adv.
American Heritage® Dictionary of the English Language, Fifth Edition. Copyright © 2016 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.

entropy

(ˈɛntrəpɪ)
n, pl -pies
1. (General Physics) a thermodynamic quantity that changes in a reversible process by an amount equal to the heat absorbed or emitted divided by the thermodynamic temperature. It is measured in joules per kelvin. Symbol: S See also law of thermodynamics
2. (General Physics) a statistical measure of the disorder of a closed system expressed by S = klog P + c where P is the probability that a particular state of the system exists, k is the Boltzmann constant, and c is another constant
3. lack of pattern or organization; disorder
4. (Communications & Information) a measure of the efficiency of a system, such as a code or language, in transmitting information
[C19: from en-2 + -trope]
Collins English Dictionary – Complete and Unabridged, 12th Edition 2014 © HarperCollins Publishers 1991, 1994, 1998, 2000, 2003, 2006, 2007, 2009, 2011, 2014

en•tro•py

(ˈɛn trə pi)

n.
1. a function of thermodynamic variables, as temperature or pressure, that is a measure of the energy that is not available for work in a thermodynamic process. Symbol: S
2. (in data transmission and information theory) a measure of the loss of information in a transmitted signal.
3. (in cosmology) a hypothetical tendency for the universe to attain a state of maximum homogeneity in which all matter is at a uniform temperature.
4. a state of disorder, as in a social system, or a hypothetical tendency toward such a state.
[< German Entropie (1865); see en-2, -tropy]
en•tro•pic (ɛnˈtroʊ pɪk, -ˈtrɒp ɪk) adj.
en•tro′pi•cal•ly, adv.
Random House Kernerman Webster's College Dictionary, © 2010 K Dictionaries Ltd. Copyright 2005, 1997, 1991 by Random House, Inc. All rights reserved.

en·tro·py

(ĕn′trə-pē)
A measure of the amount of disorder in a system. Entropy increases as the system's temperature increases. For example, when an ice cube melts and becomes liquid, the energy of the molecular bonds which formed the ice crystals is lost, and the arrangement of the water molecules is more random, or disordered, than it was in the ice cube.
The American Heritage® Student Science Dictionary, Second Edition. Copyright © 2014 by Houghton Mifflin Harcourt Publishing Company. Published by Houghton Mifflin Harcourt Publishing Company. All rights reserved.
ThesaurusAntonymsRelated WordsSynonymsLegend:
Noun1.entropy - (communication theory) a numerical measure of the uncertainty of an outcome; "the signal contained thousands of bits of information"
communication theory, communications - the discipline that studies the principles of transmiting information and the methods by which it is delivered (as print or radio or television etc.); "communications is his major field of study"
information measure - a system of measurement of information based on the probabilities of the events that convey information
2.entropy - (thermodynamics) a thermodynamic quantity representing the amount of energy in a system that is no longer available for doing mechanical work; "entropy increases as matter and energy in the universe degrade to an ultimate state of inert uniformity"
physical property - any property used to characterize matter and energy and their interactions
conformational entropy - entropy calculated from the probability that a state could be reached by chance alone
thermodynamics - the branch of physics concerned with the conversion of different forms of energy
Based on WordNet 3.0, Farlex clipart collection. © 2003-2012 Princeton University, Farlex Inc.
Translations
entropie
entropia

entropy

[ˈentrəpɪ] Nentropía f
Collins Spanish Dictionary - Complete and Unabridged 8th Edition 2005 © William Collins Sons & Co. Ltd. 1971, 1988 © HarperCollins Publishers 1992, 1993, 1996, 1997, 2000, 2003, 2005

entropy

[ˈɛntrəpi] nentropie f
Collins English/French Electronic Resource. © HarperCollins Publishers 2005

entropy

nEntropie f
Collins German Dictionary – Complete and Unabridged 7th Edition 2005. © William Collins Sons & Co. Ltd. 1980 © HarperCollins Publishers 1991, 1997, 1999, 2004, 2005, 2007

entropy

[ˈɛntrəpɪ] nentropia
Collins Italian Dictionary 1st Edition © HarperCollins Publishers 1995

en·tro·py

n. entropía, disminución de la capacidad de convertir la energía en trabajo.
English-Spanish Medical Dictionary © Farlex 2012
References in periodicals archive ?
We observed that the choice of p values, [p.sub.v], for valid transfer entropies affects results.
Smith Jr, "An investigation of the dependence of Shannon information entropies and distance measures on molecular geometry," International Journal ofQuantum Chemistry, vol.
For example, mutual information, a combination of entanglement entropies of two regions, gives an upper bound on all possible connected two-point functions between operators in the two regions [13].
[60, 61, 67] proposed that Renyi entropies may also be used as a tool for studying the dynamical systems and are closely related to the thermodynamic entropy of the system, the Shannon entropy.
Table 1 : Entropies. Entropies A B [E.sub.1] 0.5570 0.65866 [E.sub.2] 0.675 0.7 [E.sub.3] 0.6 0.525 [E.sub.4] 0.675 0.75 [E.sub.5] 0.5125 0.6 [E.sub.6] 0.7171 0.672 Since [E.sub.3](A) > [E.sub.3](B) and [E.sub.6](A) > [E.sub.6](B) which indicates that [E.sub.3] and [E.sub.6] are consistent with the intuition.
Calculation of ratio between entropies before WPDR (E([i.sub.a])) and after WPDR (E([i.sub.a(2,1)]));
Zhang [36] compared the performances of the proposed entropies [E.sub.1], [E.sub.2], [E.sub.3], [E.sub.4], [E.sub.5], and [E.sub.6] to those of [E.sub.L] [30], [E.sub.ZJ] [31], [E.sub.W[lambda]] [34], [E.sub.ZM] [33], and [E.sub.Y] [35].
We have found that GSLT holds for all cases of entropies as well as horizons.
It may be seen that the mean codeword length (17) had been generalized parametrically by Campbell [15] and their bounds had been studied in terms of generalized measures of entropies. Here we give another generalization of (17) and study its bounds in terms of generalized entropy of order [xi].
As related to data set [20] results, the computing entropies for candidate patch that is closer to optic disk area are more significant than other patches in image; this is true for all the images (35 images) as shown in Figure 8.
A review paper on the application of entropies methods on recognition of epilepsy using EEG signals was presented in [10].
To be able to compare entropies with different n, the following relation is defined [9, 10]: