Geology Reference
In-Depth Information
Assuming that in a crystal structure there exist N 1 +N 2 gaps which these molecules
can fill, one can calculate the multiplicity of random distributions that may occur.
Thus, the first molecule can occupy any of the N 1 + N 2 sites, while the second only
has the option of (N 1 + N 2 1) empty sites, the third (N 1 + N 2 2) and so on.
The total number of possibilities is (N 1 + N 2 )!.
However, as the molecules are mutually replaceable, one must divide by N i !,
that is the number of possible exchanges of molecules of the first class, and by N 2 !
for the second class. So the number of distinguishable arrangements for N 1 + N 2
molecules will be:
(N 1 + N 2 )!
(N 1 ! N 2 !)
P i mix =
(9.10)
Defining as the expression klnP, one obtains:
= klnP = kln (N 1 + N 2 )!
(N 1 ! N 2 !)
(9.11)
which vanishes if only one type of molecule exists since N 2 = 0 and lnN 1 !=N 1 ! = 0.
When N is large enough, this factorial expression can be solved by using the
Stirling approximation:
lnN! = NlnN N
(9.12)
so that becomes:
= k [(N 1 + N 2 )ln(N 1 + N 2 ) N 1 lnN 1 N 2 lnN 2 ] = k 0 [x 1 lnx 1 + x 2 lnx 2 ] (9.13)
with k 0 = k=(N 1 +N 2 ) and molar fractions x 1 = N 1 =(N 1 +N 2 ) and x 2 = N 2 =(N 1 +
N 2 ). Note that coincides with the isobaric and isothermal entropy generation of
the two ideal gases, S i mix , when the constant k 0 is equal to R = 8:314 J/molK. In
the case of solid solutions the values of x 2 for the replacing molecule are commonly
very low resulting in very small values of entropy generation.
Eq. (9.13) enables the identification of the relationship between entropy and
disorder. The “evolution of systems towards their maximum possible entropy” is
equivalent to “the reaching of their maximum disorder” subject to process con-
straints, with this state being not only quantifiable (through entropy) but also the
most probable. Order and organisation thus become synonyms for low entropy.
Likewise, information can be linked to entropy. In fact, the probability of a
given event occurring is the way used to measure information (Carter, 2011). As
is well known, the basis of information technologies is the conversion of messages
into bit sequences. A bit is a binary decision between two items and therefore its
probability is 1/2. Simply explained, a sequence of the 8-bit (0 0 0 0 0 0 0 0) gives
no information if only zeros can ever be expressed, whereas a sequence of (0 1 0 0
1 1 0 1) can convey a message if each decision may be either 0 or 1. An “amount
of information” can be linked to its “degree of randomness”. Therefore, information
defined in terms of probability draws parallel to conventional entropy. In fact,
it shares the same properties as entropy: it is never negative and is additive for
 
Search WWH ::




Custom Search