Information Technology Reference
In-Depth Information
adding more decimal digits but we will never get t here unless we somehow round the
results. As we have seen earlier, even that won't give us the 1 we desire.
Our expression of 1/3 is equal to the infinite repeating decimal .333333333333…,
which I can't show here no matter how many pages of paper I have. The 3s just keep on
going, just like the Energizer bunny. On the other hand the computer will give us .3, .33,
.333 or something approaching 1/3 but never exactly that. Computer math is finite but we
are talking about the infinite and so there will be differences and confusion at times. You
may have written code in any language and expect certain results but get something a bit
different. The reason could be because of the way a computer does calculations and its
limits.
Consider the following statements:
define a decimal (3.1)
define b decimal (3.2)
define c decimal (3.3)
define u decimal (3.1)
define v decimal (3.1)
define w decimal (3.2)
define x decimal (3.2)
define y decimal (3.3)
define z decimal (3.3)
u = 1/3
v = 5/8
a = u * v
w = 1/3
x = 5/8
b = w * x
y = 1/3
z = 5/8
c = y * z
The symbol used for the first two variables above and a few others represents division.
Recalling that the asterisk represents multiplication, what will be the values of
a ,
b
and
c ?
In the calculation of the first of those variables, note that
u
will be .3 and
v
will be .6, because of truncation. Thus
a
Search WWH ::




Custom Search