Biology Reference
In-Depth Information
presented on-off dynamics. In this first numerical work it was also suggested that
the systemic metabolic structure could be an intrinsic characteristic of metabolism,
common to all living systems (De la Fuente et al. 1999a )
During 2004 and 2005, several studies applying flux balance analysis to experi-
mental data contributed evidence that can be interpreted as a systemic functional
structure (Almaas et al. 2004 , 2005 ). Specifically, it was observed that a set of
metabolic reactions belonging to different anabolic processes remained active
under all investigated growth conditions. The remaining enzymatic reactions not
belonging to this metabolic core stayed only intermittently active. These global
catalytic processes were verified for E. coli , H. pylori , and S. cerevisiae (Almaas
et al. 2004 , 2005 ).
The systemic functional structure appears as a robust dynamical system (De la
Fuente et al. 2009 ), in which self-organization, self-regulation, and long-term
memory properties emerge (De la Fuente et al. 2010 ). Long-term correlations
have been observed in different experimental studies, e.g., quantification of DNA
patchiness (Viswanathan et al. 1997 ), physiological time series (Eke et al. 2002 ;
Goldberger et al. 2002 ), NADPH series (Ramanujan et al. 2006 ), DNA sequences
(Allegrini et al. 1988 ; Audit et al. 2004 ), K + channel activity (Kazachenko
et al. 2007 ), mitochondrial processes (Aon et al. 2008 ), and neural electrical activity
(De la Fuente et al. 2006 ; Mahasweta et al. 2003 ).
8.1.5
Introduction to the Effective Functional Connectivity
Understanding functional coordination and spontaneous synchronization between
enzymatic processes needs a quantitative measure of effective connectivity.
In the field of information theory, transfer entropy (TE) has been proposed to be
a rigorous, robust, and self-consistent method for the causal quantification of
functional information flow between nonlinear processes (Schreiber 2000 ). TE
quantifies the reduction in uncertainty that one variable has on its own future
when adding another, allowing for a calculation of the functional influence between
two variables in terms of effective connectivity.
The roots of TE are given by the Shannon entropy, which measures the average
information required to determine a random variable X (Cover and Thomas 1991 ).
The Shannon entropy is defined as
X
x
H
ð
X
Þ
p
ð
x
Þ
log p
ð
x
Þ;
(8.1)
where x is one of the possible states which characterize the dynamics of variable X .
For instance, in tossing a coin, the two only possible states are head or tail. p
Þ
defines the probability (or normalized occurrence) of measuring the variable X in
the state x .
ð
x
Search WWH ::




Custom Search