Biomedical Engineering Reference
In-Depth Information
4.3.5.3 The Bottom Line
Chemical shift data yield a restraint energy function that is rugged and not
very instructive unless structures are sampled which are highly accurate (,4
˚ ). Due to the mostly local nature of the chemical shifts one can derive
torsional
restraints
(e.g.,
TALOS)
or
filter
sets
of
short,
5-10
residue,
fragments of protein backbone (see below).
4.4
Optimisation Methods
4.4.1 Challenges for Optimisation Methods
All-atom force-fields and high-resolution restraint potentials, as, e.g., chemical
shifts, are short-range and thus do not yield significant guidance towards the
correct structure for de novo structure calculations (Table 4.1). In this section,
we discuss methods for structure calculation if the data is guidance-sparse, i.e.,
the available instructive data (e.g., NOE data) is not sufficient to constrain the
conformational search to the native energy basin (see previous section). A
sampling method applicable to such sparse NMR data is thus required to find
and
identify
the
native
energy
basin
within
a
much
larger
accessible
conformational space.
Two important characteristics of methods for global optimisation are their
efficiency, i.e., how much computer time is required to find the native energy
basin, and their thoroughness, i.e., do they always find the global low-energy
region or in the case of near-degenerate energy basins, do they find all low-
energy regions. Obviously, there is a trade-off between both characteristics.
The most thorough method would be one that enumerates all possible
conformations, which is clearly not efficient enough for most protein targets of
interest. On the algorithmic side this compromise is reflected by a trade-off
between intensification and exploration. Exploration is required to find a new
unknown territory of conformational space whereas intensification is required
to evaluate competing low-energy regions despite the ruggedness and noisiness
of the energy landscape. 24,99
What is a reasonable computational effort? Currently 12 h computing using
512 compute cores would cost ca. $100 in total on a commercial on-demand
cloud-computing platform.{ Thus calculations requiring up to 10 000 CPU
hours are economically viable, whereas calculations of 100 000 CPU hours and
more would require exceptional justification at current prices of computing.
For development of computational methods (that will be used in the future),
however, one should also keep in mind that cost of computing has for many
decades now followed very precisely Moore's Law of exponential decline in
{ AMAZON INC.; EC2 spot-price for quadruple extra large cluster compute instance with 33.5
EC2-units
(e.g.
33.5
standard
units
with
one
virtual
core…)
0.537$
/
hour,
from
http://
aws.amazon.com/ec2/#pricing on 6.8.2011.
Search WWH ::




Custom Search