Biomedical Engineering Reference
In-Depth Information
and response surface methods are used commonly to explore a search space. An
initial design such as a fractional two-level factorial with replicate centre points
might be sufficient to indicate where an optimal area might exist for closer
examination by full factorial or response surface methods. There are many choices
of design, and the selection of the number and location of test conditions must
strike a balance between the depth of information on one hand and the time needed
to gather the data on the other [ 19 ].
Ideally, experimental designs should minimise the total number of conditions
that need to be evaluated, allowing a researcher to arrive at an optimum by
selecting test conditions judiciously while still delivering sufficient process
knowledge to enable efficient chromatographic operation [ 30 ]. One way to do this
is to 'step' through a design space using the information gathered from previously
tested conditions in order to determine the most suitable direction in which to
continue searching for an upward gradient in the underlying response surface
towards a good operating region. This can help to drive the experimental focus
towards more desirable regions and so minimise the total number of experiments
that are conducted before an optimum is found. One such technique is the simplex
algorithm, which offers a rapid alternative to more conventional factorial designs
during the very early stages of development. Design of experiments (DoE)
involves regressing models to data values, but if the fitness of the regressed
equation is found to be statistically insufficient, then additional laboratory work
has to be carried out to supplement the existing data set and therefore provide more
information to enable the identification of a more suitable regression model. The
requirement for additional work may become difficult during very early devel-
opment when feed material availability is highly limited. The use of such resources
may be a particular wastage if a large number of the tested conditions transpire to
be so highly sub-optimal that they are of no use whatsoever in defining a potential
bioprocess design space for subsequent exploration at larger scales. The simplex
method is different in that it uses accumulated information from conditions that
have been tested already to direct its focus away from poor areas of the search
space and instead towards better areas. A study conducted recently described the
use of the simplex method in a chromatography case study [ 6 ]. When conventional
two-level and face-centred central composite design models were created for the
data sets, they were found to fit the data poorly, prompting further experiments to
provide additional data in order to verify the location of the robust regions in the
search space. By comparison, the simplex algorithm identified a good operating
point using up to 70 % fewer conditions. Such methods therefore have some value
during the very early stages of process development when resources are highly
constrained and thus were it is necessary to extract as much value from the
experimental studies as possible. An additional point to note is that the simplex
method is sequential and therefore does not benefit directly from the parallel
processing capabilities of a high-throughput screening device, but is relevant for
systems where analytical timescales are long and so where it is advantageous that
the majority of tested conditions should head successively towards a viable
potential operating position. It should be noted, however, that the simplex method
Search WWH ::




Custom Search