Information Technology Reference
In-Depth Information
When the state is not completely observed, one has first to reconstruct
the state using a filter before implementing the control. It can be shown that
observability and controllability of completely observed system is a su cient
condition of controllability [Kwakernaak et al. 1972]. Controllability assump-
tions are more di cult to state in a nonlinear framework. Some algebraic
concepts are necessary which are beyond the scope of this topic.
In real-life systems, one cannot implement arbitrary control laws, because
the magnitude of acceptable control is bounded by the physical limitations
of the actuators. The control must obey such constraints. The set of controls
that obey the constraints is called the set of feasible controls. Prior to actually
applying a control law designed in a linear framework, one must check whether
it is feasible. If the control law saturates the actuators, the system is no longer
linear.
5.1.3 Stability of Controlled Dynamical Systems
The most important property of a control law is that it guarantees the stabil-
ity of the controlled dynamical system. We explained in the previous chapter
that a controlled dynamical system with a closed-loop control law behaves
just like a usual dynamical system without any control. Let us recall some de-
finitions about stability of discrete-time nonlinear dynamical systems. In this
section, discrete-time dynamical systems are considered, with the following
state equation:
x ( k +1)= f [ x ( k )] .
A state x such that f ( x )= x is called an equilibrium state . x is also
said to be a fixed point of f .
An equilibrium x is said stable if
x
x
ε,
η,
x (0)
η
⇒∀
k,
x ( k )
ε.
An equilibrium x is said asymptotically stable , with an attraction basin ,if
for any initial condition in Ω, the state trajectory originating from that initial
condition reaches the fixed point x .
The stability properties of dynamical linear systems x ( k +1)= A
x ( k )
can be easily derived from the spectral properties of matrix A . The point
0 is a fixed point of the linear system. If the eigenvalues of A are strictly
included in the open unit disc, the equilibrium is stable and asymptotically
stable. If there exists an eigenvalue, whose module is strictly larger than 1,
then 0 equilibrium is neither stable nor asymptotically stable. Critical cases
of eigenvalues of module equal to 1 deserve a specific analysis.
That simple characterization of linear dynamical systems is the basis of
the methodology to build control laws for linear controlled dynamical systems
by locating the poles of transfer functions [Kwakernaak et al. 1972]. That
methodology is traditional in control theory, and is very popular in real world
·
Search WWH ::




Custom Search