Stability and Lyapunov's Method
Stability criteria for linear systems. Lyapunov's direct method for general nonlinear systems. Linearization for stability analysis. Discrete-time stability.
Introduction
In many engineering applications, one wants to make sure the system behaves well in the long-run; without worrying too much about the particular path the system takes (so long as these paths are acceptable). Stability is the characterization ensuring that a system behaves well in the long-run. We will make this statement precise in the following.
Stability Criteria
Consider , with initial condition . Let . We will consider three definitions of stability:
For every , there exists such that implies that , . This is also known as stability in the sense of Lyapunov.
Intuition: Think of a ball resting at the bottom of a bowl. If you nudge it slightly, it stays close to the bottom -- it does not fly away. Local stability says exactly this: small perturbations produce small deviations, forever. Note that the ball does not have to return to the bottom; it just cannot wander far. A frictionless pendulum at its lowest point is locally stable (it oscillates nearby) but not asymptotically stable (it never stops oscillating).
such that implies that .
Intuition: Now add friction to the ball-in-a-bowl picture. The ball not only stays near the bottom but actually returns to it over time. Local asymptotic stability means small perturbations eventually die out completely. The "local" qualifier matters: there is only a neighborhood around the equilibrium from which the system returns. Push the ball too far (over the rim of the bowl), and all bets are off.
For every , . Hence, here, for any initial condition, the system converges to 0.
Intuition: Global asymptotic stability is the strongest guarantee: no matter where you start in the entire state space, the system eventually returns to the equilibrium. Think of it as a bowl that extends infinitely in all directions with friction everywhere -- no matter how far you throw the ball, it always rolls back to the bottom. For linear systems, local and global asymptotic stability are equivalent, but for nonlinear systems they can differ dramatically.
It should be added that, stability does not necessarily need to be defined with regard to 0; that is stability can be defined around any with . In this case, the above norms above should be replaced with (such that, for example for the asymptotic stability case, will converge to ).
One could consider an inverted pendulum as an example of a system which is not locally stable.
Linear Systems
Consider an initial value problem , with initial conditions . As studied earlier, the solution is
where
is the matrix exponential (see Exercise). We now briefly review how to compute matrix exponentials with a focus on stability properties.
Case: (Identity Matrix). In this case, , and
Case: diagonal. With similar arguments, if is diagonal with
we obtain
Hence, it is very easy to compute the exponential when the matrix is diagonal.
Commutativity property. Note now that if , that is, if and commute, then (see Exercise)
Case: in Jordan form. We will use commutativity to compute the matrix exponential in the case where the matrix is in a Jordan form. Let us write
as , where
We note that , for is the identity matrix multiplied by a scalar number. Hence, . All we need to compute is , as we have already discussed how to compute . Here, one should note that .
More generally, for a Jordan matrix where the number of 1s off the diagonal of a Jordan block is , the th power is equal to 0.
Therefore,
becomes
Hence,
General matrix. Now that we know how to compute the exponential of a Jordan form, we can proceed to study a general matrix. Let , where is in a Jordan form. Then,
Finally,
and
Hence, once we obtain a diagonal matrix or a Jordan form matrix , we can compute the exponential very efficiently.
The main insight here is that the eigenvalues determine whether the system remains bounded or not. In case, we have a repeated eigenvalue of 0, then the Jordan form determines whether the solution remains bounded or not. We state the following theorem.
For a linear differential equation
the solution is locally and globally asymptotically stable if and only if
where denotes the real part of a complex number, and denotes the th eigenvalue of .
Intuition: For linear systems, stability is entirely determined by the eigenvalues of . Each eigenvalue corresponds to a natural "mode" of the system, and the real part of the eigenvalue determines whether that mode grows or decays exponentially. If every mode decays (all eigenvalues in the open left half-plane), the entire system is stable -- and this works globally, not just locally. This is why pole placement is such a central concept in control design: moving eigenvalues to the left half-plane is equivalent to stabilizing the system.
This is one of the most important results in linear systems theory. It says that for a linear system, all three notions of stability coincide and reduce to a simple eigenvalue check. If all eigenvalues are in the open left half-plane, the system is globally asymptotically stable; if any eigenvalue has positive real part, the system is unstable.
For a linear differential equation
the system is locally stable if and only if the following two conditions hold:
(i) ,
(ii) if , for some , the algebraic multiplicity of this eigenvalue should be the same as the geometric multiplicity.
Intuition: This theorem relaxes the asymptotic stability condition to mere boundedness. Eigenvalues on the imaginary axis are allowed (corresponding to sustained oscillations that neither grow nor decay), but only if they have no Jordan blocks larger than . A Jordan block for a purely imaginary eigenvalue introduces polynomial growth factors like , which grow without bound even though . Think of a frictionless pendulum: simple oscillation is fine (simple eigenvalue), but resonance-like behavior (repeated eigenvalue with a Jordan block) would make the amplitude grow.
Condition (ii) prevents polynomial growth from Jordan blocks. If a purely imaginary eigenvalue has a Jordan block of size greater than 1, the terms , , etc. will grow without bound even though .
In practice, many systems are not linear, and hence the above theorems are not applicable.
A General Approach: Lyapunov's Method
A very versatile and effective approach to stabilization is via Lyapunov functions (this approach is often called Lyapunov's second method, the first one being an analysis based on linearization to be considered after this section). Let be an open set containing the equilibrium point, taken to be 0 here without any loss.
A function is called a Lyapunov function if
- , , ,
- , if ,
- is continuous, and has continuous partial derivatives.
Intuition: A Lyapunov function is a generalized notion of "energy" for a system. Just as physical energy is always non-negative and equals zero only at rest, is positive everywhere except at the equilibrium. The key idea is that you do not need to solve the differential to prove stability -- you just need to find an appropriate energy-like function and show it never increases (or strictly decreases) along the system's trajectories. For mechanical systems, the actual energy often works as a Lyapunov function, but for abstract systems you may need to be creative.
A Lyapunov function serves as a generalized notion of "energy" for a system. The idea is that if we can find a positive-definite function that never increases along trajectories of the system, then the trajectories must remain bounded (stability). If the function is strictly decreasing, the system must converge to the equilibrium (asymptotic stability).
First we present results on local asymptotic stability. As above, let be an open set containing 0.
a) For a given differential with , and continuous , if we can find a Lyapunov function such that
for all , then, the system is locally stable (stable in the sense of Lyapunov).
b) For a given differential with , and continuous , if we can find a Lyapunov function such that
for , the system is locally asymptotically stable.
c) If b) holds for so that , and
for , then the system is globally asymptotically stable.
Intuition: Lyapunov's direct method is like proving a ball will reach the bottom of a valley without tracking its exact path. Part (a) says: if the system's "energy" never increases, the state stays close (stability). Part (b) says: if the energy strictly decreases, the state must converge to equilibrium (asymptotic stability). Part (c) adds the global guarantee: if the energy function grows to infinity in all directions (so there are no "escape routes"), then asymptotic stability holds everywhere. The power of this method is that it avoids solving the differential entirely -- you only need a suitable energy function.
Let us, in addition to the conditions noted in Theorem(b), further impose that for some , is a bounded set and where satisfies Theorem(b). Then, every solution of the system with initial state converges to equilibrium.
Intuition: This theorem gives you a concrete, computable estimate of the "basin of attraction" -- the set of initial conditions from which the system is guaranteed to converge. The sublevel set acts like a fence: since is decreasing along trajectories, once the state is inside this set, it can never leave. In practice, you pick as large as possible while keeping the sublevel set within the region where , giving you the largest provable region of attraction.
By following (and slightly modifying) the proof of Theorem(b), we can conclude that is a region of attraction for the equilibrium point, which is defined as a set of initial states whose corresponding solutions converge to the equilibrium point: .
For local stability, by restricting the analysis to , we can allow the Lyapunov function to take even negative values outside or not necessarily be continuous outside . In Theorem, we used such properties of only on .
Show that is locally asymptotically stable, by picking as a Lyapunov function. Is this solution globally asymptotically stable as well?
We compute for all . Since as , by Theorem(c), the system is globally asymptotically stable.
Show that is locally asymptotically stable, by picking as a Lyapunov function. Is this solution globally asymptotically stable? Find a region of attraction for local stability.
We compute . This is negative when , i.e., when . So the system is locally asymptotically stable with region of attraction . The system is not globally asymptotically stable since for , .
One should note that BIBO stability and the stability notions considered in this chapter have very different contexts; BIBO stability is concerned with the input-output behaviour of systems and the criteria considered in this chapter are with regard to the effects of initial conditions (also called internal stability). The conditions are also slightly different for the linear setup: for continuous-time linear systems, BIBO stability requires all the eigenvalues to have strictly negative real parts. Stability itself, however, may require more relaxed conditions.
Revisiting the linear case
Recall that an real matrix is positive definite if is symmetric and for all . It is positive semi-definite of for all . Note that being symmetric is part of the definition.
All eigenvalues of a square matrix have negative real parts if and only if for any given positive definite , the (Lyapunov) equation
has a unique solution , where the solution is positive definite.
Intuition: The Lyapunov converts the problem of checking eigenvalue locations (a nonlinear problem) into solving a system of linear equations for . If you can find a positive definite satisfying , then is a Lyapunov function that certifies stability. In MATLAB, this is a single command (lyap). The theorem says this always works for stable linear systems: for any "target" energy dissipation rate , there is a unique energy function that achieves it. This makes stability verification for linear systems completely algorithmic.
The Lyapunov is a linear matrix equation -- given and , solving for is a system of linear equations. In MATLAB, the command lyap(A', N) solves this directly. This theorem converts a nonlinear eigenvalue problem (checking eigenvalue locations) into a linear algebra problem (solving a matrix and checking positive definiteness).
Non-Linear Systems and Linearization
Let be continuously differentiable. Consider and let for some . Let be the Jacobian of at (that is, with the linearization with ). Let be the eigenvalues of . If for , then is locally asymptotically stable.
Intuition: This theorem says that near an equilibrium, a nonlinear system behaves like its linearization. If the linearized system (the Jacobian) is stable, then the nonlinear system is locally stable too -- the higher-order nonlinear terms are too small near the equilibrium to overcome the stabilizing effect of the linear part. This is why linearization is such a workhorse in engineering: you can design controllers using linear theory and trust that they will work locally around the operating point. The caveat is that if the linearization has eigenvalues on the imaginary axis, the nonlinear terms decide stability and the linearization is inconclusive.
The above shows that linearization can be a very effective method. However, when the linearization leads to a matrix with an eigenvalue having a zero real part, the analysis (above based on linearization) is inconclusive and further analysis would be required. To make this observation explicit, consider two systems
which have the same linearization around 0. By a Lyapunov stability argument with taking , the first system can be shown to be locally and globally stable, whereas the second one is not (which can be verified by solving the directly: show that blows up in finite time!).
Discrete-time Setup
The stability results presented for continuous-time linear systems have essentially identical generalizations for the discrete-time setup.
For a linear difference , the solution is locally and globally asymptotically stable if and only if
where denotes the th eigenvalue of . That is, all eigenvalues must be strictly inside the unit disk.
For a linear difference , the system is locally stable if and only if:
(i) ,
(ii) if for some , the algebraic multiplicity of this eigenvalue should be the same as the geometric multiplicity (i.e., the Jordan form corresponding to an eigenvalue on the unit circle is strictly diagonal).
Intuition: In discrete time, the unit circle plays the role that the imaginary axis plays in continuous time. Eigenvalues inside the unit disk correspond to decaying modes (), eigenvalues outside to growing modes, and eigenvalues on the unit circle to sustained oscillations. The Jordan block condition for local stability prevents polynomial growth: grows polynomially in when and .
In this case, we require the eigenvalues to be strictly inside the unit disk for asymptotic stability (local and global); and for local stability we additionally have the relaxation that the Jordan form corresponding to an eigenvalue on the unit circle is to be strictly diagonal: Any Jordan form block of size , with eigenvalue , can be written as
where is a matrix which has all terms zero, except the super-diagonal (the points right above the diagonal), at which points the value is 1. The second term is such that . Finally, we use the power expansion and using the fact that any matrix commutes with the identity matrix:
Since , we have
One can have discrete-time generalizations of Lyapunov functions.
Consider
All eigenvalues of have magnitudes strictly less than 1 if and only if for any given positive definite matrix or for where is any given matrix with , the discrete Lyapunov equation
has a unique solution which is also positive definite.
Intuition: This is the discrete-time counterpart of the continuous Lyapunov equation. Instead of checking that eigenvalues are in the left half-plane (continuous-time), you check that they are strictly inside the unit circle (discrete-time). The Lyapunov function decreases at each step: . Think of it as verifying that the system's "energy" drops with every discrete time step, guaranteeing that the state spirals inward to the origin.
The solution in the theorem statement is .
Exercises
Let be a square matrix. Show that .
Solution. Let be scalars. Since and commute so that we have that . Then,
The final line follows from the fact that the sum converges to zero as : let be the matrix consisting of the absolute value of the entries of , then for each entry of the matrix
The result follows.
Show that for square matrices and , which commute, that is , it follows that
Solution. Recall that
with the definition that . It follows that
In the above, the substitution was used, the re-indexing of the double sum follows from re-expressing the summation, and the binomial theorem step uses the fact that to establish
This last statement is proved by induction. Clearly for , . Suppose this is true for . Then for :
Let us separate out the terms involving for . It turns out that we obtain:
Now,
This completes the proof.
Let satisfy
Is (locally) asymptotically stable?
Using , we get for . By Theorem(b), the system is locally asymptotically stable. Since as , it is also globally asymptotically stable by part (c).
Consider . Is this system asymptotically stable?
Hint: Convert this into a system of first-order differential equations, via and , . Then apply as a candidate Lyapunov function.
Solution. Setting , , the system becomes:
This is with .
Consider . First, note that is positive definite: writing for .
Computing :
for . Since as , by Theorem(c), the system is globally asymptotically stable.
Alternatively, the eigenvalues of are , which both have negative real part . By Theorem, the system is globally asymptotically stable.
Prove the following theorem:
Consider
where is continuous and . Let be continuously differentiable. Suppose there exists a continuous function such that
Then, provided remains bounded, .
Hint: Write
and conclude that for all and by the non-negativity of , we have that . From here, we want to establish that , provided that (by hypothesis) remains bounded. Complete the proof.
Prove and use Barbalat's lemma: Let be uniformly continuous over . Then, if remains in and if is finite, then .
Note: The above result also implies an important stability theorem known as LaSalle's invariance principle.
Consider a network of agents which are connected over a graph. We say that is an adjacency graph if if Agents and Agent are connected and otherwise. For each agent define to be the degree of the agent. Now define where is a diagonal matrix with . Such a matrix is called a Laplacian.
Now, suppose that the agents update their states by the following equation:
Observe that is a positive semi-definite matrix and if the graph is connected the only eigenvector corresponding to the zero eigenvalue is . This you can see by noting that .
In this case, define the following Lyapunov function:
Then,
The above ensures that remains bounded. Now, invoke Barbalat's Lemma to conclude that . Since the only eigenvalue corresponding to is and throughout the updates the sum is a constant (as the sum does not change), we have that .
Consider
Suppose that our goal is to have . We know that if we select for any , the system is stable. In particular, let .
In many engineering applications, the value of is unknown.
Adaptive control theory is the sub-field of control theory studying such problems. The goal is to allow the controller to learn the system to be able to achieve the desired goal.
Suppose that the controller runs the following policy:
which leads to , where is an estimate of . Suppose that we take
In this case, consider the Lyapunov function:
We compute .
Since is non-increasing, and remain bounded. By Barbalat's Lemma (Theorem), .