Szidarovszky, F, Bahill, A..Stability Analysis The Electrical Engineering Handbook Ed. Richard C. Dorf Boca Raton CRC Press llc. 2000
Szidarovszky, F., Bahill, A.T. “Stability Analysis” The Electrical Engineering Handbook Ed. Richard C. Dorf Boca Raton: CRC Press LLC, 2000
2 Stability analysis 12.2 Using the State of the System to Determine Stability 12.3 Lyapunov Stability Theory Ferenc Szidarovszky 12.4 Stability of Time-Invariant Linear Systems Stability Analysis with State-Space Notation. The Transfer A. Terry Bahill 12.5 BIBO Stability University of Arizona 12.6 Physical Examples 12.1 Introduction In this chapter, which is based on Szidarovszky and Bahill [ 1992], we first discuss stability in general and then present four techniques for assessing the stability of a system:(1)Lyapunov functions, (2)finding the eigenvalues for state-space notation, (3)finding the location in the complex frequency plane of the poles of the closed loop transfer function, and(4) proving bounded outputs for all bounded inputs. Proving stability with Lyapunov functions is very general: it works for nonlinear and time-varying systems. It is also good for doing proofs. Proving the stability of a system with Lyapunov functions is difficult, however, and failure to find a Lyapunov function that proves a system is stable does not prove that the system is unstable. The next techniques we present, finding the eigenvalues or the poles of the transfer function, are sometimes difficult, because they require factoring high-degree polynomials. Many commercial software packages are now available for this task, however. We think most engineers would benefit by having one of these computer programs. Jamshidi et al. [1992] and advertisements in technical publications such as the IEEE Control Systems Magazine and IEEE Spectrum describe many appropriate software packages. The last technique we present, bounded-input, bounded-output stability, is also quite general. Let us begin our discussion of stability and instability of systems informally. In an unstable system the state an have large variations, and small inputs or small changes in the initial state may produce large variations in the output. A common example of an unstable system is illustrated by someone pointing the microphone of a public address(PA)system at a speaker; a loud high-pitched tone results. Often instabilities are caused by oo much gain, so to quiet the PA system, decrease the gain by pointing the microphone away from the speaker Discrete systems can also be unstable. A friend of ours once provided an example. She was sitting in a chair she went over and turned up the thermostat on the heater. The house warme She got hot, so she got up and turned down the thermostat. The house cooled off. She got cold and turned up the thermostat. This process continued until someone finally suggested that she put on a sweater(reducing the gain of her heat loss system). She did, and was much more comfortable. We modeled this as a discrete system, because she seemed to sample the environment and produce outputs at discrete intervals about 15 c 2000 by CRC Press LLC
© 2000 by CRC Press LLC 12 Stability Analysis 12.1 Introduction 12.2 Using the State of the System to Determine Stability 12.3 Lyapunov Stability Theory 12.4 Stability of Time-Invariant Linear Systems Stability Analysis with State-Space Notation • The Transfer Function Approach 12.5 BIBO Stability 12.6 Physical Examples 12.1 Introduction In this chapter, which is based on Szidarovszky and Bahill [1992], we first discuss stability in general and then present four techniques for assessing the stability of a system: (1) Lyapunov functions, (2) finding the eigenvalues for state-space notation, (3) finding the location in the complex frequency plane of the poles of the closedloop transfer function, and (4) proving bounded outputs for all bounded inputs. Proving stability with Lyapunov functions is very general: it works for nonlinear and time-varying systems. It is also good for doing proofs. Proving the stability of a system with Lyapunov functions is difficult, however, and failure to find a Lyapunov function that proves a system is stable does not prove that the system is unstable. The next techniques we present, finding the eigenvalues or the poles of the transfer function, are sometimes difficult, because they require factoring high-degree polynomials. Many commercial software packages are now available for this task, however. We think most engineers would benefit by having one of these computer programs. Jamshidi et al. [1992] and advertisements in technical publications such as the IEEE Control Systems Magazine and IEEE Spectrum describe many appropriate software packages. The last technique we present, bounded-input, bounded-output stability, is also quite general. Let us begin our discussion of stability and instability of systems informally. In an unstable system the state can have large variations, and small inputs or small changes in the initial state may produce large variations in the output. A common example of an unstable system is illustrated by someone pointing the microphone of a public address (PA) system at a speaker; a loud high-pitched tone results. Often instabilities are caused by too much gain, so to quiet the PA system, decrease the gain by pointing the microphone away from the speaker. Discrete systems can also be unstable. A friend of ours once provided an example. She was sitting in a chair reading and she got cold. So she went over and turned up the thermostat on the heater. The house warmed up. She got hot, so she got up and turned down the thermostat. The house cooled off. She got cold and turned up the thermostat. This process continued until someone finally suggested that she put on a sweater (reducing the gain of her heat loss system). She did, and was much more comfortable. We modeled this as a discrete system, because she seemed to sample the environment and produce outputs at discrete intervals about 15 minutes apart. Ferenc Szidarovszky University of Arizona A. Terry Bahill University of Arizona
12.2 Using the State of the System to Determine Stability The stability of a system is defined with respect to a given equilibrium point in state space. If the initial state x is selected at an equilibrium state x of the system, then the state will remain at x for all future time. w the initial state is selected close to an equilibrium state, the system might remain close to the equilibrium state or it might move away. In this section we introduce conditions that guarantee that whenever the system starts near an equilibrium state, it remains near it, perhaps even converging to the equilibrium state as time increases For simplicity, only time-invariant systems are considered in this section. Time-variant systems are discussed in Section 12.5 Continuous, time-invariant systems have the for x(t)=f(x(t)) and discrete, time-invariant systems are modeled by the difference equation x(t+1)=f(x(t) (12.2) Here we assume that f: X-R", where XsR" is the state space. We also assume that function f is continuous; furthermore, for arbitrary initial state xo E X, there is a unique solution of the corresponding initial value roblem x(to)=Xo, and the entire trajectory x(o) is in X. Assume furthermore that to denotes the initial time It is also known that a vector x E X is an equilibrium state of the continuous system, Eq (12.1), if and only if f(x)=0, and it is an equilibrium state of the discrete system, Eq (12.2), if and only if x=f(x). In this chapter the equilibrium of a system will always mean the equilibrium state, if it is not specified otherwise. In analyzing the dependence of the state trajectory x(t) on the selection of the initial state xo nearby the equilibrium, the ollowing stability types are considered. Definition 12.1 1. An equilibrium state x is stable if there is an eo >0 with the following property: For all E1,0<e< eor there is an e>0 such that if llx-xol<e,then‖x-x(圳川<ep, for all!t>t 2. An equilibrium state x is asymptotically stable if it is stable and there is an e >0 such that whenever ‖x-x‖<e, then x(t)→xast→ 3. An equilibrium state x is globally asymptotically stable if it is stable and with arbitrary initial state xo ∈Xx(t)→xast→∞. Unstable The first definition says an equilibrium state x is stable if the entire trajectory x(t) is closer to the equilibrium state than any small E, if the initial state xo is selected close enough to the equilibrium state. For asymptotic stability, in addition, x(t)has to converge to the equilibrium state as I-o0. If an equilibrium state is globally asymptotically stable, then x(n) converges to the equilibrium state regard less of how the initial state x, is selected These stability concepts are called internal, because they stable roperties of the state of th illustrated in Fig. 1 FIGURE 12.1 Stability concepts. Source: F. Szi In the electrical engineering literature, sometimes our darovszky and AT Bahill, Linear Systems Theory Boca tability definition is called marginal stability, and our Raton, Fla. CRC Press, 1992, P 168. With permission. asymptotic stability is called stability. e 2000 by CRC Press LLC
© 2000 by CRC Press LLC 12.2 Using the State of the System to Determine Stability The stability of a system is defined with respect to a given equilibrium point in state space. If the initial state x0 is selected at an equilibrium state x of the system, then the state will remain at x for all future time. When the initial state is selected close to an equilibrium state, the system might remain close to the equilibrium state or it might move away. In this section we introduce conditions that guarantee that whenever the system starts near an equilibrium state, it remains near it, perhaps even converging to the equilibrium state as time increases. For simplicity, only time-invariant systems are considered in this section. Time-variant systems are discussed in Section 12.5. Continuous, time-invariant systems have the form (12.1) and discrete, time-invariant systems are modeled by the difference equation (12.2) Here we assume that f: X Æ Rn, where X Õ Rn is the state space. We also assume that function f is continuous; furthermore, for arbitrary initial state x0 Œ X, there is a unique solution of the corresponding initial value problem x(t0) = x0, and the entire trajectory x(t) is in X. Assume furthermore that t0 denotes the initial time period of the system. It is also known that a vector x Œ X is an equilibrium state of the continuous system, Eq. (12.1), if and only if f(x) = 0, and it is an equilibrium state of the discrete system, Eq. (12.2), if and only if x = f(x). In this chapter the equilibrium of a system will always mean the equilibrium state, if it is not specified otherwise. In analyzing the dependence of the state trajectory x(t) on the selection of the initial state x0 nearby the equilibrium, the following stability types are considered. Definition 12.1 1. An equilibrium state x is stable if there is an e0 > 0 with the following property: For all e1, 0 < e1 < e0, there is an e > 0 such that if || x – x0 || < e, then || x – x(t)|| < e1, for all t > t0. 2. An equilibrium state x is asymptotically stable if it is stable and there is an e > 0 such that whenever || x – x0 || < e, then x(t) Æ x as t Æ •. 3. An equilibrium state x is globally asymptotically stable if it is stable and with arbitrary initial state x0 Œ X, x(t) Æ x as t Æ •. The first definition says an equilibrium state x is stable if the entire trajectory x(t) is closer to the equilibrium state than any small e1, if the initial state x0 is selected close enough to the equilibrium state. For asymptotic stability, in addition, x(t) has to converge to the equilibrium state as t Æ •. If an equilibrium state is globally asymptotically stable, then x(t) converges to the equilibrium state regardless of how the initial state x0 is selected. These stability concepts are called internal, because they represent properties of the state of the system. They are illustrated in Fig. 12.1. In the electrical engineering literature, sometimes our stability definition is called marginal stability, and our asymptotic stability is called stability. x˙( )t = f(x(t)) x(t + 1) = f(x(t)) FIGURE 12.1 Stability concepts. (Source: F. Szidarovszky and A.T. Bahill, Linear Systems Theory, Boca Raton, Fla.: CRC Press, 1992, p. 168. With permission.)
2.3 Lyapunov Stability Theory Assume that x is an equilibrium state of a continuous or discrete system, and let Q2 denote a subset of the state pace X such that x∈ Definition 12.2 A real-valued function V defined on Q2 is called a Lyapunov function, if 2. V has a unique global minimum at x with respect to all other points in $2 for any state trajectory x(t) contained in Q2, v(x(n)) is nonincreasing in t The Lyapunov function can be interpreted as the generalization of the energy function in electrical systems The first requirement simply means that the graph of V has no discontinuities. The second requirement mear that the graph of V has its lowest point at the equilibrium, and the third requirement generalizes the well- known fact of electrical systems, that the energy in a free electrical system with resistance always decreases, unless the system is at rest. Theorem 12.1 Assume that there exists a Lyapunov function V on the spherical region x||x-x‖<eo} where Eo >0 is given; furthermore Q2S X. Then the equilibrium state is stable Theorem 12. 2 Assume that in addition to the conditions of Theorem 12. 1, the Lyapunov function V(x(o)is strictly decreasing in t, unless x(t)=x. Then the equilibrium state is asymptotically stable Theorem 12.3 Assume that the Lyapunov function is defined on the entire state space X, v(x( o))is strictly decreasing in t unless x(r)=x; furthermore, V(x) tends to infinity as any component of x gets arbitrarily large in magnitude Then the equilibrium state is globally asymptotically stable Consider the differential equation The stability of the equilibrium state(1/@, 0)can be verified directly by using Theorem 12. 1 without computing the solution. Select the Lyapunov function V(x)=(x-x)(x-x)=‖ where the euclidian norm is used This is continuous in x; furthermore, it has its minimal(zero)value at x=x. Therefore, to establish the stability the equilibrium state we have to show only that v(x(n)) is decreasing. Simple differentiation shows that (x(t)=2(x-x)2·文=2(x-x)(Ax+b) c 2000 by CRC Press LLC
© 2000 by CRC Press LLC 12.3 Lyapunov Stability Theory Assume that x is an equilibrium state of a continuous or discrete system, and let W denote a subset of the state space X such that x Œ W. Definition 12.2 A real-valued function V defined on W is called a Lyapunov function, if 1. V is continuous; 2. V has a unique global minimum at x with respect to all other points in W; 3. for any state trajectory x(t) contained in W, V(x(t)) is nonincreasing in t. The Lyapunov function can be interpreted as the generalization of the energy function in electrical systems. The first requirement simply means that the graph of V has no discontinuities. The second requirement means that the graph of V has its lowest point at the equilibrium, and the third requirement generalizes the wellknown fact of electrical systems, that the energy in a free electrical system with resistance always decreases, unless the system is at rest. Theorem 12.1 Assume that there exists a Lyapunov function V on the spherical region (12.3) where e0 > 0 is given; furthermore W Õ X. Then the equilibrium state is stable. Theorem 12.2 Assume that in addition to the conditions of Theorem 12.1, the Lyapunov function V(x(t)) is strictly decreasing in t, unless x(t) = x. Then the equilibrium state is asymptotically stable. Theorem 12.3 Assume that the Lyapunov function is defined on the entire state space X, V(x(t)) is strictly decreasing in t unless x(t) = x; furthermore, V(x) tends to infinity as any component of x gets arbitrarily large in magnitude. Then the equilibrium state is globally asymptotically stable. Example 12.1 Consider the differential equation The stability of the equilibrium state (1/w, 0)T can be verified directly by using Theorem 12.1 without computing the solution. Select the Lyapunov function where the Euclidian norm is used. This is continuous in x; furthermore, it has its minimal (zero) value at x = x . Therefore, to establish the stability of the equilibrium state we have to show only that V(x(t)) is decreasing. Simple differentiation shows that W = {x * ** x - x ** < e } 0 ˙ x = x - Ê Ë Á ˆ ¯ ˜ + Ê Ë Á ˆ ¯ ˜ 0 0 0 1 w w V T (x) = - (x x) (x - x) = ** x x - ** 2 2 d dt V t T T (x( )) = 2 2 (x - x) × x˙ = (x - x) (Ax + b)
That is, with=(x, x,)T 2 =2(Ox1x2-x2-Ox1x2+x2)=0 Therefore, function V(x(n))is a constant, which is a(not strictly)decreasing function. That is, all conditions of Theorem 12.1 are satisfied, which implies the stability of the equilibrium state. Theorems 12.1, 12.2, and 12.3 guarantee, respectively, the stability, asymptotic stability, and global asymptotic stability of the equilibrium state, if a Lyapunov function is found. Failure to find such a Lyapunov function does not mean that the system is unstable or that the stability is not asymptotic or globally asymptotic. It only means that you were not clever enough to find a Lyapunov function that proved stability 12. 4 Stability of Time-Invariant Linear Systems This section is divided into two subsections. In the first subsection the stability of linear time-invariant systems given in state-space notation is analyzed. In the second subsection, methods based on transfer functions are discussed Stability analysis with State-Space Notation Consider the time-invariant continuous linear system Ax+b (12.4) and the time-invariant discrete linear syste x(t +1)=Ax(t)+b Assume that x is an equilibrium state, and let o(t, t,)denote the fundamental matrix. Theorem 12. 4 1. The equilibrium state x is stable if and only if o(t, to)is bounded for t2 to 2. The equilibrium state x is asymptotically stable if and only if o(t, t )is bounded and tends to zero as t-oo. e use the symbol s to denote complex frequency, i. e, s=0+ jo. For specific values of s, such as eigenvalues and poles, we use the symbol. Theorem 12.5 I.If for at least one eigenvalue of A, Re n >0 (or a>1 for discrete systems), then the system is unstable 2. Assume that for all eigenvalues a, of A, Re i, s 0 in the continuous case(or s 1 in the discrete case), and all eigenvalues with the property Re =0(or =1)have single multiplicity; then the equilibrium 3. The stability is asymptotic if and only if for all i, Re l, <0(or <1 e 2000 by CRC Press LLC
© 2000 by CRC Press LLC with That is, with x = (x1, x2)T , Therefore, function V(x(t)) is a constant, which is a (not strictly) decreasing function. That is, all conditions of Theorem 12.1 are satisfied, which implies the stability of the equilibrium state. Theorems 12.1, 12.2, and 12.3 guarantee,respectively, the stability, asymptotic stability, and global asymptotic stability of the equilibrium state, if a Lyapunov function is found. Failure to find such a Lyapunov function does not mean that the system is unstable or that the stability is not asymptotic or globally asymptotic. It only means that you were not clever enough to find a Lyapunov function that proved stability. 12.4 Stability of Time-Invariant Linear Systems This section is divided into two subsections. In the first subsection the stability of linear time-invariant systems given in state-space notation is analyzed. In the second subsection, methods based on transfer functions are discussed. Stability Analysis with State-Space Notation Consider the time-invariant continuous linear system (12.4) and the time-invariant discrete linear system (12.5) Assume that x is an equilibrium state, and let f(t,t0) denote the fundamental matrix. Theorem 12.4 1. The equilibrium state x is stable if and only if f(t,t0) is bounded for t ³ t0. 2. The equilibrium state x is asymptotically stable if and only if f(t,t0) is bounded and tends to zero as t Æ •. We use the symbol s to denote complex frequency, i.e., s = s + jw. For specific values of s, such as eigenvalues and poles, we use the symbol l. Theorem 12.5 1. If for at least one eigenvalue of A, Re li > 0 (or *li * > 1 for discrete systems), then the system is unstable. 2. Assume that for all eigenvalues li of A, Re li £ 0 in the continuous case (or *li * £ 1 in the discrete case), and all eigenvalues with the property Re li = 0 (or *li *= 1) have single multiplicity; then the equilibrium state is stable. 3. The stability is asymptotic if and only if for all i, Re li < 0 (or *li * < 1). A = b - Ê Ë Á ˆ ¯ ˜ = Ê Ë Á ˆ ¯ ˜ 0 0 0 1 w w and d dt V t x x x x x x x x x x ( ( )) , ( ) x = - Ê Ë Á ˆ ¯ ˜ - + Ê Ë Á ˆ ¯ ˜ = - - + = 2 1 1 2 0 1 2 2 1 1 2 2 1 2 2 w w w w w x˙ = + Ax b x(t + = 1) Ax(t) + b