14 stochastic processes
play

14. Stochastic Processes Introduction Let denote the random - PowerPoint PPT Presentation

14. Stochastic Processes Introduction Let denote the random outcome of an experiment. To every such outcome suppose a waveform X ( t , ) ( X t , ) is assigned. The collection of such X ( t , ) waveforms


  1. 14. Stochastic Processes Introduction ξ Let denote the random outcome of an experiment. To every such outcome suppose a waveform ξ X ( t , ) ( ξ X t , ) is assigned. � The collection of such ξ X ( t , ) waveforms form a n � ξ X ( t , ) stochastic process. The k ξ set of and the time { k } � ξ X ( t , ) 2 index t can be continuous ξ X ( t , ) or discrete (countably 1 t 0 t t infinite or finite) as well. 1 2 Fig. 14.1 ξ i ∈ For fixed (the set of S ( ξ all experimental outcomes), is a specific time function. X t , ) For fixed t , = ξ X X ( 1 t , ) 1 i is a random variable. The ensemble of all such realizations 1 ( ξ over time represents the stochastic X t , ) PILLAI/Cha

  2. process X ( t ). (see Fig 14.1). For example = ω + ϕ X ( t ) a cos( t ), 0 ϕ π where is a uniformly distributed random variable in (0,2 ), represents a stochastic process. Stochastic processes are everywhere: Brownian motion, stock market fluctuations, various queuing systems all represent stochastic phenomena. If X ( t ) is a stochastic process, then for fixed t , X ( t ) represents a random variable. Its distribution function is given by = ≤ F X ( x , t ) P { X ( t ) x } (14-1) Notice that depends on t , since for a different t , we obtain F X ( x , t ) a different random variable. Further dF ( x , t ) ∆ = (14-2) f ( x , t ) X dx X represents the first-order probability density function of the process X ( t ). 2 PILLAI/Cha

  3. For t = t 1 and t = t 2 , X ( t ) represents two different random variables X 1 = X ( t 1 ) and X 2 = X ( t 2 ) respectively. Their joint distribution is given by = ≤ ≤ F X ( x , x , t , t ) P { X ( t ) x , X ( t ) x } (14-3) 1 2 1 2 1 1 2 2 and ∂ 2 F ( x x t t , , , ) (14-4) ∆ = f ( x x t t , , , ) 1 2 1 2 X 1 2 1 2 ∂ ∂ X x x 1 2 represents the second-order density function of the process X ( t ). Similarly represents the n th order density f X ( x , x , � x , t , t � , t ) 1 2 n 1 2 n function of the process X ( t ). Complete specification of the stochastic f X ( x , x , � x , t , t � , t ) process X ( t ) requires the knowledge of 1 2 n 1 2 n = for all and for all n . (an almost impossible task t i , i 1 , 2 , � , n in reality). 3 PILLAI/Cha

  4. Mean of a Stochastic Process: = ∫ +∞ ∆ µ = (14-5) ( ) t E X t { ( )} x f ( , ) x t dx −∞ X represents the mean value of a process X ( t ). In general, the mean of a process can depend on the time index t . Autocorrelation function of a process X ( t ) is defined as = ∫∫ ∆ = * * R ( , t t ) E X t X { ( ) ( )} t x x f ( x x t t dx dx , , , ) (14-6) 1 2 1 2 1 2 1 2 1 2 1 2 XX X and it represents the interrelationship between the random variables X 1 = X ( t 1 ) and X 2 = X ( t 2 ) generated from the process X ( t ). Properties: 1. = = * * * R ( t , t ) R ( t , t ) [ E { X ( t ) X ( t )}] (14-7) 1 2 2 1 2 1 XX XX 2 > = (Average instantaneous power) R XX ( t , t ) E {| X ( t ) | } 0 . 2. 4 PILLAI/Cha

  5. 3. represents a nonnegative definite function, i.e., for any R XX ( t 1 t , ) 2 n { a } set of constants = i i 1 n n ∑∑ ≥ (14-8) * a a R ( t , t ) 0 . i j i j XX = = i 1 j 1 n ∑ ≥ = Eq. (14-8) follows by noticing that 2 E {| Y | } 0 for Y a X ( t ) . i i The function = i 1 = − µ µ (14-9) * C ( t , t ) R ( t , t ) ( t ) ( t ) 1 2 1 2 1 2 XX XX X X represents the autocovariance function of the process X ( t ). Example 14.1 Let T ∫ − = z X ( t ) dt . T Then T T ∫ ∫ = 2 * E [| z | ] E { X ( t ) X ( t )} dt dt 1 2 1 2 − − T T T T ∫ ∫ = (14-10) R ( t , t ) dt dt 5 1 2 1 2 − − XX T T PILLAI/Cha

  6. Example 14.2 = ω + ϕ ϕ π X ( t ) a cos( t ), ~ U ( 0 , 2 ). (14-11) 0 This gives µ = = ω + ϕ ( t ) E { X ( t )} aE {cos( t )} 0 X = ω ϕ − ω ϕ = a cos t E {cos } a sin t E {sin } 0 , (14-12) 0 0 π 2 1 ∫ ϕ = ϕ ϕ = = ϕ since E {cos } cos d 0 E {sin }. π 2 0 Similarly = ω + ϕ ω + ϕ 2 R XX ( t , t ) a E {cos( t ) cos( t )} 1 2 0 1 0 2 2 a = ω − + ω + + ϕ E {cos ( t t ) cos( ( t t ) 2 )} 0 1 2 0 1 2 2 2 a = ω − cos ( t t ). (14-13) 0 1 2 2 6 PILLAI/Cha

  7. Stationary Stochastic Processes Stationary processes exhibit statistical properties that are invariant to shift in the time index. Thus, for example, second-order stationarity implies that the statistical properties of the pairs { X ( t 1 ) , X ( t 2 ) } and { X ( t 1 + c ) , X ( t 2 + c )} are the same for any c . Similarly first-order stationarity implies that the statistical properties of X ( t i ) and X ( t i + c ) are the same for any c . In strict terms, the statistical properties are governed by the joint probability density function. Hence a process is n th -order Strict-Sense Stationary (S.S.S) if ≡ + + + f ( x , x , � x , t , t � , t ) f ( x , x , � x , t c , t c � , t c ) 1 2 n 1 2 n 1 2 n 1 2 n X X (14-14) for any c , where the left side represents the joint density function of = = = the random variables X X ( t ), X X ( t ), � , X X ( t ) and 1 1 2 2 n n the right side corresponds to the joint density function of the random ′ ′ ′ = + = + = + X X ( t c ), X X ( t c ), � , X X ( t c ). variables 1 1 2 2 n n A process X ( t ) is said to be strict-sense stationary if (14-14) is 7 = = t i , i 1 , 2 , � , n , n 1 , 2 , � and any c . true for all PILLAI/Cha

  8. For a first-order strict sense stationary process , from (14-14) we have ≡ + f ( x , t ) f ( x , t c ) (14-15) X X for any c . In particular c = – t gives = f ( x , t ) f ( x ) (14-16) X X i.e., the first-order density of X ( t ) is independent of t . In that case +∞ ∫ = = µ (14-17) E X t [ ( )] x f x dx ( ) , a constant. −∞ Similarly, for a second-order strict-sense stationary process we have from (14-14) ≡ + + f ( x , x , t , t ) f ( x , x , t c , t c ) 1 2 1 2 1 2 1 2 X X for any c . For c = – t 2 we get ≡ − f ( x , x , t , t ) f ( x , x , t t ) (14-18) 1 2 1 2 1 2 1 2 X X 8 PILLAI/Cha

  9. i.e., the second order density function of a strict sense stationary − t = τ process depends only on the difference of the time indices t . 1 2 In that case the autocorrelation function is given by ∆ = * R ( , t t ) E X t X { ( ) ( )} t 1 2 1 2 XX ∫∫ = τ = − * x x f ( x x , , t t dx dx ) 1 2 1 2 1 2 1 2 X ∆ = − = τ = − τ * R ( t t ) R ( ) R ( ), (14-19) 1 2 XX XX XX i.e., the autocorrelation function of a second order strict-sense stationary process depends only on the difference of the time τ = t − indices t . 1 2 Notice that (14-17) and (14-19) are consequences of the stochastic process being first and second-order strict sense stationary. On the other hand, the basic conditions for the first and second order stationarity – Eqs. (14-16) and (14-18) – are usually difficult to verify. In that case, we often resort to a looser definition of stationarity, known as Wide-Sense Stationarity (W.S.S), by making use of 9 PILLAI/Cha

  10. (14-17) and (14-19) as the necessary conditions. Thus, a process X ( t ) is said to be Wide-Sense Stationary if = µ (i) E { X ( t )} (14-20) and (14-21) (ii) = − * E { X ( t ) X ( t )} R ( t t ), 1 2 1 2 XX i.e., for wide-sense stationary processes, the mean is a constant and the autocorrelation function depends only on the difference between the time indices. Notice that (14-20)-(14-21) does not say anything about the nature of the probability density functions, and instead deal with the average behavior of the process. Since (14-20)-(14-21) follow from (14-16) and (14-18), strict-sense stationarity always implies wide-sense stationarity. However, the converse is not true in general, the only exception being the Gaussian process. This follows, since if X ( t ) is a Gaussian process, then by definition = = = are jointly Gaussian random X X ( t ), X X ( t ), � , X X ( t ) 1 1 2 2 n n variables for any whose joint characteristic function t , 2 t � , t 10 1 n is given by PILLAI/Cha

Download Presentation
Download Policy: The content available on the website is offered to you 'AS IS' for your personal information and use only. It cannot be commercialized, licensed, or distributed on other websites without prior consent from the author. To download a presentation, simply click this link. If you encounter any difficulties during the download process, it's possible that the publisher has removed the file from their server.

Recommend


More recommend