A first course in statistics for signal analysis

271 1.2K 0
A first course in statistics for signal analysis

Đang tải... (xem toàn văn)

Tài liệu hạn chế xem trước, để xem đầy đủ mời bạn chọn Tải xuống

Thông tin tài liệu

Wojbor A Woyczy´nski A First Course in Statistics for Signal Analysis Second Edition Wojbor A Woyczy´nski Department of Statistics and Center for Stochastic and Chaotic Processes in Sciences and Technology Case Western Reserve University 10900 Euclid Avenue Cleveland, OH 44106 USA waw@case.edu http://stat.case.edu/ Wojbor/ ISBN 978-0-8176-8100-5 e-ISBN 978-0-8176-8101-2 DOI 10.1007/978-0-8176-8101-2 Springer New York Dordrecht Heidelberg London Library of Congress Control Number: 2010937639 Mathematics Subject Classification (2010): 60-01, 60G10, 60G12, 60G15, 60G35, 62-01, 62M10, 62M15, 62M20 c Springer Science+Business Media, LLC 2011 All rights reserved This work may not be translated or copied in whole or in part without the written permission of the publisher (Springer Science+Business Media, LLC, 233 Spring Street, New York, NY 10013, USA), except for brief excerpts in connection with reviews or scholarly analysis Use in connection with any form of information storage and retrieval, electronic adaptation, computer software, or by similar or dissimilar methodology now known or hereafter developed is forbidden The use in this publication of trade names, trademarks, service marks, and similar terms, even if they are not identified as such, is not to be taken as an expression of opinion as to whether or not they are subject to proprietary rights Printed on acid-free paper www.birkhauser-science.com This book is dedicated to my children: Martin Wojbor, Gregory Holbrook, and Lauren Pike They make it all worth it Contents Foreword to the Second Edition xi Introduction xiii Notation xv Description of Signals 1.1 Types of Random Signals 1.2 Characteristics of Signals 1.3 Time-Domain and Frequency-Domain Descriptions of Periodic Signals 1.4 Building a Better Mousetrap: Complex Exponentials 14 1.5 Problems and Exercises 18 Spectral Representation of Deterministic Signals: Fourier Series and Transforms 2.1 Complex Fourier Series for Periodic Signals 2.2 Approximation of Periodic Signals by Finite Fourier Sums 2.3 Aperiodic Signals and Fourier Transforms 2.4 Basic Properties of the Fourier Transform 2.5 Fourier Transforms of Some Nonintegrable Signals; Dirac’s Delta Impulse 2.6 Discrete and Fast Fourier Transforms 2.7 Problems and Exercises Random Quantities and Random Vectors 3.1 Discrete, Continuous, and Singular Random Quantities 3.2 Expectations and Moments of Random Quantities 3.3 Random Vectors, Conditional Probabilities, Statistical Independence, and Correlations 3.4 The Least-Squares Fit, Linear Regression 3.5 The Law of Large Numbers and the Stability of Fluctuations Law 21 21 30 35 37 41 44 46 51 52 71 75 86 89 vii viii Contents 3.6 Estimators of Parameters and Their Accuracy; Confidence Intervals 92 3.7 Problems, Exercises, and Tables .100 Stationary Signals 105 4.1 Stationarity and Autocovariance Functions 105 4.2 Estimating the Mean and the Autocovariance Function; Ergodic Signals .119 4.3 Problems and Exercises 123 Power Spectra of Stationary Signals .127 5.1 Mean Power of a Stationary Signal .127 5.2 Power Spectrum and Autocovariance Function .129 5.3 Power Spectra of Interpolated Digital Signals .137 5.4 Problems and Exercises 140 Transmission of Stationary Signals Through Linear Systems 143 6.1 Time-Domain Analysis .143 6.2 Frequency-Domain Analysis and System’s Bandwidth 151 6.3 Digital Signal, Discrete-Time Sampling .155 6.4 Problems and Exercises 160 Optimization of Signal-to-Noise Ratio in Linear Systems .163 7.1 Parametric Optimization for a Fixed Filter Structure .163 7.2 Filter Structure Matched to Input Signal 167 7.3 The Wiener Filter .170 7.4 Problems and Exercises 172 Gaussian Signals, Covariance Matrices, and Sample Path Properties .175 8.1 Linear Transformations of Random Vectors .175 8.2 Gaussian Random Vectors 178 8.3 Gaussian Stationary Signals .181 8.4 Sample Path Properties of General and Gaussian Stationary Signals 184 8.5 Problems and Exercises 190 Spectral Representation of Discrete-Time Stationary Signals and Their Computer Simulations .193 9.1 Autocovariance as a Positive-Definite Sequence .195 9.2 Cumulative Power Spectrum of Discrete-Time Stationary Signal .196 9.3 Stochastic Integration with Respect to Signals with Uncorrelated Increments .199 9.4 Spectral Representation of Stationary Signals .204 9.5 Computer Algorithms: Complex-Valued Case .208 9.6 Computer Algorithms: Real-Valued Case 214 9.7 Problems and Exercises 220 Contents ix Solutions to Selected Problems and Exercises .223 Bibliographical Comments 253 Index 257 Chapter 245 (f) Finally, here SX f / D sin f /2 2f and SY f / D sin f /2 2f C cos f C f sin f 4f 2 f2 : Problem 6.4.5 Consider the circuit shown in Fig 6.4.2 Assume that the input, X.t/, is the standard white noise (a) Find the power spectra SY f / and SZ f / of the outputs Y t/ and Z.t/ (b) Find the cross-covariance, YZ Á / D E Z.t/Y t C / ; between those two outputs Solution (a) Note that X.t/ D Y t/ C Z.t/ The impulse response function for the “Z” circuit is hZ t/ D e t =RC ; RC and Z Y t/ D X.t/ hZ s/X.t s/ ds: So the impulse response function for the “Y” circuit is Z hY t/ D ı.t/ e RC s=RC ı.t s/ ds e t =RC ; t 0: RC The Fourier transform of hY t/ will give us the transfer function D ı.t/ Z HY f / D  ı.t/ e RC à t =RC e jft dt D jRCf : C jRCf For the standard white noise input X.t/, the power spectrum of the output is equal to the power transfer function of the system Indeed, SY f / D jHY f /j2 D R2 C f : C R2 C f The calculation of SX f / has been done before, as the “Z” circuit represents the standard RC-filter 246 Solutions to Selected Problems (b) yz / D E.Y t/Z.t C // ÄZ Z X.t s/hY s/ ds DE Z Z 1 Z ı Z  ı.s/ Z D e RC D ı.s/ e RC u/hZ u/ d u u/hY s/hZ u/ ds d u  u C s/ ı.s/ D D s/X.t C EX.t Z X.t C 1 D 1 e RC e RC =RC à s=RC Cs/=RC e RC Z ds e 2RC =RC D à s=RC Cs/=RC e RC e 2RC e RC u=RC du ds ds s=RC =RC e RC Cs/=RC ds : Chapter Problem 7.4.2 A signal of the form x.t/ D 5e t C2/ u.t/ is to be detected in the presence of white noise with a flat power spectrum of 0:25 V2 =Hz using a matched filter (a) For t0 D 2, find the value of the impulse response of the matched filter at t D 0; 2; 4: (b) Find the maximum output signal-to-noise ratio that can be achieved if t0 D 1: (c) Find the detection time t0 that should be used to achieve an output signal-tonoise ratio that is equal to 95% of the maximum signal-to-noise ratio discovered in part (b) (d) The signal x.t/ D 5e t C2/ u.t/ is combined with white noise having a power spectrum of V2 =Hz Find the value of RC such that the signal-to-noise ratio at the output of the RC filter is maximal at t D 0:01 s Solution (a) The impulse response function for the matched filter is of the form s C 2/ u.t0 h.s/ D expŒ t0 s/ D 5e s/ u.2 s/; where t0 is the detection time and u.t/ is the usual unit step function Therefore, h.0/ D 5e ; h.2/ D 5e ; h.4/ D 0: Chapter 247 (b) The maximum signal-to-noise ratio at detection time t0 is S t0 / D N max R1 x t0 s/ ds D N0 R t0 25e 2.t0 sC2/ ds 0:25 D 50e e 2t0 /: So S t0 D 0/ D 50e : N max (c) The sought detection time t0 can thus be found by numerically solving the equation 50e e 2t0 / D 0:95 50e ; which yields, approximately, t0 D log 0:05=2 1:5 Chapter Problem 8.5.1 A zero-mean Gaussian random signal has the autocovariance function of the form 0:1j j cos : X / D e Plot it Find the power spectrum SX f /: Write the covariance matrix for the signal sampled at four time instants separated by 0.5 s Find its inverse (numerically; use any of the familiar computing platforms, such as Mathematica, Matlab, etc.) Solution We will use Mathematica to produce plots and symbolic calculations although it is fairly easy to calculate SX f / by direct integration The plot of X / follows 1.0 0.5 0.0 −0.5 −1.0 −4 −2 The power spectrum SX f / is the Fourier transform of the ACvF, so 248 Solutions to Selected Problems In[1]:= GX[t_] := Exp[- Abs[t]]*Cos[2*Pi*t]; In[2]:= FourierTransform [GX[t], t, 2*Pi*f] Out[2]= 2p ( 1+4(-1+f)2 p2 ) 2p (1+4(1+f)2 p2) Note that the Fourier transform in Mathematica is defined as a function of the angular velocity variable ! D f ; hence the above substitution The plot of the power spectrum is next 0.5 0.4 0.3 0.2 0.1 0.0 -3 -2 -1 Problem 8.5.3 Find the joint p.d.f of the signal from Problem 8.5.1 at t1 D 1; t2 D 1:5; t3 D 2, and t4 D 2:5 Write the integral formula for P Ä X.1/ Ä 2; Ä X.1:5/ Ä 4; Ä X.2/ Ä 1; Ä X.2:5/ Ä 3/: Evaluate the above probability numerically Solution Again, we use Mathematica to carry out all the numerical calculations First, we calculate the relevant covariance matrix In[3]:= CovGX = N[{{GX[0], GX[0.5], GX[1], GX[1.5]}, {GX[0.5], GX[0], GX[0.5], GX[1]}, {GX[1], GX[0.5], GX[0], GX[0.5]}, {GX[1.5], GX[1], GX[0.5], GX[0]}}] // MatrixForm Out[3]= Chapter 249 −0.606531 −0.606531 0.367879 −0.606531 −0.22313 0.367879 0.367879 −0.606531 −0.606531 −0.22313 0.367879 −0.606531 Its determinant and its inverse are In[4]:=Det [CovGX] Out[4]= 0.25258 In[5]:= ICovGX = Inverse[CovGX] // MatrixForm Out[5]= 0.959517 −6.73384ϫ10−17 −1.11022ϫ10−16 2.16395 0.959517 −2.63452ϫ10−16 0.959517 2.16395 0.959517 −2.22045ϫ10−16 0.959517 1.58198 1.58198 0.959517 −1.11022ϫ10 −16 −5.55112ϫ10−17 Thus, the corresponding 4D Gaussian p.d.f is In[6]:= f[x1, x2, x3, x4]= (1/((2*Pi)ˆ2*Sqrt[Det[CovGX]])) * Exp[-(1/2)* Transpose[{{x1},{x2},{x3},{x4}}] ICovGX {x1,x2,x3,x4}] Out[6]= 0.05 * Eˆ( -0.79 x1ˆ2 - 1.08 x2ˆ2 - 0.96 x2 x3 1.08 x3ˆ2 + x1 (-0.96 x2 + 8.92*10ˆ-17 x3 + 8.33*10ˆ-17 x4) + 2.43*10ˆ-16 x2 x4 - 0.96 x3 x4 - 0.79 x4ˆ2 Note the quadratic form in four variables, x1, x2, x3, x4, in the exponent The calculation of the sought probability requires evaluation of the 4D integral, P Ä X.1/ Ä 2; Ä X.1:5/ Ä 4; Ä X.2/ Ä 1; Ä X.2:5/ Ä Z 2Z 4Z 1Z D f x1 ; x2 ; x3 ; x4 / dx1 dx2 dx3 dx4 ; 1 Á which can be done only numerically: In[7]:= NIntegrate[ f[x1, x2, x3, x4], {x1, -2, 2}, {x2, -1, 4}, {x3, -1, 1}, {x4, 0, 3}] Out[7]= {0.298126} Problem 8.5.4 Show that if a 2D Gaussian random vector YE D Y1 ; Y2 / has uncorrelated components Y1 ; Y2 , then those components are statistically independent random quantities 250 Solutions to Selected Problems Solution Recall the p.d.f of a general zero-mean 2D Gaussian random vector Y1 ; Y2 / [see (8.2.9)]: fYE y1 ; y2 / D Ä p 2 exp Ä exp y12 2 2/ 2.1 y1 y2 C y22 2 à : D 0, and the formula takes the If the two components are uncorrelated, then following simplified shape: fYE y1 ; y2 / D  1  y12 C y22 2 à I it factors into the product of the marginal densities of the two components of the random vector YE : Â Ã Â Ã Ä Ä 1 y12 y22 ; exp p exp fYE y1 ; y2 / D p 2 2 2 2 D fY1 y1 / fY2 y2 /; which proves the statistical independence of Y1 and Y2 Chapter Problem 9.7.8 Verify that the additivity property (9.3.7) of any continuous function forces its linear form (9.3.8) Solution Our assumption is that a function C.v/ satisfies the functional equation C.v C w/ D C.v/ C C.w/ (S.9.1) for any real numbers v; w We will also assume that is it continuous although the proof is also possible (but harder) under a weaker assumption of measurability Taking v D 0; w D gives C.0/ D C.0/ C C.0/ D 2C.0/; which implies that C.0/ D Furthermore, taking w D v, we get C.0/ D C.v/ C C v/ D 0; so that C.v/ is necessarily an odd function Chapter 251 Now, iterating (S.9.1) n times, we get that for any real number v, C.nv/ D n C.v/I choosing v D 1=n, we see that C.1/ D nC.1=n/ for any positive integer n Replacing n by m in the last equality and combining it with the preceding equality with v D 1=m, we get that for any positive integers n; m, C n nÁ D C.1/: m m Finally, since any real number can be approximated by the rational numbers of the form n=m, and since C was assumed to be continuous, we get that for any real number, C.v/ D v C.1/I that is, C.v/ is necessarily a linear function Bibliographical Comments The classic modern treatise on the theory of Fourier series and integrals which influenced much of the harmonic analysis research in the second half of the twentieth century is [1] A Zygmund, Trigonometric Series, Cambridge University Press, Cambridge, UK, 1959 More modest in scope, but perhaps also more usable for the intended reader of this text, are [2] H Dym and H McKean, Fourier Series and Integrals, Academic Press, New York, 1972, [3] T W K¨orner, Fourier Analysis, Cambridge University Press, Cambridge, UK, 1988, [4] E M Stein and R Shakarchi, Fourier Analysis: An Introduction, Princeton University Press, Princeton, NJ, 2003, [5] P P G Dyke, An Introduction to Laplace Transforms and Fourier Series, Springer-Verlag, New York, 1991 The above four books are now available in paperback The Schwartz distributions (generalized functions), such as the Dirac delta impulse and its derivatives, with special emphasis on their applications in engineering and the physical sciences, are explained in [6] F Constantinescu, Distributions and Their Applications in Physics, Pergamon Press, Oxford, UK, 1980, [7] T Schucker, Distributions, Fourier Transforms and Some of Their Applications to Physics, World Scientific, Singapore, 1991, [8] A I Saichev and W A Woyczy´nski, Distributions in the Physical and Engineering Sciences, Vol 1: Distributional and Fractal Calculus, Integral Transforms and Wavelets, Birkh¨auser Boston, Cambridge, MA, 1997, [9] A I Saichev and W A Woyczy´nski, Distributions in the Physical and Engineering Sciences, Vol 2: Linear, Nonlinear, Fractal and Random Dynamics in Continuous Media, Birkh¨auser Boston, Cambridge, MA, 2005 Good elementary introductions to probability theory, and accessible reads for the engineering and physical sciences audience, are [10] J Pitman, Probability, Springer-Verlag, New York, 1993, [11] S M Ross, Introduction to Probability Models, Academic Press, Burlington, MA, 2003 On the other hand, [12] M Denker and W A Woyczy´nski, Introductory Statistics and Random Phenomena: Uncertainty, Complexity, and Chaotic Behavior in Engineering and Science, Birkh¨auser Boston, Cambridge, MA, 1998, 253 254 Bibliographical Comments deals with a broader issue of how randomness appears in diverse models of natural phenomena and with the fundamental question of the meaning of randomness itself More ambitious, mathematically rigorous treatments of probability theory, based on measure theory, can be found in [13] P Billingsley, Probability and Measure, Wiley, New York, 1983, [14] O Kallenberg, Foundations of Modern Probability, Springer-Verlag, New York, 1997, [15] M Lo`eve, Probability Theory, Van Nostrand, Princeton, NJ, 1961 All three also contain a substantial account of the theory of stochastic processes Readers more interested in the general issues of statistical inference and, in particular, parametric estimation, should consult [16] G Casella and R L Berger, Statistical Inference, Duxbury, Pacific Grove, CA, 2002, or [17] D C Montgomery and G C Runger, Applied Statistics and Probability for Engineers, Wiley, New York, 1994 The classic texts on the general theory of stationary processes (signals) are [18] H Cramer and M R Leadbetter, Stationary and Related Stochastic Processes: Sample Function Properties and Their Applications, Dover Books, New York, 2004, [19] A M Yaglom, Correlation Thoery of Stationary and Related Random Functions, Vols I and II, Springer-Verlag, New York, 1987 However, the original, [20] N Wiener, Extrapolation, Interpolation, and Smoothing of Stationary Time Series, MIT Press and Wiley, New York, 1950 still reads very well Statistical tools in the spectral analysis of stationary discrete-time random signals (also known as time series) are explored in [21] P Bloomfield, Fourier Analysis of Time Series: An Introduction, Wiley, New York, 1976, [22] P J Brockwell and R A Davis, Time Series: Theory and Methods, Springer-Verlag, New York, 1991 and difficult issues in the analysis of nonlinear and nonstationary random signals are tackled in [23] M B Priestley, Non-linear and Non-stationary Time Series Analysis, Academic Press, London, 1988, [24] W J Fitzgerald, R L Smith, A T Walden, and P C Young, eds., Nonlinear and Nonstationary Signal Processing, Cambridge University Press, Cambridge, UK, 2000 The latter is a collection of articles, by different authors, on the current research issues in the area A more engineering approach to random signal analysis can be found in a large number of sources, including [25] A Papoulis, Signal Analysis, McGraw-Hill, New York, 1977, [26] R G Brown and P Y Hwang, Introduction to Random Signal Analysis and Kalman Filtering, Wiley, New York, 1992 A general discussion of transmission of signals through linear systems can be found in Bibliographical Comments 255 [27] M J Roberts, Signals and Systems: Analysis of Signals Through Linear Systems, McGrawHill, New York, 2003, [28] B D O Anderson, and J B Moore, Optimal Filtering, Dover Books, New York, 2005 Gaussian stochastic processes are thoroughly investigated in [29] I A Ibragimov and Y A Rozanov, Gaussian Random Processes, Springer -Verlag, New York, 1978, [30] M A Lifshits, Gaussian Random Functions, Kluwer Academic Publishers, Dordrecht, the Netherlands, 1995 and for a review of the modern mathematical theory of not necessarily second-order and not necessarily Gaussian stochastic integrals, we refer to [31] S Kwapien and W A Woyczy´nski, Random Series and Stochastic Integrals: Single and Multiple, Birkh¨auser Boston, Cambridge, MA, 1992 Index A additive noise, additivity property of probabilities, 53 adhesion model, analog-to-digital conversion, Anderson, B D O., 255 angular velocity, 21 approximation of periodic signals, 34ff at each time separately, 31 at jump points, 32 by C´esaro averages, 32 Gibbs phenomenon, 33 in power, 31 mean-square error, 31 uniform, 31 ARMA system, 157 Arzel`a–Ascoli theorem, 198 autocorrelation function (ACF), 108ff as a positive-definite function, 195 autocovariance function, 106 normalized, 107 B band-limited noise, 133 bandwidth equivalent noise —, 142, 152 half-power —, 142, 152 of finite-time integrating circuit, 154 Bayes’ formula, 80 Berger, R L., 254 Berry–Eseen theorem, 95 Billingsley, P., 199, 254 binomial formula, 54 Bloomfield, P., 254 Branicky, Mike, x Brockwell, P J., 121, 254 Brown, R G., 254 Brownian motion, Burgers’ equation, turbulence, C Casella, G., 254 Cauchy criterion of convergence, 185 Cauchy–Schwartz inequality, 82, 102 causal system, 145 central limit theorem, 60, 91, 175 error of approximation in —, 92 sketch of proof of —, 103 C´esaro average, 32 chaotic behavior, Chebyshev’s inequality, 188, 191 circuit integrating —, 145 RC —, 149 complex exponentials, 14, 18 orthogonality of —, 18 numbers, 15 computational complexity, of fast Fourier transform, 45 computer algorithms, 211ff conditional probability, 79 reverse —, 80 confidence intervals, 96ff., 122 for means, 96ff for variance, 98 Constantinescu, F., 253 control function cumulative —, 201 convergence in mean-square, 184 Cauchy criterion for —, 185 Cooley, J W., 45 correlation coefficient, 82 covariance matrix, 182 257 258 covariance, 82 matrix, 180 Cramer, H., 254 cross-correlation, 171 cross-covariance, 159 Crutchfield, James P., cumulative control function, 200 distribution function (c.d.f.), 52 power spectrum, 196 Czekajewski, Jan, x D Davis, R A., 121, 254 de Moivre’s formula, 14, 15 Denker, M., 3, 56, 63, 95, 104, 175, 197, 253 “devil’s staircase,” 62 diffusion, equation, 49 Dirac delta impulse, 41ff calculus of —, 43 “probing” property of —, 44 Fourier transform of —, 44 discrete Fourier transform, 45 discrete-time sampling, 155 distributions in the sense of Schwartz, 43 Dyke, P P G., 253 Dym, H., 253 E Edwards, Robert, x EEG signals, 107, 108, 122, 132 ergodic behavior, 7, 122 signal, 119 estimation of parameters, 92ff of power spectrum, 128 of the autocorrelation, 122ff of the mean, 119ff consistency of —, 120 expected value (expectation) of r.q., 71 linear scaling of —, 73 F fast Fourier transform (FFT), 44ff computational complexity of —, 45 filter causal —, 172 matched —, 167ff Index RC —, 149, 150, 162 Wiener —, 170ff filtering noise out of signal, 117 Fitzgerald, W J., 254 “floor” function, Folland, G B., 198, 203 Fourier analysis, 23, 253 coefficient, 22 expansion, 23, 25 pure cosine, 26 pure sine, 27 Jean-Baptiste, 49 series, 21ff complex —, 21ff Fourier transform (FT), 35ff basic properties, 37ff discrete, 44ff fast (FFT), 44ff computational complexity of —, 45 inverse (IFT), 36 linearity of —, 37 of convolution, 39 of nonintegrable signals, 41 table of —s, 43 table of properties of —, 40 fractional dimension, 61 frequency spectrum, 13 frequency-domain description, 12 fundamental frequency, 13, 18 theorem of calculus, 57 G gamma function, 95 Gauss, Carl Friedrich, 45 Gibbs’ phenomenon, 33ff H harmonic analysis, 21ff heat equation, 49 Herglotz’s theorem, 196 histogram, 51 Hwang, P Y., 254 I Ibragimov, I A., 255 impulse response function, 144 causal, 144 realizable, 144 integrating circuit, 145 inverse Fourier transform, 36 Index K Kallenberg, O., 114, 254 kinetic energy, 65 Kolmogorov’s theorem on infinite sequences of r.q.s., 199 on sample path continuity, 170 K¨orner, T W., 31, 253 Kronecker delta, 22 Kwapie´n, S., 255 L Landau’s asymptotic notation, 120 Laplace transform, 172 law of large numbers (LLN), 89 Leadbetter, M R., 254 least-squares fit, 89 ff L`evy process, Lifshits, M.A., 255 Lo`eve, M., 187, 254 Loparo, Ken, x M marginal probability distribution, 78 matching filter, 168 McKean, H., 253 mean power, 127 moments of r.q.s., 71 Montgomery, D C., 254 Moore, J B., 255 moving average autoregressive (ARMA) —, 157 general, 124 interpolated —, 139 of white noise, 111 to filter noise, 117 N nabla operator, noise additive, white, 110, 134 normal equations, 87 normalization condition, 55 O optimal filter, 169ff orthonormal basis, 22 in 3D space, 24 of complex exponentials, 22 orthonormality, 18 of complex exponentials, 18 259 P parameter estimation, 96ff Papoulis, A., 172, 254 Parseval’s formula, 24, 25, 49 extended, 24, 25, 49 passive tracer, period of the signal, infinite —, 36 periodogram, 131 Petrov, V V., 95 Piryatinska, A., x, 108, 122, 132 Pitman, J., 253 Poisson distribution, 55 polarization identity, 49 power spectral density, 135, 208 spectrum, 134ff cumulative, 196 of interpolated digital signal, 137 transfer function, 152 Priestley, M.B., 254 probability density function (p.d.f.), 57ff joint — of random vector, 75 normalization condition for —, 59 distribution, 58ff absolutely continuous, 56 Bernoulli, 54 binomial, 54 conditional, 75 continuous, 56 chi-square, 66, 98, 100 table of —, 102 cumulative, 52 exponential, 58 Gaussian (normal), 59, 91 calculations with —, 60 mean and variance of —, 74 table of —, 100 joint —, 75 marginal, 78 mixed, 61 n-point, 175 of function of r.q., 63 of kinetic energy, 65 of square of Gaussian r.q., 66 Poisson, 55 quantiles of —, 96 singular, 61 Student’s-t , 95, 100 table of —, 102 uniform, 57 260 theory, 51, 237 measure-theoretic, 254 paradoxes in —, 51 Pythagorean theorem, 24, 25 Q quantiles of probability distribution, 96 table of chi-square —, 100 R random errors, 80 harmonic oscillations, 109, 133 superposition of —, 109, 133 random interval, 94 numbers, 19 phase, 108 quantities (r.q.), 54ff absolute moments of —, 69 continuous —, 62ff correlation of —, 81 discrete —, 61ff expectation of —, 71ff function of —, 64ff linear transformation of Gaussian —, 64 moments of —, 71 singular —, 63 standard deviation of —, 73 standardized —, 74 statistical independence of —, 73ff variance of —, 64 switching signal, 100 variable, 48 vectors, 67ff covariance matrix of —, 164 Gaussian —, 68, 162ff 2-D —, 68, 164 joint probability distribution of —, 67 linear transformation of —, 160 moments of —, 73 walk, randomness, of signals, RC filter, 149ff., 162 rectangular waveform, 26ff regression line, 89ff REM sleep, 108 resolution, reverse conditional probability, 80 Roberts, M I., 255 Ross, S M., 253 Index Rozanov, Y A., 255 Rudin, W., 186 Runger, G C., 254 S Saichev, A I., 43, 253 sample paths, continuity with probability of —, 187ff differentiability of —, 184ff mean-square continuity of —, 184ff mean-square differentiability of —, 184ff sampling period, scalar product, 22 scatterplot, 86 Schucker, T., 253 Schwartz distributions, 43–44, 253 Shakarchi, R., 253 signals, analog, aperiodic, 1, 18, 35ff characteristics of, 175 delta-correlated, 135 description of —, 1ff deterministic, spectral representation of —, 21ff digital, 1, 140 discrete sampling of —, 157 interpolated, 137ff Diracdelta impulse, 41ff energy of —, filtering of —, 171 Gaussian, 175ff stationary, 171 jointly stationary —, 161 nonintegrable, 36 periodic, 8, 23 power of —, 31 random, 4, 105ff., 200ff switching, 113, 135 types of —, 1ff mean power of —, 127ff stationary, 106, 196ff discrete —, 196ff Gaussian —, 181ff power spectra of —, 127ff strictly, 105 second-order, weakly, 106 simulation of —, 124, 210ff spectral representation of —, 204ff stochastic, 2, 199 transmission of binary —, 80 time average of —, 19 Index signal-to-noise ratio, 163ff in matching filter, 168ff in RC filter, 149 optimization of —, 161ff simulation of stationary signals, 124, 193ff of white noise, 112 Smith, R L., 254 spectral representation theorem, 195 stability of fluctuations law, 89 standard deviation of r.q., 73 stationary conditions, statistical independence, 75 statistical inference, 254 Stein, E M., 253 stochastic difference equation, 116, 158 consistency of mean estimation in —, 119 integration, 199 Gaussian —, 187 isometric property of —, 203 for signals with uncorrelated increments, 199ff processes, 105, 254ff stationary, 254 Gaussian, 254 L`evy, Wiener, 4, 202 system bandwidth, 151 T time-domain description, time series, 105, 254 nonstationary, 254 nonlinear, 254 total probability formula, 79 trajectory, transfer function, 151 transmission of binary signals, 80 in presence of random errors, 80 through linear systems, 143ff trigonometricformulas, 14 261 series, 216 Tukey, O W., 45 U uncertainty principle, Usoltsev, Alexey, x V variance of r.q., 72 invariance under translation of —, 73 quadratic scaling of —, 73 W Walden, A T., 254 waveform rectangular, 26 white noise, 109, 193 band-limited —, 133 continuous-time, 135, 193 discrete–time, 109 filtered, 207 integrals, 195 moving average of —, 111 interpolated, 137 simulation of —, 124 Wiener filter, 170ff acausal, 170 causal, 172 N., 172, 202 process, 5, 204, 207 Wiener–Hopf equation, 172 Woyczynski, W A., 8, 43, 63, 95, 104, 122, 163, 197, 253, 255 Y Yaglom, A M., 254 Young, P C., 254 Z Zygmund, A., 23, 253 [...]... readable book of less than 200 pages, countering the recent trend toward fatter and fatter textbooks Since Fourier series and transforms are of fundamental importance in random signal analysis and processing, this material is developed from scratch in Chap 2, emphasizing the time-domain vs frequency-domain duality Our experience showed that although harmonic analysis is normally included in the calculus... with a signal processing course that is entirely devoted to practical applications and software implementation This one–two-punch approach has been working well, and the engineers seem to appreciate the fact that all probability /statistics/ Fourier analysis foundations are developed within the book; adding extra mathematical courses to a tight undergraduate engineering curriculum is almost impossible A. .. primary interest The time average of the signal: For analog, continuous-time signals, the time average is defined by the formula Z 1 T AVx D lim x.t/ dt; (1.2.1) T !1 T 0 1.2 Characteristics of Signals 7 Fig 1.1.7 Some deterministic signals (in this case, the images) transformed by deterministic systems can appear random Above is a series of iterated transformations of the original image via a fixed linear... last few years as lecture notes used by the author in classes mainly populated by electrical, systems, computer, and biomedical engineering juniors/seniors, and graduate students in sciences and engineering who have not been previously exposed to this material It was also used for industrial audiences as educational and training materials, and for an introductory time-series analysis class The only... of Signals Signals are everywhere Literally The universe is bathed in the background radiation, the remnant of the original Big Bang, and as your eyes scan this page, a signal is being transmitted to your brain, where different sets of neurons analyze and process it All human activities are based on the processing and analysis of sensory signals, but the goal of this book is somewhat narrower The signals... reader visualize the great variety of random signals appearing in the physical sciences and engineering, it is worthwhile reviewing a gallery of pictures of random signals, both experimental and simulated, presented in Figs 1.1.4–1.1.8 The captions explain the context in each case The signals shown in Figs 1.1.4 and 1.1.5 are, obviously, not stationary and have a diffusive character However, their increments... published in September 2009 in the Journal of the American Statistical Association by Charles Boncelet was particularly thorough and insightful Cleveland May 2010 Wojbor A Woyczy´nski http://stat.case.edu/ Wojbor Introduction This book was designed as a text for a first, one-semester course in statistical signal analysis for students in engineering and physical sciences It had been developed over the last... calculus syllabi, students’ practical understanding of its concepts is often hazy Chapter 3 introduces basic concepts of probability theory, law of large numbers and the stability of fluctuations law, and statistical parametric inference procedures based on the latter In Chap 4 the fundamental concept of a stationary random signal and its autocorrelation structure is introduced This time-domain analysis. .. impossible A gaggle of graduate students in applied mathematics, statistics, and assorted engineering areas also regularly enrolls They are often asked to make in- class presentations of special topics included in the book but not required of the general undergraduate audience Finally, by popular demand, there is now a large appendix which contains solutions of selected problems from each of the nine chapters... (differentials) are stationary and, in Chap 9, they will play an important role in the construction of the spectral representation of stationary signals themselves The signal shown in Fig 1.1.4 can be interpreted as a trajectory, or sample path, of a random walker moving, in discrete-time steps, up or down a certain distance with equal probabilities 1/2 and 1/2 However, in the picture these trajectories are

Ngày đăng: 05/09/2016, 23:35

Từ khóa liên quan

Mục lục

  • A First Course in Statistics for Signal Analysis

  • Contents

  • Foreword to the Second Edition

  • Introduction

  • Notation

  • 1 Description of Signals

  • 2 Spectral Representation of Deterministic Signals: Fourier Series and Transforms

  • 3 Random Quantities and Random Vectors

  • 4 Stationary Signals

  • 5 Power Spectra of Stationary Signals

  • 6 Transmission of Stationary Signals Through Linear Systems

  • 7 Optimization of Signal-to-Noise Ratio in Linear Systems

  • 8 Gaussian Signals, Covariance Matrices, and Sample Path Properties

  • 9 Spectral Representation of Discrete-Time Stationary Signals and Their Computer Simulations

  • Solutions to Selected Problems and Exercises

  • Bibliographical Comments

  • Index

Tài liệu cùng người dùng

Tài liệu liên quan