000 11341nam a2200325 i 4500
008 110628s2012 lau b 001 0 eng
010 _a2011025090
020 _a9781439818374
040 _aDLC
_cDLC
049 _aBAUN_MERKEZ
050 0 4 _aQA280
_b.W66 2012
082 0 0 _223
100 1 _aWoodward, Wayne A
245 1 0 _aApplied time series analysis /
_cWayne A Woodward, Henry L. Gray, and Alan C. Elliott
264 1 _aBoca Raton :
_bChapman and Hall/CRC,
_c2012.
300 _axxiii, 540 pages :
_billustrations,
_c25 cm
336 _atext
_btxt
_2rdacontent
337 _aunmediated
_bn
_2rdamedia
338 _avolume
_bnc
_2rdacarrier
490 0 _aStatistics: textbooks and monographs
504 _aIncludes bibliographical references and index
505 0 0 _tContents
_t Preface
_t Acknowledgments
_t1. Stationary Time Series
_t1.1. Time Series
_t1.2. Stationary Time Series
_t1.3. Autocovariance and Autocorrelation Functions for Stationary Time Series
_t1.4. Estimation of the Mean, Autocovariance, and Autocorrelation for Stationary Time Series
_t1.4.1. Estimation of μ
_t1.4.1.1. Ergodicity of X
_t1.4.1.2. Variance of X
_t1.4.2. Estimation of γk
_t1.4.3. Estimation of ρk
_t1.5. Power Spectrum
_t1.6. Estimating the Power Spectrum and Spectral Density for Discrete Time Series
_t1.7. Time Series Examples
_t1.7.1. Simulated Data
_t1.7.2. Real Data
_t1.A. Appendix
_t Exercises
_t2. Linear Filters
_t2.1. Introduction to Linear Filters
_t2.1.1. Relationship between the Spectra of the Input and Output of a Linear Filter
_t2.2. Stationary General Linear Processes
_t2.2.1. Spectrum and Spectral Density for a General Linear Process
_t2.3. Wold Decomposition Theorem
_t2.4. Filtering Applications
_t2.4.1. Butterworth Filters
_t2.A. Appendix
_t Exercises
_t3. ARMA Time Series Models
_t3.1. Moving Average Processes
_t3.1.1. MA(1) Model
_t3.1.2. MA(2) Model
_t3.2. Autoregressive Processes
_t3.2.1. Inverting the Operator
_t3.2.2. AR(1) Model
_t3.2.3. AR(p) Model for p [≥] 1
_t3.2.4. Autocorrelations of an AR(p) Model
_t3.2.5. Linear Difference Equations
_t3.2.6. Spectral Density of an AR(p) Model
_t3.2.7. AR(2) Model
_t3.2.7.1. Autocorrelations of an AR(2) Model
_t3.2.7.2. Spectral Density of an AR(2)
_t3.2.7.3. Stationary/Causal Region of an AR(2)
_t3.2.7.4. ψ-Weights of an AR(2) Model
_t3.2.8. Summary of AR(1) and AR(2) Behavior
_t3.2.9. AR(p) Model
_t3.2.10. AR(1) and AR(2) Building Blocks of an AR(p) Model
_t3.2.11. Factor Tables
_t3.2.12. Invertibility/Infinite-Order Autoregressive Processes
_t3.2.13. Two Reasons for Imposing Invertibility
_t3.3. Autoregressive-Moving Average Processes
_t3.3.1. Stationarity and Invertibility Conditions for an ARMA(p,q) Model
_t3.3.2. Spectral Density of an ARMA(p,q) Model
_t3.3.3. Factor Tables and ARMA(p,q) Models
_t3.3.4. Autocorrelations of an ARMA(p,q) Model
_t3.3.5. ψ-Weights of an ARMA(p,q)
_t3.3.6. Approximating ARMA(p,q) Processes Using High-Order AR(p) Models
_t3.4. Visualizing Autoregressive Components
_t3.5. Seasonal ARMA(p,q) x (PsrQs)s Models
_t3.6. Generating Realizations from ARMA(p,q) Processes
_t3.6.1. MA(q) Model
_t3.6.2. AR(2) Model
_t3.6.3. General Procedure
_t3.7. Transformations
_t3.7.1. Memoryless Transformations
_t3.7.2. Autoregressive Transformations
_t3.A. Appendix: Proofs of Theorems
_t Exercises
_t4. Other Stationary Time Series Models
_t4.1. Stationary Harmonic Models
_t4.1.1. Pure Harmonic Models
_t4.1.2. Harmonic Signal-plus-Noise Models
_t4.1.3. ARMA Approximation to the Harmonic Signal-plus-Noise Model
_t4.2. ARCH and GARCH Processes
_t4.2.1. ARCH Processes
_t4.2.1.1. The ARCH(1) Model
_t4.2.1.2. The ARCH(90) Model
_t4.2.2. The GARCH(po,qo) Process
_t4.2.3. AR Processes with ARCH or GARCH Noise
_t Exercises
_t5. Nonstationary Time Series Models
_t5.1. Deterministic Signal-plus-Noise Models
_t5.1.1. Trend-Component Models
_t5.1.2. Harmonic Component Models
_t5.2. ARIMA(p,d,q) and ARUMA(p,d,q) Processes
_t5.2.1. Extended Autocorrelations of an ARUMA(p,d,q) Process
_t5.2.2. Cyclical Models
_t5.3. Multiplicative Seasonal ARUMA(p,d,q) x (PsrDsrQs)s Process
_t5.3.1. Factor Tables for Seasonal Models of the Form (5.17) with s = 4 and s = 12
_t5.4. Random Walk Models
_t5.4.1. Random Walk
_t5.4.2. Random Walk with Drift
_t5.5. G-Stationary Models for Data with Time-Varying Frequencies
_t Exercises
_t6. Forecasting
_t6.1. Mean Square Prediction Background
_t6.2. Box-Jenkins Forecasting for ARMA(p,q) Models
_t6.3. Properties of the Best Forecast Zto(l)
_t6.4. π-Weight Form of the Forecast Function
_t6.5. Forecasting Based on the Difference Equation
_t6.6. Eventual Forecast Function
_t6.7. Probability Limits for Forecasts
_t6.8. Forecasts Using ARUMA(p,d,q) Models
_t6.9. Forecasts Using Multiplicative Seasonal ARUMA Models
_t6.10. Forecasts Based on Signal-plus-Noise Models
_t6.A. Appendix
_t Exercises
_t7. Parameter Estimation
_t7.1. Introduction
_t7.2. Preliminary Estimates
_t7.2.1. Preliminary Estimates for AR(p) Models
_t7.2.1.1. Yule-Walker Estimates
_t7.2.1.2. Least Squares Estimation
_t7.2.1.3. Burg Estimates
_t7.2.2. Preliminary Estimates for MA(q) Models
_t7.2.2.1. Method-of-Moment Estimation for an MA(q)
_t7.2.2.2. MA(q) Estimation Using the Innovations Algorithm
_t7.2.3. Preliminary Estimates for ARMA(p,q) Models
_t7.2.3.1. Extended Yule-Walker Estimates of the Autoregressive Parameters
_t7.2.3.2. Tsay-Tiao (TT) Estimates of the Autoregressive Parameters
_t7.2.3.3. Estimating the Moving Average Parameters
_t7.3. Maximum Likelihood Estimation of ARMA(p,q) Parameters
_t7.3.1. Conditional and Unconditional Maximum Likelihood Estimation
_t7.3.2. ML Estimation Using the Innovations Algorithm
_t7.4. Backcasting and Estimating σ2a
_t7.5. Asymptotic Properties of Estimators
_t7.5.1. Autoregressive Case
_t7.5.1.1. Confidence Intervals: Autoregressive Case
_t7.5.2. ARMA(p,q) Case
_t7.5.2.1. Confidence Intervals for ARMA(p,q) Parameters
_t7.5.3. Asymptotic Comparisons of Estimators for an MA(1)
_t7.6. Estimation Examples Using Data
_t7.7. ARMA Spectral Estimation
_t7.8. ARUMA Spectral Estimation
_t Exercises
_t8. Model Identification
_t8.1. Preliminary Check for White Noise
_t8.2. Model Identification for Stationary ARMA Models
_t8.2.1. Model Identification Based on AIC and Related Measures
_t8.3. Model Identification for Nonstationary ARUMA(p,d,q) Models
_t8.3.1. Including a Nonstationary Factor in the Model
_t8.3.2. Identifying Nonstationary Component(s) in a Model
_t8.3.3. Decision between a Stationary or a Nonstationary Model
_t8.3.4. Deriving a Final ARUMA Model
_t8.3.5. More on the Identification of Nonstationary Components
_t8.3.5.1. Including a Factor (1 - B)d in the Model
_t8.3.5.2. Testing for a Unit Root
_t8.3.5.3. Including a Seasonal Factor (1 - Bs) in the Model
_t8.A. Appendix: Model Identification Based on Pattern Recognition
_t Exercises
_t9. Model Building
_t9.1. Residual Analysis
_t9.1.1. Check Sample Autocorrelations of Residuals versus 95% Limit Lines
_t9.1.2. Ljung-Box Test
_t9.1.3. Other Tests for Randomness
_t9.1.4. Testing Residuals for Normality
_t9.2. Stationarity versus Nonstationarity
_t9.3. Signal-plus-Noise versus Purely Autocorrelation-Driven Models
_t9.3.1. Cochrane Orcutt, ML, and Frequency Domain Method
_t9.3.2. A Bootstrapping Approach
_t9.3.3. Other Methods for Trend Testing
_t9.4. Checking Realization Characteristics
_t9.5. Comprehensive Analysis of Time Series Data: A Summary
_t Exercises
_t10. Vector-Valued (Multivariate) Time Series
_t10.1. Multivariate Time Series Basics
_t10.2. Stationary Multivariate Time Series
_t10.2.1. Estimating the Mean and Covariance for Stationary Multivariate Processes
_t10.2.1.1. Estimating μ
_t10.2.1.2. Estimating π(k)
_t10.3. Multivariate (Vector) ARMA Processes
_t10.3.1. Forecasting Using VAR(p) Models
_t10.3.2. Spectrum of a VAR(p) Model
_t10.3.3. Estimating the Coefficients of a VAR(p) Model
_t10.3.3.1. Yule-Walker Estimation
_t10.3.3.2. Least Squares and Conditional Maximum Likelihood Estimation
_t10.3.3.3. Burg-Type Estimation
_t10.3.4. Calculating the Residuals and Estimating πa
_t10.3.5. VAR(p) Spectral Density Estimation
_t10.3.6. Fitting a VAR(p) Model to Data
_t10.3.6.1. Model Selection
_t10.3.6.2. Estimating the Parameters
_t10.3.6.3. Testing the Residuals for White Noise
_t10.4. Nonstationary VARMA Processes
_t10.5. Testing for Association between Time Series
_t10.5.1. Testing for Independence of Two Stationary Time Series
_t10.5.2. Testing for Cointegration between Nonstationary Time Series
_t10.6. State-Space Models
_t10.6.1. State Equation
_t10.6.2. Observation Equation
_t10.6.3. Goals of State-Space Modeling
_t10.6.4. Kalman Filter
_t10.6.4.1. Prediction (Forecasting)
_t10.6.4.2. Filtering
_t10.6.4.3. Smoothing Using the Kalman
505 0 0 _t Filter
_t10.6.4.4. H-Step Ahead Predictions
_t10.6.5. Kalman Filter and Missing Data
_t10.6.6. Parameter Estimation
_t10.6.7. Using State-Space Methods to Find Additive Components of a Univariate Autoregressive Realization
_t10.6.7.1. Revised State-Space Model
_t10.6.7.2. ψ Real
_t10.6.7.3. ψ Complex
_t10.A. Appendix: Derivation of State-Space Results
_t Exercises
_t11. Long-Memory Processes
_t11.1. Long Memory
_t11.2. Fractional Difference and FARMA Processes
_t11.3. Gegenbauer and GARMA Processes
_t11.3.1. Gegenbauer Polynomials
_t11.3.2. Gegenbauer Process
_t11.3.3. GARMA Process
_t11.4. K-Factor Gegenbauer And Garma Processes
_t11.4.1. Calculating Autocovariances
_t11.4.2. Generating Realizations
_t11.5. Parameter Estimation and Model Identification
_t11.6. Forecasting Based on the k-Factor GARMA Model
_t11.7. Modeling Atmospheric CO2 Data Using Long-Memory Models
_t Exercises
_t12. Wavelets
_t12.1. Shortcomings of Traditional Spectral Analysis for TVF Data
_t12.2. Window-Based Methods That Localize the "Spectrum" in Time
_t12.2.1. Gabor Spectrogram
_t12.2.2. Wigner-Ville Spectrum
_t12.3. Wavelet Analysis
_t12.3.1. Fourier Series Background
_t12.3.2. Wavelet Analysis Introduction
_t12.3.3. Fundamental Wavelet Approximation Result
_t12.3.4. Discrete Wavelet Transform for Data Sets of Finite Length
_t12.3.5. Pyramid Algorithm
_t12.3.6. Multiresolution Analysis
_t12.3.7. Wavelet Shrinkage
_t12.3.8. Scalogram: Time-Scale Plot
_t12.3.9. Wavelet Packets
_t12.3.10. Two-Dimensional Wavelets
_t12.5. Concluding Remarks on Wavelets
_t12.A. Appendix: Mathematical Preliminaries for This Chapter
_t Exercises
_t13. G-Stationary Processes
_t13.1. Generalized-Stationary Processes
_t13.1.1. General Strategy for Analyzing G-Stationary Processes
_t13.2. M-Stationary Processes
_t13.2.1. Continuous M-Stationary Process
_t13.2.2. Discrete M-Stationary Process
_t13.2.3. Discrete Euler(p) Model
_t13.2.4. Time Transformation and Sampling
_t13.3. G(λ)-Stationary Processes
_t13.3.1. Continuous G(p;λ) Model
_t13.3.2. Sampling the Continuous G(λ)-Stationary Processes
_t13.3.2.1. Equally Spaced Sampling from G(p;λ) Processes
_t13.3.3. Analyzing TVF Data Using the G(p;λ) Model
_t13.3.3.1. G(p;λ) Spectral Density
_t13.4. Linear Chirp Processes
_t13.4.1. Models for Generalized Linear Chirps
_t13.5. Concluding Remarks
_t13.A. Appendix
_t Exercises
_t References
_t Index
650 0 _aTime-series analysis
700 1 _aGray, Henry L
700 1 _aElliott, Alan C.,
_d1952-
900 _a34755
900 _bsatın
942 _2lcc
_cKT
999 _c31966
_d31966