- Format
- Häftad (Paperback / softback)
- Språk
- Tyska
- Antal sidor
- 397
- Utgivningsdatum
- 2012-05-31
- Upplaga
- Softcover reprint of the original 1st ed. 1996
- Förlag
- Springer-Verlag
- Originalspråk
- German
- Illustrationer
- 42 Illustrations, black and white; XIII, 397 S. 42 Abb.
- Dimensioner
- 244 x 170 x 22 mm
- Vikt
- Antal komponenter
- 1
- Komponenter
- 1 Paperback / softback
- ISBN
- 9783322927743
- 658 g
Du kanske gillar
-
Diercke Weltatlas
Various Authors
InbundenAdvanced Signal Processing and Digital Noise Reduction
776- Skickas inom 7-10 vardagar.
- Gratis frakt inom Sverige över 199 kr för privatpersoner.
Finns även somPassar bra ihop
De som köpt den här boken har ofta också köpt Elektronik Für Ingenieurstudenten av Gunter Schmitz (häftad).
Köp båda 2 för 1190 krKundrecensioner
Har du läst boken? Sätt ditt betyg »Fler böcker av Saeed V Vaseghi
-
Multimedia Signal Processing
Saeed V Vaseghi
Multimedia Signal Processing is a comprehensive and accessible text to the theory and applications of digital signal processing (DSP). The applications of DSP are pervasive and include multimedia systems, cellular communication, adaptive network m...
Innehållsförteckning
1 Introduction.- 1.1 Signals and Information.- 1.2 Signal Processing Methods.- 1.2.1 Non-parametric Signal Processing.- 1.2.2 Model-based Signal Processing.- 1.2.3 Bayesian Statistical Signal Processing.- 1.2.4 Neural Networks.- 1.3 Applications of Digital Signal Processing.- 1.3.1 Adaptive Noise Cancellation and Noise Reduction.- 1.3.2 Blind Channel Equalisation.- 1.3.3 Signal Classification and Pattern Recognition.- 1.3.4 Linear Prediction Modelling of Speech.- 1.3.5 Digital Coding of Audio Signals.- 1.3.6 Detection of Signals in Noise.- 1.3.7 Directional Reception of Waves: Beamforming.- 1.4 Sampling and Analog to Digital Conversion.- 1.4.1 Time-Domain Sampling and Reconstruction of Analog Signals.- 1.4.2 Quantisation.- 2 Stochastic Processes.- 2.1 Random Signals and Stochastic Processes.- 2.1.1 Stochastic Processes.- 2.1.2 The Space or Ensemble of a Random Process.- 2.2 Probabilistic Models of a Random Process.- 2.3 Stationary and Nonstationary Random Processes.- 2.3.1 Strict Sense Stationary Processes.- 2.3.2 Wide Sense Stationary Processes.- 2.3.3 Nonstationary Processes.- 2.4 Expected Values of a Stochastic Process.- 2.4.1 The Mean Value.- 2.4.2 Autocorrelation.- 2.4.3 Autocovariance.- 2.4.4 Power Spectral Density.- 2.4.5 Joint Statistical Averages of Two Random Processes.- 2.4.6 Cross Correlation and Cross Covariance.- 2.4.7 Cross Power Spectral Density and Coherence.- 2.4.8 Ergodic Processes and Time-averaged Statistics.- 2.4.9 Mean-ergodic Processes.- 2.4.10 Correlation-ergodic Processes.- 2.5 Some Useful Classes of Random Processes.- 2.5.1 Gaussian (Normal) Process.- 2.5.2 Multi-variate Gaussian Process.- 2.5.3 Mixture Gaussian Process.- 2.5.4 A Binary-state Gaussian Process.- 2.5.5 Poisson Process.- 2.5.6 Shot Noise.- 2.5.7 Poisson-Gaussian Model for Clutters and Impulsive Noise.- 2.5.8 Markov Processes.- 2.6 Transformation of a Random Process.- 2.6.1 Monotonic Transformation of Random Signals.- 2.6.2 Many-to-one Mapping of Random Signals.- Summary.- 3 Bayesian Estimation and Classification.- 3.1 Estimation Theory: Basic Definitions.- 3.1.1 Predictive and Statistical Models in Estimation.- 3.1.2 Parameter Space.- 3.1.3 Parameter Estimation and Signal Restoration.- 3.1.4 Performance Measures.- 3.1.5 Prior, and Posterior Spaces and Distributions.- 3.2 Bayesian Estimation.- 3.2.1 Maximum a Posterior Estimation.- 3.2.2 Maximum Likelihood Estimation.- 3.2.3 Minimum Mean Squared Error Estimation.- 3.2.4 Minimum Mean Absolute Value of Error Estimation.- 3.2.5 Equivalence of MAP, ML, MMSE and MAVE.- 3.2.6 Influence of the Prior on Estimation Bias and Variance.- 3.2.7 The Relative Importance of the Prior and the Observation.- 3.3 Estimate-Maximise (EM) Method.- 3.3.1 Convergence of the EM algorithm.- 3.4 Cramer-Rao Bound on the Minimum Estimator Variance.- 3.4.1 Cramer-Rao Bound for Random Parameters.- 3.4.2 Cramer-Rao Bound for a Vector Parameter.- 3.5 Bayesian Classification.- 3.5.1 Classification of Discrete-valued Parameters.- 3.5.2 Maximum a Posterior Classification.- 3.5.3 Maximum Likelihood Classification.- 3.5.4 Minimum Mean Squared Error Classification.- 3.5.5 Bayesian Classification of Finite State Processes.- 3.5.6 Bayesian Estimation of the Most Likely State Sequence.- 3.6 Modelling the Space of a Random Signal.- 3.6.1 Vector Quantisation of a Random Process.- 3.6.2 Design of a Vector Quantiser: K-Means Algorithm.- 3.6.3 Design of a Mixture Gaussian Model.- 3.6.4 The EM Algorithm for Estimation of Mixture Gaussian Densities.- Summary.- 4 Hidden Markov Models.- 4.1 Statistical Models for Nonstationary Processes.- 4.2 Hidden Markov Models.- 4.2.1 A Physical Interpretation of Hidden Markov Models.- 4.2.2 Hidden Markov Model As a Bayesian Method.- 4.2.3 Parameters of a Hidden Markov Model.- 4.2.4 State Observation Models.- 4.2.5 State Transition Probabilities.- 4.2.6 State-Time Trellis Diagram.- 4.3 Training Hidden Markov Models.- 4.3.1 Forward-Backward Probability Computation.- 4.3.2 Baum-Welch Model Re-Estimation.