Statistical Pattern Recognition; Andrew R. Webb; 2002
Begagnad
Statistical Pattern Recognition; Andrew R. Webb; 2002
Begagnad

Statistical Pattern RecognitionUpplaga 2

av Andrew R. Webb

  • Upplaga: 2a upplagan
  • Utgiven: 2002
  • ISBN: 9780470845141
  • Sidor: 514 st
  • Förlag: John Wiley & Sons
  • Format: Häftad
  • Språk: Engelska

Om boken

Preface. Notation. 1 Introduction to statistical pattern recognition. 1.1 Statistical pattern recognition. 1.1.1 Introduction. 1.1.2 The basic model. 1.2 Stages in a pattern recognition problem. 1.3 Issues. 1.4 Supervised versus unsupervised. 1.5 Approaches to statistical pattern recognition. 1.5.1 Elementary decision theory. 1.5.2 Discriminant functions. 1.6 Multiple regression. 1.7 Outline of book. 1.8 Notes and references. Exercises. 2 Density estimation - parametric. 2.1 Introduction. 2.2 Normal-based models. 2.2.1 Linear and quadratic discriminant functions. 2.2.2 Regularised discriminant analysis. 2.2.3 Example application study. 2.2.4 Further developments. 2.2.5 Summary. 2.3 Normal mixture models. 2.3.1 Maximum likelihood estimation via EM. 2.3.2 Mixture models for discrimination. 2.3.3 How many components? 2.3.4 Example application study. 2.3.5 Further developments. 2.3.6 Summary. 2.4 Bayesian estimates. 2.4.1 Bayesian learning methods. 2.4.2 Markov chain Monte Carlo. 2.4.3 Bayesian approaches to discrimination. 2.4.4 Example application study. 2.4.5 Further developments. 2.4.6 Summary. 2.5 Application studies. 2.6 Summary and discussion. 2.7 Recommendations. 2.8 Notes and references. Exercises. 3 Density estimation - nonparametric. 3.1 Introduction. 3.2 Histogram method. 3.2.1 Data-adaptive histograms. 3.2.2 Independence assumption 3.2.3 Lancaster models. 3.2.4 Maximum weight dependence trees. 3.2.5 Bayesian networks. 3.2.6 Example application study. 3.2.7 Further developments. 3.2.8 Summary. 3.3 k-nearest-neighbour method. 3.3.1 k-nearest-neighbour decision rule. 3.3.2 Properties of the nearest-neighbour rule. 3.3.3 Algorithms. 3.3.4 Editing techniques. 3.3.5 Choice of distance metric. 3.3.6 Example application study. 3.3.7 Further developments. 3.3.8 Summary. 3.4 Expansion by basis functions. 3.5 Kernel methods. 3.5.1 Choice of smoothing parameter. 3.5.2 Choice of kernel. 3.5.3 Example application study. 3.5.4 Further developments. 3.5.5 Summary. 3.6 Application studies. 3.7 Summary and discussion. 3.8 Recommendations. 3.9 Notes and references. Exercises. 4 Linear discriminant analysis. 4.1 Introduction. 4.2 Two-class algorithms. 4.2.1 General ideas. 4.2.2 Perceptron criterion. 4.2.3 Fisher's criterion. 4.2.4 Least mean squared error procedures. 4.2.5 Support vector machines. 4.2.6 Example application study. 4.2.7 Further developments. 4.2.8 Summary. 4.3 Multiclass algorithms. 4.3.1 General ideas. 4.3.2 Error-correction procedure. 4.3.3 Fisher's criterion - linear discriminant analysis. 4.3.4 Least mean squared error procedures. 4.3.5 Optimal scaling. 4.3.6 Regularisation. 4.3.7 Multiclass support vector machines. 4.3.8 Example application study. 4.3.9 Further developments. 4.3.10 Summary. 4.4 Logistic discrimination. 4.4.1 Two-group case. 4.4.2 Maximum likelihood estimation. 4.4.3 Multiclass logistic discrimination. 4.4.4 Example application study. 4.4.5 Further developments. 4.4.6 Summary. 4.5 Application studies. 4.6 Summary and discussion. 4.7 Recommendations. 4.8 Notes and references. Exercises. 5 Nonlinear discriminant analysis - kernel methods. 5.1 Introduction. 5.2 Optimisation criteria. 5.2.1 Least squares error measure. 5.2.2 Maximum likelihood. 5.2.3 Entropy. 5.3 Radial basis functions. 5.3.1 Introduction. 5.3.2 Motivation. 5.3.3 Specifying the model. 5.3.4 Radial basis function properties. 5.3.5 Simple radial basis function. 5.3.6 Example application study. 5.3.7 Further developments. 5.3.8 Summary. 5.4 Nonlinear support vector machines. 5.4.1 Types of kernel. 5.4.2 Model selection. 5.4.3 Support vector machines for regression. 5.4.4 Example application study. 5.4.5 Further developments. 5.4.6 Summary. 5.5 Application studies. 5.6 Summary and discussion. 5.7 Recommendations. 5.8 Notes and references. Exercises. 6 Nonlinear discriminant analysis - projection methods. 6.1 Introduction. 6.2 The multilayer perceptron. 6.2.1 Introduction. 6.2.2 Specifying the multilayer perceptron structure. 6.2.3 Determining the multilayer perceptron weights. 6.2.4 Properties. 6.2.5 Example application study. 6.2.6 Further developments. 6.2.7 Summary. 6.3 Projection pursuit. 6.3.1 Introduction. 6.3.2 Projection pursuit for discrimination. 6.3.3 Example application study. 6.3.4 Further developments. 6.3.5 Summary. 6.4 Application studies. 6.5 Summary and discussion. 6.6 Recommendations. 6.7 Notes and references. Exercises. 7 Tree-based methods. 7.1 Introduction. 7.2 Classification trees. 7.2.1 Introduction. 7.2.2 Classifier tree construction. 7.2.3 Other issues. 7.2.4 Example application study. 7.2.5 Further developments. 7.2.6 Summary. 7.3 Multivariate adaptive regression splines. 7.3.1 Introduction. 7.3.2 Recursive partitioning model. 7.3.3 Example application study. 7.3.4 Further developments. 7.3.5 Summary. 7.4 Application studies. 7.5 Summary and discussion. 7.6 Recommendations. 7.7 Notes and references. Exercises. 8 Performance. 8.1 Introduction. 8.2 Performance assessment. 8.2.1 Discriminability. 8.2.2 Reliability. 8.2.3 ROC curves for two-class rules. 8.2.4 Example application study. 8.2.5 Further developments. 8.2.6 Summary. 8.3 Comparing classifier performance. 8.3.1 Which technique is best? 8.3.2 Statistical tests. 8.3.3 Comparing rules when misclassification costs are uncertain 8.3.4 Example application study. 8.3.5 Further developments. 8.3.6 Summary. 8.4 Combining classifiers. 8.4.1 Introduction. 8.4.2 Motivation. 8.4.3 Characteristics of a combination scheme. 8.4.4 Data fusion. 8.4.5 Classifier combination methods. 8.4.6 Example application study. 8.4.7 Further developments. 8.4.8 Summary. 8.5 Application studies. 8.6 Summary and discussion. 8.7 Recommendations. 8.8 Notes and references. Exercises. 9 Feature selection and extraction. 9.1 Introduction. 9.2 Feature selection. 9.2.1 Feature selection criteria. 9.2.2 Search algorithms for feature selection. 9.2.3 Suboptimal search algorithms. 9.2.4 Example application study. 9.2.5 Further developments. 9.2.6 Summary. 9.3 Linear feature extraction. 9.3.1 Principal components analysis. 9.3.2 Karhunen-Loeve transformation. 9.3.3 Factor analysis. 9.3.4 Example application study. 9.3.5 Further developments. 9.3.6 Summary. 9.4 Multidimensional scaling. 9.4.1 Classical scaling. 9.4.2 Metric multidimensional scaling. 9.4.3 Ordinal scaling. 9.4.4 Algorithms. 9.4.5 Multidimensional scaling for feature extraction. 9.4.6 Example application study. 9.4.7 Further developments. 9.4.8 Summary. 9.5 Application studies. 9.6 Summary and discussion. 9.7 Recommendations. 9.8 Notes and references. Exercises. 10 Clustering. 10.1 Introduction. 10.2 Hierarchical methods. 10.2.1 Single-link method. 10.2.2 Complete-link method. 10.2.3 Sum-of-squares method. 10.2.4 General agglomerative algorithm. 10.2.5 Properties of a hierarchical classification. 10.2.6 Example application study. 10.2.7 Summary. 10.3 Quick partitions. 10.4 Mixture models. 10.4.1 Model description. 10.4.2 Example application study. 10.5.

Åtkomstkoder och digitalt tilläggsmaterial garanteras inte med begagnade böcker

Mer om Statistical Pattern Recognition (2002)

I juli 2002 släpptes boken Statistical Pattern Recognition skriven av Andrew R. Webb. Det är den 2a upplagan av kursboken. Den är skriven på engelska och består av 514 sidor. Förlaget bakom boken är John Wiley & Sons som har sitt säte i Hoboken.

Köp boken Statistical Pattern Recognition på Studentapan och spara pengar.

Referera till Statistical Pattern Recognition (Upplaga 2)

Harvard

Webb, A. R. (2002). Statistical Pattern Recognition. 2:a uppl. John Wiley & Sons.

Oxford

Webb, Andrew R., Statistical Pattern Recognition, 2 uppl. (John Wiley & Sons, 2002).

APA

Webb, A. R. (2002). Statistical Pattern Recognition (2:a uppl.). John Wiley & Sons.

Vancouver

Webb AR. Statistical Pattern Recognition. 2:a uppl. John Wiley & Sons; 2002.