|
![]() |
|||
|
||||
OverviewThis book develops advanced statistics techniques for classification. Specifically: Support Vector Machine SVM, Discriminant Analysis, Logistic Rregression, Naive Bayes, Nearest Neighbor kNN, and pattern recognition through neural networks are explainedThe Naive Bayes classifier is designed for use when predictors are independent of one another within each class, but it appears to work well in practice even when that independence assumption is not valid. It classifies data in two steps:1.Training step: Using the training data, the method estimates the parameters of a probability distribution, assuming predictors are conditionally independent given the class.2.Prediction step: For any unseen test data, the method computes the posterior probability of that sample belonging to each class. The method then classifies the test data according the largest posterior probability.The class-conditional independence assumption greatly simplifies the training step since you can estimate the one-dimensional class-conditional density for each predictor individually. While the class-conditional independence between predictors is not true in general, research shows that this optimistic assumption works well in practice. This assumption of class-conditional independence of the predictors allows the naive Bayes classifier to estimate the parameters required for accurate classification while using less training data than many other classifiers. This makes it particularly effective for data sets containing many predictors.Given a set X of n points and a distance function, k-nearest neighbor (kNN) search lets you find the k closest points in X to a query point or set of points Y. The kNN search technique and kNN-based algorithms are widely used as benchmark learning rules. The relative simplicity of the kNN search technique makes it easy to compare the results from other classification techniques to kNN results. Neural networks have been trained to perform complex functions in various fields, including pattern recognition, identification, classification, speech, vision, and control systems.Neural networks can also be trained to solve problems that are difficult for conventional computers or human beings. The toolbox emphasizes the use of neural network paradigms that build up to-or are themselves used in- engineering, financial, and other practical applications.The choice of network architecture for pattern recognition problems follows similar guidelines to function fitting problems. More neurons require more computation, and they have a tendency to overfit the data when the number is set too high, but they allow the network to solve more complicated problems. More layers require more computation, but their use might result in the network solving complex problems more efficiently. To use more than one hidden layer, enter the hidden layer sizes as elements of an array in the patternnet command. Full Product DetailsAuthor: C PerezPublisher: Independently Published Imprint: Independently Published Dimensions: Width: 15.20cm , Height: 0.90cm , Length: 22.90cm Weight: 0.227kg ISBN: 9781097467273ISBN 10: 1097467279 Pages: 150 Publication Date: 08 May 2019 Audience: General/trade , General Format: Paperback Publisher's Status: Active Availability: Available To Order ![]() We have confirmation that this item is in stock with the supplier. It will be ordered in for you and dispatched immediately. Table of ContentsReviewsAuthor InformationTab Content 6Author Website:Countries AvailableAll regions |