|
![]() |
|||
|
||||
OverviewThis book constitutes the thoroughly refereed post-proceedings of the PASCAL (pattern analysis, statistical modelling and computational learning) Statistical and Optimization Perspectives Workshop on Subspace, Latent Structure and Feature Selection techniques, SLSFS 2005. The 9 revised full papers presented together with 5 invited papers reflect the key approaches that have been developed for subspace identification and feature selection using dimension reduction techniques, subspace methods, random projection methods, among others. Full Product DetailsAuthor: Craig Saunders , Marko Grobelnik , Steve Gunn , John Shawe-TaylorPublisher: Springer-Verlag Berlin and Heidelberg GmbH & Co. KG Imprint: Springer-Verlag Berlin and Heidelberg GmbH & Co. K Edition: 2006 ed. Volume: 3940 Dimensions: Width: 15.50cm , Height: 1.20cm , Length: 23.50cm Weight: 0.710kg ISBN: 9783540341376ISBN 10: 3540341374 Pages: 209 Publication Date: 16 May 2006 Audience: Professional and scholarly , Professional & Vocational Format: Paperback Publisher's Status: Active Availability: In Print ![]() This item will be ordered in for you from one of our suppliers. Upon receipt, we will promptly dispatch it out to you. For in store availability, please contact us. Table of ContentsInvited Contributions.- Discrete Component Analysis.- Overview and Recent Advances in Partial Least Squares.- Random Projection, Margins, Kernels, and Feature-Selection.- Some Aspects of Latent Structure Analysis.- Feature Selection for Dimensionality Reduction.- Contributed Papers.- Auxiliary Variational Information Maximization for Dimensionality Reduction.- Constructing Visual Models with a Latent Space Approach.- Is Feature Selection Still Necessary?.- Class-Specific Subspace Discriminant Analysis for High-Dimensional Data.- Incorporating Constraints and Prior Knowledge into Factorization Algorithms – An Application to 3D Recovery.- A Simple Feature Extraction for High Dimensional Image Representations.- Identifying Feature Relevance Using a Random Forest.- Generalization Bounds for Subspace Selection and Hyperbolic PCA.- Less Biased Measurement of Feature Selection Benefits.ReviewsAuthor InformationTab Content 6Author Website:Countries AvailableAll regions |