Further processing options

Sparse Representation, Modeling and Learning in Visual Recognition: Theory, Algorithms and Applications.

This unique text/reference presents a comprehensive review of the state of the art in sparse representations, modeling and learning. The book examines both the theoretical foundations and details of algorithm implementation, highlighting the practical application of compressed sensing research in vi...

Full Description

Bibliographic Details
Authors and Corporations: Cheng, Hong.
Type of Resource: E-Book
Language: English
published:
London : Springer London, Limited, 2015.
©2015.
Series: Advances in Computer Vision and Pattern Recognition Ser.
Subjects:
Source: Ebook Central
ISBN: 9781447167143
Table of Contents:
  • Intro
  • Preface
  • Contents
  • Mathematical Notation
  • Part I Introduction and Fundamentals
  • 1 Introduction
  • 1.1 Sparse Representation, Modeling, and Learning
  • 1.1.1 Sparse Representation
  • 1.1.2 Sparse Modeling
  • 1.1.3 Sparse Learning
  • 1.2 Visual Recognition
  • 1.2.1 Feature Representation and Learning
  • 1.2.2 Distance Metric Learning
  • 1.2.3 Classification
  • 1.3 Other Applications
  • 1.3.1 Single-Pixel Cameras
  • 1.3.2 Superresolution
  • References
  • 2 The Fundamentals of Compressed Sensing
  • 2.1 Sampling Theorems
  • 2.2 Compressive Sampling
  • 2.2.1 Random Projection and Measurement Matrix
  • 2.2.2 Sparsity
  • 2.2.3 Structured Sparsity
  • 2.3 ell0, ell1 and ell2 Norms
  • 2.4 Spark and Singleton Bound
  • 2.5 Null Space Property
  • 2.6 Uniform Uncertainty Principle, Incoherence Condition
  • 2.7 ell1 and ell0 Equivalence
  • 2.8 Stable Recovery Property
  • 2.9 Information Theory
  • 2.9.1 K-sparse Signal Model
  • 2.9.2 The Entropy of K-sparse Signals
  • 2.9.3 Mutual Information
  • 2.10 Sparse Convex Optimization
  • 2.10.1 Introduction to Convex Optimization
  • 2.10.2 Gradient, Subgradient, Accelerated Gradient
  • 2.10.3 Augmented Lagrangian Method
  • References
  • Part II Sparse Representation, Modeling and Learning
  • 3 Sparse Recovery Approaches
  • 3.1 Introduction
  • 3.2 Convex Relaxation
  • 3.2.1 Linear Programming Solutions
  • 3.2.2 Second-Order Cone Programs with Log-Barrier Method
  • 3.2.3 ell1-Homotopy Methods
  • 3.2.4 Elastic Net
  • 3.3 Greedy Algorithms
  • 3.3.1 MP and OMP
  • 3.3.2 CoSaMP
  • 3.3.3 Iterative Hard Thresholding Algorithm
  • 3.4 Sparse Bayesian Learning
  • 3.4.1 Bayesian Viewpoint of Sparse Representation
  • 3.4.2 Sparse Representation via Relevance Vector Machine
  • 3.4.3 Sparse Bayesian Learning
  • 3.5 ell0-Norm Gradient Minimization
  • 3.5.1 Counting Gradient Difference.
  • 3.5.2 ell0-Norm Sparse Optimization Problems
  • 3.5.3 ell0-Norm Sparse Solution
  • 3.5.4 Applications
  • 3.6 The Sparse Feature Projection Approach
  • 3.6.1 Gaussian Process Regression for Feature Transforms
  • 3.6.2 Sparse Projection of Input Feature Vectors
  • References
  • 4 Robust Sparse Representation, Modeling and Learning
  • 4.1 Introduction
  • 4.2 Robust Statistics
  • 4.2.1 Connection Between MLE and Residuals
  • 4.2.2 M-Estimators
  • 4.3 Robust Sparse PCA
  • 4.3.1 Introduction
  • 4.3.2 PCA
  • 4.3.3 Robust Sparse Coding
  • 4.3.4 Robust SPCA
  • 4.3.5 Applications
  • References
  • 5 Efficient Sparse Representation and Modeling
  • 5.1 Introduction
  • 5.1.1 Large-Scale Signal Representation and Modeling
  • 5.1.2 The Computation Complexity of Different Sparse Recovery Algorithms
  • 5.2 The Feature-Sign Search Algorithms
  • 5.2.1 Fixed-Point Continuations for ell1-minimization
  • 5.2.2 The Basic Feature-Sign Search Algorithm
  • 5.2.3 The Subspace Shrinkage and Optimization Algorithm
  • 5.3 Efficient Sparse Coding Using Graphical Models
  • 5.3.1 Graphical Models of CS Encoding Matrix
  • 5.3.2 Bayesian Compressive Sensing
  • 5.3.3 Bayesian Compressive Sensing Using Belief Propagation (CS-BP)
  • 5.4 Efficient Sparse Bayesian Learning
  • 5.4.1 Introduction
  • 5.4.2 Sequential Sparse Bayesian Models
  • 5.4.3 The Algorithm Flowchart
  • 5.5 Sparse Quantization
  • 5.5.1 Signal Sparse Approximation Problems
  • 5.5.2 K-Highest Sparse Quantization
  • 5.6 Hashed Sparse Representation
  • 5.6.1 Hash Functions
  • 5.6.2 Structured Dictionary Learning
  • 5.6.3 Hashing and Dictionary Learning
  • 5.6.4 Flowchart of Algorithm
  • 5.7 Compressive Feature
  • 5.7.1 Generating Compressive
  • 5.7.2 Applications
  • References
  • Part III Visual Recognition Applications
  • 6 Feature Representation and Learning
  • 6.1 Introduction
  • 6.2 Feature Extraction.
  • 6.2.1 Feature Representation Using Sparse Coding
  • 6.2.2 Feature Coding and Pooling
  • 6.2.3 Invariant Features
  • 6.3 Dictionary Learning
  • 6.3.1 K-SVD
  • 6.3.2 Discriminative Dictionary Learning
  • 6.3.3 Online Dictionary Learning
  • 6.3.4 Supervised Dictionary Learning
  • 6.3.5 Joint Dictionary Learning and Other Tasks
  • 6.3.6 Applications
  • -Image/Video Restoration
  • 6.4 Feature Learning
  • 6.4.1 Dimensionality Reduction
  • 6.4.2 Sparse Support Vector Machines
  • 6.4.3 Recursive Feature Elimination
  • 6.4.4 Minimum Squared Error (MSE) Criterions
  • 6.4.5 Elastic Net Criterions
  • 6.4.6 Sparse Linear Discriminant Analysis
  • 6.4.7 Saliency Feature Mapping Using Sparse Coding
  • References
  • 7 Sparsity-Induced Similarity
  • 7.1 Introduction
  • 7.2 Sparsity-Induced Similarity
  • 7.2.1 The Clustering Condition of Subspaces
  • 7.2.2 The Sparse-Induced Similarity Measure
  • 7.2.3 Nonnegative Sparsity-Induced Similarity
  • 7.2.4 Some Basic Issues in SIS
  • 7.2.5 A Toy Problem
  • 7.3 Application
  • 7.3.1 Label Propagation
  • 7.3.2 Human Activity Recognition
  • 7.3.3 Visual Tracking
  • 7.3.4 Image Categorization
  • 7.3.5 Spam Image Cluster
  • References
  • 8 Sparse Representation and Learning-Based Classifiers
  • 8.1 Introduction
  • 8.2 Sparse Representation-Based Classifiers (SRC)
  • 8.2.1 The SRC Algorithm and Its Invariants
  • 8.2.2 Classification Error Analysis
  • 8.3 Sparse Coding-Based Spatial Pyramid Matching (ScSPM)
  • 8.3.1 Assignment-Based Sparse Coding
  • 8.3.2 The Spatial Pooling
  • 8.3.3 The Sparse Coding-Based Spatial Pyramid Matching
  • 8.4 Sparsity Coding-Based Nearest Neighbor Classifiers (ScNNC)
  • 8.4.1 Sparse Coding-Based Naive Bayes Nearest Neighbor
  • 8.4.2 Sparse Approximated Nearest Points (SANP) Approaches
  • 8.5 Sparse Coding-Based Deformable Part Models (ScDPM)
  • 8.5.1 Deformable Part Models
  • 8.5.2 Sparselet Models.
  • 8.5.3 The Flowchart of ScDPM
  • References
  • Part IV Advanced Topics
  • 9 Beyond Sparsity
  • 9.1 Low-Rank Matrix Approximation
  • 9.1.1 Introduction
  • 9.1.2 ell2-norm Wiberg Algorithm
  • 9.1.3 ell1-norm Wiberg Algorithm
  • 9.2 Graphical Models in Compressed Sensing
  • 9.2.1 Inference via Message Passing Algorithm
  • 9.2.2 Inference via Approximate Message Passing Algorithm
  • 9.3 Collaborative Representation-Based Classifiers
  • 9.3.1 Sparse Representation and Collaborative Representation
  • 9.3.2 Collaborative Representation-Based Classification (CRC)
  • 9.4 High-Dimensional Nonlinear Learning
  • 9.4.1 Kernel Sparse Representation
  • 9.4.2 Anchor Points Approaches
  • 9.4.3 Sparse Manifold Learning
  • References
  • Appendix A Mathematics
  • Appendix B Computer Programming Resourcesfor Sparse Recovery Approaches
  • Appendix C The Source Code of Sparsity InducedSimilarity
  • Appendix DDerivations
  • Index.