(Downloads - 0)
For more info about our services contact : help@bestpfe.com
Table of contents
1 Introduction
1.1 Challenges in Extreme Classification
1.1.1 Class Imbalance/Data scarsity
1.1.2 High dimensionality/Large sample size
1.1.3 Structure and Label Dependence exploitation
1.1.4 Training/Inference Complexity reduction
1.2 Contributions
1.3 Outline
2 Extreme Single Label Classification
2.1 Introduction
2.2 Flat approaches
2.2.1 Machine Learning Reductions
2.2.1.1 Binary Classification
2.2.1.2 One-versus-Rest Classifier
2.2.1.3 Error Correcting Output Codes
2.2.1.4 Discussion
2.2.2 Embedding approaches
2.2.2.1 Sequence of Convex Problems
2.2.2.2 Joint Non Convex Embeding
2.2.2.3 Discussion
2.2.3 Conclusion
2.3 Hierarchical Approaches
2.3.1 Hierarchical Structure Learning
2.3.1.1 Spectral Clustering
2.3.1.2 Learning Class Hierarchies
2.3.1.3 Discussion
2.3.2 Discriminative Models Learning
2.3.2.1 Independent Optimization ofModels: PachinkoMachines
2.3.2.2 Joint Optimization of Models
2.3.2.3 Regularization Based Approaches
Similarity based dependence modeling:
Dissimilarity based dependence modeling:
2.3.2.4 Sequential Learning of Models
2.3.3 Joint Learning of Models and Hierarchical Structure
2.3.3.1 Fast and Balanced Hierarchies (Deng et al., 2011)
2.3.3.2 Relaxed Discriminant Hierarchies (Gao and Koller, 2011a)
2.3.3.3 Discussion
2.3.4 Conclusion
3 Extreme Single Label Classification with Compact Ouput Coding
3.1 Introduction
3.2 Learning Distributed Representation of Classes (LDR)
3.2.1 Principle
3.2.2 Learning Compact Binary Class-codes
3.2.3 Relations to ECOC
3.2.4 Training and inference complexity
3.3 Experiments
3.3.1 Datasets
3.3.2 Experimental setup
3.3.3 Comparison of the methods
3.3.4 Zero-shot learning
3.4 Conclusion
Contents iv
4 Extreme Multilabel Classification
4.1 Introduction
4.2 In defense of Hamming Loss
4.3 On Binary Relevance
4.4 Early approaches to MLC
4.4.1 Stacking Binary Relevance
4.4.2 Classifier Chains (CC)
4.4.3 Label Powerset and friends
4.5 Scalable approaches to Extreme MLC
4.5.1 Label Selection Methods
4.5.1.1 Label Space Pruning
4.5.1.2 Column Subset Selection Method
4.5.2 Label Transformation Methods
4.5.2.1 Compressed Sensing
4.5.2.2 Principle Label Space Transformation
4.6 Conclusion
5 Extreme Multilabel Classification with Bloom Filters
5.1 Introduction
5.2 Background on Bloom Filters
5.3 Standard Bloom Filters for Multilabel Classification
5.3.1 Encoding and Decoding
5.3.2 Computational Complexitys
5.4 Extreme MLC with Robust Bloom Filters
5.4.1 Label Clustering
5.4.2 Encoding and decoding
5.4.2.1 Encoding and Hash functions
5.4.2.2 Decoding and Robustness Proof:
5.5 Experiments
5.5.1 Datasets
5.5.2 Evaluation metrics
5.5.3 Baselines and experimental setup
5.5.4 Parameter selection for Standard Bloom Filters
5.5.5 Parameter selection for Robust Bloom Filters
5.5.6 Correlation Decoding (CD) versus Standard Decoding (SD)
5.5.7 Comparative Results
5.5.8 Runtime analysis
5.6 Conclusion
6 Conclusion and Perspectives
Bibliography



