类型 | 方法 |
---|---|
Robust Loss Function | Robust MAE, Generalized Cross Entropy,Symmetric Cross Entropy, Curriculum Loss |
Robust Architecture | Webly Learning, Noise Model, Dropout Noise Model, S–model , C–model , NLNN , Probabilistic Noise Model , Masking, Contrastive-Additive Noise Network |
Robust Regularization | Adversarial Training , Label Smoothing , Mixup , Bilevel Learning , Annotator Confusion , Pre-training |
Loss Adjustment | Gold Loss Correction, Importance Reweighting, Active Bias , Bootstrapping, Dynamic Bootstrapping , D2L , SELFIE |
Sample Selection | Decouple , MentorNet , Co-teaching , Co-teaching+, Iterative Detection , ITLM , INCV , SELFIE , SELF , Curriculum Loss |
Meta Learning | Meta-Regressor , Knowledge Distillation , L2LWS , CWS , Automatic Reweighting , MLNT , Meta-Weight-Net, Data Coefficients |
Semi-supervised Learning | Label Aggregation , Two-Stage Framework , SELF , DivideMix |
- (Robust MAE) Robust Loss Functions under Label Noise for Deep Neural Networks pdf
- (Generalized Cross Entropy) Generalized cross entropy loss for training deep neural networks with noisy labels pdf
- (Symmetric Cross Entropy )Symmetric cross entropy for robust learning with noisy labels pdf
- (Curriculum Loss ) Curriculum loss: Robust learning and generalization against label corruptionpdf
- (Webly Learning) Webly supervised learning of convolutional networks pdf
- (Noise Model)Training convolutional networks with noisy labels pdf
- (Dropout Noise Model)Learning deep networks from noisy labels with dropout regularization pdf
- (S–model) (C–model)Training deep neural-networks using a noise adaptation layer pdf
- (NLNN)Training deep neural-networks based on unreliable labels pdf
- (Probabilistic Noise Model) Learning from massive noisy labeled data for image classification pdf
- (Masking) Masking: A new perspective of noisy supervision pdf
- (Contrastive-Additive Noise Network) Deep learning from noisy image labels with quality embedding pdf
- (Adversarial Training) Explaining and harnessing adversarial examples pdf
- (Label Smoothing) Regularizing neural networks by penalizing confident output distributions pdf
- (Mixup)mixup: Beyond empirical risk minimization pdf
- (Bilevel Learning)Deep bilevel learningpdf
- (Annotator Confusion) Learning from noisy labels by regularized estimation of annotator confusion pdf
- (Pre-training) Using Pre-Training Can Improve Model Robustness and Uncertainty pdf
- (Gold Loss Correction) Using trusted data to train deep networks on labels corrupted by severe noisepdf
- (Importance Reweighting) Multiclass learning with partially corrupted labels pdf
- (Active Bias) Active Bias: Training more accurate neural networks by emphasizing high variance samples pdf
- (Bootstrapping) Training deep neural networks on noisy labels with bootstrapping pdf
- (Dynamic Bootstrapping) Unsupervised label noise modeling and loss correction pdf
- (D2L) Dimensionality-driven learning with noisy labels pdf
- (SELFIE ) SELFIE: Refurbishing unclean samples for robust deep learning pdf
- Decouple “Decoupling” when to update” from” how to update” pdf
- MentorNet MentorNet: Learning data-driven curriculum for very deep neural networks on corrupted label pdf
- Co-teaching Co-teaching: Robust training of deep neural networks with extremely noisy labels pdf
- Co-teaching+ How does disagreement help generalization against label corruption? pdf
- Iterative Detection Iterative learning with open-set noisy labels pdf
- ITLM Iterative learning with open-set noisy labels pdf
- INCV Understanding and utilizing deep neural networks trained with noisy labels pdf
- SELFIE SELFIE: Refurbishing unclean samples for robust deep learning pdf
- SELF Self: Learning to filter noisy labels with self-ensembling pdf
- Curriculum Loss Curriculum loss: Robust learning and generalization against label corruption pdf
- Meta-Regressor Noise detection in the meta-learning level pdf
- Knowledge Distillation Learning from noisy labels with distillation pdf
- L2LWS Learning to Learn from Weak Supervision by Full Supervision pdf
- CWS Avoiding your teacher’s mistakes: Training neural networks with controlled weak supervision pdf
- Automatic Reweighting Learning to reweight examples for robust deep learning pdf
- MLNT Learning to learn from noisy labeled data pdf
- Meta-Weight-Net MetaWeight-Net: Learning an explicit mapping for sample weighting pdf
- Data Coefficients MetaWeight-Net: Learning an explicit mapping for sample weighting pdf
- Label Aggregation Robust semisupervised learning through label aggregation pdf
- Two-Stage Framework A semi-supervised two-stage approach to learning from noisy labels pdf
- SELF Self: Learning to filter noisy labels with self-ensembling pdf
- DivideMix DivideMix: Learning with noisy labels as semi-supervised learning pdf
- Robust deep supervised hashing for image retrieval
- NOISE-ROBUST CONTRASTIVE LEARNING pdf
- ROBUST CURRICULUM LEARNING: FROM CLEAN LABEL DETECTION TO NOISY LABEL SELF-CORRECTION pdf
- ROBUST LEARNING VIA GOLDEN SYMMETRIC LOSS OF (UN)TRUSTED LABELS pdf
- LONG-TAIL ZERO AND FEW-SHOT LEARNING VIA CONTRASTIVE PRETRAINING ON AND FOR SMALL DATA pdf
- A SIMPLE APPROACH TO DEFINE CURRICULA FOR TRAINING NEURAL NETWORKS pdf
- INTERACTIVE WEAK SUPERVISION: LEARNING USEFUL HEURISTICS FOR DATA LABELING pdf
- PROTOTYPICAL REPRESENTATION LEARNING FOR RELATION EXTRACTION pdf
- NEIGHBOR CLASS CONSISTENCY ON UNSUPERVISEDDOMAIN ADAPTATION pdf
- ON THE IMPORTANCE OF LOOKING AT THE MANIFOLD pdf
- SHAPE OR TEXTURE: UNDERSTANDING DISCRIMINATIVE FEATURES IN CNNS pdf
- LEARNING TO EXPLORE WITH PLEASURE pdf