Poster
Dynamic Pseudo Labeling via Gradient Cutting for High-Low Entropy Exploration
Jae Hyeon Park · Joo Hyeon Jeon · Jae Yun Lee · Sangyeon Ahn · MinHee Cha · Min Geol Kim · Hyeok Nam · Sung In Cho
This study addresses the limitations of existing dynamic pseudo-labeling techniques, which often utilize static or dynamic thresholds for confident sample selection. Traditional methods fail to capture the non-linear relationship between task accuracy and model confidence, particularly in the context of overconfidence, thus limiting learning opportunities for sensitive samples that significantly influence a model's generalization ability. To solve this, we propose a novel gradient pass-based dynamic pseudo-labeling (DPL) technique that incorporates high-entropy samples, which are typically overlooked. Our approach introduces two classifiers—low gradient pass (LGP) and high gradient pass (HGP)—to derive sensitive dynamic thresholds (SDT) and underconfident dynamic thresholds (UDT), respectively. By effectively combining these thresholds with those from converged and overconfident states, we aim to create a more adaptive and effective learning strategy. Our main contributions highlight the importance of considering both low and high-confidence samples in enhancing model robustness and generalization for improved PL performance.
Live content is unavailable. Log in and register to view live content