Paper
in
Workshop: LatinX in Computer Vision Research Workshop
Slot Attention-based Feature Filtering for Few-Shot Learning
Javier RĂ³denas · Petia Radeva · Eduardo Aguilar Torres
Irrelevant features can significantly degrade few-shot learning performance. This problem is used to match queries and support images based on meaningful similarities despite the limited data. However, in this process, non-relevant features such as background elements can easily lead to confusion and misclassification. To address this issue, we propose Slot Attention-based Feature Filtering for Few-Shot Learning (SAFF) that leverages slot attention mechanisms to discriminate and filter weak features, thereby improving few-shot classification performance. The key innovation of SAFF lies in its integration of slot attention with patch embeddings, unifying class-aware slots into a single attention mechanism to filter irrelevant features effectively. We introduce a similarity matrix that computes across support and query images to quantify the relevance of filtered embeddings for classification. Through experiments, we demonstrate that Slot Attention performs better than other attention mechanisms, capturing discriminative features while reducing irrelevant information. We validate our approach through extensive experiments on few-shot learning benchmarks: CIFAR-FS, FC100, miniImageNet and tieredImageNet, outperforming several state-of-the-art methods.