Skip to yearly menu bar Skip to main content


ZePT: Zero-Shot Pan-Tumor Segmentation via Query-Disentangling and Self-Prompting

Yankai Jiang · Zhongzhen Huang · Rongzhao Zhang · Xiaofan Zhang · Shaoting Zhang

Arch 4A-E Poster #167
[ ] [ Project Page ]
Thu 20 Jun 10:30 a.m. PDT — noon PDT


The long-tailed distribution problem of medical image analysis reflects a high prevalence of common conditions and a low prevalence of rare ones, which poses a significant challenge in developing a unified model capable of identifying rare or novel tumor categories not encountered during training. In this paper, we propose a new Zero-shot Pan-Tumor segmentation framework (ZePT) based on query-disentangling and self-prompting to segment unseen tumor categories beyond the training set. ZePT disentangles the object queries into two subsets and trains them in two stages. Initially, it learns a set of fundamental queries for organ segmentation through an object-aware feature grouping strategy, which gathers organ-level visual features. Subsequently, it refines the other set of advanced queries that focus on the auto-generated visual prompts for unseen tumor segmentation. Moreover, we introduce query-knowledge alignment at the feature level to enhance each query's discriminative representation and generalizability. Extensive experiments on various tumor segmentation tasks demonstrate the performance superiority of ZePT, which surpasses the previous counterparts and demonstrates the promising ability for zero-shot tumor segmentation in real-world settings.

Live content is unavailable. Log in and register to view live content