Skip to yearly menu bar Skip to main content


Poster

Non-Exemplar Domain Incremental Learning via Cross-Domain Concept Integration

Qiang Wang · Yuhang He · Songlin Dong · Xinyuan Gao · Shaokun Wang · Yihong Gong

Strong blind review: This paper was not made available on public preprint services during the review process Strong Double Blind
[ ]
Thu 3 Oct 1:30 a.m. PDT — 3:30 a.m. PDT

Abstract:

Existing approaches to Domain Incremental Learning (DIL) address catastrophic forgetting by storing and rehearsing exemplars from old domains. However, exemplar-based solutions are not always viable due to data privacy concerns or storage limitations. Therefore, Non-Exemplar Domain Incremental Learning (NEDIL) has emerged as a significant paradigm for resolving DIL challenges. Current NEDIL solutions extend the classifier incrementally for new domains to learn new knowledge, but unrestricted extension within the same feature space leads to inter-class confusion. To tackle this issue, we propose a simple yet effective method through cross-domain concePt INtegrAtion (PINA). We train a Unified Classifier (UC) as a concept container across all domains. Then, a Domain Specific Alignment (DSA) module is trained for each incremental domain, aligning the feature distribution to the base domain. During inference, we introduce a Patch Shuffle Selector (PSS) to select appropriate parameters of DSA for test images. Our developed patch shuffling technique disrupts class-dependent information, outperforming the domain selectors based on K-Nearest Neighbors or Nearest Mean Classifier. Extensive experiments demonstrate that our method achieves state-of-the-art performance while reducing the number of additional parameters. The source code will be released in http://XXX.XXX.XX.

Live content is unavailable. Log in and register to view live content