Skip to yearly menu bar Skip to main content


Poster

Leveraging Frozen Batch Normalization for Co-Training in Source-Free Domain Adaptation

Rui Wang


Abstract:

Source-free domain adaptation (SFDA) aims to adapt a source model, initially trained on a fully-labeled source domain, to an unlabeled target domain. Previous works assume that the statistics of Batch Normalization layers in the source model capture domain-specific knowledge and directly replace them with target domain-related statistics during training. However, our observations indicate that source-like samples in target data exhibit less deviation in the feature space of the source model when preserving the source domain-relevant statistics. In this paper, we propose co-training the source model with frozen Batch Normalization layers as part of the domain adaptation process. Specifically, we combine the source model and the target model to produce more robust pseudo-labels for global class clustering and to identify more precise neighbor samples for local neighbor clustering. Extensive experiments validate the effectiveness of our approach, showcasing its superiority over current state-of-the-art methods on three standard benchmarks. Our codes are available on https://github.com/SJTU-dxw/BN-SFDA.

Live content is unavailable. Log in and register to view live content