Variance Constrained Distribution Alignment in Few-shot Models
Abstract
Learning generative models from the limited samples remains challenging due to unstable estimation of class conditional representations. Such instability often leads to intra-class distribution drift and degraded generalization under few sample regimes. To address these challenges, we propose a method that can model class level latent distributions for flexible and efficient few shot synthesis. Specifically, each input is represented by a learnable conditional latent distribution. Metric based statistical modeling effectively disentangles latent variables, contracts intra-class variance, and enlarges inter-class margins while enforcing cross task distributional alignment. Furthermore, we provide a variance based generalization analysis, showing that controlling class conditional variance tightens generalization bounds under few sample regimes. Experiments on the benchmark datasets demonstrate that our method surpasses prior works in visual quality and diversity, highlighting the benefit of statistical alignment for robust few shot generative modeling.