T$_k$CP: Context-Aware Pooling via Top-k% Activation Selection
Seo-Yeon Choi · Kyungsu Lee
Abstract
Pooling is a core operation in convolutional neural networks (CNNs), enabling spatial reduction and hierarchical abstraction. However, standard methods such as max or average pooling operate locally and often fail to capture global context, leading to under- or over-estimation of features. This limits performance on tasks requiring both fine localization and holistic understanding. To address this, we propose Top-$k$\% Contextual Pooling (TkCP), a framework that preserves informative activations based on contextual importance. TkCP includes two variants: (1) Sparse Contextual Pooling, selecting top-$k$\% activations within local windows, and (2) Global Contextual Pooling, selecting top-$k$\% across the entire feature map. Given a kernel size and target resolution, TkCP deterministically sets the stride and reconstructs outputs without additional parameters. Experiments across classification, detection, tracking, segmentation, and generation show consistent improvements in accuracy and robustness. Additionally, TkCP enhances interpretability by tracing high-activation regions across layers.
Successful Page Load