Scalable Subset Sampling with Neural Conditional Poisson Networks

Explaining text classification by subset of words

Abstract

A number of problems in learning can be formulated in terms of the basic primitive of sampling k elements out of a universe of n elements. This subset sampling operation cannot directly be included in differentiable models and approximations are essential. Current approaches take an order sampling approach to sampling subsets and depend on differentiable approximations of the Top-k operator for selecting the largest k elements from a set. We present a simple alternative method for sampling subsets based on conditional Poisson sampling. Unlike order sampling approaches, the parallel complexity of the proposed method is independent of the subset size which makes the method scalable to large subset sizes. We adapt the procedure to make it efficient and amenable to discrete gradient approximations for use in differentiable models. Furthermore, the method also allows the subset size parameter k to be differentiable. We demonstrate our approach on model explanation, image sub-sampling and stochastic k-nearest neighbor tasks outperforming existing methods in accuracy, efficiency and scalability.

Publication
International Conference on Learning Representations (ICLR 2023)