Skip to yearly menu bar Skip to main content


Poster

Spectral Pruning for Recurrent Neural Networks

Takashi Furuya · Kazuma Suetake · Koichi Taniguchi · Hiroyuki Kusumoto · Ryuji Saiin · Tomohiro Daimon


Abstract:

Recurrent neural networks (RNNs) are a class of neural networks used in sequential tasks. However, in general, RNNs have a large number of parameters and involve enormous computational costs by repeating the recurrent structures in many time steps. As a method to overcome this difficulty, RNN pruning has attracted increasing attention in recent years, and it brings us benefits in terms of the reduction of computational cost as the time step progresses. However, most existing methods of RNN pruning are heuristic. The purpose of this paper is to study the theoretical scheme for RNN pruning method. We propose an appropriate pruning algorithm for RNNs inspired by “spectral pruning”, and provide the generalization error bounds for compressed RNNs. We also provide numerical experiments to demonstrate our theoretical results and show the effectiveness of our pruning method compared with the existing methods.

Chat is not available.