Tag: gated recurrent unit

Research

Exploring Sparsity in Recurrent Neural Networks

In order to deploy Recurrent Neural Networks (RNNs) efficiently, we propose a technique to reduce the parameters of a network by pruning weights during the initial training of the network.