The neural network pruning algorithm reduces the computation and complexity of neural networks effectively. But the pruning scope is always difficult to define. The existing algorithms perform polarization operations on the scale factors, then the threshold is selected, and the pruning range is determined. However, there are still many candidate points in the selection of the threshold, and it is difficult to determine the optimal solution from them. Aiming at the above problems, this paper proposes an improved neural network pruning algorithm based on contrast loss and simulated annealing to determine the selection of threshold. Firstly, different training data from the same dataset are used to train multiple networks. The goal is to allow different data to improve the generalization ability of the network. Then, each network chooses its own pruning threshold at the same layer. When selecting the threshold, the stationary point close to zero among the candidate points is selected as the candidate set of pruning threshold. Different threshold values are used in this range to obtain the corresponding pruning structure. Next, these structures are compared and the pruned structure corresponding to the smallest difference in neural network structures is kept. Experiments on CIFAR-10 and MINIST datasets show that the proposed method outperforms the existing algorithms on accuracy, pruning rate, model storage and running time.
|