With the performance increase of convolutional neural network (CNN), the disadvantages of CNN's high storage and high power consumption are followed. Among the methods mentioned in various literature, filter pruning is a crucial method for constructing lightweight networks. However, the current filter pruning method is still challenged by complicated processes and training inefficiency. This paper proposes an effective filter pruning method, which uses the saliency of the feature map (SFM), i.e. information entropy, as a theoretical guide for whether the filter is essential. The pruning principle use here is that the filter with a weak saliency feature map in the early stage will not significantly improve the final accuracy. Thus, one can efficiently prune the non-salient feature map with a smaller information entropy and the corresponding filter. Besides, an over-parameterized convolution method is employed to improve the pruned model's accuracy without increasing parameter at inference time. Experimental results show that without introducing any additional constraints, the effectiveness of this method in FLOPs and parameters reduction with similar accuracy has advanced the state-of-the-art. For example, on CIFAR-10, the pruned VGG-16 achieves only a small loss of 0.39% in Top-1 accuracy with a factor of 83.3% parameters, and 66.7% FLOPs reductions. On ImageNet-100, the pruned ResNet-50 achieves only a small accuracy degradation of 0.76% in Top-1 accuracy with a factor of 61.19% parameters, and 62.98% FLOPs reductions.