Recent research has shown that attention mechanisms can help convolutional networks train and infer more efficiently and accurately. However, the current attention mechanism mainly focuses on the relationship between global features and ignores the relationship between local features. A feature filtering module (FFM) for convolutional neural networks is proposed in this paper. The FFM module uses attention mechanisms in both spatial and channel dimensions and fuses the attention feature maps of the two branches into a 3D attention feature map to help effective feature information flow more efficiently. Extensive tests on the CIFAR-100, and MS COCO show that FFM improves baseline network performance under various models and tasks, demonstrating FFM's versatility.
scite is a Brooklyn-based organization that helps researchers better discover and understand research articles through Smart Citations–citations that display the context of the citation and describe whether the article provides supporting or contrasting evidence. scite is used by students and researchers from around the world and is funded in part by the National Science Foundation and the National Institute on Drug Abuse of the National Institutes of Health.