Top-down neural attention by excitation backprop

Files
1608.00507.pdf(9.37 MB)
Accepted manuscript
Date
2016
Authors
Zhang, Jianming
Lin, Zhe
Brandt, Jonathan
Shen, Xiaohui
Sclaroff, Stan
Bargal, Sarah Adel
Version
Accepted manuscript
OA Version
Citation
J Zhang, Z Lin, J Brandt, X Shen, S Sclaroff. 2016. "Top-down neural attention by excitation backprop." European Conference on Computer Vision.
Abstract
We aim to model the top-down attention of a Convolutional Neural Network (CNN) classifier for generating task-specific attention maps. Inspired by a top-down human visual attention model, we propose a new backpropagation scheme, called Excitation Backprop, to pass along top-down signals downwards in the network hierarchy via a probabilistic Winner-Take-All process. Furthermore, we introduce the concept of contrastive attention to make the top-down attention maps more discriminative. In experiments, we demonstrate the accuracy and generalizability of our method in weakly supervised localization tasks on the MS COCO, PASCAL VOC07 and ImageNet datasets. The usefulness of our method is further validated in the text-to-region association task. On the Flickr30k Entities dataset, we achieve promising performance in phrase localization by leveraging the top-down attention of a CNN model that has been trained on weakly labeled web images.
Description
License