Deep Learning Based Automatic Multiclass Wild Pest Monitoring Approach Using Hybrid Global and Local Activated Features
点击次数:
DOI码:10.1109/TII.2020.2995208
发表刊物:IEEE Transactions on Industrial Informatics
关键字:Deep Learning, Object Detection, Computer Application
摘要:Specialized control of pests and diseases have been a high-priority issue for the agriculture industry in many countries. On account of automation and cost effectiveness, image analytic pest recognition systems are widely utilized in practical crops prevention applications. But due to powerless hand-crafted features, current image analytic approaches achieve low accuracy and poor robustness in practical large-scale multiclass pest detection and recognition. To tackle this problem, this article proposes a novel deep learning based automatic approach using hybrid and local activated features for pest monitoring. In the presented method, we exploit the global information from feature maps to build our global activated feature pyramid network to extract pests' highly discriminative features across various scales over both depth and position levels. It makes changes of depth or spatial sensitive features in pest images more visible during downsampling. Next, an improved pest localization module named local activated region proposal network is proposed to find the precise pest objects positions by augmenting contextualized and attentional information for feature completion and enhancement in local level. The approach is evaluated on our seven-year large-scale pest data-set containing 88.6 K images (16 types of pests) with 582.1 K manually labeled pest objects. The experimental results show that our solution performs over 75.03% mean average precision (mAP) in industrial circumstances, which outweighs two other state-of-the-art methods: Faster R-CNN with mAP up to 70% and feature pyramid network mAP up to 72%.
合写作者:Rui Li,Sud Sudirman,Po Yang,Jie Zhang,Chengjun Xie,Fangyuan Wang
第一作者:Liu Liu
论文类型:期刊论文
通讯作者:Rujing Wang
卷号:17
期号:11
页面范围:7589 - 7598
是否译文:否
发表时间:2020-05-20