高级检索

基于注意力机制和迁移学习的弧焊熔池图像识别

Arc welding molten pool image recognition based on attention mechanism and transfer learning

  • 摘要: 焊接过程中受复杂时变干扰和工艺条件变化的影响,熔池边界特征易模糊、尺度信息复杂多变,对熔池区域的准确识别与鲁棒分割造成极大困难. 文中提出一种结合注意力机制和迁移学习的熔池图像识别方法,首先,在UNet下采样过程中添加残差模块(residual block,RB)提取多尺度的低级特征,并在下采样和上采样过程中引入坐标注意力模块(coordinate attention block,CAB),提高有效区域的特征权重. 其次,在Pascal VOC2012预训练好的深度卷积神经网络迁移到UNet网络中,实现特征迁移和参数共享,以缓解训练效果过度依赖数据集. 文中提出的TL-RCUNet网络在未曾训练的MAG和TIG跨工艺数据集上进行测试,取得了良好的识别效果,平均交并比(mean intersection over union,MIoU)分别达到96.21%和79.55%,比经典语义分割网络分别高出约15%和25%. 为解决现有语义分割方法依赖于大量训练样本和需要专家经验进行像素级别标注的问题提供了可行方案.

     

    Abstract: Due to the influence of complex time-varying interference and variations in process conditions during the welding process, the boundary characteristics of the molten pool are easy to be blurred, and the scale information is complex and changeable, which poses significant challenges to the accuracy recognition and the robust segmentation of the molten pool. In this paper, a molten pool image recognition method combining attention mechanism and transfer learning is proposed. Firstly, the residual block(RB) is added to the UNet down-sampling process to extract multi-scale low-level features, and the coordinate attention block(CAB) is introduced in the down-sampling and up-sampling processes to improve the feature weight of the effective region. Secondly, the pre-trained deep convolutional neural network in Pascal VOC2012 is transferred to the UNet network to realize feature transfer and parameter sharing, so as to alleviate the over-reliance of training effect on datasets. The TL-RCUNet network proposed in this paper has achieved good recognition results on the untrained MAG and TIG cross-process datasets. The mean intersection over union(MIoU) reaches 96.21% and 79.55%, respectively, which is about 15% and 25% higher than the classical semantic segmentation network. The model provides a feasible solution to the problem that existing semantic segmentation methods rely on a large number of training samples and pixel-level annotations based on expert experience.

     

/

返回文章
返回