Welcome to Journal of Graphics share: 

Journal of Graphics ›› 2024, Vol. 45 ›› Issue (6): 1313-1327.DOI: 10.11996/JG.j.2095-302X.2024061313

• Image Processing and Computer Vision • Previous Articles     Next Articles

Automatic reading of pointer meters based on R-YOLOv7 and MIMO-CTFNet

LI Shengtao(), HOU Liqun(), DONG Yasong   

  1. Department of Automation, North China Electric Power University, Baoding Hebei 071003, China
  • Received:2024-07-23 Accepted:2024-09-26 Online:2024-12-31 Published:2024-12-24
  • Contact: HOU Liqun
  • About author:First author contact:

    LI Shengtao (1998-), master student. His main research interests cover image processing and deep learning. E-mail:lst18315889026@163.com

  • Supported by:
    Natural Science Foundation of Hebei(F2016502104)

Abstract:

To solve the problems in current pointer meter reading methods, such as the complicated reading process, significant reading errors, and the motion blur caused by camera shakes, an automatic reading method based on R-YOLOv7 and MIMO-CTFNet (multi-input multi-output CNN-transformer fusion network) was proposed. First, the R-YOLOv7 algorithm was constructed to consider both accuracy and lightweight for detecting the dial and its key information. Then, a MIMO-CTFNet algorithm was designed to recover the motion-blurred meter images. Finally, the angle method based on the extracted small scales was utilized to perform meter reading. The experimental results showed that for the data set of dial key information finding, the parameters, FLOPs, ADT, and mAP50:95 were 12 M, 60.30 G, 17.04 ms, and 86.5%, respectively. The PSNR and SSIM of the improved MIMO-CTFNet algorithm achieved 33.05 dB and 0.935 3, respectively. The maximum fiducial error of the proposed reading method was 0.35%, and the reading time for images requiring and not requiring motion blur was 0.561 s and 0.128 s, respectively, validating the effectiveness of the proposed method.

Key words: pointer meters, R-YOLOv7, MIMO-CTFNet, automatic reading, light weight

CLC Number: