Home
Scholarly Works
Study on image acquisition and camera positioning...
Journal article

Study on image acquisition and camera positioning of depth recognition model in the tobacco curing stage

Abstract

To address the challenges in the digital transformation of the tobacco industry regarding the recognition of data acquisition standards and the construction of an Internet of Things (IoT) system for tobacco leaf curing stages, these deep neural networks are applied in this field to construct an identification model. Data was collected under various conditions, including different camera types (with or without distortion, focal length), installation positions, and lighting conditions, to obtain curing stage data. After preprocessing each type of image data, ten-stage classification recognition datasets were established based on the “three stages and six steps” curing process. Six recognition models were developed using ResNeXt-50, ShuffleNetV2(1.0), MobileNetV3-S, VanillaNet-10, EfficientNetV2-S, and EfficientNetV2-M as backbone networks. Evaluation and analysis were conducted using multiple performance metrics such as accuracy, F1 score, as well as various graphical representations including curve plots, radar charts, bar charts, and confusion matrices. The results indicate: 1. ShuffleNetV2(1.0) (MT: 98.71%) and EffcientNetV2(MT: 99.84%) series networks exhibit superior recognition performance. 2. Recognition performance varies between high-temperature and low-temperature areas. 3. Combining multiple perspectives can improve recognition accuracy. 4. Cameras with larger focal lengths, cool white lighting, and distortion are more conducive to recognition. 5. The application prospects of image-assisted tobacco leaf curing are promising. Code available at: https://github.com/vontran2021/CuringStage.

Authors

Feng C; Zhu S; Tang M; Zhao H; Yuan Q; Wang B

Journal

Engineering Applications of Artificial Intelligence, Vol. 143, ,

Publisher

Elsevier

Publication Date

March 1, 2025

DOI

10.1016/j.engappai.2024.109992

ISSN

0952-1976

Contact the Experts team