DEVELOPMENT OF A CASCADE ALGORITHM FOR MONITORING THE MOVEMENT OF PARTS DURING THEIR MANUFACTURE

Cover Page

Cite item

Full Text

Abstract

A cascade algorithm has been developed that allows identification of contents in production containers. The algorithm consists of two stages: detection of container cells and classification of the contents of each cell. The proposed algorithm makes it possible to achieve a classification accuracy of 89% when trained on a relatively small sample size than would be required when using a direct part detection algorithm, without the cell detection stage. The algorithm is thus suitable for use in environmental monitoring systems in aerospace manufacturing.

About the authors

Polina I. Kiseleva

Samara National Research University (Samara University)

Email: kiseleva.pi@ssau.ru

master of group 3202-240405D

34, Moskovskoye shosse, Samara, 443086, Russian Federation

Ekaterina Yu. Pechenina

Samara National Research University (Samara University)

Email: ek-ko@list.ru

assistant at the department of engine production technologies

34, Moskovskoye shosse, Samara, 443086, Russian Federation

Vadim A. Pechenin

Samara National Research University (Samara University)

Author for correspondence.
Email: v.a.pechenin@ssau.ru

candidate of technical sciences, associate professor of the department of engine production technologies

34, Moskovskoye shosse, Samara, 443086, Russian Federation

References

  1. van Mourik, S., van der Tol, R., Linker, R., Reyes-Lastiri, D., Kootstra, G., Koerkamp, P. G. and van Henten, E. J. (2021), "Introductory overview: Systems and control methods for operational management support in agricultural production systems", Environmental Modelling & Software, vol. 139, p. 105031.
  2. Panetto, H., Iung, B., Ivanov, D., Weichhart, G. and Wang, X. (2019), "Challenges for the cyber-physical manufacturing enterprises of the future", Annual Reviews in Control, vol. 47, pp. 200-213.
  3. Mörth, O., Emmanouilidis, C. and Schadler, M. (2020), "Cyber-physical systems for performance monitoring in production intralogistics", Computers & Industrial Engineering, vol. 142, pp. 106333.
  4. Guo, Y., Liu, Y., Oerlemans, A., Lao, S., Wu, S. and Lew, M.S. (2016), "Deep learning for visual understanding: A review", Neurocomputing, vol. 187, pp. 27-48.
  5. Wang, C-Y, Bochkovskiy, A., Liao, H-Y M. (2021), "Scaled-YOLOv4: Scaling cross stage partial network", Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, pp. 13024-13033.
  6. Redmon, J. and Farhadi, A. (2018), "YOLOv3: An Incremental Improvement", arXiv, vol. 1804, p. 02767.
  7. Redmon, J., Divvala, S., Girshick, R. and Farhadi, A. (2016), "You only look once: Unified, real-time object detection", Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, vol. 7780460, pp. 779-788.
  8. Simonyan, K. and Zisserman, A. (2015), "Very deep convolutional networks for large-scale image recognition", 3rd International Conference on Learning Representations, vol. 1409, p. 1556.
  9. Ioffe S.Y., Szegedy C. (2015), “Batch normalization: accelerating deep network training by reducing internal covariate shift”, Proc. 32nd ICML, pp. 448-456.

Supplementary files

Supplementary Files
Action
1. JATS XML

Copyright (c) 2023 Kiseleva P.I., Pechenina E.Y., Pechenin V.A.

Creative Commons License
This work is licensed under a Creative Commons Attribution-ShareAlike 4.0 International License.

Journal of Dynamics and Vibroacoustics

ISSN 2409-4579 (Online)

Publisher and Founder: Samara National Research University, 34, Moskovskoye shosse, Samara, 443086, Russian Federation.

Extract from the register of registered media

Editor-in-chief:  Academician of the RAS
E. V. Shakhmatov 

4 issues per year.

Free price

Editorial address: room 324, 43, Gaya street, Samara, 443086

Address for correspondence: 34, Moskovskoye shosse, Samara, 443086, Russian Federation, Samara National Research University (room 324, building 14)

Phone: 8 (846) 267 47 66

e-mail: dynvibro@ssau.ru

www: https://dynvibro.ru

© Samara University

 

This website uses cookies

You consent to our cookies if you continue to use our website.

About Cookies