Multispectral electronic device for autonomous mobile platform of ecological monitoringMultispectral electronic device for autonomous mobile platform of ecological monitoring


DOI: 10.34759/trd-2020-114-14

Аuthors

Chernetskaya I. E.*, Spevakova S. V.**

South-Western State University, 94, 50-let Oktyabrya str., Kursk, 305040, Russia

*e-mail: white731@yandex.ru
**e-mail: sspev@yandex.ru

Abstract

A special place in robotic systems application is occupied by the task of developing specialized devices for orientation controlling of the autonomous mobile platforms (AMP) operating in constantly changing observation conditions. One of the tasks consists in developing an optoelectronic control device for controlling autonomous mobile platform for ecological monitoring. Such platforms can be used under conditions dangerous to human health, for analyzing radiation, chemical, and bacteriological contamination, as well as for round-the-clock monitoring of geographically remote places. However, operation of these devices involves employing them on the territories not prepared in advance, with complex landscape, or presence of obstacles, both temporary and permanent. The points for data collection for analysis may not be provided with access ways with road marking. A promising trend is optoelectronic sensors application to obtain information on the elements of the work stage, along the AMP path, which in its turn leads to the in the computational complexity increase of the image recognition algorithms and imposes additional requirements on the performance of the control device element base. This leads to the increase in weight and decrease in the criterion of the AMP autonomy. The purpose of the presented article consists in the performance improving of the device for the autonomous mobile environmental monitoring platform. The authors propose to employ multispectral video sensors that allow obtaining an image of the work scene in several spectral ranges, and hybrid image processing methods that will reduce computational complexity and improve the results accuracy. The problem solution of improving the speed of selection of the objects located on the path of the autonomous mobile environmental monitoring platform is achieved by detecting heterogeneous objects in various spectral ranges, by color and classification based on the albedo value. The developed device novelty consists in calculating three-dimensional coordinates of the geometric center, and the size of the objects highlighted in space by a sequence of images, obtained in drastic conditions in various spectral ranges from video sensors and lidar, from the mobile surveillance system. It allows adjusting the AMP original route, formed by positioning systems, based on the detected obstacles, increasing thereby the speed and precision of the AMP spatial reference. As the result of experimental studies, a comparative analysis of the devices, processing the multispectral imagery to search for the objects located in the field of view of video systems of observation was performed. A structural-functional diagram of multi-spectral mobile device for environmental monitoring platform was proposed. The article presents the description of the software-hardware test-bench employed for the experimental study by simulating the FPGA-based device. The authors were able to analyze the speed performance of individual blocks, which allowed realize even blocks loading and the FPGA internal resources optimization, as well as confirm mathematical substantiation of the proposed hybrid methods and assess the key characteristics of the device. As the result, it allowed reducing computational error at the distance of up to 100 m to the object, the RMSE up to 0.447%, MAPE — 0.397, and increase speed-performance. The object selection and its coordinates determination required 0.04 seconds.

The authors propose the presented solution implementation in the form of a device for autonomous mobile platforms control based on the FPGA. This will allow increasing performance of such platforms and reliability of the obtained results.

Keywords:

multispectral sensor, controller, autonomous mobile platform, ecological, image recognition

References

  1. Khafizov R.G., Okhotnikov S.A. Izvestiya vysshikh uchebnykh zavedenii. Priborostroenie, 2012, vol. 55, no. 5, pp. 3 – 8.

  2. Spevakov A.G. Izvestiya Yugo-Zapadnogo gosudarstvennogo universiteta. Seriya: Upravlenie, vychislitel’naya tekhnika, informatika. Meditsinskoe priborostroenie, 2013, no. 1, pp. 233 – 237.

  3. Spevakov A.G., Spevakova S.V., Matiushin I.S. Detection objects moving in space from a mobile vision system, Radio Electronics, Computer Science, Control, 2019, no. 4 (51), pp. 103 – 110. DOI: https://doi.org/10.15588/1607-3274-2019-4-10

  4. Spevakova S.V. Intellektual’nye i informatsionnye sistemy. Intellekt-2019: sbornik trudov, Tula, Tul’skii gosudarstvennyi universitet, 2019, pp. 334 – 337.

  5. Antyufeev V.I., Bykov V.N. Aviatsionno-kosmicheskaya tekhnika i tekhnologiya, 2008, no. 1, pp. 70 – 78.

  6. Shirabakina T.A. Datchiki i sistemy, 2004, no. 6, pp. 65 – 67.

  7. Barabin G.V., Gusev V.Yu. Trudy MAI, 2013, no. 71. URL: http://trudymai.ru/eng/published.php?ID=46740

  8. Zubarev, Yu.B., Sagdullaev Yu.S., Sagdullaev T.Yu. Voprosy radioelektroniki. Seriya tekhnika televideniya, 2009, no. 1, pp. 47 – 64.

  9. Shipko V.V. Trudy MAI, 2019, no. 104. URL: http://trudymai.ru/eng/published.php?ID=102211

  10. Bondarenko M.A., Drynkin V.N. Programmnye sistemy i vychislitel’nye metody, 2016, no. 1, pp. 64 – 79.

  11. Kazbekov B.V. Trudy MAI, 2013, no. 65. URL: http://trudymai.ru/eng/published.php?ID=35912

  12. Knyaz’ V.V., Busurin V.I. Trudy MAI, 2014, no. 81. URL: http://trudymai.ru/eng/published.php?ID=57839

  13. Krishnamoorthy S., Soman K.P. Implementation and comparative study of image fusion algorithms, International Journal of Computer Applications, 2010, vol. 9, no. 2, pp. 25 – 35. DOI:10.5120/1357-1832

  14. Jiang G. et al. A simultaneous localization and mapping (SLAM) framework for 2.5 D map building based on low-cost LiDAR and vision fusion, Applied Sciences, 2019, vol. 9, no. 10, pp. 2105. DOI: 10.3390/app9102105

  15. Bekhtin Yu.S., Emel’yanov S.G., Titov D.V. Teoreticheskie osnovy tsifrovoi obrabotki izobrazhenii vstraivaemykh optiko-elektronnykh system (Theoretical basics of digital image processing of embedded optoelectronic systems), Moscow, Argamak-Media, 2016, 296 p.

  16. Kalutskiy I., Spevakova S., Matiushin Ju. Method of Moving Object Detection from Mobile Vision System, International Russian Automation Conference “RusAutoCon −19”, 2019, Sochi, Russia. DOI: 10.1109/RUSAUTOCON.2019.8867632

  17. Newcombe R.A. et al. KinectFusion: Real-time dense surface mapping and tracking, 10th IEEE International Symposium on Mixed and Augmented Reality, 2011, pp. 127 – 136. DOI: 10.1109/ISMAR.2011.6162880

  18. Kerl C., Sturm J., Cremers D. Dense visual SLAM for RGB-D cameras, In Proceedings of the IEEE/RSJ International Conference on Intelligent Robots and Systems, Chicago, IL, USA, 14–18 September 2014, pp. 2100–2106. DOI:10.1109/IROS.2013.6696650

  19. Shipko V.V. Trudy MAI, 2020, no. 110. URL: http://trudymai.ru/published.php?ID=112863. DOI: 10.34759/trd-2020-110-12

  20. Spevakova S.V., Kalutskii I.V. Patent RU 2 714 603 C1, 23.01.2020.


Download

mai.ru — informational site MAI

Copyright © 2000-2024 by MAI

Вход