Machine learning methods in classification of radio signals

Radio engineering, including TV systems and devices


Malygin I. V.1*, Belkov S. A.1**, Tarasov A. D.1***, Usvyatsov M. R.2****

1. Ural federal university, 19, Mira str., Ekaterinburg, 620002, Russia
2. Moscow Institute of Physics and Technology, 9, Institutskiy per., Dolgoprudny, Moscow region, 141701, Russia



This paper is devoted to the issue of recognizing received encoded radio signal sequences. Traditionally, correlators or matched filters are used in communication systems to detect and process noise-like signals. These two models use a threshold detection parameter. The use of a neural network is proposed to improve the quality of signal recognition when the noise characteristics in an environment are unknown. The neural network is a non-linear model with a relatively large number of parameters which tests proposed examples during the learning phase and attempts to reproduce the relationship between them using the responses. To use the neural network, it is necessary to reformulate the original issue in terms of optimization. That is, to enter a quality functional, which will be optimized, and choose a method of optimization for the given functionality. To select the neural network parameters that achieve the optimum of the quality functional, the back propagation algorithm will be used. The stochastic gradient descent algorithm determined by the quality functional will be used to calculate parameter updates of the neural network. For convenience, only digital signals will be considered in this paper. Nevertheless, the method described in this paper can be applied to a continuous signal. The digitizing of a continuous signal will transform the task into one of signal detecting, which has been formulated above. In each cycle of the receiver’s operation, a signal is generated at its input.

The signal is a sequence of fixed-length and can either be similar to the protocol-defined signal, or be significantly different. Therefore, the task can be formulated as a task of binary classification of the received signal in each cycle of the receiver’s operation as a valid signal and noise.

It is assumed that the quality of recognition will be better than with the traditional methods because, during training, the neural network is able to memorize the special features of the noise and, consequently, use the obtained model at the signal classification stage.

A diagram for an experimental stand is also provided in this paper that would allow for the confirmation of this assumption.


neural network, signal processing, m-sequence, Barker's codes, correlator


  1. Barker R.H. Group synchronizing of binary digital sequences, Communication theory, Butterworth, London, 1953, pp. 273-287.

  2. Digital Design and Computer Architecture. 2nd Edition. David Harris Sarah Harris, ISBN: 9780123978165, Paperback ISBN: 9780123944245, Imprint: Morgan Kaufmann, Published Date: 24th July 2012, 712 p.

  3. Forney G. Generalized minimum distance decoding, IEEE Transactions on Information Theory, 1966, vol. 12, no. 2, pp. 125-131.

  4. Rüschendorf L. The Wasserstein distance and approximation theorems, Zeitschrift für Wahrscheinlichkeitstheorie und verwandte Gebiete, 1985, vol. 70, no. 1, pp. 117-129.

  5. Welch L. Lower bounds on the maximum cross correlation of signals, IEEE Transactions on Information theory, 1974, vol. 20, no. 3, pp. 397-399.

  6. Amari S. Backpropagation and stochastic gradient descent method, Neurocomputing, 1993, vol. 5, no. 4-5, pp. 185-196.

  7. Vorontsov K.V. Matematicheskie metody obucheniya po pretsedentam (teoriya obucheniya mashin), available at:

  8. Chen T., Chen H. Universal approximation to nonlinear operators by neural networks with arbitrary activation functions and its application to dynamical systems, IEEE Transactions on Neural Networks, 1995, vol. 6, no. 4, pp. 911-917.

  9. Shore J., Johnson R. Axiomatic derivation of the principle of maximum entropy and the principle of minimum cross-entropy, IEEE Transactions on information theory, 1980, vol. 26, no. 1, pp. 26-37.

  10. Gurakov M.A., Krivonosov E.O., Kostyuchenko E.Yu. Trudy MAI, 2016, no. 86, available at:

  11. Efimov E.N., Shevgunov T.Ya. Trudy MAI, 2015, no. 82, available at:

  12. Filatov V.I. Trudy MAI, 2015, no. 81, available at:

  13. Sukhanov N.V. Trudy MAI, 2013, no. 65, available at:

  14. Tyumentsev Yu.V., Kozlov D.S. Trudy MAI, 2012, no. 52. available at:

  15. Efimov E.N., Shevgunov T.Ya. Trudy MAI, 2012, no. 51, available at:

Download — informational site MAI

Copyright © 2000-2022 by MAI