Artificial intelligence and machine learning
Research Article
Recognition of cadastral coordinates using convolutional recurrent neural networks
Igor Victorovich Vinokurov
Financial University under the Government of the Russian Federation, Moscow, Russia | |
igvvinokurov@fa.ru |
Abstract. The article examines the use of convolutional recurrent neural networks (CRNN) for recognizing images of cadastral coordinates of objects on scanned documents of the «Roskadastr» PLC. The combined CRNN architecture, combining convolutional neural networks (CNN) and recurrent neural networks (RNN), allows you to take advantage of each of them for image processing and recognition of continuous digital sequences contained in them. During experimental studies, images consisting of a given number of digits were generated, and a CRNN model was built and studied. The formation of images of digital sequences consisted of preprocessing and concatenation of images of the digits forming them from one's own data set. Analysis of the values of the loss function and Accuracy, Character Error Rate (CER), and Word Error Rate (WER) metrics showed that the use of the proposed CRNN model makes it possible to achieve high accuracy in recognizing cadastral coordinates in their scanned images. (Linked article texts in Russian and in English).
Keywords: convolutional recurrent neural network, CRNN, image recognition, digital sequences, deep learning, Keras, Python
MSC-2020 68T20; 68T07, 68T45For citation: Igor V. Vinokurov. Recognition of cadastral coordinates using convolutional recurrent neural networks. Program Systems: Theory and Applications, 2024, 15:1, pp. 3–30. (In Russ., in Engl.). https://psta.psiras.ru/2024/1_3-30.
Full text of bilingual article (PDF): https://psta.psiras.ru/read/psta2024_1_3-30.pdf (Clicking on the flag in the header switches the page language).
English part of bilingual article (PDF): https://psta.psiras.ru/read/psta2024_1_3-30-en.pdf.
The article was submitted 29.09.2023; approved after reviewing 27.11.2023; accepted for publication 27.11.2023; published online 11.03.2024.