UAVs in rail damage image diagnostics supported by deep-learning networks

cris.lastimport.scopus2024-11-06T02:30:54Z
dc.abstract.enThe article uses images from Unmanned Aerial Vehicles (UAVs) for rail diagnostics. The main advantage of such a solution compared to traditional surveys performed with measuring vehicles is the elimination of decreased train traffic. The authors, in the study, limited themselves to the diagnosis of hazardous split defects in rails. An algorithm has been proposed to detect them with an efficiency rate of about 81% for defects not less than 6.9% of the rail head width. It uses the FCN-8 deep-learning network, implemented in the Tensorflow environment, to extract the rail head by image segmentation. Using this type of network for segmentation increases the resistance of the algorithm to changes in the recorded rail image brightness. This is of fundamental importance in the case of variable conditions for image recording by UAVs. The detection of these defects in the rail head is performed using an algorithm in the Python language and the OpenCV library. To locate the defect, it uses the contour of a separate rail head together with a rectangle circumscribed around it. The use of UAVs together with artificial intelligence to detect split defects is an important element of novelty presented in this work.
dc.affiliationTransportu i Informatyki
dc.contributor.authorPiotr Bojarczak
dc.contributor.authorPiotr Lesiak
dc.date.accessioned2024-07-04T09:14:05Z
dc.date.available2024-07-04T09:14:05Z
dc.date.issued2021
dc.identifier.doi10.1515/eng-2021-0033
dc.identifier.issn2391-5439
dc.identifier.urihttps://repo.akademiawsei.eu/handle/item/364
dc.languageen
dc.pbn.affiliationinformation and communication technology
dc.relation.ispartofOpen Engineering
dc.rightsCC-BY
dc.subject.enUnmanned Aerial Vehicles
dc.subject.enSplit Defect in Rail
dc.subject.enDeep-Learning Networks
dc.titleUAVs in rail damage image diagnostics supported by deep-learning networks
dc.typeReviewArticle
dspace.entity.typePublication
oaire.citation.issue1
oaire.citation.volume11