Image Analysis using Deep Learning for Electrical Overhead Line Tower Management

  • Anicetus Odo

Student thesis: Doctoral ThesisDoctor of Philosophy


Electricity networks are critical national infrastructure throughout the world, delivering vital energy services and supporting interdependent assets such as transport and hospitals. The network corridors are inspected and refurnished regularly to remain useful. Due to the high number of components and the geographical spread of electricity networks, operating costs can be very high. Inspection parameters include vegetation encroachment, sagging lines, tower paintwork defects and numerous components on the towers. There are over fifty inspection parameters that are critical along a network segment. The state-of-the-art inspection process involves aerial surveys. Images are acquired and analysed manually, which adds to the high cost of aerial surveys. In addition to being costly, this process can suffer from inter-observer and intra-observer variability.

The condition-based risk management model is a popular network asset management model within the industry. The model allows for individual components rating and then enables a collective economic impact analysis for the medium and long term. The assessment model has two main stages that inform if a tower requires intervention. An aim of the routine analysis stage is to rate towers, select "at-risk" candidates for detailed investigation and refurbishment. The effectiveness of the assessment model would depend on how quick and accurate the routine assessment is able to highlight areas of need considering the limited resources during the inspection window.

This thesis focuses on automating tasks within the routine analysis stage involving image analysis. Specifically, this thesis identified towers that are at-risk of different failure modes using deep learning. The first step in the proposed pipeline involves the identification of tower types as suspension and tension. We found that towers could be categorised automatically by focusing on the configurations at tower cross-arms. In addition to tower type detection, identifying images of cross-arm, body and foot would serve as precursor for effective extraction of specific inspection parameters. Components such as anti-climbing devices are found around tower body and not higher up the cross-arm or peak of the tower. Hence, classifying tower images to reflect the region of interest would provide a filter for object detection.

Tower conditions are often associated with the failure modes of components instances they support. This thesis demonstrated an automated detection of insulators and U-bolts as exemplar inspection parameters. Our method classified at-risk towers based on the detected instances but without explicitly labelling the instances. Instead, learning was supervised using only the condition labels of towers in their entirety. This enables us to use a real-world industry dataset without the requirement of fine-grained data annotation of thousands of individual components. While existing studies classified component instances on a tower, we classified the tower as a whole and show that tower labels are adequate for the task. Automated detection and analysis of U-bolts have not been previously reported.

Furthermore, the thesis presents the identification of paintwork deterioration for image-based tower management. We argue that the classification of tower parts may be costly. While identifying towers with immediate need for intervention, our approach to tower paintwork classification could be used as an early warning system for assets approaching end-of-life.

The utility of each sub-system has been demonstrated using a real-world industry dataset of over 7k towers and 300k images that are representative of asset failure modes and inspection scenarios.
Date of Award2022
Original languageEnglish
SupervisorJan Vorstius (Supervisor) & Stephen McKenna (Supervisor)


  • Asset Management
  • Machine Learning
  • Pattern Recognition
  • Power Distribution
  • Image processing

Cite this