Skip to main content
Thesis defences

MCS Thesis Examination: Hamid Reza Behizadi

Clip-Mesh: Application of Attention Mechanism in Deep Neural Network Architecture for System Failure Prognostics


Date & time
Friday, May 5, 2023
10 a.m. – 11:30 a.m.
Cost

This event is free

Organization

Department of Computer Science and Software Engineering

Contact

Leila Kosseim

Where

Online

Abstract

    Machine health monitoring and management are essential improvements that must be considered in the industry toward smart manufacturing. Intelligent prognosis and health management (PHM) systems have demonstrated remarkable capabilities for industrial use and, consequently, have become active research areas in the last several decades. Machines must be appropriately controlled and monitored to run with almost no breakdown and reduce the downtime caused by a local machine or component failure. Therefore, adequate sensors should be applied to devices to gather different measurements and monitor the components’ health status in real-time. Subsequently, the control center monitors and analyses the data collected and then chooses and implements the best maintenance techniques for each machine and component. With the intention of PHM, numerous maintenance techniques have been suggested and put into practice for various asset types. Predictive Maintenance (PM) generally predicts faults or breakdowns in a deteriorating system to optimize maintenance efforts by evaluating the system’s status using historical data. In this strategy, the Remaining Useful Life (RUL) of the components is anticipated using characteristics, which typically include sensors and operational profiles.

    This research aims to evaluate the possibility of predicting the RUL of a system based on sensor data by deploying an attention-based deep learning model. RUL prediction based on the attention mechanism is a relatively new approach with promising results. The basic idea behind this approach is to use an attention mechanism to identify the most relevant features or time steps in the input data that are predictive of the RUL and to weigh these features or time steps accordingly. One advantage of this approach is that it can be useful for interpreting the results and understanding the underlying factors contributing to the RUL. Applying an attention mechanism to find temporal dependencies also shows improvement in model performance by detecting the most important part of the sequences to be passed to the prediction model.

    Our proposed model has shown a noticeable impact on the performance of the neural network architecture from the attention mechanism added to the pipeline by keeping the model light in terms of computational resources and training time. Although the proposed architecture cannot outperform the most recent hybrid models in the field, it clearly shows the attention mechanism’s high impact on predicting time series data. This technique can also be used in more complex hybrid architectures to improve neural network performance.

Examining Committee

  • Dr. Juergen Rilling (Chair) 
  • Dr. Leila Kosseim & Jia Yuan Yu (Supervisor)
  • Dr. Yann-Gael Gueheneuc (Examiner)
  • Dr. Juergen Rilling (Examiner)
     
Back to top

© Concordia University