Construction Delay Prediction Model Using a Relationship-Aware Multihead Graph Attention Network

Fatemeh Mostofi*, Onur Behzat Tokdemir, Vedat Toǧan

*Corresponding author for this work

Research output: Contribution to journalArticlepeer-review

Abstract

Existing machine learning (ML) delay prediction models cannot process dependencies among the construction progress records. This study investigates graph attention networks (GAT) incorporating multihead attention mechanisms for predicting construction delays. Leveraging an attention mechanism, GAT emphasizes differential node significance in networks and demonstrates the capability to learn input configurations. The data set was configured into six networks that linked records based on contractual alignment and spatial proximity dependency criteria. Under contractual alignments, predictions for electrical and concrete tasks achieved 65% and 76%, respectively, outperforming spatial-based predictions. However, multihead GAT with spatial networks delivered 77% for insulation tasks, overtaking 67% of contractual networks, underscoring model sensitivity to task dependencies and its applicability across a range of decision making contexts. Recognizing the dependencies and shared aspects among construction records, the proposed GAT model better reflects human understanding of construction progress reports, shifting the focus from mere predictive accuracy to representative modeling of construction delay.

Original languageEnglish
Article number04025010
JournalJournal of Management in Engineering - ASCE
Volume41
Issue number3
DOIs
Publication statusPublished - 1 May 2025

Bibliographical note

Publisher Copyright:
© 2025 American Society of Civil Engineers.

Keywords

  • Construction progress prediction
  • Delay prediction model
  • Graph attention networks (GAT)
  • Multihead attention mechanism
  • Project management
  • Schedule performance

Fingerprint

Dive into the research topics of 'Construction Delay Prediction Model Using a Relationship-Aware Multihead Graph Attention Network'. Together they form a unique fingerprint.

Cite this