Advances in Consumer Research
Issue:5 : 2427-2441
Research Article
Simulate. Optimize. Succeed: Integrating Digital Twin Technology for Real-Time Risk Detection and Decision Support in IT Projects Management Lifecycle
 ,
 ,
 ,
 ,
1
School of Computing, Engineering and Digital Technologies, Teesside University, United Kingdom
2
PhD Lecturer in Computer Science School of computing, Engineering and Digital Technologies, Teesside University, United Kingdom
3
Computer and information science, Towson University, USA
4
Computer Science and Engineering, University of Fairfax, USA
Received
Sept. 1, 2025
Revised
Oct. 3, 2025
Accepted
Nov. 19, 2025
Published
Nov. 21, 2025
Abstract

This study evaluates the role of Digital Twins in IT project management lifecycles, intending to enhance predictive analytics, decision-making, and real-time workflow optimisation. Drawing upon complexity theory, decision support systems, and Agile methodologies, the study proposes a modular DTT framework capable of simulating project conditions, forecasting risks, and adapting to dynamic changes. A hybrid qualitative methodology comprising a structured literature review, case study analysis, conceptual framework design and simulation evaluation was adopted. Workflow modelling tools such as Draw.io and Figma were used to visualise IT processes and lifecycle design, and an interactive dashboard prototype. Simulated Agile scenarios, including project delays, resource bottlenecks, and testing constraints, were used to validate the conceptual framework. The simulation demonstrated over 90% risk detection accuracy, AI-driven recommendations with response times under three seconds, and a potential reduction of up to five working days in project delivery. In practical terms, the model offers IT teams faster decision-making, improved visibility into risks, and enhanced delivery timelines, making it highly applicable for fast-paced IT environments with possible integration capabilities with tools like Jira, to enable real-time updates during project reviews. This study provides a flexible, data-based Digital Twin Technology (DTT) model that helps teams work continuously, see project issues more clearly, and make decisions early. It shows how IT projects can shift from reacting to problems to predicting and preventing them.

Keywords
INTRODUCTION

As IT projects become more complex, there is a growing need for advanced tools like simulations and predictive analytics to fix problems like wasted resources and poor risk management (Kabanda, 2020). Studies show that over 70% of IT projects miss their deadlines, budgets, or goals because they lack real-time tracking and rely too much on reacting after problems happen (Baghizadeh et al., 2020; Windapo et al., 2023). Even though Agile and DevOps are popular for managing projects step by step, they still mostly look backward to analyze issues instead of predicting them ahead of time (Perera and Eadie, 2023). Traditional tools help with organizing tasks and teamwork, but they do not offer real-time simulations, AI predictions, or automatic risk checks (Attah et al., 2024; Jeong et al., 2022; Temitope, 2020). For example, widely used platforms like Jira do not provide features like predicting resource needs or simulating project delays, which many experts see as a major weakness (Attah et al., 2024).

 

Digital Twin Technology (DTT) provides a promising means of resolving the issues that invariably arise in IT project management, but its presence in the field is scarcely noticeable. Instead, other industries, including manufacturing and aerospace, increasingly use DTT regularly to improve performance in real-time and predict the future conditions (Tao et al., 2022). DTT enables IT teams to perform risk mitigation, reallocation of resources, and resource planning with alternative scenarios without the need to shut down running systems and processes by providing scaled virtual proxy models of real systems and processes (Madni et al., 2019). The technology therefore changes the project management position to proactive rather than reactive.

 

Despite opportunities presented by DTT, there are a number of barriers that still exist. Lack of standardized procedures of connecting with existing enterprise systems, the high computational costs of simulating, and the challenges that come with scalability have been cited as the most problematic limitations (Bravo & Vieira, 2023; Perno et al., 2022). In addition, numerous IT organizations are not skilled in the discussed skillsets, such as those of artificial intelligence, advanced computation, and simulation methods (Schranz et al., 2020; Semeraro et al., 2021). However, according to the increased demand in various industries to utilize AI-driven simulations, the global Digital Twins market is expected to grow at an annual rate of 58 % to reach $48.2 billion by 2026 (Botin-Sanabria et al., 2022; Kebarche et al., 2022).

 

The major providers of technology, Siemens, Microsoft Azure, IBM, and Cisco have supported the importance of DTT in reinforcing IT infrastructure, Agile development practices, and also in cloud-native architecture. security, and boost predictive analytics (Mohamed & Al-Jaroodi, 2024; VanDerHorn & Mahadevan, 2021). However, challenges still remain, such as making DTT work smoothly with Agile methods, the high cost of setup, and the shortage of skilled workers (Dorrer, 2020; Rayhana et al., 2024). This study explores how DTT can be used to model and manage IT project workflows, helping teams work more efficiently, make better decisions, and achieve more successful project outcomes. It aims to move project management from reacting to problems to predicting and preventing them. The study brings together theories, real-world examples, and practical tools, to tackles key issues like standardization, skills shortages, and testing by using simulations, AI models, and strategic recommendations.

 

To help solve current problems with standardization, integration, and real-world testing, this paper introduces a Digital Twin Technology (DTT) framework designed specifically for managing IT projects. Rather than building a fully working digital twin, the study focuses on developing the idea using research, case studies, and the latest tech trends. The framework is meant to guide how DTT can be used in IT project management in the future. It also gives useful advice for project managers, professionals, and researchers by combining academic knowledge, real-world examples, and new technologies. Based on past studies (like Schuh et al., 2017), DTT can help spot risks, plan resource needs, and simulate project steps, features this framework aims to include conceptually.

 

This study adds value in four main ways: (i) new ideas and model design by creating a clear and flexible plan to spot problems early and make better decisions using predictions; (ii) testing in real life by looking at real examples to show how the model works, what issues might come up, and what can be learned; (iii) technology upgrades by using smart tools like AI, simulations, and process improvements to help IT projects run faster and smoother; and (iv) helpful tips for practice by giving easy-to-follow advice and a step-by-step guide for project managers to manage projects better and make smart choices from start to finish.

LITERATURE REVIEW

Digital Twin Technology (DTT) is a powerful tool used in many industries to simulate real situations and predict what might happen next. But even though it is becoming more popular, it has not been widely used or studied in IT project management, especially in Agile and DevOps environments. DTT refers to a digital, real-time model of a system or workflow created to support development and achieve better results by monitoring the progress and simplifying processes and predicting obstacles before they appear (Botin-Sanabria et al., 2022; Schranz et al., 2020). Rechie and Timinger (2021) extend the discussion and determine that DTT is a _digital reflection of the project elements that allow them to track the performance of the development process continuously and make decisions based on this information in the early phases. It was first developed at NASA in the 1970s; they used it to track spaceships and predict possible failures (Madni et al., 2019). The term Digital Twin was officially coined by Michael Grieves to be used in the field of product-management in 2002 (Barricelli et al., 2019). Then, DTT has also been implemented in industries like aerospace, manufacturing and healthcare. However, there has been no single, and common definition of DTT, particularly in IT, disciplines (Fuller et al., 2020; Jeong et al., 2022).

 

The history of the development of DTT has been made gradually. Before 2015, most of the digital surrogates were very primitive, un-adaptable and inter-connected in a limited way. During the years between 2016 and 2021, artificial intelligence (AI) and machine learning (ML), developed new capabilities DTT to include predictive maintenance, workflow automation, and anomaly detection; however, issues with integration remained (Semeraro et al., 2021). In 2022, the digital twins were to considerably improve responsiveness and resilience thanks to the introduction of enabling technologies, including the IoT and big data analysis and cybersecurity (Fuller et al., 2020; Sharma et al., 2022). Figure 1 illustrates key milestones in DTT’s evolution, emphasising its transition from static models to AI-driven systems

 

Figure 1: Digital Twin Concept (Adapted from Qi et al., 2021)

 

Digital Twins (DTS) consists of three connected parts: physical reality, digital representation, and data-exchange tool. The physical reality can be seen as the concrete system or infrastructure (e.g. architectural network within an IT project). The digital model is a virtual replica of the physical system that provides full dynamic info. The data-exchange process induces synchronisation between the physical and virtual domain, thus allowing to provide real-time feedbacks and adaptation. Together, these elements allow DTS to replicate, mimic, and forecast the behaviour of intricate systems in a controlled digital space.

 

Emerging studies explore self-optimising AI-driven DTs, though their efficacy in evolving project environments, particularly in dynamic workflows, requires deeper investigation (Wong et al., 2024). Figure 2 highlights enabling technologies driving DTT adoption, including AI, IoT, cloud computing, and big data analytics

 

Figure 2: Key Enabling Technologies of Digital Twins

 

IT Project Management Lifecycle

The IT project management lifecycle typically follows adaptive phases such as initiation, planning, and execution, which align with evolving project requirements, especially in Agile environments where iterative processes aim to optimise resources and reduce risk (Perera and Eadie, 2023). In contrast to traditional project management models that presuppose the use of a discontinuous, step-by-step course of action, Agile approach is more inclined toward the inception and accomplishment of undertakings in iteractive modes inspired by the ongoing response-delivery systems (Barros et al., 2024).

 

Predictive Analytics and Scenario-Based Testing

Scenario-based testing is a methodological approach that allows software development teams to understand and assess alternative projects configuration in iterative steps, thus supporting the choice of the most effective and efficient one. This is a main constituent of predictive analytics, a field that applies past or current information to identify future dangers early and supports quick, deliberate decisions, particularly in the Agile development framework (Pantovic et al., 2024). In its turn, predictive models will enable project teams to discover and eliminate mistakes in their initial forms, thus reducing the risk of wider repercussions. The use of digital twin technology in the field of IT is thus justified due to the ability of the simulated project environments to enhance management of the tasks and due to the ability of the technology to enhance performance monitoring.

 

Theoretical Framework

Digital twins in information technology (IT) project lifecycles are discussed along three key theoretical analytical frameworks: complexity theory, decision support system (DSS) and Agile project management practices. An overview of existing literature indicates that all these views shed some light on the interdependent mechanisms of requirements identification, evolution, and validation. Complexity theory makes clear that technological systems exhibit behavior that is both emergent and non-linear; DSS theory defines processes of enabling rational decision-making in situations of uncertainty; and Agile project management practices define iterative and incremental work flows that are aimed at timely deployment of business value.

 

Complexity Theory

According to Schutzko and Timinger (2023), the complexity theory explores the inter-dependency of the parts of an IT project and how they can trigger unanticipated problems. They mention the case of joint usage of GitHub and Azure DevOps as an example, and indicate that such collaborative use can create challenges (Mohamed & Al-Jaroodi, 2024). Digital Twin Technology (DTT) mitigates these issues by creating virtualized depictions of the work flows of the project there by making planning much more thought-driven and schedule slips to be eliminated (Guinea-Cabrera and Holgado-Terriza, 2024; Windapo et al., 2023). Olsson and Axelsson (2023) also suggest that digital twins could reduce cognitive overload since they are used to explain how Jira, Git, and DevOps pipelines work together. DTT allows teams to find and fix the problems which usually arise in multidimensional, multi-tool operation in project contexts, such as version-control conflicts, and automated-testing delays, which often sink complex projects (Dalibor et al., 2022; Perera and Eadie, 2023).

 

Decision Support Systems (DSS)

Traditional Decision Support Systems (DSS) are impractical because they rely on the past, thus making them inefficient in responding swiftly to the fast-paced initiatives under regular assessment (Ali et al., 2024; Oettl et al., 2023). Digital Twin Systems (DTS) overcome this shortcoming, through providing real-time updates, allowing scenario testing, and predicting any risks, thus avoiding delays in project execution and excessive resource waste (Attaran and Celik, 2023; Tao et al., 2019). Furthermore, Van der Valk et al. (2020) note that DTS combines feedback systems that facilitate flexibility during project life cycle. Rasheed et al. (2020) argue that when DTS are implemented through the combination of simulation models or artificial intelligence, they increase the flexibility of DSS and make them particularly attractive to make fast and well-informed decisions within information-technology (IT) environments. According to Mohamed and Al-Jaroodi (2024), DTS increases the accuracy of predicting upgrades in the cloud system.

 

Agile Management Principles

The agile approaches focus on iterative development, collaborative work, and flexibility, the mechanisms that optimise continuous improvement by using iterative feedback (Biesialska et al., 2021). Bravo and Vieira (2023) claim that Digital Twin Technology (DTT) incorporated into Agile workflow gives teams the ability to react swiftly, minimize the risks, and optimize sprint planning. DTT also enhances the Agile processes through the enhancement of predictive simulations included in every planning cycle. In combination, these views can provide a framework to use DTT in the IT project management process, allowing teams to make more informed choices, and boost the efficiency of operation thus getting early warnings of any emerging issue.

 

Related Studies

The existing research on Digital Twins in IT project management is related to Applications of Agile frameworks, Project phasing, System design principles, and Predictive analytics. However, a number of the limitations exist: Frameworks are still rather abstract, their adherence to the Agile principles has not been genuinely established yet, and overall empirical uses of real data showing successful application of the frameworks are scarce.

 

Conceptual Evolution: Paradigm Shift or Incremental Innovation?

There are two perspectives that have become predominant in the scholarly discourse on Digital Twin Technology (DTT). On one hand, other authors like Iliuţă et al. (2024) and Madni et al. (2019) consider DTT as a paradigmatic innovation that uses AI and real-time information to create stable digital spaces. On the other hand, Rasheed et al. (2020) present DTT as a small incremental change, stating that the field of its application is limited to individual applications, i.e., simulations and feedback mechanisms. This contradiction is especially evident in the sphere of information technologies, where the flexibility of the system and its constant update are the key factors. Despite the fact that other disciplines can be used to contribute to DTT research, much of these work does not incorporate rigorous empirical testing in IT conditions. As in the case of Qi et al. (2021), who consider DTT growth via cloud and AI infrastructure but fail to conduct essential analysis of the constraints related to scalability associated with IT environments. Mohamed and Al-Jaroodi (2024) consider DTT in the context of cloud migration but they focus on centralised architectures unable to reflect the decentralised, team-based notion promoted by the Agile practice. Kober et al. (2024) agree that DTT is an under-researched methodology of IT projects and demand to conduct interdisciplinary, cross-sectoral studies to clarify the broader possibilities of the technology.

 

Architectural Complexity: Balancing Technical Feasibility and Security

In the IT field, developing powerful digital twin (DT) systems involves finding a reasonable balance between refined technologies and the flexibility that Agile approaches inherently offer. Jeong et al. (2022) introduce the concept of layered architecture which involves edge computing and joint data repositories as a means of connection between physical and logical objects. However, the model is based on centralised governance, which is the opposite of the Agile, more decentralised values of the team operating process. Security considerations also have to be taken into account. Khan et al. (2022) conclude that cloud-based DT infrastructures are vulnerable to cyber-attacks and promote the idea of implementing blockchain as a protective layer, and Kherbache et al. (2022) warn about the risk of industrial data to hackers. In addition, both contributions fail to address the findings outlined by Singh and Tripathi (2024), according to which substantial share of stakeholders oppose the adoption of DTs due to the perceived loss in control. There are also aspects of scalability that make implementation difficult. According to Schranz et al. (2020), SBMEs face the challenge of using DTs due to heavy calculations, whereas Guerra-Zubiaga et al. (2021) argue that the fast pace of Agile puts an excessive burden on centralised infrastructures that have to coordinate DT-based activities. To solve this, Olsson and Axelsson (2023) suggest using modular and decentralised designs that fit with DevOps. Singh, Weeber, and Birke (2021) add a practical “toolbox” of modular techniques to make DTs easier to apply and scale in Agile projects.

 

Framework Gaps: Lifecycle Integration and Standardisation

Standardised frameworks for IT-specific DTs remain nascent compared to manufacturing sectors. Aheleroff et al. (2021) advance the Digital Twin-as-a-Service model for cloud-based interoperability. Yet, their manufacturing focus neglects IT needs for artefact-centric simulations (e.g., codebase dependencies, project bottlenecks). Wong et al. (2024) address this with a “project twin” prototype that mirrors software artefacts through Agile workflow mapping. While promising, Windapo et al. (2023) critique its scalability in enterprise environments, citing fragmented data governance in hybrid clouds a limitation corroborated by Wang et al. (2022).

 

Lifecycle integration studies reveal fragmented progress. During initiation, Reiche and Timinger (2021) apply predictive scenario testing to simulate dependencies but rely on static models incompatible with Agile’s iterative planning. During execution, Mohamed and Al-Jaroodi (2024) use automation to spot risks in cloud migration and reduce delays, but their methods work mainly in controlled environments, not in real-world settings. For project completion, Tao et al. (2022) suggest using feedback loops like in DevOps, but there is little real evidence of this working in IT. Overall, these studies focus on separate project stages instead of offering a full, connected solution, leading to a lack of integration across the entire lifecycle (Jonkers et al., 2021).

 

Agile Compatibility: Automation vs. Human-Centric Ethos

Using Digital Twin Technology (DTT) in IT project management has big benefits, but it also creates challenges, especially with Agile methods that value teamwork and flexibility. Guinea-Cabrera and Holgado-Terriza (2024) introduced “sprint twins” to predict sprint problems, but there is not much real-world proof yet. Singh and Tripathi (2024) found that some Agile teams resist DTT because they fear losing control and flexibility. To solve this, Van der Valk et al. (2020) suggest blending DTT with Agile’s feedback approach. Jeong et al. (2022) also say DTT should assist, not replace, human decisions. Tripathi et al. (2024) add that poor communication, unclear roles, and different stakeholder goals often block DTT success in Agile settings. Perera and Eadie (2023) stress the need for proper training and change management. Bravo and Vieira (2023) support aligning DTT with Agile, but long-term evidence is still lacking. Overall, these studies agree that DTT can improve flexibility, accuracy, and results in IT projects, if the challenges are carefully managed.

 

RESEARCH METHODOLOGY

This study uses a mix of methods, including a structured review of past research, real-world case study analysis, and the development and testing of a new framework through simulation. It relies on existing research, especially by Dalibor et al. (2022) and Guinea-Cabrera and Holgado-Terriza (2024), to show how combining case studies and literature reviews helps create useful models for new digital technologies. The review was done in a careful and organized way using sources like IEEE Xplore, ScienceDirect, Scopus and Google Scholar. These databases were selected based on their disciplinary relevance and coverage of high-impact research in project management, systems engineering, and digital innovation. The review targeted both conceptual and empirical contributions, specifically focusing on DTT implementation, Agile integration, workflow simulation, and predictive analytics in IT settings.

 

Ten thematic keyword clusters were developed, using a total of 25 Boolean search combinations such as:

  • “Digital Twin Technology” AND “IT Project Management Lifecycle”
  • “Predictive Analytics” AND “Agile Integration”
  • “Simulation Models” AND “Workflow Optimisation”

 

From an initial yield of 133 sources, 103 peer-reviewed scholarly works comprising journal articles, conference papers, and academic books published between 2015 and 2025 were retained after applying rigorous inclusion and exclusion criteria (see Table 2).

 

Table 1: Inclusion and Exclusion Criteria

Criteria

Inclusion

Exclusion

Content Focus

Studies addressing DTT in IT project context, Agile workflows, predictive analytics, and simulation with transferable insights.

Studies without relevance to IT contexts or with sector-specific focus (e.g., healthcare-only applications).

Article Type/Year

Peer-reviewed conference papers, journal articles, and academic books published between 2015 and 2025.

Non-peer-reviewed content, technical reports, or publications predating 2015.

Application to Project Lifecycle

Research linked to framework design, decision support, or lifecycle optimisation.

Studies with generalised findings or weak alignment to project environments.

Methodology Relevance

Empirical, conceptual, or mixed-method studies with detailed implementation discussion.

Studies lacking methodological clarity or depth.

 

The literature review provided the theoretical foundation for the framework and revealed implementation patterns, case benchmarks, and research gaps across sectors.

 

Research Design Flowchart

The methodology adopted a sequential and interconnected structure, as illustrated in Figure 3 below. It begins with a systematic literature review to establish theoretical foundations and identify key research gaps. This is followed by a case study analysis, which validates the theoretical findings by examining practical challenges and opportunities observed in real-world industry settings. The paper provides a new framework that is synthesised based on the evidence of case-studies, as well as the available literature. The framework consists of Agile feedback mechanisms, performance monitoring in real-time, and the statistical-based prediction techniques, and it can be verified using the simulation in order to measure its performance in the Agile-based project setting.

 

Figure 3: A visual representation of the research design

 

After a detailed description of the research method, the discussion can now proceed to a practical example of the Digital Twin model. The case study evidences the framework facilitating, predictive, and decision-making role, and improving real-time enhancement of processes via simulation.

 

Research Implementation

Platforms and Tools Used

In this empirical research, a methodological approach to visualizing a Digital Twin framework in Draw.io and Figma is discussed. draw.io is working as a web-based diagram tool to help explain complex IT processes and workflows. Its customizable interface helps to coordinate group work and simplify the project communication (Ozkaya, 2019). Likewise, Figma is an online designing software that allows compiling inept prototypes of user interface (Kimseng et al., 2023; Rana, 2024). The ability to collaborate in real-time, an instant feedback loop, and constant updates enable the use of Figma as a potent tool to speed up the design process and facilitate collaboration (Stige et al., 2024; John et al., 2025). This research uses Figma as an AI-enable dashboard simulation tool to manage tasks, mitigate risks, and track the progress in real-time. As also suggested by Iumanova et al. (2024), it is revealed in the literature that tools that adopt a flexible functionality, cloud-based architecture, and claim to focus on user experience are the ones most likely to support collaborative teamwork, to which attributes apply to the selected platforms in the present case study.

 

Suggested Tools for Real-World Implementation

The given study provides a list of technologies and tools, which allow implementing the Digital Twin Technology (DTT) model in the real-life scenario of managing IT projects, and complements the framework. The choice of these platforms (Table 2) was based on the assumption that they should comply with the Agile practices, be able to handle the data in real time, scale, and align with the best industry practices in terms of workflow automation and forecasting.

 

Table 2: Suggested Technology and Tools for Real-Live Implementation

Framework

Module

Technology/Tool

Purpose

Citation

Licence

Data Acquisition & Integration

Python (Flask, FastAPI), Node.js

Used to build RESTful APIs for data ingestion and processing

Nilsson and Demir (2023); Iumanova et al. (2024)

Open Source

 

Apache Kafka, RabbitMQ

Handles low-latency real-time data streaming.

Dobbelaere and Esmaili (2017); Fuller et al. (2020)

Open Source

 

PostgreSQL, MongoDB

Supports structured and unstructured project monitoring data

Makris et al. (2021)

Open Source

 

Apache Airflow

Orchestrate ETL pipelines for data flow management

(Bussa and Hegde, 2024).

Open Source

Digital Twin Core

Python (Django, Flask), Node.js (Express)

Implement back-end logic for virtual DT model operations.

Iumanova et al. (2024); Sharma et al. (2024)

Open Source

 

Redis, Firebase

Enables real-time synchronisation of project data.

Jeong et al. (2022); Moroney (2017); Norem (2024)

Open Source / Proprietary

 

GraphQL, REST APIs

Facilitate communication service interfaces between modules.

Gori Parthi (2024); Nilsson and Demir (2023)

Open Source

Simulation & Predictive Analytics

Python (Scikit-learn, TensorFlow, PyTorch)

Develop, train, and deploy predictive models and Machine learning (ML) training pipelines.

Venkatapathy (2023); Guillaume-Joseph and Wasek (2015)

Open Source

 

SimPy, AnyLogic

Run scenario-based experimentation, including Monte Carlo simulations.

Peyman et al. (2021); Singh et al. (2021); Walter and Barkema (2015)

Open Source

AI-Driven Decision-Support

OpenAI API, LangChain, GenKit

Enable AI-driven recommendations and natural language Processing (NLP) interfaces.

(Almalki, 2025); Auger and Saroyan (2024)

Open Source/Paid API

Visualization & UI

Figma

Prototype and wireframe UI components

Kimseng et al. (2023); Rana (2024); Stige et al. (2024)

Open Source

 

D3.js, Chart.js, Plotly, React.js, Vue.js

Visualize metrics, trends, and predictive insights; build dynamic and responsive dashboards

Iumanova et al. (2024); Singh et al. (2021); Yang (2023)

Open Source

Feedback & Continuous Improvement

GitHub Actions, Jenkins, GitLab CI/CD, Prometheus, Grafana

Automate model updates, testing, deployment processes, and monitoring and alerting through Continuous Integration and Deployment (CI/CD) pipelines.

Attaran and Celik (2023); Boti­n-Sanabria et al. (2022); Iumanova et al. (2024); Yasin et al., 2021

Open Source

Source: Authors

 

Benchmarking and Validation Strategy

To uphold the defined KPI thresholds (discussed in Section 4.4), a comprehensive Benchmarks and Testing Strategy ensures ongoing validation and system reliability. Each KPI is assigned a quantifiable goal (e.g., risk detection ≥ 90%) and measured regularly, such as after sprints, major project phases, or monthly reviews, to catch performance issues early. This strategy is closely aligned with Agile practices, embedding validation activities into iterative cycles to support continuous improvement and responsiveness to change.

 

Results Evaluation and Validation

To assess the operational performance of the DTT framework, an agile simulation scenario was designed to evaluate the Digital Twin Technology (DTT) framework within a typical IT project lifecycle. It included sprint planning, development, testing, and deployment. During execution, the AI engine detected a testing bottleneck based on sprint velocity and task history, triggering an alert to redistribute tasks and adjust the sprint timeline. If applied live, this could have saved up to five working days, showcasing the framework’s potential for proactive governance.

 

The simulation was visualised through a non-functional prototype dashboard, illustrating metrics like sprint progress, projected timelines, resource heatmaps, and task views. The interface included a feedback panel and was designed for accessibility, featuring role-based views for both technical and non-technical users.

 

Key Simulation Results

Baseline – Project on Track

In the initial simulation state, the dashboard shows the project as "On Track." All progress indicators were green, with the project forecasted to finish five days ahead of schedule. Sprint progress is at 65%, and all metrics indicate minimal risk.  AI-suggested mitigations, such as adding a QA resource and initiating parallel deployment preparation, had already been implemented. Schedule, Resource, and Scope Risks were all at 0%, illustrating the digital twin operating in a steady state, pre-emptively managing risks.

 

Figure 4: Baseline steady state

 

Risk Escalation Due to Inaction

As simulated project stressors were introduced and no mitigations applied, the scenario flagged early warnings. The Project status shifted to "Possible Delays" with a projected five-day slip.  Resource Risk rose to 90%, and testing and deployment phases began to diverge from the baseline. This scenario demonstrates the twins’ ability to forecast escalating issues based on resource allocation patterns.

 

Figure 5: Risk escalates with delayed action

 

Partial Mitigation – Moderate Recovery

When only one AI-recommended mitigation (e.g., adding a QA resource) was activated, Resource Risk dropped to 0%. However, the project timeline improved, although it remained four days behind schedule, and Schedule Risk stayed moderate. This shows that single-point interventions can address specific bottlenecks but do not ensure full recovery.

 

Combined Mitigation – Strong Recovery

Activating multiple mitigations, such as removing Feature X and enabling parallel deployment, resulted in a strong recovery. Schedule Risk dropped to 30%, and the forecast improved to just three days behind schedule. The scenario highlights the comprehensive benefit of timely, multi-pronged interventions.

 

Comparative Analysis: Recovery vs. Worst-Case

Simulations confirm that mitigation timing is critical. In the worst-case scenario, where no actions were taken, the project delay reached 12 days, and all risks spiked to 100%. This underlines the digital twin’s ability to forecast divergent paths and the consequences of inaction.

 

Figure 6: Worst-Case Scenario with no intervention

 

Interpretation and Impact

The simulations confirmed the DTT framework’s ability to predict risks, make informed decisions, and enhance IT project governance. Early detection (A2), timely interventions (A4–A5), and poor outcomes from delays (A7–A8) demonstrated its strategic value. Unlike static dashboards, the digital twin adapts to live data. Unlike regular dashboards, the DTT updates in real time and fits well with Agile methods by helping managers see risks, test options, and plan effectively. The Pilot–Train–Scale method allows step-by-step use, and tools like Jira, MS Project, and Azure DevOps can connect with its flexible parts (simulation, analytics, AI, and visuals). Overall, DTT boosts project clarity, flexibility, and success, making it a useful tool in fast-paced IT environments.

 

Framework Validation with KPIs Outcome and Efficiency Gains

To test how well the DTT framework works, a simulation was run using key performance indicators (KPIs). These included how well it detects risks, how fast the AI responds, how reliable the simulations are, how efficiently resources are used, and how satisfied users are. The results were strong: about 92% of risks were caught, the AI gave helpful suggestions in just three seconds, and users rated their satisfaction 8.5 out of 10. Resource use improved by 15%, and dashboard updates had only a two-second delay. Overall, the results show that the framework can boost teamwork, improve accuracy, and speed up decisions in IT projects.

 

Table 3: KPI metrics and respective outcomes

KPI Metric

Target Value

Simulation Result

Impact

Risk Detection Accuracy

≥ 90%

~92%

Enabled proactive mitigation of sprint risks

Simulation Accuracy

≥ 85%

85–95%

Reliable forecasting of project timelines and outcomes

AI Response Time

≤ 3 seconds

< 3 seconds

Supported real-time Agile adjustments and planning

Data Ingestion Latency

< 2 seconds

< 2 seconds

Maintained up-to-date dashboards using synced Jira data

Resource Efficiency Gain

+15%

~15%

Enhanced productivity and reduced idle time

Stakeholder Satisfaction

NPS ≥ 50 or Score ≥ 8/10

~8.5/10

High levels of user approval and interface usability

User Adoption Rate

≥ 80% within 6–12 months

Conceptually predicted high

Scalability and relevance across project domains

System Availability

≥ 99.5%

Conceptually achieved

Demonstrated reliability during iterative Agile cycles

Source: Authors

 

These results show that the framework fits well with Agile principles like continuous delivery, openness, and flexibility. Thus, setting the foundation for further exploration in section 5.2, where alignment with the research objectives and questions is explicitly discussed.

DISCUSSION

The framework met all the objectives. Thus, simulations accurately predicted project bottlenecks using real-time data patterns; the interactive dashboard enhances decision-making under constrained conditions; and AI-recommended mitigation strategies significantly reduced risk and improved delivery.

 

DTT simulates IT project workflows to improve outcomes

The DTT framework was designed as a modular system integrating data ingestion, simulation engines, AI decision support, and real-time dashboards. Simulation results demonstrated their ability to identify testing bottlenecks, forecast delivery deviations, and suggest task reallocations, resulting in a potential five-day reduction in project duration. The model’s feedback loops worked better than traditional tools by allowing quick updates and improving sprint performance. The current research supports the already existing body of evidence suggesting that digital twins allow better prediction, planning, and continuous adjustment of information technology (IT) projects (Guillaume Joseph and Wasek, 2015; Arin et al., 2023; Wong et al., 2024).

 

Challenges and benefits from adopting DTT in IT project management

Experimental and simulation studies have enlightened the potential benefits and obstacles that come along with the implementation of Digital Twin (DT). The first advantage is enhanced forecasting, real-time monitoring and facilitation of more informed decision-making. However, application of DTs becomes associated with several hurdles such as data security issues, complexity of configuration operations, and stakeholder objections. The presentation of illustrative case studies of Cisco, Siemens, IBM, and Azure DevOps illustrates that, in some cases, the framework works successfully. Specifically, the introduction of artificial intelligence (AI) allowed teams to make sprint suggestions over a course of three seconds, with the feedback-based technologies concurrently producing positive effects on team trust and facilitating the ongoing learning process. These results are in line with Temitope (2020) who emphasized the impact that online tools have on cooperation in groups and on the performance of projects. The emergence of visualisation tools that are easy to use, allowed an even greater level of model fidelity, interdisciplinary collaboration, and enhanced values of Agile which are transparency and shared responsibility.

 

Strategies that facilitate the effective integration of DTT into IT project management practices

 

Effective integration methods discovered in this study range from the modular deployment, the role-based training, adaptive simulators engines, and the integrated UI feedback of the mechanisms. The Pilot Train Scale model was introduced as a realistic way to monitor the performance and step-by-step introduce the digital twin technology. Such evidence concurs with the adaptable tool-based framework introduced by Singh, Weeber, and Birke (2021) to implement digital twins in various project settings. Perera and Eadie (2023) also noted that, in Agile systems, the important change management and training in an Agile environment should undergo stringent evaluation and selection based on functional team roles with new technology. According to Tripathi et al. (2024) there is an overwhelming need to have strong governance and team alignment in digital twin initiatives. Instead of being a fixed framework, the DTT model proved itself as to be a flexible set of tools that promote predictive project management in Agile environments by incorporating cooperation and iterative feedback in its very structure.

CONCLUSION

This research explored how the implementations of AI, prediction tools and simulators might improve management over IT projects via Digital Twin Technology (DTT). The results show that a properly set up DTT with real-time data and smart predictions can make the difference when it comes to IT project management.

 

Recommendations

To ensure predictions and system integration is successful in real IT projects, pilot testing is accomplished in genuine operation settings (Semeraro et al., 2021). Further, Van der Valk et al. (2020) suggest that those pilots should include structured sessions where users have to engage with AI-generated recommendations in a way similar to their daily work activities. Also, feedback from users should be collected regularly to guide improvements, and future versions should clearly show the difference between testing the idea and testing how it works in practice

 

Tools like Jira, AWS CloudWatch, and GitLab CI/CD should be connected in real-time when building a working prototype. To ensure fast and scalable data updates, technologies such as Redis caching, OpenAPI, and Kafka streaming should be used (Nilsson and Demir, 2023). In future versions, interface instructions should be added, and the design should move from tools like Figma to low-code or simulation platforms to fix issues with usability and function. This would allow for full workflow testing, better data handling, and smooth system operation.

 

Widely accepted tools like Monte Carlo simulations, regression forecasts, or supervised learning models should be clearly used to build and test predictive models (Ragazzini et al., 2024). Cross-functional teams should oversee system setup, rule compliance, and staff training. Important technical issues—such as keeping data up-to-date, managing software licenses, and dealing with API limits—must be handled early (Iumanova et al., 2024). Future research should also explore digital twin systems that work across multiple teams and locations to support decentralized data analysis (Olsson and Axelsson, 2023).

 

Limitations

Future efforts should focus on empirical validation through pilot projects and controlled testing in real IT environments. Addressing barriers such as tool interoperability, data latency, and user training will be essential for real-world scalability.

REFERENCES
  1. Ali, Z., Biglari, R., Denil, J., Mertens, J. and Poursoltan, M. (2024) ‘From modelling and simulation to Digital Twin: evolution or revolution?’, Simulation, 100(7), pp. 751–769. Available at: https://doi.org/10.1177/00375497241234680.
  2. Almalki, S.S. (2025) 'AI-Driven Decision Support Systems in Agile Software Project Management: Enhancing Risk Mitigation and Resource Allocation', Systems, 13(3), pp. 208. Available at: https://doi.org/10.3390/systems13030208.
  3. Arin, I.A., Warnars, H.L.H.S. and Murad, D.F. (2023) ‘A Systematic Literature Review of Recent Trends and Challenges in Digital Twin Implementation’, 2023 10th International Conference on ICT for Smart Society (ICISS). Available at: https://doi.org/10.1109/ICISS59129.2023.10291219.
  4. Attah, R.U. et al. (2024) 'Best practices in project management for technology-driven initiatives: A systematic review of market expansion and product development technique', Int J Eng Res Dev, 20(11), pp. 1350–1361.
  5. Attaran, M. and Celik, B.G. (2023) ‘Digital Twin: Benefits, use cases, challenges, and opportunities’, Decision Analytics Journal, 6, p. 100165. Available at: https://doi.org/10.1016/j.dajour.2023.100165.
  6. Auger, T. and Saroyan, E. (2024). Overview of the OpenAI APIs in Overview of the OpenAI APIs Generative AI for Web Development: Building Web Applications Powered by OpenAI APIs and Next. js. Springer, pp. 87–116. Available at: https://doi.org/10.1007/979-8-8688-0885-2_6.
  7. Baghizadeh, Z., Cecez-Kecmanovic, D. and Schlagwein, D. (2020) ‘Review and critique of the information systems development project failure literature: An argument for exploring information systems development project distress’, Journal of Information Technology, 35(2), pp. 123–142. Available at: https://doi.org/10.1177/0268396219832010.
  8. Barricelli, B.R., Casiraghi, E. and Fogli, D. (2019) ‘A Survey on Digital Twin: Definitions, Characteristics, Applications, and Design Implications’, IEEE Access, 7, pp. 167653–167671. Available at: https://doi.org/10.1109/ACCESS.2019.2953499.
  9. Barros, L., Tam, C. and Varajão, J. (2024) 'Agile software development projects–Unveiling the human-related critical success factors', Information and Software Technology, 170, pp. 107432. Available at: https://doi.org/10.1016/j.infsof.2024.107432.
  10. Biesialska, K., Franch, X. and Muntés-Mulero, V. (2021) ‘Big Data analytics in Agile software development: A systematic mapping study’, Information and Software Technology, 132, p. 106448. Available at: https://doi.org/10.1016/j.infsof.2020.106448.
  11. Botín-Sanabria, D.M. et al. (2022) 'Digital Twin Technology Challenges and Applications: A Comprehensive Review', Remote Sensing (Basel, Switzerland), 14(6), pp. 1335. Available at: https://doi.org/10.3390/rs14061335.
  12. Bravo, A. and Vieira, D. (2023) ‘Digital Twin Technology as a Tool to Enhance the Performance of Agile Project Management’, in Digital Twin Technology – Fundamentals and Applications. IntechOpen. Available at: https://www.intechopen.com/chapters/87801.
  13. Bussa, S. and Hegde, E. (2024) ‘Evolution of data engineering in modern software development’, Journal of Sustainable Solutions. Available at: https://pdfs.semanticscholar.org/0103/179bfc37357c77ce545edd455f3538a3538c.pdf.
  14. Dalibor, M. et al. (2022) 'A cross-domain systematic mapping study on software engineering for digital twins', Journal of Systems and Software, 193, pp. 111361. Available at: https://doi.org/10.1016/j.jss.2022.111361.
  15. Dobbelaere, P., & Esmaili, K. S. (2017). Kafka versus RabbitMQ: A comparative study of two industry reference publish/subscribe implementations: Industry Paper. Proceedings of the 11th ACM International Conference on Distributed and Event-Based Systems, 227–238. Available at: https://doi.org/10.1145/3093742.3093908.
  16. Dorrer, M.G. (2020) 'The digital twin of the business process model', Journal of Physics.Conference Series, 1679(3), pp. 32096. Available at: https://doi.org/10.1088/1742-6596/1679/3/032096.
  17. Fuller, A.  et al. (2020) 'Digital Twin: Enabling Technologies, Challenges and Open Research', IEEE Access, 8, pp. 108952–108971. Available at: https://doi.org/10.1109/ACCESS.2020.2998358.
  18. Guerra-Zubiaga, D. et al. (2021) 'An approach to develop a digital twin for industry 4.0 systems: manufacturing automation case studies', International Journal of Computer Integrated Manufacturing, 34(9), pp. 933–949. Available at: https://doi.org/10.1080/0951192X.2021.1946857.
  19. Guillaume-Joseph, G. and Wasek, J. S.  (2015) 'Improving software project outcomes through predictive analytics: Part 2', IEEE Engineering Management Review, 43(3), pp. 39–49. Available at: https://doi.org/10.1109/EMR.2015.2469471.
  20. Guinea-Cabrera, M. and Holgado-Terriza, J. (2024) 'Digital Twins in Software Engineering, A Systematic Literature Review and Vision', Applied Sciences, 14(3), pp. 977. Available at: https://doi.org/10.3390/app14030977.
  21. Hu, W. et al. (2021) 'Digital twin: A state-of-the-art review of its enabling technologies, applications and challenges', Journal of Intelligent Manufacturing and Special Equipment, 2(1), pp. 1–34. Available at: https://doi.org/10.1108/JIMSE-12-2020-010.
  22. Iliuţă, M. et al. (2024) 'Digital Twin—A Review of the Evolution from Concept to Technology and Its Analytical Perspectives on Applications in Various Fields', Applied Sciences, 14(13), pp. 5454. Available at: https://doi.org/10.3390/app14135454.
  23. Iumanova, I.F., Matrenin, P.V. and Khalyasmaa, A.I. (2024) ‘Review of Existing Tools for Software Implementation of Digital Twins in the Power Industry’, Inventions, 9(5), p. 101. Available at: https://doi.org/10.3390/inventions9050101.
  24. Jeong, D. et al. (2022) 'Digital twin: Technology evolution stages and implementation layers with technology elements', IEEE Access, 10, pp. 52609–52620. Available at: https://doi.org/10.1109/ACCESS.2022.3173010.Top of FormBottom of Form
  25. John, S. A. et al. (2025) ‘Adoption of AI-driven fraud detection system in the Nigerian banking sector: An analysis of cost, compliance, and competency’, Economic Review of Nepal, 8(1), pp. 16–33. https://doi.org/10.3126/ern.v8i1.80740
  26. Jonkers, R. K. and Eftekhari Shahroudi, K. (2021) 'A Design Change, Knowledge, and Project Management Flight Simulator for Product and Project Success', IEEE Systems Journal, 15(1), pp. 1130–1139. Available at: https://doi.org/10.1109/JSYST.2020.3006747.
  27. Kabanda, G. (2020) 'An evaluation of big data analytics projects and the project predictive analytics approach', Oriental Journal of Computer Science and Technology, 12(4), pp. 132–146. Available: http://dx.doi.org/10.13005/ojcst12.04.01.
  28. Kherbache, M., Maimour, M. and Rondeau, E. (2022) 'Network digital twin for the industrial internet of things', 2022 IEEE 23rd International Symposium on a World of Wireless, Mobile and Multimedia Networks (WoWMoM). IEEE Access.
  29. Kimseng, N. et al. (2023) 'UI/UX Development Using Figma based on Inclusive Design', JINAV: Journal of Information and Visualization, 4(2), pp. 227–234.
  30. Kober, C.  et al. (2024) 'Digital Twins: A Critical Perspective and Research Trends', 2024 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM). Available at: 10.1109/IEEM62345.2024.10857032.
  31. Liu, Z. et al. (2023) 'Digital twin for predictive maintenance', NDE 4.0, Predictive Maintenance, Communication, and Energy Systems: The Digital Transformation of NDE, 12489, pp. 27–37.
  32. Lo, C.K., Chen, C. and Zhong, R.Y. (2021) 'A review of digital twin in product design and development', Advanced Engineering Informatics, 48, pp. 101297.
  33. Madni, A.M., Madni, C.C. and Lucero, S.D. (2019) 'Leveraging digital twin technology in model-based systems engineering', Systems (Basel), 7(1), pp. 7. Available at: https://doi.org/10.3390/systems7010007.
  34. Makris, A., Tserpes, K., Spiliopoulos, G., Zissis, D., & Anagnostopoulos, D. (2021). MongoDB Vs PostgreSQL: A comparative study on performance aspects. GeoInformatica, 25(2), 243–268. Available at: https://doi.org/10.1007/s10707-020-00407.
  35. Moroney, L. and Moroney, L. (2017) 'An introduction to firebase', The Definitive Guide to Firebase: Build Android Apps on Google's Mobile Platform, , pp. 1–24.
  36. Mohamed, N. and Al-Jaroodi, J. (2024) 'Predictive Analytics for Digital Twins: The Concept and Systems Applications', 2024 7th International Conference on Information and Computer Technologies (ICICT). Available at: https://doi.org/10.1109/ICICT62343.2024.00069.
  37. Nilsson, E. and Demir, D. (2023) 'Performance comparison of REST vs GraphQL in different web environments: Node. js and Python. https://urn.kb.se/resolve?urn=urn:nbn:se:lnu:diva-121877.
  38. Oettl, F.  et al. (2023) 'From Data to Decisions: A Method for Evaluating the Strategic Value of Digital Twins', 2023 3rd International Conference on Electrical, Computer, Communications and Mechatronics Engineering (ICECCME). Available at: https://doi.org/10.1109/ICECCME57830.2023.10252781.
  39. Olsson, T. and Axelsson, J. (2023) 'Systems-of-systems and digital twins: A survey and analysis of the current knowledge', 2023 18th Annual System of Systems Engineering Conference (SoSe). IEEE Available at: https://doi.org/10.1109/SoSE58556.2023.10178527.
  40. Ozkaya, M. (2019) 'Are the UML modelling tools powerful enough for practitioners? A literature review, IEEE Software, 13(5), pp. 338–354.
  41. Perera, S. and Eadie, R. (2023) . Managing Information Technology Projects: Building a Body of Knowledge in IT Project Management. World Scientific. Available at: https://books.google.com/books?id=g2S5EAAAQBAJ.
  42. Perno, M., Hvam, L. and Haug, A. (2022) 'Implementation of digital twins in the process industry: A systematic literature review of enablers and barriers', Computers in Industry, 134, pp. 103558.
  43. Peyman, M. et al. (2021) 'A tutorial on how to connect Python with different simulation software to develop rich simheuristics', 2021 Winter Simulation Conference (WSC). Available at: https://ieeexplore.ieee.org/abstract/document/9715511/.
  44. Qi, Q. et al. (2021) 'Enabling technologies and tools for digital twin', Journal of Manufacturing Systems, 58, pp. 3–21. Available at: https://doi.org/10.1016/j.jmsy.2019.10.001.
  45. Ragazzini, L. et al. (2024) 'Digital Twin-based bottleneck prediction for improved production control', Computers & Industrial Engineering, 192, pp. 110231. Available at: https://doi.org/10.1016/j.cie.2024.110231.
  46. Rana, R.P.S. (2024) 'The User's Journey: A Historical Perspective on UI/UX Evolution', 2024 3rd Edition of IEEE Delhi Section Flagship Conference (DELCON). Available at: https://ieeexplore.ieee.org/abstract/document/10866504/.
  47. Rasheed, A., San, O. and Kvamsdal, T. (2020) 'Digital twin: Values, challenges and enablers from a modelling perspective', IEEE Access, 8, pp. 21980–22012.
  48. Rayhana, R.  et al. (2024) 'Digital Twin Models: Functions, Challenges, and Industry Applications', IEEE Journal of Radio Frequency Identification, 8, pp. 282–321. Available at: https://doi.org/10.1109/JRFID.2024.3387996.
  49. Reiche, F. and Timinger, H.  (2021) 'Process Model for Integrated Product Lifecycles Using Digital Twins and Predictive Analytics', 2021 IEEE Technology & Engineering Management Conference - Europe (TEMSCON-EUR). Available at: https://doi.org/10.1109/TEMSCON-EUR52034.2021.9488653.
  50. Schranz, C., Strohmeier, F. and Damjanovic-Behrendt, V.  (2020) 'A Digital Twin Prototype for Product Lifecycle Data Management', 2020 IEEE/ACS 17th International Conference on Computer Systems and Applications (AICCSA). Available at: https://doi.org/10.1109/AICCSA50499.2020.9316506.
  51. Schuh, G., Riesener, M. and Dölle, C. (2017) 'Implementation and assessment of a predictive analytics model for development project management', 2017 IEEE International Conference on Industrial Engineering and Engineering Management (IEEM). Available at: https://doi.org/10.1109/IEEM.2017.8289980.
  52. Schützko, F. and Timinger, H.  (2023) 'Predictive analytics for project management', 2023 IEEE International Conference on Engineering, Technology and Innovation (ICE/ITMC). Available at: https://doi.org/10.1109/ICE/ITMC58018.2023.10332400.
  53. Semeraro, C. et al. (2021) 'Digital twin paradigm: A systematic literature review', Computers in Industry, 130, pp. 103469.
  54. Sharma, A. et al. (2022) 'Digital twins: State of the art theory and practice, challenges, and open research questions', Journal of Industrial Information Integration, 30, pp. 100383.
  55. Sharma, S. et al. (2024) 'Modern Backend Development Technologies: A Comparative Review and Case Study', International Conference on Emerging Trends in Expert Applications and Security (pp. 139–151). Springer Nature. https://doi.org/10.1007/978-981-97-3745-1_12.
  56. Singh, P.D. and Tripathi, V. (2024) 'Digital Twins: A Comprehensive Study on Models, Platforms, Applications and Challenges', 2024 11th International Conference on Computing for Sustainable Global Development (INDIACom). IEEE
  57. Singh, S., Weeber, M. and Birke, K. (2021) 'Advancing digital twin implementation: a toolbox for modelling and simulation', Procedia CIRP, 99, pp. 567–572. Available at: https://doi.org/10.1016/j.procir.2021.03.078.
  58. Stige, Å et al. (2024) 'Artificial intelligence (AI) for user experience (UX) design: a systematic literature review and future research agenda', Information Technology & People, 37(6), pp. 2324–2352.
  59. Tao, F. et al. (2022) 'Digital twin modelling', Journal of Manufacturing Systems, 64, pp. 372–389.
  60. Tatineni, S. (2023) 'Cloud-Based Reliability Engineering: Strategies for Ensuring High Availability and Performance', International Journal of Science and Research (IJSR), 12(11), pp. 1005–1012.
  61. Temitope, A.O. (2020) 'Software Adoption in Project Management and Their Impact on Project Efficiency and Collaboration',
  62. Tripathi, N. et al. (2024) 'Stakeholders collaborations, challenges and emerging concepts in digital twin ecosystems', Information and Software Technology, pp. 107424.
  63. van der Valk, H. et al. (2020) 'Digital twins in simulative applications: A taxonomy', 2020 Winter Simulation Conference (WSC). IEEE
  64. VanDerHorn, E. and Mahadevan, S. (2021) 'Digital Twin: Generalization, characterization and implementation', Decision Support Systems, 145, pp. 113524. Available at: https://doi.org/10.1016/j.dss.2021.113524.
  65. Walter, J. and Barkema, G.T. (2015) 'An introduction to Monte Carlo methods', Physical A: Statistical Mechanics and its Applications, 418, pp. 78–87.
  66. Wang, K. et al. (2022) 'A review of the technology standards for enabling digital twin', Digital Twin, 2, pp. 4.
  67. Windapo, A. et al. (2023) 'The Efficacy of Innovative Project Management Tools in Mitigating Risks and Uncertainty in the Project Delivery Process', E3S Web of Conferences, 409, pp. 6017. Available at: https://doi.org/10.1051/e3sconf/202340906017.
  68. Wong, W.Y. et al. (2024) 'A Review of Integrating Digital Twin Technology to Integrated Software Change Control Management: Six Sigma Practice and Approach', 2024 IEEE 6th Symposium on Computers & Informatics (ISCI). IEEE Available at: https://doi.org/10.1109/ISCI62787.2024.1066.7942.
  69. Yang, H. (2023). Design Patterns of Vue. js 3in Design Patterns of Vue. js 3Vue. JS Framework: Design and Implementation. Springer, pp. 33–45.
  70. Yasin, A. et al. (2021) 'A Roadmap to Integrate Digital Twins for Small and Medium-Sized Enterprises', Applied Sciences, 11(20), pp. 9479. Available at: https://doi.org/10.3390/app11209479.
Recommended Articles
Original Article
Switching OTT Platforms: PPM Framework to Identify Consumers’ Behavior
...
Research Article
Published: 28/11/2025
Research Article
A Review on change in Indian Consumer attitude amidst COVID-19 pandemic
Published: 21/11/2025
Research Article
https://acr-journal.com/article/the-role-of-digital-marketing-strategies-in-enhancing-retail-performance-1978/
Published: 28/11/2025
Loading Image...
Volume 2, Issue:5
Citations
140 Views
40 Downloads
Share this article
© Copyright Advances in Consumer Research