Next Article in Journal
Salinity-Induced Extinction of Zostera marina in Lake Grevelingen? How Strong Habitat Modification May Require Introduction of a Suitable Ecotype
Next Article in Special Issue
Industry 4.0 Technologies for Sustainable Asset Life Cycle Management
Previous Article in Journal
A Theoretical Model of the Development of Public Citizenship in a Sustainable Environment: Case of Lithuania
Previous Article in Special Issue
State Estimation and Remaining Useful Life Prediction of PMSTM Based on a Combination of SIR and HSMM
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Generic Multi-Layered Digital-Twin-Framework-Enabled Asset Lifecycle Management for the Sustainable Mining Industry

1
Laboratory of Industrial Engineering (LGIIS), Faculty of Science and Technology, University Sultan Moulay Slimane (USMS), Beni Mellal 23000, Morocco
2
Green Tech Institute (GTI), Mohammed VI Polytechnic University (UM6P), Benguerir 43150, Morocco
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(4), 3470; https://doi.org/10.3390/su15043470
Submission received: 15 January 2023 / Revised: 3 February 2023 / Accepted: 7 February 2023 / Published: 14 February 2023
(This article belongs to the Special Issue Industry 4.0 Technologies for Sustainable Asset Life Cycle Management)

Abstract

:
In the era of digitalization, many technologies are evolving, namely, the Internet of Things (IoT), big data, cloud computing, artificial intelligence (IA), and digital twin (DT) which has gained significant traction in a variety of sectors, including the mining industry. The use of DT in the mining industry is driven by its potential to improve efficiency, productivity, and sustainability by monitoring performance, simulating results, and predicting errors and yield. Additionally, the increasing demand for individualized products highlights the need for effective management of the entire product lifecycle, from design to development, modeling, simulating, prototyping, maintenance and troubleshooting, commissioning, targeting the market, use, and end-of-life. However, the problem to be overcome is how to successfully integrate DT into the mining business. This paper intends to shed light on the state of art of DT case studies focusing on concept, design, and development. The DT reference architecture model in Industry 4.0 and value-lifecycle-management-enabled DT are also discussed, and a proposition of a DT multi-layered architecture framework for the mining industry is explained to inspire future case studies.

1. Introduction

Considering the competitiveness of the global market, businesses must create data-oriented interactions to satisfy sustainability, which is playing a relevant role in all phases of asset lifecycle management [1]. In the early phases of a product’s development, several issues need to be addressed, including the product’s design, manufacture, usage, disposal, and, respectively, its impact on society. Each of these stages has its own specific sustainability considerations and measurements. As an area of interest in digital transformation, from manufacturing to service and operations, the Fourth Industrial Revolution (Industry 4.0) presents the promise of enhanced flexibility, higher quality, and improved productivity [2].
Every physical asset needs a digital representation conducive to fully realizing Industry 4.0 potential. To tackle difficult business problems, mirroring digital representations of actual assets can be quite valuable. One of the most desired qualities in the age of Industry 4.0 is hence the ability to deliver distinctive characteristics at scale. The fast advancement of communication and information technology has made it possible for digitalization to change how business is conducted along industrial value chains. Another term for this procedure is “Industry 4.0”, or “the fourth industrial revolution”. The goals include highly customized and optimized manufacturing as well as enhanced automation and adaptability. The industrial sector is already moving toward more sustainable practices by suggesting a number of strategies to improve the effectiveness of these environmentally friendly activities. The digital twin (DT) is one of the most promising enabling technologies for achieving these Industry 4.0 ambitions. Having a two-way dynamic mapping between a physical thing and its digital model, which has a structure of linked parts and meta-information, a DT is a digital representation of a physical entity [3]. A framework for digital twin manufacturing is described in the ISO 23,247 series as virtual representations of actual manufacturing components such as workers, goods, assets, and process specifications.
The present work aims to provide a new approach based on a multi-layered DT framework toward enabling asset life cycle management that would support sustainability in the mining industry, answering several challenges that are faced at the mine, such as the big data, the heterogeneous types of the gathered data, and the supporting infrastructure that must comply with specific standards such as the communication linking, the distributed IoT-components, big data processing or real-time response, the high cost of maintenance, and the complexity of the process.
The increasing demand for customized products is having an impact on all phases of the product lifecycle, which is challenging production. By using engineering and management technologies such as digital twin, digitization creates a variety of options. This study proposes a general DT architecture framework to fully utilize DT capabilities in response to issues in the mining industry. The framework intends to accomplish sustainable mining by using RAMI 4.0, managing assets throughout their lifecycles, and creating a collaborative ecosystem that incorporates four services.
As shown in Figure 1, this paper is structured as follows: After the introduction, Section 2 provides the state of the art in terms of Industry 4.0 and the mining industry. Section 3 discusses the DT reference architecture model in Industry 4.0, besides its discussion of the integration level of DT technology and its potentialities of enabling value lifecycle management and creating a Phygital (physical–digital) collaborative environment. Section 4 proposes a DT reference multi-layers approach for the mining industry as the leading contribution of this study; on top of this, a section of discussion will debate the features to consider for designing DT framework architecture and the advantages of this proposed developing method to create a DT for the mining industry. Section 5 concludes and outlines future work.

2. Research Methodology

In this research, the following methodology aims to create a digital twin framework for asset lifecycle management of the mining industry. The methodology comprises several steps. Firstly, an exploratory literature review was conducted to examine the prevalent services, concepts, architectures, and frameworks related to DTs within the context of manufacturing, including engineering, production, process, and operations. This literature review served as a foundation for the conceptualization of the proposed digital twin framework. Secondly, focus groups were organized to gather information and feedback on the theme of digitalization within the mining industry. These focus groups were composed of experts and stakeholders from the mining industry, including managers, engineers, and technicians. The information collected from these focus groups provided valuable insights into the challenges and opportunities related to digitalization within the mining industry. Thirdly, based on the findings of the literature review and focus groups, the study aimed to identify the challenges faced by the mining industry in terms of the integration of digitalization, specifically digital twin technology. Additionally, the study listed the promising benefits of using this technology for asset lifecycle management within the mining industry. Fourthly, using the information gathered in the previous steps, a conceptual framework architecture of DT-enabled asset lifecycle management was developed. This framework architecture adhered to the standards and concepts of the reference architecture model in Industry 4.0 (RAMI 4.0) and was designed to be highly flexible and adaptable to the specific conditions and challenges of the mining industry. Lastly, to validate the proposed framework, the experimental open pit mine of OCP Benguerir was selected as the case study site. The framework was implemented in the mine and stakeholders were involved throughout the process, including managers, engineers, and technicians. The results of the case study provided valuable feedback on the performance and effectiveness of the proposed digital twin framework. In conclusion, this methodology, which is a comprehensive research methodology, provides a robust and reliable approach for the development and validation of the proposed digital twin framework for asset lifecycle management within the mining industry.

3. Mining 4.0: The State of the Art

3.1. Industry 4.0 toward 5.0

The emergence of Industry 5.0 is predicated on the observation or assumption that Industry 4.0 is less concerned with the original principles of social justice and sustainability and more concerned with digitalization and AI-driven technologies for enhancing the efficiency and flexibility of production [4]. DT is viewed as a crucial enabler in Industry 5.0 for developing intelligent, autonomous systems that can interact and communicate with one another, resulting in a highly connected and highly optimized manufacturing process. DT can be used to simulate and model intricate physical systems, including each of its individual parts and interactions. The last step is to use these data to forecast future performance and pinpoint potential improvement areas. The DT also enables the remote management and control of physical systems, which lowers the demand for human involvement and boosts productivity and efficiency generally. By integrating digital and virtual solutions in multiple industries conducive to enabling smart features such as predicting production, optimizing energy efficiency, and better scheduling maintenance, these smart features are leading the industry towards 5.0 where the process can be more autonomous in multidimensional levels, creating a full synergy between human and robot co-working [5]. However, many challenges have been faced to implement these smart features, for example, the lack of skills of the integrators, because of the newest technologies which require the time to understand and master them by technicians and engineers who are responsible for this digital transition in the industry. The data and cybersecurity also create big challenges in industrial case studies because the networking field in process automation is very basic compared to Internet and cloud computing networks; therefore, cyber attackers always have the ability to find vulnerabilities in industrial networks. As a solution to some vulnerabilities in cyber systems, novel components have been introduced in recent works that are blockchain-based, for example, smart meters which are applied in different industries [6]. Other challenges such as culture change and capital investment are creating big obstacles, where those responsible for the industrial process are hardly convinced to invest in digital transformation [7].
Industry 4.0 has become a living context where some industries have successfully implemented most of the features and are ready for fully autonomous Industry 5.0; however, in other industries, many challenges have been recently discussed and case studies are being developed such as integrating artificial intelligence, Internet of Things, and machine learning. Additionally, the DT is a smart feature that enables Industry 4.0, which researchers are currently developing. Aksa [8] has covered a wide range of research topics, from intelligent information management of complex models to building information management and the interaction of building systems, where researchers are becoming more interested in using the DT to manage their information and in developing new research lines focused on data exchange and the interoperability of building information modeling (BIM) and facility management (FM) [9]. On the other hand, Giulio [10] has discussed the DT studies and applications in the fields of engineering and computer science as well as spot research hotspots and emerging trends which were designed and tested to help operators in both regular and emergency situations and to improve their capacity to regulate safety levels. The benefits of DTs as they are currently perceived [11] are not well understood, nor have DTs across the product lifecycle or the DT lifecycle received enough research, and, also, how DTs can help with decision making or cost reduction is unclear, according to Rui [11] who developed a meta systematic review of sustainability requirements of DT-based applications. It is necessary to improve and better integrate DTs’ technology implementation into the IoT.
As a matter of fact, DTs are now being used by 13% of companies implementing IoT projects, while the remaining 62% are either deploying DTs now or have plans to do so, according to Gartner. The DT market is anticipated to grow from USD 3.8 billion in 2019 to USD 35.8 billion by 2025, at the compound annual growth rate (CAGR) of 37.8%, according to a new Markets and Markets report [12].
Table 1 highlights some DT applications in different industrial sectors where researchers around the world are currently developing them with multiple features such as predictive maintenance, monitoring, forecasting, diagnosis, production pace improvement, virtual reality integration, cybersecurity, and cyber-physical system integration.

3.2. Mining Industry: The Experimental Open Pit Mine

According to a TMR analysis, the worldwide smart mining market will increase at a CAGR of 10.2% between 2021 and 2031 [31], and the mining equipment market size will achieve USD 185 billion by 2030, growing at 4.1% CAGR due to rapidly rising construction activities in emerging nations (exclusive report by Acumen Research and Consulting [32]); therefore, the need to develop and to pursue the digital transformation in the mining industry is obviously and extremely important. The experimental open pit mine was first built for a necessary function to extract mining goods and optimize output while minimizing energy usage and monitoring grid quality. The maintenance experts employed a strategy of curative and preventative maintenance. The supervisory control and data acquisition (SCADA) system was fully focused on process machining supervision and key performance indicators in production, meaning that the field was ready to implement DT solutions in open pit mines using the implemented solutions of Industry 3.0, namely, distributed control systems, industrial automation, and other enterprise resource planning (ERP) and manufacturing execution systems (MESs). In this context, Adila [33] has proposed an architecture to forecast and monitor the energy consumption based on a research study on smart energy management as a feature of digital open pit mine developed by Oussama et al. [34], where the authors described how power meters can be conclusive in an automated open pit mine. Mariya [35] has developed an online diagnostic technique of a jaw crush which is a very critical piece of equipment in the mining industry that can be fully integrated into a DT by acquiring and pre-processing the used vibration data. The same approach in the mining industry has been followed using machine learning algorithms for the diagnostics of squirrel cage induction motors using only electrical data [36,37]. In addition to determining energy consumption, data correlated with other smart sensors, such as moisture, temperature, and others, can also be used to assess the health index of the power transformer, one of the key components in all industries which represents the condition of all machines in the cyber system and builds a DT of all industry components [38,39].
In this case study of the open pit mine of Benguerir, the DT and its features will focus on the fixed installations which are divided into three essential parts, namely, destoning unit, screening unit, and the train loading station. The destoning unit contains belt conveyors, a stone jaw crusher system, stackers, and a wheel bucket reclaimer. The screening unit, which is alimented by a 1 km belt conveyor from the first unit, has screeners and a storage system that contains a transborder and a stacker machine. The train loading station has two-wheel bucket reclaimer conveyors, system positioning, and train loading in two railways. To sum up, every station has a different machine; Table 2 classifies the mining machinery in this open pit mine.

4. Digital Twin

4.1. Related Research

Since the first introduction of the terminology in 2003 by Michael Grieves [40], the DT has experienced an impetus in academia and industry and, currently, it is recognized as a foremost method in the transformation process to Industry 4.0. A DT can be defined as a dynamic virtual representation of physical, economic, and/or social systems and the processes that are related to the system in question, enabling the tracking, adjusting, and/or predicting of its status in a real-time manner [41]. However, the DT can be considered as a polysemous term since there is no consensus yet on a unique definition of this concept. The most widely accepted definition is given by NASA which identifies the DT as an “integrated multi-physics, multi-scale, probabilistic simulation of a vehicle or system that uses the best available physical models, sensor updates, fleet history, etc., to mirror the life of its corresponding flying twin” [42]. Otherwise, there are abundant domain- and application-specific definitions which preclude a common understanding of this concept and impede its evolution. Indeed, the DT has overshadowed many domains since its first application in aerospace by NASA in 2010 to track the state of a flying spacecraft [43]. Thereafter, the DT was investigated and implemented by many sectors, including smart city, healthcare, construction, automobile, agriculture, manufacturing, and the aviation industry, to name but a few.
In the manufacturing context, the DT is a virtual representation that mimics the real-time operation of a physical manufacturing asset (e.g., machine, production line, shop floor, product, worker) enabling decision making, such as its real-time monitoring as well as predicting its future behavior, state, performance, and maintenance needs [24]. In [44], three major application scenarios of DT were stated, namely, supervisory—DT provides the real-time status of the counterpart physical system which supports the decision-making process; interactive—DT transcends supervisory tasks by automatically adjusting a parameter or a set of parameters of the manufacturing asset once a disruption occurs; and predictive—DT supervises and predicts the future state of the manufacturing asset with respect to implementing the corrective measures that maintain and/or optimize the current performance. Based on these three capabilities, the DT has rejuvenated many manufacturing tasks:
(1) Equipment health management: DT allows one to enhance the reliability, availability, and safety of manufacturing systems and workers through seamless monitoring and effective maintenance decisions founded based on the prognosis/diagnosis outputs of the DT. In the literature, abundant studies have leveraged DT in equipment health management. For instance, [45] presents a model-driven approach to estimate the remaining useful life (RUL) of an equipment’s component based on data acquired from the physical machine’s controllers and the simulation of virtual models. Ref. [46] presents a data-driven DT for fault diagnosis under data insufficiency which is usable in both the development and maintenance stages. At first, the digital model enables an intelligent design where potential problems can be detected and solved. At the same time, a data-driven fault diagnosis model is trained by the data generated from the physics-based model. In the second stage, the previously trained diagnosis model is adjusted using the transfer learning technique with respect to enabling timely monitoring and predictive maintenance.
(2) Production control and optimization: With the dynamic changes and uncertainty in the manufacturing environment, the production process must be continuously monitored. Real-time data provide the DT with the awareness of the current status of the manufacturing asset providing corrective measures to optimize the overall throughput. In the literature, [47] presents a data-driven DT that dynamically optimizes controllable parameters’ values to realize production control optimization in a petrochemical industry environment. Ref. [48] presents an MES-assisted DT which can react to disturbances on the shop floor in a real-time manner, enabling error state management and reactive disassembly of assembled products once quality standards are not met.
(3) Production scheduling: The uncertainty in the production processes subverts the static production scheduling methods. The DT can dynamically elaborate and/or verify schedules once a disruption occurs on the shop floor. The literature on DTs in production scheduling remains nascent. For example, [49] proposes a DT-assisted dynamic production scheduling framework that allows for detecting disturbances based on the distance between physical and virtual models, as well as predicting the impact of disturbances, thus triggering the rescheduling to avoid hazardous and/or cascading effects along with the performance evaluation of elaborated schedules. Ref. [50] presents data-driven DT-based dynamic scheduling that monitors and schedules tasks for robot manufacturing systems by communicating with robots the optimal path to complete a specific task.
Looking through the lens of elementary components of a generic DT in the manufacturing sector, a DT must comprise a physical asset in question, a virtual asset, and a real-time two-way information flow between physical and virtual assets. More specifically, a DT framework is grounded on IoT devices with respect to acquiring sensors’ readings from different sub-components of the physical asset, while maintaining a high-fidelity connection between IoT devices for reliable and timely information exchange. Data collected from different IoT devices and software are another key enabler of a DT; it is fundamental to monitor the physical asset, maintain its normal operation, and provide input to the decision model. Storage tools and big data analysis for extracting relevant information from data are also required within the DT. Artificial intelligence such as machine learning is also required to predict the future behavior of the physical asset, as well as to identify efficient mitigation strategies in abnormal situations. The privacy and security of data among various components involved in the DT must be tackled with respect to preserving data from both external and internal attacks that may tamper with sensitive data, jeopardizing the safety of the physical asset and the workers.
DT research is growing rapidly, particularly in the manufacturing industry, where DT is viewed as a virtual representation of a physical asset utilized for real-time monitoring, decision making, and behavior prediction. However, DT is a polysemous phrase, and there is no agreement on a single definition, which makes it difficult for it to evolve. A physical asset, a virtual asset, and a real-time, two-way information exchange between the physical and virtual assets are required components of DT in manufacturing. To build shared knowledge of DT and address the difficulties in its use in the industrial sector, additional study is required despite advancements in equipment health management, production control and optimization, and production scheduling.

4.2. Digital Twin Reference Model and Architecture

4.2.1. Digital Twin Reference Architecture Model in Industry 4.0

Because product lifecycles are reduced, the value chain is becoming more globalized, and pivotal choices must be made quickly, Industry 4.0 makes all business units in the manufacturing sector more complex [3].
To successfully integrate cutting-edge technology while navigating this complexity, it is essential to understand the concept of DTs. Furthermore, it is necessary to have a common definition when discussing Industry 4.0 and DTs; the reference architecture model for Industry 4.0 (RAMI 4.0) includes the key components of Industry 4.0 [3]. Using this framework, many perspectives on the DT can be organized methodically.
RAMI 4.0 can assist in simplifying complex tasks into packages that are well-aligned with key Industry 4.0 characteristics [51]. A digital twin reference architecture model for Industry 4.0 is shown in Figure 2. This three-dimensional layered model, which describes all key facets of the DT, is made up of a coordinate system in three dimensions. The first dimension aims to describe the product lifecycle, from the first idea to the decommissioning of the product according to the IEC 62890 industrial process measurement, control, and automation—lifecycle management for systems and components. The second dimension that encapsulates the architecture layers from the asset into business without any issues is the utilization of the DT layer. Additionally, the third dimension represents the level of integration of the DT layer, conforming to the data flow.
This method simplifies complex relationships into smaller, more manageable clusters, which include the agile value lifecycle, the Industry 4.0 architecture layers, and the DT integration hierarchy [52].

Horizontal Axis: Agile Product Lifecycle

The central axis depicted in Figure 2 is discussed in this section. The left horizontal axis depicts a continuous, iterative lifecycle of product and service value that encapsulates a set of interrelated value creation activities ranging from prototype development to products including design, manufacturing, logistics, sales, and services. Within this axis, types and instances must be distinguished. A type is always created with ideation. This includes preliminary design, simulation, prototype development, and testing. Once design and prototyping have been finished and the manufacturing stage is triggered, a “type” becomes an “instance”.
The agile approach as a proactive way to anticipate changes across the lifecycle can be implemented by leveraging the capabilities of the DT (e.g., data-driven individualization) with respect to enabling self-directed and continuous learning. The agile DT value lifecycle virtually duplicates the physical world by using relevant data that describe the actual status of things enabling incremental development. In the bargain, the agile value lifecycle can predict the future by using past scenarios and observing the behavior of the twin with minimum trial-and-error cost. Hence, the value lifecycle axis has an important role in this reference model since both agile and twinning concepts are suitable for individualization. On the other hand, the DT can be involved in requirement definition and design improvement through a seamless interaction with customers, capturing their preferences on functionalities and appearance.

Vertical Axis: RAMI4.0 Layers

The vertical axis describes the combined conceptual layered architecture enclosed by Industry 4.0 hierarchy and the DT layer, comprising the six layers associated with RAMI 4.0, which are as follows:
  • The asset layer: This corresponds to the physical entity where the DT’s physical representation is located.
  • The integration layer: Run-time data and engineering data can be separated in this layer. Run-time data are produced by sensors or events and show the physical entity’s current state. Typically, they are time series data. The underlying infrastructure must adapt to application-specific requirements, such as big data processing or real-time reaction, because these data are particularly dynamic. Often static, engineering data do not frequently alter over time. Examples include details about a physical component inside a plant or the topology of the plant.
  • The communication layer: Mostly, industrial communication or Industrial Internet of Things (IIoT) protocols can be used. The specific protocol is determined by the demands of the application, such as real-time capabilities or support for publications and subscriptions. As a result, a combination of several protocols can be envisaged, essentially to ease the data flow into the upper levels.
  • The information layer: Here, the acquired data are semantically processed and related to additional context information; the core element of the information layer is the shared knowledge base, which contains context-sensitive data about resources and services. This information’s semantics are increased by linking data to it. By using various information modeling tools, various levels of semantic expressivity can be attained. Data should ideally be kept in suitable databases in their original formats.
  • The functional layer: From an architectural perspective, the function layer describes functions and services, along with their interactions. In applications, systems, and components, the functions are represented without the use of actors or actual physical implementations, whereas it provides a framework for the horizontal mixing of diverse functions and allows for the formal description of functions and services. It includes a run-time environment for applications and technical functions, as well as a modeling environment for services supporting business processes.
  • The business layer: This relates to process logic, abstract business models (rules), and assuring the quality of operations along the value chain. This layer facilitates business model mapping, value stream function integrity assurance, and process output assurance. It permits the modeling of the requirements that the process must adhere to and contains the legal and regulatory framework conditions. This layer also establishes a connection between various business processes and organizations.

Data Flow Axis

This axis is defined by the data flow specifications that dictate the integration level of each digital replica (model, shadow, twin) [30], and aims to network the different participants across hierarchical levels and maintain the interaction between the physical and the cyber part of the CPS, conductive to ensuring the flexibility of the systems, machines, and functions all over the networked entities of the RAMI 4.0 vertical axis layers. More details about this axis will be discussed in the next section.

4.2.2. Digital Twin Integration Level

The simplest definition of “Digital Twin” is the seamless integration of data between the physical layer and the cyber layer of the cyber-physical system (CPS). There are three levels of integration, and each one of them is defined essentially by the data flow exchange between the physical plant and the digital system, regardless of whether it is a digital model, digital shadow, or DT. Figure 3 explains the differences between digital model, shadow, and twin according to the digitalization levels of integration for each of them. Each physical asset in the cyber layer is made up of the data exchange type that creates a digital replica of the asset [30]. The definition of this digital replica can be changed to three possible types according to its digitalization levels of integration [30].
While the integration of data into the physical, digital, and cyber layers differs, a “Digital Model,” unlike simulations and mathematical models, does not have a real-time connection to or from a physical object. In the same way, a “Digital Shadow” is a significant application in real-time monitoring if a change in a thing’s state causes an instant change in the state of the digital things. This is possible with one-way real-time data communication from physical to digital space. The digital replica in the top integration level is transformed into a DT when it combines the five features as seen in Figure 3: connectivity, simulatability, active data acquisition, synchronization with the physical asset, and active decision applicability through a real-time link between the DT and the physical twin.
  • Connectivity is the ability to connect systems, services, or application programs. Ideally, these connections are established without requiring many changes to the applications or the systems on which they run.
  • Simulatability: The ability of the system to replicate the model and the behavior of the physical asset with the capacity to deal with all of its properties and parameters.
  • Active data acquisition is the key point to feed the models with data and establish the real-time link between the two sides of the cyber-physical system (CPS).
  • Synchronization brands the sequencing of the data flow between both components of the cyber-physical system, and without any problems creates a full, synchronous, and real-time exchange for the DT in its third level of integration.
  • Active decision applicability: In our case, the significant benefit of the utilization of the DT replica is the bidirectional data flow exchange centrally located in the physical part and DT of the system. This makes it possible to exploit and apply the decision token on the physical plant.
Over the top of the DT replica, there is the intelligent digital twin (IDT) which combines artificial intelligence with all of the features of a DT to create an independent and mature system [53]. Conductive to optimizing operations and continuously testing what-if scenarios, the intelligent DT can therefore apply machine learning algorithms to the DT’s models and data. This opens the door for predictive maintenance and a more adaptable and efficient production process through plug-and-produce scenarios. In applications such as open pit mines, where a lot of data are entered manually, some design elements are represented as digital shadows in this article. However, the majority of these elements are fully integrated DTs created to utilize programmable logic controllers and automation systems.

4.2.3. Digital-Twin-Enabled Value Lifecycle Management

Companies frequently have to rely on new technologies to develop new methods to develop company value. The DT may be a key driver in maximizing the potential of the company by providing a variety of use cases. Using all data and information of the organization, the idea of “DT” technology goes beyond just representing a real thing virtually. It is capable of providing a dynamic process information flow that may be used in a variety of case situations that strongly affect the company value [48].
The entire lifecycle is presented in Figure 4; the links between data increase data availability, and this aims to simplify working in model-oriented and partly automated systems, which leads to exploiting the opportunities that could be gained.
In the area of quality, developers are able to more effectively respond to consumer behavior by using their product [2]. They can also automatically detect the trend of product quality defects or even predict them.
In the area of warranty cost and services, it is worth relying upon accurate product descriptions based on the DT [54]. Even after several conversations and maintenance procedures, it is necessary to document the current product configuration at any moment in real time. This enables the company to modify its service in the best possible way.
Operating costs: When operating a product, manufacturers aim to utilize as few of the necessary resources as possible. This is not an easy procedure because of how many subsystems are being produced nowadays. However, to solve this issue, the DTs can interact with one another to discuss the ideal production order while manufacturing. This allows us additional process flexibility along with the simple configuration of the system. We can benefit from DTs not only to synchronize individual machines, but also to enhance the machines’ performance. Enhancing the functionality of our systems and achieving a higher level of dependability can be achieved, for example, by collecting sensor data and conducting analysis with the use of open, networked models and intelligence.
In addition, opportunities that are associated with the ‘DT-enabled value lifecycle management’ are the record retention and services which become difficult with products that undergo continuous improvement and expansion, because every product depends on its configuration and the update, respectively. Therefore, this operation requires the continuous tracking of every step of the product lifecycle. Luckily, the DT provides us with a detailed description of the product with the current setup in the field and the history of every iteration of the product configuration.
At the end is revenue growth opportunities; this step shows the chances for a rise in sales. The DT, in the first place, improves the availability of information, which is valuable because looking for information occupies a large part of work time that can be used in development. As seen, the products were made and maintained in a more efficient and improved process. Finally, the DT gives us the chance to create whole new business models. It is considerably simpler to offer product-based services with the presence of DT. Many benefits are involved in this step: reducing time-to-market, encouraging new business model development, and reducing the cost of production.

4.2.4. Digital-Twin-Based Collaborative Environment

Considering the DT benefits shown in the previous section, the digital part of the CPS—the DT—is connected to the physical part. Hence, this constitutes as seen in the Figure 5a collaborative environment enabling the setup of the whole DT of the system including product, production, process, and services.
Thus, to expand the DT benefits, Figure 4 enumerates the four services that are involved, as follows:
  • Engineering: Improve the efficiency of your engineering procedures, so that new products may be produced with insights based on the behavior of existing items in the real world, connecting the virtual models with the actual physical objects. By conducting what-if scenarios, the DT helps you to uncover product faults in early product design and development phases, which lowers engineering costs and speeds up design cycles. For instance, client usage is represented in the DT and is integrated into the processes of product development and manufacturing. The DT will strengthen the bond between your organization’s technical and operational teams by establishing a feedback loop throughout the lifecycle of your product.
  • Production: Gain knowledge about the manufacturing processes from the DT to prognostic or diagnostic condition monitoring. For example, the maintenance of plants might be optimized using this knowledge. The DT’s analytical capabilities will increase production effectiveness by predicting faults so that they may be corrected before having an impact on manufacturing goals. By modifying factors throughout the production process in the DT to enhance utilization, you may simulate different plant plans. DTs assist in detecting quality trend impacts and maintaining quality standards during real manufacturing.
  • Operation and process: By optimizing the information flow throughout all of your operational and servicing activities, the DT enables you to comprehend how to run your shop floor more effectively and efficiently. Performance evaluations may be evaluated by the DT conductive to reducing production costs, which is one of its most important capabilities. Gain an insight into everything that happens to your items, and your supply chain network will be improved overall. Owners and operators, for instance, may track their fleet of cars, fleet assets, or logistical assets using DTs, as well as improve infrastructures.
  • Marketing and go-to-market: Implementing DTs will improve the effectiveness of customer interaction workflows in marketing and sales. Before, when a product was handed over to a client, product development essentially came to a stop. These times are long gone. With product feedback going well beyond product delivery into service, complete data management is made possible. Improve your customer validation strategy and aim for top-notch digital marketing. You may be able to identify new business models for the product as a service depending on how the product is used or consumed.
As demonstrated, a DT has a wide range of applications and advantages. The information flow across all business operations is linked by digitization, which is the foundation that enables you to adapt swiftly to changing business situations. An Industry 4.0 tool called the “DT method” will link and optimize your company, allowing it to operate to its fullest potential.

5. Digital Twin Multi-Layer Reference Architecture Framework for the Mining Industry

The proposed conceptual framework architecture presented in Figure 6 is used to characterize the design of a tree-level DT that satisfies all of the features of integration as shown in Figure 3, considering its traceability back to the requirements, components, and control systems of the physical asset. It includes two parts of the CPS layers, cyber and physical layers, and three sublayers, including the data preprocessing sublayer, edge computing sublayer, and cloud sublayer. The details of each sublayer will be discussed in the following subsections.

5.1. Physical Layer

Before entering the DT architecture, data must first be collected from the open pit mine’s physical assets. Data from the mine are disseminated through a variety of methods, including sensor readings, videos, photographs, the staff clocking system, electric meters, and many others. Most often, they are time series data [30]. Since these data are constantly changing, the supporting infrastructure must adhere to application-specific standards, such as big data processing or real-time response. These data can be kept in archives, such as specialized time series databases, for diagnosis and model identification. Engineering data are typically static, meaning they do not frequently change over time. Examples include information about a plant’s physical structure or its topology. This information must be converted to a digital form because it is typically only available in analog representations such as pipe and instrumentation diagram drawings. Even if the data are already in machine-readable form, they still need to be translated and added to the DT as contextual data. The data that are gathered must contain all of the information that is pertinent to the objective of the DT, such as information on the operation of the equipment, scheduling, and the production environment. In general, mine data are information received from people, machines, objects, and the environment.

5.2. Cyber Layer

5.2.1. Data Pre-Processing Sublayer

Conductive to employing data mining techniques, raw data must be transformed into well-formed data sets through the process of data pre-processing. Raw data are typically unformatted and insufficient. Every data analytics project’s performance is directly correlated to how well the data were prepared. After data collection is complete, how does it proceed via the next sublayer of the DT architecture using its many applications, such as data cleaning, which comprises adding missing values or removing rows with incomplete data, reducing noise in the data, and resolving data discrepancies, as well as the data integration process, which entails reconciling data conflicts by merging data with different representations?
On the other side, when the amount of data is high, databases may become slower, more expensive to access, and more challenging to store. Data reduction aims to provide a condensed version of the data in a data warehouse.

5.2.2. Edge Computing Sublayer

Edge computing is processing that operates close to the data source or end user of a system, depending on where the information is coming from or going. Faster processing is made possible by edge architecture by reducing latency and lag. Applications and programs that are based on the edge can function more rapidly and effectively, improving user experience and overall performance. A distributed information technology (IT) architecture known as edge computing processes client data at the network’s edge, as near to the source as is practical. Edge computing benefits in the following list are clear as a result of this feature:
  • Quick data processing and analysis in real time: The edge computing approach does not upload data to a cloud computing platform, instead storing and processing it on edge devices. The rapid rise in data volume and the strain on network capacity are drawbacks of cloud computing. Edge computing offers an advantage over regular cloud computing in terms of response time and real time. Since the edge computing node is located closer to the data source, it may perform computing and data storage functions locally, reducing the need for intermediate data transmission. It emphasizes being close to users and offering them superior intelligent services, which enhances data transmission performance and ensures real-time processing while decreasing wait time. In the realm of automatic driving, intelligent manufacturing, video surveillance, and other forms of location awareness, quick feedback is crucial. Edge computing offers users a range of fast response services.
  • Security: For unified processing, a centralized processing technique, traditional cloud computing demands that all data be uploaded to the cloud. Risks such as data loss and leakage will be present during this procedure; thus, security and privacy cannot be guaranteed. Account passwords, past search history, and even trade secrets can all be made public. The security of data can be guaranteed since edge computing exclusively handles the tasks that are within its purview, relies on local processing rather than uploading to the cloud, and avoids the dangers associated with network transmission. When data are attacked, just the local data are impacted—not all data.
  • Low cost, low bandwidth cost, and low energy cost: Since edge computing does not require the processing of data to be uploaded to a cloud computing facility, there is a reduction in the demand on the network’s bandwidth as well as a significant reduction in the energy consumption of intelligent devices at the network’s edge. Edge computing is “small-scale”, and, in practice, businesses can lower the price of processing data on local hardware. Edge computing hence reduces the volume of data carried across the network, lowers the cost of transmission and the demand on the network’s capacity, lowers the energy consumption of local equipment, and increases computing efficiency.

5.2.3. Cloud Computing Sublayer

IT has advanced, and a key business model for providing IT resources is cloud computing. With cloud computing, people and businesses can access managed and customizable IT resources such as servers, storage, and services via a shared network on demand. As a result, industrial dynamics are accelerated, established business models are disrupted, and the digital revolution is fueled. It also supplies the infrastructure that has supported key digital developments such as the Internet of Things, big data, and artificial intelligence. However, cloud computing offers a wide range of advantages and possibilities [30].
Located at the top of the cyber layer of the CPS, here is the final sublayer of the proposed framework of DT, the cloud computing sublayer, which is distinguished by housing the cloud database, the main database that receives data from the layer above for either storing and feeding the virtual mine and its optimization and predictive data applications, such as predictive production and maintenance scheduling, process optimization, supply chain, and predictive shift-work scheduling. These applications will also involve interfacing with virtual models that are present locally on the edge computing sublayer.
In particular, the edge computing layer’s online defects detection and online process quality control per each component, machine station, and the cloud layer’s predictive maintenance application would interact to create a general predictive maintenance plan for the mine. To prevent models from being lost if an edge computing application fails, models created in the edge computing layer must be duplicated in the cloud layer. Furthermore, the cloud layer is in charge of developing and refining algorithms and models at both the edge computing and cloud levels. Regular updates and optimizations of the virtual model’s algorithms will take place in the cloud before being sent back to the edge, ensuring that the edge layer always has the most recent versions of both the model’s algorithms and the models themselves.
The main database that is located in this sublayer feeds the DT by updating data coming from the optimization and predictive data applications for exploiting the three major services that constitute the core DT.

Co-Simulation Service

Conductive to designing ever-more-complex systems under increasing market pressure, it is critical to find new techniques to make it possible for specialists from many disciplines to interact more effectively. Using a heterogeneous-model-based approach could be a method for solving this problem. Different teams could create their models and conduct mono-disciplinary analyses, but they could also couple their models together for simulation (co-simulation), which would make it possible to study the system’s overall behavior [30].
At the cyber layer, the simulation service located at the edge computing sub-layer relates to the synthesized data database which is fed by the data monitoring coming from the applications performed at this level, such as online defects diagnosis, online energy management, and online process quality control. The virtual model could operate through the simulation service that allows one to handle a what-if simulation of the concerned physical entity with continuous monitoring data updates. In addition, the co-simulation service at the cloud sublayer, located at the heart of the DT, allows one to constitute the virtual mine, throughout the simulation of each model that belongs to the physical twin, simultaneously with a virtual interconnection between the models; thus, the co-simulation service would endorse a multidisciplinary simulation that may be exploited to replicate the entire process flow of the physical side of the CPS. Therefore, a simulation of the real interactions between the models represented by the virtual mine is needed, conducive to realistically simulating the overall system, both the physical assets, and the physical interactions. Consequently, the co-simulation is dynamic, which means that the co-simulation can include more models, respectively, to run their simulations during the run-time.
Additionally, we distinguish between two types of co-simulation approaches, the domain-independent co-simulation approach and the domain-specific co-simulation approach. Respectively, the co-simulation approaches that are used independently of the domain represent the domain-independent approaches of co-simulation. On the other hand, a variety of domain-specific co-simulation approaches exist, which are used for the simulation of a specific field related to a specific domain; for instance, Mosaik, EPOCHS, and ADEVS are simulation tools used in the electric power grids field.
  • Co-simulation based on functional mock-up interface (FMI): The functional mock-up interface (FMI) offers the ability to interchange models between simulation programs in addition to providing a standard for co-simulation. It is not allowed to incorporate simulation tools into the co-simulation that do not support FMI due to the need that the employed simulation tools support FMI. Likewise, the tools only communicate at specific times while operating independently of one another in between. The data exchange and slave simulations are synchronized by a master algorithm, with FMI allowing for configurable time steps between two synchronization phases. A functional mock-up unit (FMU) that implements the simulation tool’s interface must be used to represent each simulation tool in a co-simulation. FMI is not suited for a dynamic co-simulation with “Plug-and-Simulate” features, since its master algorithm requires input from each slave simulation before beginning the simulation [55].
  • Co-simulation based on high-level architecture (HLA): The United States Department of Defense created the high-level architecture (HLA) as a distributed and parallel simulation architecture federation for the many simulators, and the run time infrastructure (RTI), a hub for federate coordination, which makes up the co-simulation with HLA known as a federation. An object model template defines the information that may be shared between the federates, and an interface specification defines the interfaces between the federates and the RTI. There are also some HLA rules that a simulator must adhere to, conductive to being considered to conform to HLA. The RTI may be thought of as the simulation master in charge of synchronizing the federates. Although HLA permits the dynamic entry of federates during run-time, a new Federation Agreement, that is domain- and use-case-specific, must be prepared for each federate. There are several RTI implementations available, both paid and unpaid [56].
  • Co-simulation based on open services gateway initiative (OSGI): The open services gateway initiative (OSGi) is a framework that was created for the building of applications made up of dynamically combinable and reusable components. It is based on Java. The implementation of the components in OSGi is isolated from other components, conductive to decreasing complexity, and the components communicate with one another via services. The framework may load, delete, swap, or update the bundles that serve as representations of the components at run-time. These bundles may be located on a single computer or spread across a number of machines. A dynamic co-simulation is made possible by the dynamic exchange of bundles. A bundle that is linked to the simulation by a simulator coupler serves as a representation of each simulation. The simulation coupler makes use of OPC and permits data exchange and synchronization between the simulations. Another option is to directly incorporate a model into an OSGi bundle. For switching simulators during run-time, a separate bundle is needed, in which the states of the withdrawn simulators are maintained to allow for their re-entry.
  • Co-simulation based on OPC unified architecture: The OPC Foundation created OPC UA, a machine-to-machine communication standard that is service-oriented and enables the exchange of process data and their machine-readable description. The reference implements a co-simulation with OPC UA, in which each simulator is connected to a generic adapter via an interface that houses both an OPC-UA server and an OPC-UA client. This adapter uses OPC UA to connect to a central server that also has an OPC-UA server and a client. Each simulator must be registered on the central server, with the first simulator to do so being the simulation master and the following simulators being the simulation slaves. The co-simulation is coordinated and synchronized by the master. If the co-simulation master exits, another simulator assumes control of the co-simulation.
  • Agent-based co-simulation: This concept of multi-agent systems was chosen to couple the simulation tools, as long as software agents can join and leave multi-agent systems at any time during run-time, are domain-independent, have no restrictions during any stage of the lifecycle, and have the ability to add intelligence on top of the models [56]. Each IoT component is modeled in its own simulation, maybe using multiple simulation tools, and each simulation is linked to and embodied by an agent, as shown in Figure 7. It is feasible to interchange the models at run-time by simulating each IoT component separately and linking them to agents, due to the ability of the agents to enter and exit the multi-agent system at any moment, just like IoT components can enter and exit the IoT system. Through agent-to-agent communication, the agents advance the interaction between the models. For illustration, if a heating unit model asks for the temperature value of a temperature sensor model, the agent associated with the heating unit model sends this request to every other agent, who then sends it to their corresponding models. The models then determine for themselves whether the message received is valuable before responding appropriately. A notion for an interface has to be established in order to facilitate communication between the agents and their respective simulation tools.

Operation Data Service

A physical asset’s operating data must be able to be collected by a DT, as well as processed and analyzed. This serves as the foundation for knowledge extraction, which will make the DT intelligent and enable it to perform a variety of aid activities including diagnosis and prediction accuracy. Operation data collecting, pre-processing, and semantic annotation serve as the basis for this. A significant difficulty in industrial implementation is the capacity to sample process data very precisely and to assign it to the correct digital replicate conducive to enriching the DT of a CPS with operation data.
The DT must have up-to-date asset operating data conducive to appropriately representing the behavior and status of the asset. Both sensor data, which are continually streamed and recorded, and control data, which define the physical component’s present condition and are likewise documented during their entire lifecycle, may be used as examples. Here, you may also save new orders and other company information. The interface for data acquisition, a structural element of the DT, is used to do this. A database that stores and processes this kind of data is referred to as operation data. These data might be used by algorithms which are both live and offline to enhance knowledge of the asset and improve the relevant models.

Synchronization Service

The anchor point method can be useful in identifying changes that are taking place in the physical production system and in examining the connections between the modified components within it, conducive to synchronizing cross-domain models of the DT.
Three principal phases that constitute the anchor point method are as follows: automatic change detection, relational analysis, and change management. The first phase relies on importing the control code in two different times conducive to performing a relation comparison of models [57]. Two versions of the system’s control code are imported from a repository in the first step. These versions are saved at various times, the first being immediately upon system commissioning and the second being in the present day.

5.3. Discussion

5.3.1. Features to Consider for Designing Digital Twin Frameworks

Digital twins have numerous benefits including the ability to simulate real-world behavior, monitor performance, and optimize the design. Within the mining industry, the use of DTs can improve the efficiency and safety of operations by enabling engineers to analyze and optimize the mine designs, predict and prevent potential problems, and monitor the performance of mines in real time. There are various frameworks available for creating DTs in the mining industry, each of which may have its own set of tools and methods for modeling and simulating mine behavior, integrating data from various sources, and visualizing results. When comparing the architecture of different DT frameworks for the mining industry, it is important to consider the following factors:
  • The level of detail and complexity of the models: DT frameworks may vary in the level of detail and complexity of their models, which can impact their ability to capture a wide range of physical and operational characteristics of a mine [58,59].
  • The type of data sources and integration methods: Different DT frameworks may utilize different types of data sources and integration methods, such as sensors, simulations, and historical data, and may have varying levels of interoperability between these sources [60].
  • The visualization and analysis tools: The quality and flexibility of the visualization and analysis tools provided by a DT framework can significantly influence its usability and effectiveness [61].
  • The scalability and performance of the system: As the size and complexity of a mine increases, the scalability and performance of the DT framework may become increasingly important considerations [62].
  • The cost and resources required for implementation and maintenance: The cost and resources required for implementing and maintaining a DT framework can vary greatly depending on the system’s complexity and scope [63].
  • Customizability and adaptability: Some DT frameworks may offer a higher degree of customizability and adaptability to meet the specific needs and requirements of a mine, while others may be more rigid in their capabilities [64].
  • Ease of use and user experience: The usability of a DT framework can greatly impact its adoption and effectiveness. Some frameworks may be more intuitive and user-friendly, while others may require specialized training or expertise to be used [65].
  • Integration with other systems and platforms: A DT framework that can easily integrate with other systems and platforms used by a mine, such as enterprise resource planning (ERP) systems and asset management systems, may be more effective [66].
  • Security: DT frameworks may handle sensitive data and intellectual property, so it is important to consider the security and privacy measures in place to protect this information.
  • Support and maintainability: The level of support and maintenance provided by the developer or vendor of a DT framework can be important factors in its long-term viability and reliability [66].

5.3.2. Proposed Digital Twin Framework Advantages

The proposed DT framework architecture aims to propose a roadmap towards the design of DT-enabled product lifecycle management with the aim of answering the challenges faced in the field of the mining industry. With a focus on the mining industry, it is essential to have a comprehensive understanding of the system and its interconnections. Barricelli et al. [67] surveyed the definitions of DTs and their main characteristics. The process of developing DT is complex and involves the integration of various technologies. Leng et al. [68] studied how manufacturing systems have evolved from Industry 1.0 to the current Industry 4.0, focusing on the use of DTs in this process. They examined the design steps and the technological advancements that have taken place over time. There are many relevant conditions to consider when it comes to the mining industry, such as the collaboration of different systems, the creation of intricate models, the use of various technologies, the long distances that can cause latency and connection delays, and the large volume of heterogeneous data generated. The paragraph below represents our contribution and discusses the integrity of the framework combining these challenges.
Steindl et al. [51] enumerate various architectures and frameworks in order to create a technology-agnostic generic DT architecture that is consistent with the reference architecture model Industry 4.0 (RAMI 4.0), which is the main subject of our work. Through the proposed DT framework architecture, we provide a multi-layered architecture that links the physical layer and the cyber layer, considering the three sublayers presented inside the cyber layer of the CPS, which are, respectively, the preprocessing sublayer, edge computing sublayer, and cloud computing sublayer. In furtherance of dealing with the big volume of data gathered and its heterogeneity, many techniques are presented in the processing sublayer: data cleaning, data reduction, and data integration. In addition, the framework affords an architecture-based edge–cloud computing collaboration, manifested by two sublayers named, respectively, the edge computing sublayer and cloud computing sublayer.
For real-time monitoring and online diagnosis, edge computing has potential applications due to its ability to process data closer to the source, potentially leading to faster and more efficient analysis.
However, the cloud computing sublayer can be utilized to enhance the performance of optimization and prediction algorithms due to its ability to offer scalable and flexible computing resources. This can enable the algorithms to operate more efficiently and effectively.
Optimization algorithms are used to determine the optimal solution to a problem, often by minimizing or maximizing an objective function. These algorithms can be resource-intensive and may require large amounts of data to be processed. By utilizing cloud computing resources, it is possible to increase the computational power and storage capacity as needed, allowing the optimization algorithm to operate more efficiently and potentially find better solutions.
Prediction algorithms, also known as machine learning models, are utilized to make predictions or forecasts based on data. These algorithms often require a large amount of data for training and may need to make predictions in real time. By utilizing cloud computing resources, it is possible to quickly process and analyze the data, potentially leading to more accurate predictions.
In summary, the use of cloud computing as a sublayer for hosting the optimization and prediction algorithms can provide the required computational resources and allow the algorithms to operate more efficiently. Furthermore, the cloud computing sublayer also hosts the virtual mine that represents the core of the DT framework architecture, enhanced by the operation data service that provides access to data related to the operation of a system or process. These data could be used to feed the DT, allowing it to accurately represent the current state and behavior of the physical entity.
The co-simulation service allows for the simultaneous simulation of multiple models of systems or processes. This could be useful for analyzing the interactions between different components of a system, or for testing the behavior of a system under different scenarios. The synchronization service ensures that data are kept up to date and are consistent across different systems or processes. This could be useful for ensuring that the DT accurately reflects the current state of a physical system, or for coordinating the operation of multiple systems.
It has been suggested that by utilizing a combination of services, the accuracy and functionality of the DT could be improved, allowing it to represent the behavior and performance of the physical system more closely.
With this DT framework architecture, we aim to suggest a generic developing method for designing DT-enabled asset lifecycle management specifically for the mining industry. Hribernik et al. [69], in their work, conducted a survey of the necessary properties and requirements for a DT. They then presented a roadmap towards the development of DTs that are autonomous, context-aware, and adaptive. Nevertheless, the promising benefits of using DT in the industry are still limited by the lack of DT-developing methods, as the authors point out in their review [70].
Several factors should be taken into account when developing a digital twin framework for the mining industry, including the degree of detail and complexity of the models, the types of data sources and integration techniques, the visualization and analysis tools, scalability and performance, cost and resources, customizability and adaptability, ease of use and user experience, integration with other systems and platforms, security, and support and maintainability. With a focus on the mining sector, the suggested digital twin framework architecture offers a multi-layered framework that connects the physical layer and cyber layer, combining data cleansing, reduction, and integration techniques, as well as edge–cloud computing cooperation. Cloud computing can improve the performance of optimization and prediction, whereas edge computing is useful for online diagnostics and real-time monitoring. The incorporation of DTs has the potential to bring numerous benefits and a range of applications, despite facing certain challenges. As a result, various industries are exploring ways to adapt their operations to align with future trends and to evaluate the long-term viability of their business models concerning the lifecycle of assets.

6. Conclusions and Future Work

Driven by the rising demand for individualized and customized products, production is becoming increasingly complicated in terms of structure. Impacting all of the product lifecycle phases, digitization is the main key into a transformation process that can lead to plenty of opportunities. It is empowering to introduce many engineering and management applications such as the DT. The current research aims to take full advantage of DT capabilities in response to the challenges faced in the mining industry. The goal behind this paper is to propose a generic DT architecture framework for the sustainable mining industry by adopting the reference architecture model in Industry 4.0 (RAMI 4.0) and enabling asset lifecycle management by discussing the outcomes and potentialities of the use of DT technology. This framework establishes a collaborative environment that joins the two sides of the cyber-physical system, integrating the four services impacted: engineering, production, process and operation, and marketing and go-to-market.
Motivated by the benefits and opportunities enabled by using the DT approach, the goal now is to design a proof of concept of this framework in future work by developing an industrial case study of DT that facilitates the achievement of a sustainable mine considering the asset lifecycle management mirror and assuring the simulation and test of the process consistency over the lifecycle phases.

Author Contributions

Conceptualization, N.E.B. and O.L.; methodology, N.E.B., O.L. and M.M.; software, N.E.B.; validation, M.M. and A.C.; formal analysis, A.C. and F.-E.H.; investigation, M.M.; resources, N.E.B. and N.O.; data curation, N.E.B., F.-E.H. and H.E.H.; writing—original draft preparation, N.E.B.; writing—review and editing, N.E.B., N.O. and H.E.H.; visualization, N.E.B.; supervision, O.L. and H.E.H.; project administration, M.M.; funding acquisition, A.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the Green Tech Institute of UM6P.

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

Not applicable.

Acknowledgments

This work is the result of research on the production value chain of the Green Tech Institute and OCP Morocco Experimental Open Pit Mine of Benguerir. We thank the team and their industrial partners.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Macchi, M.; Roda, I.; Negri, E.; Fumagalli, L. Exploring the Role of Digital Twin for Asset Lifecycle Management. IFAC Pap. 2018, 51, 790–795. [Google Scholar] [CrossRef]
  2. Davila Delgado, J.M.; Oyedele, L. Digital Twins for the Built Environment: Learning from Conceptual and Process Models in Manufacturing. Adv. Eng. Inform. 2021, 49, 101332. [Google Scholar] [CrossRef]
  3. Aheleroff, S.; Xu, X.; Zhong, R.Y.; Lu, Y. Digital Twin as a Service (DTaaS) in Industry 4.0: An Architecture Reference Model. Adv. Eng. Inform. 2021, 47, 101225. [Google Scholar] [CrossRef]
  4. Xu, X.; Lu, Y.; Vogel-Heuser, B.; Wang, L. Industry 4.0 and Industry 5.0—Inception, Conception and Perception. J. Manuf. Syst. 2021, 61, 530–535. [Google Scholar] [CrossRef]
  5. Demir, K.A.; Döven, G.; Sezen, B. Industry 5.0 and Human-Robot Co-Working. Procedia Comput. Sci. 2019, 158, 688–695. [Google Scholar] [CrossRef]
  6. Laayati, O.; El Hadraoui, H.; Bouzi, M.; El-Alaoui, A.; Kousta, A.; Chebak, A. Smart Energy Management System: Blockchain-Based Smart Meters in Microgrids. In Proceedings of the 2022 4th Global Power, Energy and Communication Conference (GPECOM), Nevsehir, Turkey, 14–17 June 2022; pp. 580–585. [Google Scholar]
  7. NIBUSINESSINFO.CO.UK. Industry 4.0 Challenges and Risks. Available online: https://www.nibusinessinfo.co.uk/content/industry-40-challenges-and-risks (accessed on 8 November 2022).
  8. Artificial Intelligence and Industry 4.0—1st Edition. Available online: https://www.elsevier.com/books/artificial-intelligence-and-industry-40/hassanien/978-0-323-88468-6 (accessed on 8 November 2022).
  9. Hosamo, H.H.; Imran, A.; Cardenas-Cartagena, J.; Svennevig, P.R.; Svidt, K.; Nielsen, H.K. A Review of the Digital Twin Technology in the AEC-FM Industry. Adv. Civ. Eng. 2022, 2022, e2185170. [Google Scholar] [CrossRef]
  10. Agnusdei, G.P.; Elia, V.; Gnoni, M.G. Is Digital Twin Technology Supporting Safety Management? A Bibliometric and Systematic Review. Appl. Sci. 2021, 11, 2767. [Google Scholar] [CrossRef]
  11. Carvalho, R.; da Silva, A.R. Sustainability Requirements of Digital Twin-Based Systems: A Meta Systematic Literature Review. Appl. Sci. 2021, 11, 5519. [Google Scholar] [CrossRef]
  12. Global Logic a Hitachi Group Company. Digital Twins Technology, Its Benefits & Challenges to Information Security. 2020. Available online: https://www.globallogic.com/insights/blogs/if-you-build-products-you-should-be-using-digital-twins/ (accessed on 22 October 2022).
  13. Kohne, T.; Burkhardt, M.; Theisinger, L.; Weigold, M. Technical and Digital Twin Concept of an Industrial Heat Transfer Station for Low Exergy Waste Heat. Procedia CIRP 2021, 104, 223–228. [Google Scholar] [CrossRef]
  14. Gao, Y.; Chang, D.; Chen, C.-H.; Xu, Z. Design of Digital Twin Applications in Automated Storage Yard Scheduling. Adv. Eng. Inform. 2022, 51, 101477. [Google Scholar] [CrossRef]
  15. Jiang, Z.; Guo, Y.; Wang, Z. Digital Twin to Improve the Virtual-Real Integration of Industrial IoT. J. Ind. Inf. Integr. 2021, 22, 100196. [Google Scholar] [CrossRef]
  16. Brosinsky, M.C.; Song, M.X.; Westermann, D. Digital Twin—Concept of a Continuously Adaptive Power System Mirror. In Proceedings of the International ETG-Congress 2019, Esslingen, Germany, 8–9 May 2019. [Google Scholar]
  17. O’Dwyer, E. Integration of an Energy Management Tool and Digital Twin for Coordination and Control of Multi-Vector Smart Energy Systems. Sustain. Cities Soc. 2020, 62, 102412. [Google Scholar] [CrossRef]
  18. Qiuchen Lu, V.; Parlikad, A.K.; Woodall, P.; Ranasinghe, G.D.; Heaton, J. Developing a Dynamic Digital Twin at a Building Level: Using Cambridge Campus as Case Study. In Proceedings of the International Conference on Smart Infrastructure and Construction 2019 (ICSIC), Cambridge, UK, 8–10 July2019; pp. 67–75. [Google Scholar]
  19. Agostinelli, S.; Cumo, F.; Guidi, G.; Tomazzoli, C. Cyber-Physical Systems Improving Building Energy Management: Digital Twin and Artificial Intelligence. Energies 2021, 14, 2338. [Google Scholar] [CrossRef]
  20. Agouzoul, A.; Tabaa, M.; Chegari, B.; Simeu, E.; Dandache, A.; Alami, K. Towards a Digital Twin Model for Building Energy Management: Case of Morocco. Procedia Comput. Sci. 2021, 184, 404–410. [Google Scholar] [CrossRef]
  21. Saad, A.; Faddel, S.; Mohammed, O. IoT-Based Digital Twin for Energy Cyber-Physical Systems: Design and Implementation. Energies 2020, 13, 4762. [Google Scholar] [CrossRef]
  22. Židek, K.; Piteľ, J.; Adámek, M.; Lazorík, P.; Hošovský, A. Digital Twin of Experimental Smart Manufacturing Assembly System for Industry 4.0 Concept. Sustainability 2020, 12, 3658. [Google Scholar] [CrossRef]
  23. Urazayev, D.; Bragin, D.; Zykov, D.; Hafizov, R.; Pospelova, I.; Shelupanov, A. Distributed Energy Management System with the Use of Digital Twin. In Proceedings of the 2019 International Multi-Conference on Engineering, Computer and Information Sciences (SIBIRCON), Novosibirsk, Russia, 21–22 October 2019; pp. 685–689. [Google Scholar]
  24. Ouahabi, N.; Chebak, A.; Zegrari, M.; Kamach, O.; Berquedich, M. A Distributed Digital Twin Architecture for Shop Floor Monitoring Based on Edge-Cloud Collaboration. In Proceedings of the 2021 Third International Conference on Transportation and Smart Technologies (TST), Tangier, Morocco, 27–28 May 2021; pp. 72–78. [Google Scholar]
  25. Laayati, O.; El Hadraoui, H.; El Magharaoui, A.; El-Bazi, N.; Bouzi, M.; Chebak, A.; Guerrero, J.M. An AI-Layered with Multi-Agent Systems Architecture for Prognostics Health Management of Smart Transformers: A Novel Approach for Smart Grid-Ready Energy Management Systems. Energies 2022, 15, 7217. [Google Scholar] [CrossRef]
  26. Rihi, A.; Baïna, S.; Mhada, F.; Elbachari, E.; Tagemouati, H.; Guerboub, M.; Benzakour, I. Predictive Maintenance in Mining Industry: Grinding Mill Case Study. Procedia Comput. Sci. 2022, 207, 2483–2492. [Google Scholar] [CrossRef]
  27. Greco, A.; Caterino, M.; Fera, M.; Gerbino, S. Digital Twin for Monitoring Ergonomics during Manufacturing Production. Appl. Sci. 2020, 10, 7758. [Google Scholar] [CrossRef]
  28. Autiosalo, J.; Ala-Laurinaho, R.; Mattila, J.; Valtonen, M.; Peltoranta, V.; Tammi, K. Towards Integrated Digital Twins for Industrial Products: Case Study on an Overhead Crane. Appl. Sci. 2021, 11, 683. [Google Scholar] [CrossRef]
  29. Pang, T.Y.; Pelaez Restrepo, J.D.; Cheng, C.-T.; Yasin, A.; Lim, H.; Miletic, M. Developing a Digital Twin and Digital Thread Framework for an ‘Industry 4.0’. Shipyard. Appl. Sci. 2021, 11, 1097. [Google Scholar] [CrossRef]
  30. Elbazi, N.; Mabrouki, M.; Chebak, A.; Hammouch, F. Digital Twin Architecture for Mining Industry: Case Study of a Stacker Machine in an Experimental Open-Pit Mine. In Proceedings of the 2022 4th Global Power, Energy and Communication Conference (GPECOM), Cappadocia, Turkey, 14–17 June 2022; pp. 232–237. [Google Scholar]
  31. Global Smart Mining Market to Grow 10% CAGR to 2031. Available online: https://miningdigital.com/smart-mining/global-smart-mining-market-to-grow-at-10-cagr (accessed on 8 November 2022).
  32. Consulting, A.R. and Mining Equipment Market Size Was Valued at USD 133 Billion in 2021 and Will Achieve USD 185 Billion by 2030 Growing at 4.1% CAGR Owing to the Rapidly Rising Construction Activities in Emerging Nations—Exclusive Report by Acumen Research and Consulting. Available online: https://www.globenewswire.com/news-release/2022/08/05/2492964/0/en/Mining-Equipment-Market-Size-Was-Valued-at-USD-133-Billion-in-2021-and-Will-Achieve-USD-185-Billion-by-2030-growing-at-4-1-CAGR-Owing-to-the-Rapidly-Rising-Construction-Activities-.html (accessed on 8 November 2022).
  33. El Maghraoui, A.; Ledmaoui, Y.; Laayati, O.; El Hadraoui, H.; Chebak, A. Smart Energy Management: A Comparative Study of Energy Consumption Forecasting Algorithms for an Experimental Open-Pit Mine. Energies 2022, 15, 4569. [Google Scholar] [CrossRef]
  34. Laayati, O.; Bouzi, M.; Chebak, A. Smart Energy Management System: Design of a Monitoring and Peak Load Forecasting System for an Experimental Open-Pit Mine. Appl. Syst. Innov. 2022, 5, 18. [Google Scholar] [CrossRef]
  35. Guerroum, M.; Zegrari, M.; Masmoudi, M.; Berquedich, M.; Elmahjoub, A. Machine Learning Technics for Remaining Useful Life Prediction Using Diagnosis Data: A Case Study of a Jaw Crusher. Int. J. Emerg. Technol. Adv. Eng. 2022, 12, 122–135. [Google Scholar] [CrossRef]
  36. Laayati, O.; Bouzi, M.; Chebak, A. Smart Energy Management System: SCIM Diagnosis and Failure Classification and Prediction Using Energy Consumption Data. In Proceedings of the Digital Technologies and Applications; Motahhir, S., Bossoufi, B., Eds.; Springer International Publishing: Cham, Switzerland, 2021; pp. 1377–1386. [Google Scholar]
  37. El Hadraoui, H.; Laayati, O.; Guennouni, N.; Chebak, A.; Zegrari, M. A Data-Driven Model for Fault Diagnosis of Induction Motor for Electric Powertrain. In Proceedings of the 2022 IEEE 21st Mediterranean Electrotechnical Conference (MELECON), Palermo, Italy, 14–16 June 2022; pp. 336–341. [Google Scholar]
  38. Laayati, O.; Bouzi, M.; Chebak, A. Design of an Oil Immersed Power Transformer Monitoring and Self Diagnostic System Integrated in Smart Energy Management System. In Proceedings of the 2021 3rd Global Power, Energy and Communication Conference (GPECOM), Virtual, 5–8 October 2021; pp. 240–245. [Google Scholar]
  39. Laayati, O.; El Hadraoui, H.; Bouzi, M.; Chebak, A. Smart Energy Management System: Oil Immersed Power Transformer Failure Prediction and Classification Techniques Based on DGA Data. In Proceedings of the 2022 2nd International Conference on Innovative Research in Applied Science, Engineering and Technology (IRASET), Meknes, Morocco, 3–4 March 2022; pp. 1–6. [Google Scholar]
  40. Grieves, M.; Vickers, J. Digital Twin: Mitigating Unpredictable, Undesirable Emergent Behavior in Complex Systems. In Transdisciplinary Perspectives on Complex Systems: New Findings and Approaches; Kahlen, F.-J., Flumerfelt, S., Alves, A., Eds.; Springer International Publishing: Cham, Switzerland, 2017; pp. 85–113. ISBN 978-3-319-38756-7. [Google Scholar]
  41. Digital Twins—Michael Batty.2018. Available online: https://journals.sagepub.com/doi/full/10.1177/2399808318796416?casa_token=G_IVGe8ZBPEAAAAA:Mh90cEjcFA3gVv3X7srEWfBAYGrC5MXx-XRjm69m1BVOhH6wuC3mzdyxyVV7yh7LJCgTY2eQJswfScYD (accessed on 24 October 2022).
  42. Shafto, M.; Conroy, M.; Doyle, R.; Glaessgen, E.; Kemp, C.; LeMoigne, J.; Wang, L. Modeling, Simulation, Information Technology & Processing Roadmap. Natl. Aeronaut. Space Adm. 2012, 32, 1–38. [Google Scholar]
  43. Boschert, S.; Rosen, R. Digital Twin—The Simulation Aspect. In Mechatronic Futures: Challenges and Solutions for Mechatronic Systems and Their Designers; Hehenberger, P., Bradley, D., Eds.; Springer International Publishing: Cham, Switzerland, 2016; pp. 59–74. ISBN 978-3-319-32156-1. [Google Scholar]
  44. AMCR. Untangling the Requirements of a Digital Twin. The University of Sheffield AMRC Factory 2050, Europa Avenue, Sheffield, S9 1ZA. 2020. Available online: https://www.amrc.co.uk/files/document/404/1604658922_AMRC_Digital_Twin_AW.pdf (accessed on 23 November 2022).
  45. Aivaliotis, P.; Georgoulias, K.; Chryssolouris, G. The Use of Digital Twin for Predictive Maintenance in Manufacturing. Int. J. Comput. Integr. Manuf. 2019, 32, 1067–1080. [Google Scholar] [CrossRef]
  46. Xu, Y.; Sun, Y.; Liu, X.; Zheng, Y. A Digital-Twin-Assisted Fault Diagnosis Using Deep Transfer Learning. IEEE Access 2019, 7, 19990–19999. [Google Scholar] [CrossRef]
  47. Min, Q.; Lu, Y.; Liu, Z.; Su, C.; Wang, B. Machine Learning Based Digital Twin Framework for Production Optimization in Petrochemical Industry. Int. J. Inf. Manag. 2019, 49, 502–519. [Google Scholar] [CrossRef]
  48. Negri, E.; Berardi, S.; Fumagalli, L.; Macchi, M. MES-Integrated Digital Twin Frameworks. J. Manuf. Syst. 2020, 56, 58–71. [Google Scholar] [CrossRef]
  49. Zhang, M.; Tao, F.; Nee, A.Y.C. Digital Twin Enhanced Dynamic Job-Shop Scheduling. J. Manuf. Syst. 2021, 58, 146–156. [Google Scholar] [CrossRef]
  50. Xia, K.; Sacco, C.; Kirkpatrick, M.; Saidy, C.; Nguyen, L.; Kircaliali, A.; Harik, R. A Digital Twin to Train Deep Reinforcement Learning Agent for Smart Manufacturing Plants: Environment, Interfaces and Intelligence. J. Manuf. Syst. 2021, 58, 210–230. [Google Scholar] [CrossRef]
  51. Steindl, G.; Stagl, M.; Kasper, L.; Kastner, W.; Hofmann, R. Generic Digital Twin Architecture for Industrial Energy Systems. Appl. Sci. 2020, 10, 8903. [Google Scholar] [CrossRef]
  52. Niu, X.; Qin, S. Integrating Crowd-/Service-Sourcing into Digital Twin for Advanced Manufacturing Service Innovation. Adv. Eng. Inform. 2021, 50, 101422. [Google Scholar] [CrossRef]
  53. Ashtari, B.; Jung, T.; Lindemann, B.; Sahlab, N.; Jazdi, N.; Schloegl, W.; Weyrich, M. An Architecture of an Intelligent Digital Twin in a Cyber-Physical Production System. Automatisierungstechnik 2019, 67, 762–782. [Google Scholar] [CrossRef]
  54. A Review of Digital Twin in Product Design and Development. Available online: https://ouci.dntb.gov.ua/en/works/9GBYB0Pl/ (accessed on 8 November 2022).
  55. Abrazeh, S.; Mohseni, S.-R.; Zeitouni, M.J.; Parvaresh, A.; Fathollahi, A.; Gheisarnejad, M.; Khooban, M.-H. Virtual Hardware-in-the-Loop FMU Co-Simulation Based Digital Twins for Heating, Ventilation, and Air-Conditioning (HVAC) Systems. IEEE Trans. Emerg. Top. Comput. Intell. 2022, 1–11. [Google Scholar] [CrossRef]
  56. Jung, T.; Shah, P.; Weyrich, M. Dynamic Co-Simulation of Internet-of-Things-Components Using a Multi-Agent-System. Procedia CIRP 2018, 72, 874–879. [Google Scholar] [CrossRef]
  57. Talkhestani, B.A.; Jazdi, N.; Schlögl, W.; Weyrich, M. A Concept in Synchronization of Virtual Production System with Real Factory Based on Anchor-Point Method. Procedia CIRP 2018, 67, 13–17. [Google Scholar] [CrossRef]
  58. Segovia, M.; Garcia-Alfaro, J. Design, Modeling and Implementation of Digital Twins. Sensors 2022, 22, 5396. [Google Scholar] [CrossRef] [PubMed]
  59. Boyes, H.; Watson, T. Digital Twins: An Analysis Framework and Open Issues. Comput. Ind. 2022, 143, 103763. [Google Scholar] [CrossRef]
  60. Botín-Sanabria, D.M.; Mihaita, A.-S.; Peimbert-García, R.E.; Ramírez-Moreno, M.A.; Ramírez-Mendoza, R.A.; Lozoya-Santos, J.D.J. Digital Twin Technology Challenges and Applications: A Comprehensive Review. Remote Sens. 2022, 14, 1335. [Google Scholar] [CrossRef]
  61. Uhlenkamp, J.-F.; Hauge, J.B.; Broda, E.; Lütjen, M.; Freitag, M.; Thoben, K.-D. Digital Twins: A Maturity Model for Their Classification and Evaluation. IEEE Access 2022, 10, 69605–69635. [Google Scholar] [CrossRef]
  62. Bowman, D.; Dwyer, L.; Levers, A.; Patterson, E.A.; Purdie, S.; Vikhorev, K. A Unified Approach to Digital Twin Architecture—Proof-of-Concept Activity in the Nuclear Sector. IEEE Access 2022, 10, 44691–44709. [Google Scholar] [CrossRef]
  63. Qamsane, Y.; Moyne, J.; Toothman, M.; Kovalenko, I.; Balta, E.C.; Faris, J.; Tilbury, D.M.; Barton, K. A Methodology to Develop and Implement Digital Twin Solutions for Manufacturing Systems. IEEE Access 2021, 9, 44247–44265. [Google Scholar] [CrossRef]
  64. Bécue, A.; Maia, E.; Feeken, L.; Borchers, P.; Praça, I. A New Concept of Digital Twin Supporting Optimization and Resilience of Factories of the Future. Appl. Sci. 2020, 10, 4482. [Google Scholar] [CrossRef]
  65. Han, X.; Yu, H.; You, W.; Huang, C.; Tan, B.; Zhou, X.; Xiong, N.N. Intelligent Campus System Design Based on Digital Twin. Electronics 2022, 11, 3437. [Google Scholar] [CrossRef]
  66. Stojanovic, L.; Usländer, T.; Volz, F.; Weißenbacher, C.; Müller, J.; Jacoby, M.; Bischoff, T. Methodology and Tools for Digital Twin Management—The FA3ST Approach. IoT 2021, 2, 717–740. [Google Scholar] [CrossRef]
  67. Barricelli, B.R.; Casiraghi, E.; Fogli, D. A Survey on Digital Twin: Definitions, Characteristics, Applications, and Design Implications. IEEE Access 2019, 7, 167653–167671. [Google Scholar] [CrossRef]
  68. Leng, J.; Wang, D.; Shen, W.; Li, X.; Liu, Q.; Chen, X. Digital Twins-Based Smart Manufacturing System Design in Industry 4.0: A Review. J. Manuf. Syst. 2021, 60, 119–137. [Google Scholar] [CrossRef]
  69. Hribernik, K.; Cabri, G.; Mandreoli, F.; Mentzas, G. Autonomous, Context-Aware, Adaptive Digital Twins—State of the Art and Roadmap. Comput. Ind. 2021, 133, 103508. [Google Scholar] [CrossRef]
  70. Kantaros, A.; Piromalis, D.; Tsaramirsis, G.; Papageorgas, P.; Tamimi, H. 3D Printing and Implementation of Digital Twins: Current Trends and Limitations. Appl. Syst. Innov. 2022, 5, 7. [Google Scholar] [CrossRef]
Figure 1. Paper structure.
Figure 1. Paper structure.
Sustainability 15 03470 g001
Figure 2. Digital twin reference architecture model in Industry 4.0.
Figure 2. Digital twin reference architecture model in Industry 4.0.
Sustainability 15 03470 g002
Figure 3. Digital twin integration level.
Figure 3. Digital twin integration level.
Sustainability 15 03470 g003
Figure 4. Digital-twin-enabled value lifecycle management.
Figure 4. Digital-twin-enabled value lifecycle management.
Sustainability 15 03470 g004
Figure 5. Digital-twin-based collaborative environment.
Figure 5. Digital-twin-based collaborative environment.
Sustainability 15 03470 g005
Figure 6. Proposed digital twin multi-layer architecture framework for the mining industry.
Figure 6. Proposed digital twin multi-layer architecture framework for the mining industry.
Sustainability 15 03470 g006
Figure 7. Multi-agent-based simulation.
Figure 7. Multi-agent-based simulation.
Sustainability 15 03470 g007
Table 1. Digital twin applications in different industrial sectors.
Table 1. Digital twin applications in different industrial sectors.
TeamYearApplicationFeaturesDigital Twin
Tomas [13]2021Industrial heat transfer stationFunctional mock-up interface
Python
OPC UA
Concept
Yinping [14]2022Storage yard schedulingGenetic optimization
Neural network prediction
Co-simulation
Automation
Zongmin [15]2021Population health managementVirtual–real integration of industrial IoTConcept
Christoph [16]2019Power system mirrorNeural network
Energy management
Concept and design
Edward [17]2020EV charging
Microgrid
Smart energy management systemsConcept
Vivi [18]2019CampusBuildingCase study
Sofia [19]2021Energy managementBuilding AIConcept
Abdelali [20]2021Energy managementBuildingCase study
Ahmed [21]2020Cyber-physical systemsEnergy management
Cyber-physical
Design and implementation
Kamil [22]2020ManufacturingAssembly system
Industry 4.0
Concept
Damir [23]2019Energy management systemsDistributed EMS
Prediction
Simulation
Concept simulation
Nada [24]2021Shop floor monitoringMonitoring
Cloud collaboration
Edge cloud
Concept and case study
Oussama [25]2022Power transformerMulti-agent hybrid AI fault detectionSystem and case study
Ayoub [26]2022Grinding mill case studyPredictive, maintenance-based, data-drivenCase study
Allessandro [27]2020Manufacturing productionMonitoring ergonomicsCase study
Jusso [28]2021Overhead craneAccelerating production paceCase study
Toh Yen [29]2021Shipyard Fully integrated digital thread and digital twinFramework architecture
Nabil [30]2022Open pit mine stacker machineFull digital twin of a stacker machineArchitecture
Table 2. Mining machinery.
Table 2. Mining machinery.
MachineNumber
Feeders2
Jaw crushers1
Stackers3
Belt conveyors21
Bucket wheel reclaimer2
Screening machine9
Pumps and compressors10
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

El Bazi, N.; Mabrouki, M.; Laayati, O.; Ouhabi, N.; El Hadraoui, H.; Hammouch, F.-E.; Chebak, A. Generic Multi-Layered Digital-Twin-Framework-Enabled Asset Lifecycle Management for the Sustainable Mining Industry. Sustainability 2023, 15, 3470. https://doi.org/10.3390/su15043470

AMA Style

El Bazi N, Mabrouki M, Laayati O, Ouhabi N, El Hadraoui H, Hammouch F-E, Chebak A. Generic Multi-Layered Digital-Twin-Framework-Enabled Asset Lifecycle Management for the Sustainable Mining Industry. Sustainability. 2023; 15(4):3470. https://doi.org/10.3390/su15043470

Chicago/Turabian Style

El Bazi, Nabil, Mustapha Mabrouki, Oussama Laayati, Nada Ouhabi, Hicham El Hadraoui, Fatima-Ezzahra Hammouch, and Ahmed Chebak. 2023. "Generic Multi-Layered Digital-Twin-Framework-Enabled Asset Lifecycle Management for the Sustainable Mining Industry" Sustainability 15, no. 4: 3470. https://doi.org/10.3390/su15043470

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop