Link to Original Article
New Era of BIM Lifecycle Implementation - Issue 4
Over the past decade, the understanding of Building Information Modeling (BIM) has been expanded from a 3D update of traditional CAD drafting to a lifecycle concept that encompasses the process of creating and managing digital models of built assets. The AECO industry has responded with increased expectations for the advantages the BIM workflow will provide. Project owners and BIM users are well-prepared for its utilization during the delivery phase and now anticipate similar results from its performance during the operational phase, which runs through the remaining lifecycle of a project. The requests for collaborative approaches of information to be collected, created, managed, and shared have intensified.
This article is the fourth part of the series that deciphers the journey of building information and data management from the delivery phase to the operational phase. The first two articles reviewed the impact of ISO19650 Series on Asset Management, followed by the applications and drawbacks of widely adopted classifications in Building Information Modeling (BIM) to Facility Management (FM) transition. The third article presented an FM-oriented Data Dictionary Management System (DDMS) which can be the backbone for ISO 19650 implementation and a Digital Twin. This article continuingly explains why a Digital Twin matters to the facility owners and further explores what a Digital Twin for facility management should entail to fulfill lifecycle implementation.
New Era of BIM Lifecycle Implementation
Part 4: A Cradle to Cradle Digital Twin Ecosystem for Building Asset Management
By Dr. Eve Lin, Microdesk EAM Strategy Consultant, Dr. Xifan (Jeff) Chen, Microdesk EAM Assistant Director, and George Broadbent, Microdesk VP of Asset Management
Introduction
Previously in this series, we reviewed the importance of data management behind the model handoff during BIM lifecycle implementation, in terms of data interoperability, accuracy, and sufficiency. We highlighted the necessity of developing well-defined data requirements from an AM/FM perspective and using them to regulate delivery phase data collection and population. We also described an AM/FM Data Dictionary Management System (DDMS) that helps address commonly seen issues during PIM to AIM transition as well as during operation and maintenance. BThis article further discusses the data flow during the entire project lifecycle from delivery to operational phases and introduces the current trend of Digital Twins -- an ideal BIM implementation scenario that needs to be built on top of a solid data foundation.
Lifecycle BIM Data Management
As previously discussed, Figure 1 illustrates the top-down structure of a project lifecycle from different levels. It indicates the fundamental importance of a well-defined and managed FM-oriented DDMS to the entire BIM program. While we emphasized the importance of a DDMS during the FM stage owing to its long duration and high operational cost during the entire lifecycle, a well-planned DDMS is a critical foundation to support the lifecycle BIM data management. In the real world, even a well-coordinated data management plan could collapse any moment due to small data glitches in the data exchange process and impact the downstream data flow consequently. Without certain governance and policies to standardize the workflows and processes, it is hard for organizations to maintain data interoperability, sufficiency, and accuracy because teams tend to work in silos and make decisions based on available information.
Therefore, we need to elevate the topic from the data or technology application level to the business process level, which includes but is not limited to BIM Standard, BIM Execution Plan, Contractual Languages, and QA/QC Compliance Validation. This is where a DDMS can help with bridging all data gaps. In the following content we are going to have a close look into data gap with real-world scenarios and describe how DDMS can benefit the project delivery
and operational loop.
Data Gaps
Data gaps are the most common yet difficult problem in lifecycle BIM implementation. A data management plan can break on one or several data gaps alongside project delivery and operation. One example is the miscoordination among data collection and validation between the submitted BIM model and onsite data collection. In an ideal scenario, data from these two channels should complement each other in a synchronized way. However, this rarely occurs as planned. Figure 2 illustrates the ideal and real scenario of asset data collection for facility management purposes. During the project delivery phase data is collected from all channels (onsite, cut sheets, manuals), but part of them are not required by facility management. Moreover, what facility management really cares about is missed. Possible root causes are listed below:
-
There is no concrete FM-oriented data requirement.
-
If there is a solid data requirement, it is not included in the Integrated Project Delivery (IPD).
-
If the requirement is included in the IPD, there is a lack of QA/QC session to regulate the data submission.
Closing the Loop
In this case, the DDMS starts by helping the owner or facility management to build up a well-defined data requirement (with all semantical analysis, AI recommendation functionalities described in the previous article). In this way, the facility owner understands what data they are expecting from upstream systematically. After the data dictionary has been developed, the DDMS will further polish the data requirement and assemble the data dictionary into the data requirement (parameter) packages and share with each corresponding stage (design, construction, commissioning, etc.).
Stakeholders in each delivery stage will be informed with types of data they are supposed to collect or populate so as to avoid the expensive task of collecting and coordinating asset data after the building has been occupied. Lastly, the DDMS needs to be capable of analyzing submitted data against the hosted data dictionary to perform a compliance check. This process not only tests the data completeness but also data validity and uniqueness based on each attribute/parameter requirement. The DDMS will also perform data analytics and reporting on analyzed data for the data or BIM manager to retouch their data collection.
In short, the DDMS will help clients with what needs to be collected as well as when and who needs to collect it. ] The entire process is centered around the defined data dictionary to eliminate any data miscoordination. Figure 3 illustrates the loop of data collection and validation driven by the utilization of DDMS.
The gap described above varies and occurs all around the project delivery. The aggregation of data glitches here and there adds up to the failure of the entire data management plan regardless of how good it was originally. The combination of a well-defined data dictionary, a solid business process that ensures the successful implementation of this dictionary, and effective data management tools can help the client overcome these gaps and eventually get on the right track of closing the loop.
The Call of Digital Twin
Confusion from the Owner’s Perspective
The Word Speaks for Itself
The concept of a Digital Twin has been around since 2002 and has quickly spread into manufacturing, healthcare, automotive, aerospace, and other industries. The basic definition of a Digital Twin is a digital replica of a physical entity that connects the physical and the virtual world to enable data synchronization in between. In other words, the Digital Twin is a living model that reflects and keeps up with its real physic counterpart, including not only the representative geometry, but current conditions and data.
In the field of AECO, a Digital Twin does not need a specific level of development, detail modeling requirements, or technology/process utilizations. From the owner’s view, what he received by the end of the delivery phase will be a digital replica of what has been built onsite, which, most of the time, includes everything that the owner wants to know.
Once a project has gone through the delivery phases, onsite commissioning, field/asset data collections, populations, and validations, a Digital Twin can be developed and handed over to the owners that fully connect and sync with the Owner’s Computerized maintenance management system (CMMS) updating itself periodically during the entire service life of the facility. Facility asset information, from detail to high level, could be synchronized into the Digital Twin and perform analytics and reporting from there. Meanwhile, 3D representation of the facility can help the operation managers or line workers to accelerate work order response time, enhance data interoperability, and increase FM efficiency and efficacy. Being a living digital replica, the Digital Twin keeps itself updated during the entire facility’s service life, and ready for serving as a refreshed “as-built model” at the next renovation. In this way, a Digital Twin reshapes the “Cradle to Grave” to “Cradle to Cradle” BIM lifecycle.
Digital Twin Ecosystem
The living virtual-reality mapping mechanism enables analyses and reports of real-world data to handoff problems before they occur, prevent failure, and possibly develop new plans. The concept of a Digital Twin perfectly connects BIM, CMMS, Business Intelligence (BI), Artificial Intelligence (machine learning, deep learning, etc.), data science, GIS, and the Internet of Things (IoT) naturally and logically.
“Living” is a keyword in this connected process that makes everything possible that does not apply to a traditional “static” as-built model. An effective and functioning Digital Twin for facility management requires a well-balanced five-dimensions ecosystem, as illustrated in Figure 4, including Physical World, Digital World, People & Organization, Business Intelligence, and Digital Connections. How effective and intelligent a Digital Twin will be relies on how well each dimension is developed and connected.
Physical World: Data Capture Capability
To have a virtual replica of the physical world depends on how well the Digital Twin can capture real-time data. The advancement of sensing, monitoring, and measuring technology along with IoT allows the abundance of real-time data to be captured and utilized. Evolving from traditionally monthly metered utility bills, real-time outdoor and indoor environments and building systems can be captured, measured, and monitored for operational and maintenance system opportunities. However, the effective response path will rely on the business intelligence of the Digital Twin, which will be described in the fourth dimension of the ecosystem. In addition, the collected data quality also impacts the response results. To prevent garbage in and garbage out, collect accurate and validated data for whatever purpose is essential for data integrity.
Digital World: Digital Model Maturity
The second dimension of the Digital Twin ecosystem is related to the maturity of the digital model. With the adoption of ISO 19650 and an FM-oriented DDMS as a backbone, a streamlined process from PIM to AIM becomes possible. While several BIM performance measurement metrics, maturity models, and tools were developed to gauge the performance of BIM, the measurement metrics for an FM-oriented Digital Twin is still not clearly defined. However, the maturity of the data model can be roughly gauged based on the data quality (i.e., data accuracy, richness, consistency), data capability , and lifecycle support.
People & Organization
How people and organizations interact with and utilize a system is the determining factor of the effectiveness of a Digital Twin. No matter how advanced technologies are or how rich the data model is, a self-driving car won’t start until the driver knows where to push the power bottom. The people/organization aspect is an often-overlooked area. On the individual level, how well is an individual equipped to interact with their necessary systems? How can you increase an individual’s competency? On the organizational level, how efficient are the processes among the organization? How do you improve the organizational process from a decentralized organization to a centralized and agile process ? Sometimes matching users with the tools they know how to use is more efficient than giving them a supercomputer that they don’t know how to run.
Business Intelligence
With the solid foundation of the collected data, data models, and clearly defined business logics, this dimension focuses on how to make the system intelligent, such as applying Artificial intelligence, Machine Learning, and an Artificial Neural Network to replicate the human cognitive process and learning behavior to perform tasks and improve performance. With the foundation of real-time data and asset information, big data and advanced predictive analysis can be utilized to bring the unorganized data to several actionable insights to increase the accuracy of prediction and support decision making. The services involved include but are not limited to condition monitoring, function simulation, evolution simulation, dynamic scheduling, reductive maintenance, quality control, etc.
Digital Connections
The final dimension is the six digital connections that bring together the previous four areas, including the relationships between the (1) physical model and digital model; (2) physical model and business intelligence; (3) digital model and business intelligence; (4) physical model and people & organization; (5) digital model and people & organization; and (6) business intelligence and people & organization. How to enable the data transfer from a manual input and error-prone process to streamlined and highly autonomous levels is another key to the effectiveness of a Digital Twin. Numerous technologies are currently available to support these connections, such as internet, communication, interaction, collaboration technologies, as well as human-computer interaction technologies, i.e., Virtual Reality , Augmented Reality, and Mixed Reality.
The connection between the physical and digital world ensures that the collected real-time data are reflected dynamically. Through the connection between the digital model and business intelligence, the data can be analyzed and generated with the corresponding actions, followed by the connection between the business intelligence and the physical world; the corresponding control can be sent back and executed in the physical entities. The connection between the people/organization and business intelligence allows business logic, processes, and services that can dynamically reflect in the business intelligence system; the generated work orders and preventive maintenance can then be sent back and deployed to different actors within the organization.
Given many different models and inputs among different dimensions, to ensure smooth data interaction, a standardized data exchange protocol with unified communication interfaces and standards becomes more important than ever before. A standardized data exchange format reemphasizes the importance of an FM-oriented DDMS. With that as the backbone of the data management system and the foundation of the connection protocol, well-connected pathways can then be established to support a healthy Digital Twin ecosystem for managing the entire project lifecycle seamlessly.
Solid Foundation + Well-Balanced Development is the Key
The progression of technologies in different dimensions of a Digital Twin brings a tremendous amount of potential and possibility, which cannot advance without maintaining a well-defined DDMS as the foundation along with a well-balanced ecosystem. From the data foundation perspective, a well-functioning Digital Twin involves an increasing amount of data streams and complexity of maintaining the consistency, integrity, and interoperability of data collected from all sources, including but not limited to roll-in and rollout of new data formats and old data formats, data from upstream BIM as-built model, and collected IoT data.
Consequently, establishing a solid DDMS as the foundation is a must for the success of a BIM lifecycle implementation or a Digital Twin system. From the technology implementation perspective, well-balanced development towards all dimensions is essential. For example, the IoT technologies enable the connected operation process and reshape data utilization for facilities and asset management. However, the implementation process is not just a plug-in and done scenario without challenges.
First, a standardized DDMS needs to be established to ensure seamless data exchange flow. Then, all stakeholders involved in the process must work collaboratively and understand the methodology to utilize the innovative solution effectively. This means the organizational and user capabilities need to increase simultaneously to maintain the balance of the ecosystem, which requires coordination between facility operations, IT, and business leaders. The learning curve of each implemented technology determines the degree of the overall implementation and further affects the system performance. Lastly, from the financial perspective, a well-established Digital Twin is expected to require high capital costs for system implementation, including DDMS deployment, various sensing and monitoring technologies installation and configuration, as-built model and data collection, business logic programing, and user training.
Despite the Lifecycle Benefit/Cost Analysis of a Digital Twin demonstrates promising productive Benefit/Cost Ratio (BCR), ROI, and Payback Period, it is still a determining factor that impedes clients from adopting or understanding the system at the outset. Moreover, associated policy modification, execution plan, adaptability, and other external influences indirectly impact system implementation positively or negatively. Nonetheless, a Digital Twin as the final destination of BIM lifecycle implementation has become a widely adopted vision and goal to ultimately leveraging available technologies to their fullest potential for building lifecycle management.
By