Skip to content
All Articles


Link to Original Article

Denver International Airport – Data Transition for a BIM to FM Process

lAsset Management has both a strategic and tactical component to it both of which are vital to any long-term program success. When we look at the technical component it boils down to two main areas one is where the information coming from and second what is the system that's going to be used for the asset / work order management. However, there's a fundamental component in between these two areas that's often overlooked this is the area of the region that we called a data transition period.

At Denver International Airport they recognized that the data transition was a critical component to their long-term success and began taking steps to fill the gap between the collection of the information and the population of their asset management system.

The first look at this process became the need to develop a Single Source of Truth. The single source of Truth would store information and be used to feed this to all the other systems.

However, building and maintaining a single source of Truth system is a far more complex project and requires systems and controls in place that will constantly update the data from both the source and the destination system so that everyone is in sync.

About 3 years ago Denver moved away from the central repository model for the information towards a shared communication between systems. This became a Federated model for information management.

Any information management plan that is developed needs to understand what the source and the destination is for the information as well as the extract transform and load (ETL) process for is for that data. It's only by ensuring that the process is consistent and reliable that you can ensure that the data is also consistent and reliable.

In a Federated model for information management the individual systems that are responsible for the day to maintain their system their data maintain their data and in the process for maintaining up keeping a data exists outside of that system send through a trusted connection the ETL process curse moving the information from one system to the Target system at the Target system processes are put in place to maintain upkeep that data then the data can be returned back to the store system based upon information that's updated from the target it says bi-directional flow of information that was appealing to Denver while not having to maintain a third database for storing information period after

<Figure 1>

BIM- ModelStream® and Maximo

Microdesk ModelStream was the product selected to help and them build this Federated model of Information Management.

ModelStream allows for the information to be map between the source and the destination system in this case the source was a BIM model and the destination was IBM Maximo system it's. It's true this mapping that date of translations can be you can be handled the system then based on this mapping extract information transforms it and loads it into the Maximo system.

As the Maximo system information is constantly being updated with new equipment new information and this information can then go back into the BIM model using the ModelStream’s functions. What this process allows is for each group to continue to update and maintain their data as they need to base upon their job, minutes ModelStream that handles the ETL process of moving information from one system to the other on an as-needed basis so the BIM process can be updated through the normal workflow established for then and Maximo is updated based on normal day-to-day facility Management on operation two systems are tied together through this mapping but they are not dependent on each other.

As the BIM models evolve and your standards are developed for data being captured within those models the mapping file is updated to show where the information needs to go into the target system because there is a profile for each set of models existing models using a different data standard don't need to be updated to the new gate of standard but they can continue to use their existing map of information to push and pull information into the Maximo system.

The process that was developed at DIA and other infrastructure locations allows for the approach / system to be used not only by MEP but Structural Engineers as well. The approach has been proven to be useful in fixed Structural Assets such as bridges and is being extended to Tunnel infrastructure.

Impact on Owners

From the owner’s perspective the construction process is contains rich data sets that are needed in operations and maintenance. Unfortunately, the traditional methods of data delivery require additional time and money to get the information into the hands of the operations and maintenance staff. This time has been measured as somewhere between 6 and 18 months to process the close out packages extract the data and convert the data to something that a CMMS system can use.

This process, using ModelStream® greatly reduces the time to data delivery for operations and maintenance.

 

Asset Management Impacts

Day one operational support is the holy grail of Facility Maintenance and Operations, where at the end of major construction the Assets and related data and reference documents are loaded into an Asset Management System. Thus, on Day one of Operations and Maintenance, all relevant information about the assets is made available to facility staff. ModelStream® gets allows the industry to get one step closer to Day 1 operations, by streamlining the data transfer process from a unified As-built model into an operations and maintenance CMMS.

In addition, operations staff now can view the 3D model information and all of the related data while inside Maximo through the introduction of a Model Viewer Tab. This model Viewer tab allows for the Facility Technician to see where the asset is located, and what equipment is around the asset. [need to expand to include forge platform]

Future proofing their process

The overall approach that was taken allows for a flexible and adaptable system where the BIM Models prepared in Revit can be based on an evolving set of data standards and modeling techniques. Since each model is given a separate mapping file, and they are published individually to the Forge Platform, the mapping between Revit and Maximo is based on the Model that existed at the time of publishing. As standards evolve and new models are delivered, the mapping file is updated to reflect the new standard. This means that the old models do not need to be re-mapped as is the case with a single data model.

Challenges that this overcomes.

The challenge of trying to maintain one unified data model for all BIM models at varying stages of development is overcoming this approach because each set of BIM models has its own file mapping that can easily be copied from one BIM Model to the next. This solves the issue of having to retrofit BIM models with data sets as the owner’s standards evolve. This allows for each model to still map to one Central operations and maintenance system and for this data to flow back to each model without having to retrofit the data. Thus, the approach has a long term sustainable data model.

Implementation challenges

As with any data management system complexities of implementation always come up in this case the complexities revolved around ensuring that the Maximo system for the development staging and production environments were of similar Rev levels to ensure that data consistency across each platform because we are populating assets sometimes you don't want to have asset records updated in Maximo so all information updates were handled through the specifications or classifications in Maximo and then through an automation process updated on the I said record itself once they were approved

The main technical things we ran into were Maximo patch level differences among their environments and network/firewall issues we had to work through. The FW issues were mainly getting FW exceptions from the hosted Maximo environment to the Forge platform.

As a part of a separate effort, DIA realized that they had to modify their BIM process to add placeholders for the data they were looking to collect and eventually transfer to Maximo. Even though the data didn’t exist yet, the placeholders allowed for the creation of the mapping files. All the additional parameters were created at the instance level in the models to allow for individual values (as opposed to global values that are placed at the family level). This provided support for using the model as a facility model.

DIA wanted to be able to have Maximo Auto Number the assets, but also be able to see what a particular element in Revit was linked to in Maximo. To accomplish this, we set them up with an inbound (Maximo to Revit) mapping that brings the value for the “assetnum” field back into the Revit model after it is loaded in Maximo so they get the full visibility.

Conclusion - Show how this technology could benefit future projects or business

What this and other projects like this have shown us is that BIM does have a role within operations and maintenance. The single source of truth model may not currently exist, but there is a role for a federated data model, with BIM as the primary source of information. What is left for the Facility Maintenance and Operations industry to figure out is if the 3D model itself has a useful role in Operations and Maintenance. The 3D model allows for the facility staff to view and examine the work area prior to going into the field. Future integrations of technology will allow for the 3D model to be used in the field for the “behind-the-Wall” view. However, for the industry to get there a robust BIM workflow must be in place to ensure that the data is consistent and reliable.