How to Make Intelligent 3D Models Even More Intelligent

Originally aired on 4/7/20 | 35 Minutes
The promise of intelligent 3-D models is right at our doorstep. In terms of construction, the question is clear: Shall we embrace this new technology or slip back into the past? This webinar explores the challenges of embracing new technologies and planning techniques that allow the free flow of ideas that only serve to move the technological needle forward.

Transcript

John Klobucar: Hello, I’m John Klobucar with InEight, and I’d like to welcome you to the latest webcast in our path of construction series. Today’s webcast is titled How to Make Intelligent 3D Models Even More Intelligent. Our presenter today is Max Risenhoover, who is an Executive Vice President at InEight. Max is the founder of M-SIX, which was acquired by InEight in November 2018. Following the acquisition, Max became a member of InEight’s leadership team, with responsibility for solutions in the virtual design of construction, quality, commissioning and Advanced Work Packaging categories. Prior to M-SIX, Max was the architect of a financial services software framework used by Shearson Lehman Brothers, Bear Stearns, and the World Bank. Max has also worked in recording studios around the world with a diverse range of artists. Now, if you have any questions as you watch this webcast, please email them to webcasts@ineight.com, and Max will do his best to answer them. Also, this presentation is being recorded, and we’ll be sending you a link to the video in about a week’s time. Once again, we’re glad you’ve joined us, and now, let me introduce Max Risenhoover. Max Risenhoover: Thanks, John. So today, I’m going to be covering How to Make Intelligent 3D Models Even More Intelligent. I’ll be basically dividing this up into three chapters. Why? Why do we need our models to be more intelligent? How does this relate to Advanced Work Packaging? Even though this webinar is not about AWP, there’s some overlap there, and then, what’s InEight’s approach? First things first, what’s an intelligent model? A simple definition would be the geometry, the shapes, the 3D faces and vertices and all the things that make up the spatial definition of things we’re building, linked to information, information in the really broadest sense of tables in a database, documents, photos, linked attributes, checklists issues, et cetera. Max Risenhoover: By that definition, we certainly have intelligent models. We have powerful authoring environments that allow us to create design models, design intent models for each of the disciplines, for example, and then over time, perhaps they give way to construction models, fabrication models, as-built models that might be polygonal or more and more commonly, they’re derived from LIDAR photogrammetry into point clouds for as-builts, but we have intelligent models. But if we already have intelligent models, why does it seem like their full potential is rarely achieved, and why are the benefits so inconsistent? Some areas derive these benefits routinely, doing spatial coordination, sometimes referred to as clash detection. That provides a big win on many non-trivial projects, but there are many other teams that could derive benefits from this intelligence, and it doesn’t happen that often. Max Risenhoover: I believe it was Albert Einstein that said, “The reason the construction industry rarely realizes the full potential of 3D models is that each team needs a different classification system to meet their role-specific needs and that manually specifying these classifications is tedious and error-prone, especially if it must be repeated with each model revision.” So what do we mean by a classification system? Really, it’s just tagging model elements with parameters, properties, meta-data, but each team, each discipline, tends to need different kinds of classification system. Max Risenhoover: An estimator might do an early estimate just based on quantities in Uniformat II categories. Later on, they might need more detail. A BIM coordinator might just need attributes around discipline and area to set up their clash rules and do spatial coordination. A scheduler might need links to WBS ID, links to rows in a schedule. A workface planner might need estimated ship dates from the procurement system to be able to plan installation work packages. A QA/QC team member might need inspection status. There are many, many examples of the kinds of information that we would like to be able to link into this repository, this common data environment that we’re creating here and it’s not just 3D geometry. Max Risenhoover: We want to be able to organize all kinds of information, but the classification system is the key to making this information useful to the various teams. And then this problem is really compounded when models were replaced from stage to stage in the life cycle or even within a single life cycle replaced with revisions as the design proceeds. So a model element that is part of the electrical package at the design intent stage would typically give way to a detailed construction model or a fabrication model or a model that was provided by the manufacturer, and then maybe the as-built model is an audited version of the construction model or in the future, more typically might even be a LIDAR derived point cloud. So if each of these steps were linking information to the model itself, then presumably we have to do that again when we replaced the design model with the construction model, and again when we replace the construction model with the as-built model. Max Risenhoover: Clearly that’s a huge problem. So maybe what we should define as an intelligent model here isn’t linking these things directly to model elements and instead we introduce some intermediate indirection step here of an ID or multiple IDs, typically, that might be asset tags, systems, location IDs, cost codes, WBS codes, basically a primary key into a classification system. So basically-we want this intelligent model to be able to accommodate the very different needs of all these different teams. So really what we’re saying is that model elements are just a new addition to this list of information types that we would link. So we want to have structured relationships between all the different information that’s useful with role-specific and task specific views of that information. Max Risenhoover: So that’s our goal. It’s easy to say and it’s very difficult to execute. And then that’s a kind of a good segue into Advanced Work Packaging. You’ve all seen me by now know that I am an actual human being and not a robot, so I’m going to hide my face, make a little more room on the screen. And again, this webinar is not specifically targeted at AWP. That’s something that the InEight model team and all of InEight is making a significant investment in in 2020 and we’ll be speaking about that in much more detail. But there’s a certain amount of between Advanced Work Packaging and what we’re describing here around intelligent models and since the Advanced Work Packaging concept is not as widely understood in vertical building infrastructure projects as it is in industrial projects, especially in oil and gas, here’s a really quick rundown. Max Risenhoover: AWP is a set of methodologies that were really born out of studies by CII and other groups, I’m going back quite some time, more than 10 years, where they were studying productivity in the field and seeing that metrics like time on tools were pretty terrible and that sometimes that would be 40% on a project. So 40% of the time the craft in the field was actually building and the rest of the time they were waiting for obstacles to progress to be removed work that needed to be done prior to their work, drawings, information, materials, tools, resources of all kinds. And so the idea of managing those constraints and trying to identify them, these methodologies are around a formalization of that idea to improve productivity in the field. So the standard approach that’s used in oil and gas a great deal and that they’re great ideas and that I think we all think should propagate to the other areas of our industry is this idea of decomposing a big complicated project into parts that make predictability easier and easier. Max Risenhoover: So it’s typical to then define a path of construction that’s derived or decomposed into construction work areas. And then within a construction work area, a single discipline engineering work packages and construction work packages and then a CWP, construction work package, would be decomposed into multiple installation work packages and then potentially even dividing installation work packages into daily plans. So this hierarchy is an important part of AWP. And another thing that these working groups are defining is the handoff of information that makes identifying those constraints possible. So you have to break down a lot of silos to really make that work well. And so this concept of a digital thread, which is borrowed from military aviation and manufacturing and is now this idea that’s starting to become a part of working groups with Advanced Work Packaging is to identify the handoff or the flow of information from different teams over the life cycle of a project. Max Risenhoover: Some AWP projects put the construction model at the very center of all of this process and have it be very, very central to this idea of information moving from the various teams to the various stakeholders. Ultimately to then feed to cost, procurement, schedule ,tracking progress and then verifying that the quality and the completions and startup process are ready to hand this off to the owner. So this idea of a digital thread is very, very important and intelligence in the model is key to all of that. It’s also key to InEight’s strategy for 2020 and beyond is having these integration points between all the different tools work smoothly to be able to be indifferent about whether that hierarchy of CWAs to CWPs down to IWPs to be indifferent about whether those are created in model versus plan of schedule, some of the other tools in the platform ,to have a single shared data structure representing all of that information and then have these different tools and different solutions in essence be a different user interface into that same data structure. Max Risenhoover: So whether or not your team is exploring and curious about the prescriptive methodologies of AWP in particular, the ideas in AWP are undeniably valuable and in some ways very, very simple, breaking down silos, integrating the flow of information and key to that with regard to models and taking advantage of the intelligence in the models. We’ve outlined some obstacles here. So what’s InEight’s approach? We start with a CDE, a common data environment that provides easy access to every revision of every design, construction and as-built model for every discipline plus linked information of all kinds, documents, rows in a database table, photos from the field, issues, checklists and so on. And then we make it easy to leverage the existing intelligence provided by those model authoring tools and to add or update that information manually when necessary from the office or from the field or via automation. Max Risenhoover: So I’ll illustrate these first two concepts in the live software now and then we’ll come back to automation in a moment. To start with, here’s the project structure, in fact, spanned all of these, what we call model streams. These represent all the different models from the different disciplines in this project. And if we click on the root node, we see all the revisions to the project itself over time with revision notes about what changed. And then for individual models, like the grid model here, we see there were only two revisions back in 2014. But here’s how it’s all organized and we have many, many models here, hundreds and then revisions of each of them. This is a particular level of this international airport terminal. So a lot of information here. If I quickly go to a different filtered view of the data set, we’ll see there’s even more. Max Risenhoover: This is the entire airport campus and a transit corridor leading to a portion of downtown to provide context. And if I zoom in, we see that we’ve got lots of detail, we’re not cheating by hiding the detail and the information. So to continue with my example of breaker panels, let’s say my project is providing the owner with the digital twin at the end of the project, and I’m an electrical sub that’s obligated to provide information that’s unlikely to be available at design time, like asset serial numbers and owner manuals and things like that, things that we can’t pull the design authoring environment or that it’s very unlikely anyway. So today I’m auditing the breaker panels. So let’s say I just go ahead and click on one of them and I look at the linked data. I can see a schema or a data structure that was defined by the facilities and maintenance team for the things that they want to track. Max Risenhoover: I can see any linked documents if there are any, so if I wanted to see field photos for this asset, I could or the ONM manual. I have easy access to all these documents. So I see that the document submittals are complete for this. If I go back to the linked data, let’s just assume that I’m providing the asset serial number and I think it’s a little bit contrived to do that from the modeling environment. It’s more likely that this would be scanned in the field using the iPad app or perhaps entered from the back office from the Excel plugin. But if I change that here and I show you my pending changes, I have this one change that I’m about to share if I publish, that updates the asset serial number for that asset. Let’s imagine I’m continuing my audit of all the breaker panels, so I’ll turn off this section tool that’s hiding other aspects or other spatial areas of this model. Max Risenhoover: And we see that there’s a ton of information here. This is every discipline, every sprinkler head, every piece of mechanical equipment, the people movers, the baggage handling system, everything is in here. So to manage that much information, there might be an occasion where we would use the Excel plugin to help with that. So if I switched to Excel, I can see the list of all the SQL database tables that make up this project and this FM data table, the one that I mentioned was defined by the O & M team, the FM team. I can query the latest. So rather than loading a spreadsheet where I wouldn’t be confident that I have the latest information, this is actually looking at the central database and doing a query of all this master equipment list. And if I use Excel for what Excel is good for and filter down to these breaker panels that I’m concerned with today, and then I’ll make a little room here. And then as I click through the breaker panels, it’ll take me to the objects in the model. Max Risenhoover: I can see the linked data, I can look for documents as I mentioned. So I could go through this entire model and make sure that the ultimate digital twin that I’m delivering to the owner is complete and accurate. So that was a really quick demo of these first two foundational items. Let’s talk about automation a bit. In 1555 in The Prophecies, Nostradamus said, “From The West there will come a new way to augment the intelligence of design, construction and as-built models that is versatile and efficient, even in the face of frequent model revisions. At its heart, a mystery that shall be known as the DTO.” By the way, for anyone on this webinar that is maybe not familiar with my sense of humor, I may be paraphrasing or perhaps misquoting throughout this presentation, but in this case the DTO, which stands for data transformation operation is a crucial part of our approach to making these models more useful to the broader team. Max Risenhoover: So a DTO you could think of as a generic script that takes model elements, engine and properties as inputs in order to create new properties, calculate quantities, link information and documents, audit compliance with execution plans and automate the application of standard or custom classification systems. To give you a better understanding of it. And first of all a warning, DTOs are a little bit nerdy. They’re quite technical. Some of them are very simple to make and some of them can be extremely complex. The reason this shouldn’t be a concern is that the idea here is that they’re used to create generic reusable libraries. So a very, very small proportion of the project team needs to understand the plumbing of how they work because once they’re defined, they’re easy to use over and over again and a broad spectrum of the project team can use them without having to get into the weeds of how they work. Max Risenhoover: But very, very briefly, you can create a library of these things that can have one to ten, hundreds of them and they can be defined for different kinds of project types or defined to work with particular partners that define their models in certain ways. And ultimately for any one of these DTOs, it’s a selector that provides an expressive set of rules to determine what subset of a model we’re looking at, so basically a query or a filter into the model. And then based on that, some operations. So we could look at those model elements that the filter identified and then create new properties or do calculations based on existing properties or use metadata to create links to things. So a very expressive and powerful set of capabilities that can come from this. So here’s an example of adding intelligence to a model with the purpose of calculating quantities, in this case for an estimate. Max Risenhoover: But there’s many, many applications of this, and this is a very real world example because the models, while they provide all kinds of intelligence, they didn’t provide the intelligence that this team wanted to do the estimate the way they wanted to. They wanted to be able to create an estimate that was driven by really two things, the zone. So where, in this case this high rise mixed use building, a zone within the project combined with an account code basically. And then to be able to compare the quantity changes over time from initial 2D takeoff to model driven quantities and as the models revise and got closer to construction to be able to see those changes. First I’ll switch to InEight estimate. The team wanted to calculate quantities based on really two things, the area in the model, so zones one through eight, and then a cost code that mapped to broad categories of materials, so for example, 8,000 PSA concrete versus 6,000 versus 3,000. Max Risenhoover: They had a challenge that they wanted to understand how they could calculate the area of rectangular ducting to calculate the square meters of all the materials needed for the ducting. So when we go to the model itself, there’s a lot going on here. This is 500 gigabytes of models, mostly Revit. And if we go to the model structure here, if I were to, for a moment, turn off all the structural models and other architectural models and zoom in, you start to see all the systems. So there’s an enormous amount of information here. If I go to structural for a moment, and if I look at the tags and properties, we see zone came across from the authoring environment for all the disciplines. So zone one in the case of structural is the pilings and foundations. Max Risenhoover: Zone two is the rest of the podium on up the tower to zone eight. So zones are good. We have no cost code whatsoever though. So let’s say this is the first time we’ve worked with this particular estimate on a particular vertical build project like this. And so we need a power user, an expert on the team to create any missing DTOs so that we can have appropriate cost codes. So in the workshop that I did with this particular team, I’ve saved some of those DTOs. And in the case of concrete, just show you an example, so we as a reminder, we have the selector that’s choosing what in the model to work on and an operation that is an output of this DTO that’s going to do something, some sort of select some nouns and then have a verb, have an action here, right? Max Risenhoover: So in the properties that we do have to work with, which typically these authoring environments provide all kinds of stuff. It’s just not necessarily coming across in a way that’s useful to the estimators or to the schedulers or to the workface planners. But what do we have? So in looking at this model, material name turned out to be a lot of useful stuff in here that would help us create the kinds of properties we needed that were missing. Revit family type, often anything that comes from Revit will have lots and lots of detail in Revit family types. I’m scrolling here through thousands and thousands of these. So the process of this kind of onetime as we create this generic reusable library of looking through typical models from your partners, you can discover things that are generic and reusable. So if we search on structural, we’re now searching and narrowing down to a smaller set. Max Risenhoover: If I were to double click on an item in that list and then look at it here in the model, I can start to get a sense of what maps to what. If I shift click through a bunch of these that are labeled VCD and then isolate them, I can see when I’m working with, so this kind of experimentation and exploring of the model can help us understand what is available to us. So in doing that with concrete, I’m searching 8,000 PSI, for example, brought up a set of properties and then 8,000 space PSI brought up some other ones. So fast forwarding in the storytelling of this, it turned out that we could create some DTOs, let me clear these out so I don’t get confused later, we could create some prep DTOs that would create missing properties so that then when we connect with the estimate, it makes sense. Max Risenhoover: So let me get back to a view that makes sense here. Do the orthographic front view of all the structural, go to the DTOs and so now for concrete 8,000 PSI, we have a selector that says find every model element in this entire model where material name contains 8,000 PSI or material name contains 8,000 space PSI. If I click test on that, that’ll show me what matches that filter. And then for every match, the operation in this case is a simple one, action type, create a tag, the tag name of cost code and the tag value of concrete hyphen 8,000 PSI. So almost exactly the same for 6,000 and for 3,000, turns out there’s not a lot of 3,000 PSI in this project, a little bit down here. And then having cost codes means we’re going to have a map to the items here because we already have zone. Max Risenhoover: Ducting was a little more complicated because in looking at all the ducting parameters that we had, the team wanted to be able to calculate the square meters of materials for all the rectangular ducting and it was just wasn’t quite as straightforward as that. So what we found was a physical type tag ducting had all kinds of false positives in it, all kinds of stuff that we don’t want to include. So we had to do kind of a two step nerdy approach to this. We did a preparatory step that says, okay, find everything where physical type is ducting and Revit family type contains the word grill or contains register or fan or flex or around. So basically that was all the ways to find the things in the ducting that were not rectangular ducting. So this is a kind of a in case of emergency, break glass approach to getting what you want done here. This would not be a typical thing to have to do. So having created this selector, the output is to add a tag called a temp tag called duct other. Max Risenhoover: So this is something we’re just using as a kind of intermediate step because as I mentioned, the output of a DTO can feed the input of another one. And then here’s another DTO that’s even nerdier really, is it’s using raw SQL to then say, all right, select every object from this master table of properties and tags where the physical type is ducting and this new property we just created in the previous DTO is nul or does not equal duct other. So basically what that’s saying is find everything that’s ducting but not flagged by this DTO as something other than the rectangular ducting that we’re looking for. And then the output of that is going to be a cost code mech dash rectangular ducting. So having one person understand all the nerdy details of that create a reusable DTO that could be part of a library that’s used only for the particular partner or product project type or maybe it’s one off for a single project, but that becomes reusable and it’s generic. Max Risenhoover: So all these prep DTOs really are in service of what happens next. So if I go to estimate and I go to the new plugin that creates a connection between estimate and model, I could send basically all of this to model as a cost breakdown structure. So I’ll click on the concrete and it’ll take everything below that in the tree and I’ll control click on ducting. And keep an eye over here in the DTO library. When I click send selected CBS to model, we now have a new category and model called estimate. So now the plugin to estimate created a DTO for every line item in the estimate honoring the CBS structure that maps to the line just by creating a pair of rules. So we’ve got cost code equals concrete dash 8,000 PSI and tag zone equals zone one, zone two, zone three. Same for the 6,000 PSI concrete, the 3,000, which I believe will only be in this first zone, and likewise with the ducting, cost code equals mech rectangular ducting zone one. Max Risenhoover: And in each of these cases, the operation of the output is to calculate the quantity. So in this case it looked at the units here. So square meter is the unit of measure for all the ducting, and so it created a square meter units output, calculating the actual area for everything meeting this selector. So if I say, all right, let’s run all these because this is a really, really large model, it’s executing millions and millions of operations first going through every model element and looking for those prep steps and creating missing intelligence, then the estimate rules, those DTO rules are leveraging them and then calculating cubic volume and square meter areas. All of that’ll take a moment. So while it’s happening, I’ll switch to estimate, just to point out a couple of things. One is that the cost items already have quantities plugged in. Max Risenhoover: Maybe we could assume they came from an earlier model revision or from a 2D takeoff. The other is that we don’t really have much logic plugged into these yet, but you could certainly have these quantities feeding into labor costs and material costs in a more sophisticated way than I have in this demonstration right now. So I’ve actually shown you the hard way first, right? We’ve had to create this library in a typical situation where we’ve got the library setup that matches the estimate template. It is simple as loading up the template, clicking send selected CBS to model, run all DTOs and then sync those model quantities back. Max Risenhoover: So now all those calculations are done, we just hit sync model quantities and this is showing us the current quantity in the estimate and the calculated quantity that’s coming from model and some color coding to show us how big the Delta is between them. And as we mentioned earlier, the 3,000 PSI only exists in zone one so there’s nothing in the rest of these zones. So if I assume these are correct and I just select all of them and accept them, then they’re going to update all of these quantities immediately. Any flow through to other calculations around costs and labor and all of that will be driven by those quantities. If I were to hit it again without making any change, it’s all white indicating that there is no difference between the current quantity in the estimate and the calculated quantity from the model. So the really interesting thing now is to show how easy it is once we’ve set up these libraries, what happens when models revise? Max Risenhoover: So to demonstrate that, I’ll take advantage of one of the interesting features of the platform, which is that we save every revision of every model and on the structural model for the mezzanine, it’s color coded here in this way because it’s saying you’re not looking at the latest model. In fact, the green is the one that’s active right now and there’ve been two model revisions since then. So it would be unusual to not be on the latest, but there are some cases where that’s useful. And in this case, I made two edits to the mezzanine structural model to make kind of a gross and obvious change. So let me set up a… Actually, I’ll go ahead and change to the latest and when I accept this, somewhere over on the right here, when we update this model, we’re going to see some new stuff, some kind of obvious.  Max Risenhoover: So I, not being any kind of a modular engineer, just created a couple of things that I added to this model. All we have to do at this point to see the impact on the estimate is to rerun the DTO and then sync model quantities. So let’s go ahead and select just the estimate. Actually I think we might need to do the preps to grab that. So we’re going to have to go ahead and do both. So let me run those DTOs. And it was quite a bit faster that time because I had already churned through all of those in the previous paths. So since my model revision just changed the 6,000 PSI concrete for these columns and girders and 3,000 PSA concrete for the slabs, when we go back, hit sync model quantities, we would expect to just see change quantities for zone one 6,000 PSI where it increased a little bit from the previous and zone one 3,000 PSI as we would expect. Max Risenhoover: I appreciate your patience as we got in the weeds there a little bit, but I think that was a really good example of showing a case where there’s information needed by a team. It’s just not part of the models as delivered and we need a automateable way to provide it in a reusable generic form. So I think that was a good example in a vertical build context. It’s easy because those tools are so generic, it’s easy to imagine them helping out with workface planning in an AWP project with an infrastructure project. So we’ll look forward to finding opportunities to show how we can add intelligence into the model to suit those other use cases as well. So ultimately what we’re saying here is that in an ideal world, all of those powerful authoring tools from Autodesk® and Bentley and Tekla, Hexagon and the like, they would provide everything that every team needs. Max Risenhoover: In the real world no matter how good the execution plans are and how well the contract deliverables are written, there will be things that are missing or there’ll be things that can’t be known at design time and have to be added in an aggregated environment. So tools like this are really necessary. So we’re building our platform on this premise that an intelligent model allows structured relationships between all kinds of information with role-specific views to be practical. This requires automation to group and classify and quantify and link elements to other information types. If it’s done manually, perhaps it’s tolerable despite the TDM and the error prone nature of it. Max Risenhoover: Perhaps it’s tolerable once, but in the face of changing projects and revising models, et cetera, it’s just not practical. And it’s also crucial in this effort to break down silos between teams and tools to allow this information to flow as part of a digital thread from the very beginning, from pre-planning all the way through to post-construction operations. So thanks again for your time today. I really enjoyed sharing some of this with you and look forward to other opportunities to talk about more of the great tools we have in our platform. Thanks. John Klobucar: Thank you, Max. Again, if you have any questions about this webcast, please email them to webcasts@ineight.com. To learn more about InEight, as well as our broad portfolio of construction project management solutions, visit ineight.com and click on the request a demo button, and if you’d like to see a schedule of upcoming webcasts, visit ineight.com/webcasts. Thanks for watching. This concludes our presentation.  

FOLLOW US:

FEATURED:

Related Resources

Data vs. Intelligence: The Power of Integrated Program Controls

Data vs. Intelligence: The Power of Integrated Program Controls

Recent advances in construction program controls have transformative potential, but success requires a thoughtful approach that distinguishes between data and intelligence. Join Osama Abdelfatah, Regional Program Controls Director, Americas at AECOM, and Rick Rients, Director of Client Services at InEight, as they explore how to effectively use increasingly large volumes of digital program controls data.

This discussion will dive into how integrated project management information systems (PMIS) go beyond data processing, turning data into intelligence and insights that drive successful project delivery. The key: Harmonizing data to ensure it’s not just abundant but actionable.

Register for the webinar to:

    • • Understand the limits of traditional data-centric approaches in program controls.
    • • Discover the benefits of integrated, intelligence-driven solutions.
    • • Explore the theory and implementation of a controls engine using a real-world case study of a major program in the US.
    • • See how the shift from disparate systems to a single source of truth enables more informed, strategic decision-making.
The Year of Data: Why Data Is the New Currency of Construction in 2024

The Year of Data: Why Data Is the New Currency of Construction in 2024

2024 marks a tipping point for construction – it’s the year in which data has become more important than even dollars as the currency of the industry. Data informs decisions, helps us be more productive and efficient, and shapes how we spend our money. Without good data, stakeholders are grasping in the dark — and in an increasingly complex and demanding environment, that’s not just frustrating, but dangerous and commercially irresponsible.

This session addresses the urgent need for leaders to adopt a data strategy to overcome challenges and redefine project outcomes. Learn about:

·       Why data literacy is critical for your business

·       The strategic benefits of well-structured data

·       How a robust data asset sets you up for success with AI

Register now to join us Wednesday, 14 February at noon AEDT for a journey into the future of data excellence in construction.

Navigating the Unpredictable: Mastering Uncertainty in Project Controls

Navigating the Unpredictable: Mastering Uncertainty in Project Controls

It’s not just you: Uncertainty in construction really is unusually high these days. Whether because of inflationary pressures, adverse weather events, or shifts in materials availability – or all of those factors – successful project control demands an ever more delicate balancing act between scope, cost, and schedule.

But helpful tools and techniques are out there. Join Brad Barth, Chief Product Officer and John Upton, Vice President of Control at InEight, as they discuss integrating scope, cost and schedule to optimize project controls, even in the face of the most daunting uncertainties.

Tune in to learn about:

  • Aligning risk analysis across scope, cost and schedule.
  • Steering clear of common pitfalls when enhancing project controls. 
  • Overcoming the challenges of integrating risk management.