The Gold Standard for Data: Setting the Stage for AI

Oct 28, 2025 | Webinar, Data, AI

Originally aired on October 28, 2025 | 1 hour watch time

For too long, inconsistent paper records, long-forgotten spreadsheets, and siloed teams have kept construction organizations from taking full advantage of their valuable project data. Teams crave a trusted single source of truth to learn from and refer back to throughout the project lifecycle—which is impossible in a fragmented data environment.

Adding AI to today’s chaotic data won’t solve anything. To achieve meaningful gains from AI, you first need to get your data house in order.

Discover how standardizing operations with the right data approach helps organizations achieve better decision making and stronger results—while setting the stage for AI success.

This deep-dive webinar provides actionable strategies for technical-minded construction leaders interested in improving operations and project outcomes, both today and in the future, by building gold-standard data practices.

You’ll learn:

 

  • Proven strategies to improve data collection, accuracy, and sharing
  • How to improve current and future project outcomes through effective benchmarking
  • How data-driven decision-making mitigates risks and improves outcomes
A headshot of Brian Mikinski

Brian Mikinski

Software Engineering Manager, InEight Schedule

A headshot of Kimo Pickering

Kimo Pickering, PE*

Product Owner, InEight Estimate

Transcript

Lance Stephenson:

Hello everyone and welcome to today’s webinar. My name is Lance Stephenson and I’ll be the host for today. While everyone is entering the room, I’d like to go over a few housekeeping rules for today’s event. The presenters will be taking questions at the end of the presentation. So, if you have a question, please include in the Q&A box located at the bottom of your screen.

Please note though, attendees are able to upvote each question by clicking on the Thumbs-Up icon next to the open question. This allows us to present these questions first. That way the presenters can address them in regards to how they’re relevant to everybody. If we don’t get to your live question, the presenters will attempt to answer your question in a follow-up email.

And today’s presentations are CEU eligible. And this is being recorded. So, attendees will receive a copy of the presentation recording as well as the slides and the certificate of attendance containing CEU credits within the next 48 hours.

So, with this said, today’s webinar, The Gold Standard of Data: Setting the Stage for AI is sponsored by InEight. We have a couple of speakers from InEight team today. First, we have Kimo Pickering. Kimo is a product owner at InEight, where he leads product development for the company’s Estimate application, a core solution used by contractors, engineers, and project owners to manage cost estimating and project data. With over 30 years of combined experience in civil engineering, construction management, and software development, Kimo brings a rare blend of field-tested construction experience and digital transformation insight to his role.

Before joining the software industry, he spent nearly three decades working on major infrastructure projects, including highways, bridges, airports, and underground utilities, serving in roles as project manager, senior estimating manager, lead estimator, senior project engineer, project controls manager, general superintendent, and craft superintendent.

Other presentation, Brian Mikinski is a software architect and program manager with over 13 years of experience in the Construction and Energy industry. His career is focused on designing and developing enterprise grade software solutions that support large capital projects. Brian is a founder of InEight Schedule and currently serves as its program manager where he combines deep software engineering expertise with firsthand industry knowledge to develop innovative field-ready technology. His work centers on bridging the gap between construction operations and digital transformation with a growing emphasis on AI and data continuity across project life cycles.

So, without further delay, I would like to turn this over to our two today’s presenters. Thank you.

Brian Mikinski:

Thank you, Lance. And thank you to all the participants that are joining us today. Like Lance said, my name is Brian Mikinski. I’m a software architect and program manager for InEight Schedule. And I’m joined here with Kimo Pickering, the product owner on the InEight Estimate team. Kimo and I are very excited to share some insights and talk about data and laying the foundation for a AI-first future.

But right off the bat, we wanted to do a little survey and try to gain an understanding for who we’ve got in the audience today. So, if you could, please answer the survey and then we’ll take a look at that here after I walk through a little bit more about InEight.

Okay. So, let’s talk a little bit about InEight here. We are a leading vendor in the construction project control software industry. As you can see by this slide, we cover project information management, costs and scheduling, construction operations, change management, design management across every aspect of project controls and building large scale projects. We’re a modular and an integrated platform. We’ve been around for a little bit over 10 years as a suite. We were founded in 2014.

You can also see we’ve got a lot of money flowing through our systems at the time. So, right now, we manage globally greater than a trillion dollars and inactively executed projects, and then we’re pretty much everywhere. So, you are going to find us in around 60 countries or so all over the globe. So, at our core, really, InEight is about empowering construction professionals with the tools that they need to make informed decisions really every step of the way.

All right. So, I’m going to turn it over here to Kimo. But before I do that, I’ve got the results of the poll here. So, about 19% of us are owners. And the question was, which of the following best describes your organization? So, 19% of y’all are owners. We’ve got some owner reps and construction managers. We’ve also got designers, about 8%, general contractors 26%, 5% of y’all are subcontractors or suppliers, and then 36% are consultants. So, that’s the first question. Welcome and thank you all for being here again.

We also had a second poll question which is, which of the following best describes your role within the organization? So, 26% are project management, 24% are cost controllers, 3% of y’all are document controllers, 17% are schedulers, an area that’s close and dear to my heart. Great to have the schedulers here. We’ve also got contract admins at 3%, design engineers at 8%, and then 20% of you operate in a consultant role within your organization. So, once again, thank you all for being here and it’s great to see such a diverse crowd across many different verticals at the companies that y’all work at.

So, at this point I’m going to turn it over to Kimo to talk a little bit about some of the key takeaways that we’d like to try and communicate here today.

Kimo Pickering:

All right. Thank you, Brian. Again, my name is Kimo Pickering, I’m product owner for InEight. I work on the Estimate application team. Let’s pause here and highlight a few key takeaways that we want you to go away with when we’re done at the end of the session. First, leveraging data for better decisions. In today’s construction environment, decisions are only as good as the data behind them. So, when we capture and structure data effectively, especially using what we’re going to talk about a lot today are account codes, what we call account codes, we empower teams to make faster and more informed choices with greater confidence.

Second, benchmarking for continuous improvement. So, this is looking at historical data and using it for forecasting purposes on the job or for your estimate that you’re working on and your individual cost items. So, data isn’t just for tracking and it’s for learning. By comparing performance across projects, scopes and teams, we can identify patterns, spot inefficiencies, and drive improvements over time. This turns experience into a strategic asset.

You finally build a strong data foundation for AI. So, that’s the buzzword these days. And AI isn’t magic though. It needs clean, contextualized data to deliver value. By tagging and organizing our data today, we’re laying the groundwork for tomorrow’s AI-driven insights, automation and predictive capabilities. These three principles are the pillars of a modern construction data strategy.

Next slide, Brian. So, let’s start with the reality of data challenges in construction. And there’s a lot of them. Our industry runs on information, but that information lives in disparate systems. So, you have things like ERP systems, estimating tools and applications, scheduling platforms, 3D and BIM models, Digital Twins, spreadsheets, job photos, work plans, submittals, handwritten notes, et cetera. I mean it’s all over the place on any typical construction project. Nothing is perfect and the result is often fragmentation and chaos.

When data is siloed, decisions are made without complete visibility leading to risk and inefficiency. So, what’s the strategy then? Well, we’re going to start small and work our way up, but we need to build a good base and it begins with account codes. Think of them as a common language that ties everything together, because what you measure becomes what you know. Your data is your experience if you measure it correctly. Quantities matter, of course, but they only have meaning in context.

Account codes provide that context allowing us to connect costs, schedule and scope across systems. So, why does this matter for AI? You might be asking. Well, when we embed account codes thoughtfully, we’re not just organizing data, we’re preserving experience. And that experience becomes fuel for AI-driven decision, augmentation, et cetera. Imagine engineers and estimators leveraging decades of hard-won knowledge, but at machine speed.

Alternatives can be evaluated instantly. Routine low-risk tasks can be automated, freeing time for higher-value decisions. The goal is decision-quality data because no one wants to spend $5,000 to support a $5 decision. In other words, don’t step over a $20 bill to pick up a nickel. Next.

Brian Mikinski:

All right. So, Kimo, thanks for laying out some of the challenges there. But I want to dive down a little bit more deeper into really the reality of construction data. So, we’ve got siloed data sources, everybody does. And a common way to view our data is as two major types. So, there’s unstructured data sources and then there’s structured data sources. And both of these data sources are going to present different, unique and interesting challenges.

So, let’s talk first about the unstructured data sources. These are often very rich in context. They can be hard to analyze without advanced tools. These are sources that everybody on this call is going to know very well. These are the Word documents, the PDFs, the spreadsheets, they’re going to have specs, contracts, budgets, it’s all scattered, all about the place, it’s disconnected and there’s really no consistent tagging and way to search and analyze this data beyond what you’re going to find in a common file search or maybe a PDF search. You’ve probably also got emails. We’ve probably got text messages. All this data is really classified as unstructured data.

The other category that we’re all going to be pretty familiar with too is our structured data sources. So, in this area, we’re going to find our ERPs. This is going to be the financial information that lets your organization run. SAP, Oracle, JDE, those are all going to be the systems where you’re going to find the structured data. You’re also going to have more your project management tooling. This is tools that InEight offers that a Procore offers, an Autodesk offers. And oftentimes, it can be difficult for these more structured but siloed and proprietary sources to communicate together.

Some other examples here are going to be estimating software, such as InEight Estimate or HCSS. And then we’ve got rich 3D models, Digital Twins with lots of great metadata, but it’s isolated. That’s really one of the problems here is that it’s not easy to communicate between our structured data sources. And then more commonly, we’re always going to have some GIS or drone survey data. That’s really becoming common on nearly any project.

So, the reality is none of this is ever going to be perfect, none of it’s going to ever be totally lined out and all communicating synchronously. And we know that at some level that the net result of these disparate data sources is chaos. And chaos is bad. Chaos leads to uncertainty, it adds unrecognized risk, it can cause delays, cost overruns, et cetera, and nobody wants to deal with that. But with every chaotic situation, we also have to remember that there’s also a opportunity.

And so, Kimo and I here would propose that account codes and starting up high, starting at the top, we can really begin to take this fragmentation and turn it more into an integration. And what we’re aiming for is to try to weave together these unstructured and these structured data sources to better enable cross-domain insights, forecasting and smarter decision-making.

Okay. So, really, we need a strategy here and let’s talk about that a little bit. How are we going to bring some order to this chaos? And as we’ve already said, it’s going to be account codes. And I really like to think of account codes as a common language, a lingua franca between our disparate systems, between our structured data, between our siloed ERP, and our project management. And the goal here is that we are trying to get all of these systems to speak across a common language and get true cross-functional visibility.

And so, when we start to connect the dots between these systems and we allow the ERP and the estimating systems and our project management systems to communicate together, we begin to realize some operational productivity. But one important thing I want to call out here is that we really aren’t just looking for operational productivity in our current and our active projects. We need a way to tie this information to historical projects. And account codes are going to allow us to connect those dots and bring more order to the chaos.

And when we start to do that, we can realize gains of our historical data and we can understand that what we measure is really what we know and that’s going to allow for our data to become more of an asset and less of a liability. And then really looking forward, this is our true foundation for AI and automation. And coding both historically and with active projects is going to prepare ourselves for the predictive analytics that we know is going to come at some point for the automated reporting. And it’s even going to allow for us to gain and aggregate insights from our unstructured data sources such as our photos and our documents.

All right. Kimo, why don’t you give us an idea of what an account code actually is here?

Kimo Pickering:

Yeah. So, what exactly are our account codes? One way to look at is to think of them as a substitute for physical work. They give us a way to look across all the different operational domains, separate operations, and start to weave them together. Instead of just tracking tasks, account codes provide the context that connects cost, schedule, and scope.

Now, let’s talk about contextualizing account codes. They act like guardrails. They define where and what you can explore from your historical data and they help us capture experience, because experience isn’t just something you have, it’s something you can ask questions of and get suggestions from. Account codes, support estimators, construction engineers, design engineers, schedulers and project management by giving structure to work plans and even highlighting safety and quality hotspots.

They can even be used for augmentation, helping us present data in an actionable way. For example, man-hour curves, commodity charts, leading indicator analyses for safety and quality incidents, and safety and quality risks. The key here is understanding that everyone has used pay item codes and cost codes. And most people are using their form of cost codes, but they’re typically slightly different from project to project. Whereas standardizing on an organization-based account code system enables apples to apples comparisons versus apples to oranges comparisons.

So, what about some use cases? There’s plenty of them. Estimating based on past costs, what we call past costs, some might call historical data, historical man-hour factors, et cetera. Providing alternative analysis, onboarding new employees faster. And we’ll have some real-world examples I’ll be showing you here in a minute, where showing how field operations can be tied to account codes.

But one important note, account codes don’t tell you everything. They won’t capture whether supply chain issues or change orders, although you can, to a certain degree, with enough flexibility, apply account codes or some tagging system to some of these items I just mentioned. But those factors still need to be managed separately. And when it comes to creating the connected data strategy, account codes are the backbone.

Next slide, Brian, please. So, I got a couple of examples here. I’m sure those in the audience are very familiar with. I think account codes, we have different organizations and standards, organizations out there around the planet. US and Canada has their own. The UK has their own. Up on the screen, I’m showing a couple examples, ones of the CSI MasterFormat. And to the right of that is the RICS NRM 2, New Rules of Measurement out of the UK.

But for the most part, a lot of these classifying or classification numbering systems or coding systems have a relatively similar structure and format where they tend to be more in a hierarchical format with the first couple of digits or numbers being of a general broader category. For instance, on the screen here for the CSI format, we’re showing on the left side 03 30 00 Cast-In-Place Concrete.

And then on the right with the RICS NRM 2, a similar case here for them is 11 is their main category number In-Situ Concrete Works and then they go from levels to levels. So, when I’m saying levels, I’m referring to the position of the number within the code itself. So, if we go back, if you look at the Cast-In-Place Concrete on the left, 03 30 00, I might call that a three-level code, some might call it a two-level code. But you can see that the first 03, we would call a Level 1. The next two digits, 30, we call it a Level 2. And the next two digits, Level 3, and so on.

So, that’s the format a lot of these common systems use, whether they’re using different characters for separators is up to the system themselves. But one thing about some of these systems, they’re proprietary and require licensing in order to use those systems. But there’s nothing that says you can’t create your own in-house account code system.

So, Brian, next slide, please. So, here on the screen I show an example of some account codes. In this case, it’s account code group 61, which is a major category for concrete and it has a primary unit of measure and a secondary unit of measure. And then which of the different cost categories would use these account codes? And as you start to drill deeper into the codes, meaning going down into more and more deeper levels, you can get into more detail like I’m showing here as far as the type of work under its parent type of work or parent category.

Another example I have, Brian, on the next slide, a similar one but different types of work. So, in this case, we have account code group of 53 and that would, in this case, would be aggregates and paving. And then we have the two-level code under it, 53.09, that would be asphalt paving. And then you can keep drilling deeper and deeper into it for the different operations like lay and roll, miscellaneous flexible paving, seal codes, chip seal, you even get down to the near the bottom, 53.10, concrete paving and then all of its account codes underneath it.

And you would use all these different account codes to tag your work and to tag your cost items in your cost systems and to keep it in a database where it can be used later on. Next slide, please. It’s you Brian. You’re on mute.

Brian Mikinski:

Thank you, Kimo. Those were some excellent examples. So, let’s talk here a little bit about how we actually go about implementing account codes at our own enterprises. So, first off, like Kimo was mentioning, it’s not just about picking a list, it’s about building a structure that works across your organization. So, I think Kimo showed the CSI MasterFormat and the RICS, those are really great places to start and formats that you can adopt. But you’re probably going to want to customize it to your own needs.

The other thing that he called out is that you really want it to be hierarchical. So, you want to be able to drill down from a high-level, which is where you’re going to start, and you want to be able to supplement that with more details that are going to really bring richness and context. And then the table that you can see below is what we call a crosswalk table. And this is a really useful tool to begin to see how codes can allow for individual customization, but still weave together the data between various siloed data sources.

So, let’s just walk through a couple of these here. On the first column in the first row, we’ve got a 1C-100 account code that is going to correspond to our structured ERP Category called concrete. In the column 3, the BIM Element is going to be structural concrete. And then in column 4, we’ve got 1C-100 that is going to correspond to a Schedule Activity. So, all of these different systems, they’ve all got their own specificities and can manage their metadata, call things what they want, but they’re still talking the same common language of a 1C-100.

And playing off of that point here a little bit as well, the rows 2 and 3 are both 1C-200, 1C-300, they’re related to concrete as well, footing and slabs, but they’re all specific to their individual context. So, this strategy and this step of building out some crosswalk or mapping table amongst your data sources is going to be very powerful and it starts to allow you to realize what some of the benefits are of establishing a master account code structure.

Those are our structured data sources. Let’s look at the 5th and 6th columns in the crosswalk table. And these are categorized as Spreadsheet Line Items and Document References. They’re also related to concrete, but maybe they’ve got some different namings or different titles and different metadata. And you’re probably going to find that in Spreadsheet Line Items and Document References that this data is more challenging to work with.

And so, when you bump into these unstructured sources, you’re probably going to have to start to look at utilizing some other AI tools that already exist and you want to be able to both extract context and extract metadata, and then you want to be able to write and push metadata back into these unstructured sources. So, some tools that come to mind here that you may look at are in Azure, there’s Azure AI services, it can help you read and extract metadata. There’s other tools such as Power Automate.

And in the structured sources you can start to extract value and begin to tag account codes. So, if we keep going, the fourth step here and an implementation of account codes is to begin to integrate with Middleware and ETL tools. I’d be willing to bet a lot of us are familiar with Middleware and ETL tools, but this is really where we need to bring together the structured and the unstructured data sources. We need to push them into a more generic area repository, likely a data warehouse or maybe a data lake. And we’re going to need to use ETL tools to get our data there.

And ultimately, what we want to do is have that be the foundation, the footing of the AI tools that we’ll layer on in the future. So, that’s another very important step. And once you’ve done that, whether or not the AI tooling exists today, you can still begin to realize a lot of gains and measure both current operational productivity and do comparisons and analysis to your historical data.

So, there’s some examples in here that I added. We obviously can build dashboards. We can do reporting. Once we’ve gotten our data into a data warehouse, we can begin to look into data governance or project governance and then we can envision other integration scenarios such as cost loading a schedule or trying to weave together data from systems where it just wasn’t possible until we established a master account code structure.

All right. Kimo, would you like to show us some real-world examples?

Kimo Pickering:

Yeah. Let’s have some fun now. Let’s get some of you in the audience involved. Right here, I’m showing a picture from a job site of asphalt paving operation. And I’d like to see if anyone in the chat can start to list out some data sources before Brian shows you all the ones that I’ve come up with that you could use for basically data sources that you will want to capture and then use for future purposes like forecasting and estimating and obviously, cost tracking in that.

But if anyone out there wants to give a guess, just throw something out there you see on the screen that you think would need to be tracked from a project controls perspective. I’ll start you off with, there you go, asphalt quantity. Okay. We got one in there. Crew size, production rate. All right. Here we go. Labor production, mobilization. Well, mobilization that’s tied somewhat to the first one I show here.

So, a first example of an account code would be hot plant fix costs. So, this would be the cost that you would incur for setting up and tearing down a portable asphalt hot plant like you see in the background there. And in this case, we used account code starting with the 53. And that 53, if you recall, one of those tables I showed earlier is for the aggregates and paving group and then you start to drill down further from there.

And this final one is the batch plant fixed costs set up and tear down. And of course, with every account code it’s important to have a unit of measure and even more, not more importantly, but even in addition to just a primary unit of measure, you’d want a secondary unit of measure as well. What else we got out there? Location equipment. Brian.

I think someone mentioned productivity. How about mixing, making the asphalt trap and mix HMA. We have a separate account code there you can see in the box. And this one is going to be tracked by the ton. What else do we have here? Paving the asphalt. So, in the background we’re making the asphalt. In the foreground, we’re paving it, we’re laying it down. And this has a slightly different cost code or account code. Sorry. But you can see how it begins with the same group number, 53, and it just drills down from there until it gets to a terminal level.

And in this case, this is an account code for lane and compacting asphalt on a main line, meaning a highway mainline and with a primary unit of measure of ton and secondary unit of measure of square yards. What else do we have? Anybody get dust control? That’s pretty important. We’re going to need permits, especially when it comes to plant operations and dust control. And that’s very useful for future purposes of estimating, because every state, every municipality has different requirements for their dust control permits. So, we want to make sure we’re tracking that.

Keep going, Brian. How about mineral filler or adding admixtures to the asphalt? So, we’re talking about permanent materials here or bulk commodities. In this case, it’s lime and the owner of this project requires the aggregates to be treated with lime. So, that’s an in-place, in-line operation where the lime is delivered by that red truck down there at the bottom right and it’s being blown up into the silo where then it’s metered out and mixed in pug mill with the aggregates that before it goes into the drum where it gets heated up and mixed with asphalt.

What else we got? Craft supplies. Anyone pick that up? Well, you’re going to have to pay for ice and water and things like that and depending how you want to track it. In this case, we want to track it by the DMH, which is direct man-hours. Yeah. Fuel costs, I see that. Someone jumped the gun, and lighting. We’ll get there.

Okay. Here, we’re showing another example. Now, it’s a different type of work. This is a bridge being built. And in this case this bridge is part of a … Oh, it’s those ABC bridges is a job I was on here in Utah. In this case, the bridge is actually being built offsite, not in its final location and it will then be moved with SPMTs into its final location, which in this case was about … I think it was about a quarter or a third of a mile away off of on I-15. But again, we’ve got things in here that we will want to track.

And so, from data sources, anyone care to guess as to what you see here. Brian, let’s help them out. Well, first thing, how about forming a deck? So, now, we’re starting with account code of 61, which that’s the group category for concrete. And then I believe .09 would be regarding girders and decks and diaphragms. And in this case, this account code is for erecting and stripping the interior deck form. And we want to quantify that by the square footage.

We got some waterproofing. Yep. Fall protection. Guy, nailed it there. Mark, you got it. In this case, it falls under access, other and by eaches. But there’s many different types of fall protection systems out there. Yeah. I like it. Crane support. Keep going. How about subcontractors? If in this case … And someone did get that, rebar for the deck. All right, Ravi. I know in this case, we weren’t going to self-perform the rebar installation, so we subbed it out and we have account codes for subcontractors which start with 95.61, which would be for concrete. .06 would be for steel and re-steel. And 002 is furnish and install the rebar by the ton. And that could have a secondary unit of measure of, say, pounds or kilograms.

Falsework. Did anyone get falsework? Well, you can see here the arrow pointing to those pipes because we’re building this off-site, not in the final location. So, we had to build the falsework or some might call a temporary abutment. And in this case, it’s because it’s temporary work, it falls under the 54 category. Falsework with pipe, square foot or cubic foot. But some might not know that. Well, if you’re not going to design it in-house, you’re going to have to hire an outside third-party engineer to design the falsework or formwork, any sorts of those elements. And in this case, that account code starts with an 88 which falls under temporary construction design and then it goes down from there.

Brian. Someone mentioned earlier about lighting on the asphalt painting spread. Well, definitely, we have lighting here, because we’ll be working around the clock on this bridge to get it ready to go, because schedule is of utmost importance. And so, here we’re talking about renting this lighting system. And for outside rentals, in this case, the account codes begin with the 90 and go from there. And these are portable lights that we’re running by the week.

Lumber supplies, there you go. Sahid got that one. Good job. And this one is what we would consider some might call consumables. My old employer we call the STS, services, tools and supplies. But these are consumable materials that are not part of the permanent work. They are used to build the permanent work. And in this case, they’re going to form up and support the concrete that’s being poured for the decks and the diaphragms and things like that. And then they’ll be ripped out, removed and either reused or thrown away and we typically would quantify that by the square foot.

And finally, the big guy here, where most of the cost is, we have these UDOT Bulb Tee Girders. Just FYI, at the time this bridge was built, they were the longest precast concrete bulb tee girders made in Utah. They were 192 feet long, single span. And those girders, if you can tell by reference, they’re roughly about 8 feet tall. But in this case, we’re buying the girders and the supplier casted these girders and we use the account code group of 93.61 for bulk commodities for concrete and then precast concrete. And we either price that by or track it by the ton or the linear foot, depending on your system and what the supplier does.

I think someone mentioned some things in here about only for civil construction works. Yeah. I’m a civil paving guy, so these are most of the examples I have. But next time, I’ll bring some piping and stuff from other jobs I was on. Concrete paving. Right here, we’re going to use a slip line machine, Gomaco 4000. It can pave up to 40 some feet wide. This case, it’s 48 account code. It’s for the 3-lane. And again, it begins with a 53, because it falls under aggregates and paving. But then .10 would be concrete paving.

And also, because we’re slipping it, we want to make sure we get into details there. You can see Concrete Paving, Slip PCP greater than 1,000 cubic yards, greater than 14 feet wide. There’s 1,000 tie bars but no rebar and we’re going to track it by the square yard or cubic yard, of course, and I’ll explain. But you don’t see there on the ground the dowel bars or tie bars. That’s because this particular paver has a DBI machine on the backend where it inserts the dowel bars and tie bars automatically. So, that helps to speed up the operations. Otherwise, those people in the front that are guiding the trucks and dumping the concrete would be working around all these baskets.

Of course, we have our wet hauls trucking. So, if we’re batching the concrete ourselves, we’re going to either truck it ourselves in Super 16, Super 18s, or even 10 wheelers and haul it out there to the job and then dump it in front of the paver. So, if we’re doing it with our own trucks, we might have an account code starting with the 53.24 and we’ll haul it by the cubic yard or the mile. But if we don’t, we’re going to sub it out. And so then, those subcontract truckers will then be assigned account codes that begin with the subcontractor group 95.53.06 by the cubic yard.

Keep going, Brian. I know we’re pushing time here. Someone mentioned batch concrete. Yeah. That’ll be its own account code. And how about quality? I point behind the slide, you can’t see it, but we’re going to have to go back there because all the states has smoothness and roughness specifications. So, we’re going to have to go back there with a roughness meter or a profilometer or something of that nature and profile the slab after we’re done paving. And then note any bumps or dips that are out of tolerance and we’ll have to go in and grind it with a subcontractor diamond grinder. All that’s going to have account codes too, and that’s very important, because that’s a major cost.

Typically, when you’re estimating concrete paving, you’re going to put some money in your estimate for grinding, because it’s just inevitable. You’re going to have bumps here and there that you need to grind down. And then of course, the owner’s going to want to test the thickness of the concrete. And I believe in this case that concrete was on I-15, it was 12 and a half inches thick. But we’re going to have to drill core holes and measure the thickness of the concrete and that, of course, has its own account code as well.

Keep going. Last one, Brian. Let’s get in some dirt work real quick and some maintenance. So, here, I’m showing a major earthwork operation 651 scraper operation with two D-10 dozers pushing it, because that 651 only has one engine on the front. And when it’s in that tough material, it can’t excavate the material itself in this specific type of soil that is. So, it needs Caterpillar, the dozers behind them. And in this case, it needed two dozers to be able to push.

And so, here, we have an account code of 51.06.02.056, breaking it down, 51 being grading, 06 being scraper operations, excavation. And we’re going to … These are 651 scrapers with a 1,500-foot to 3,000-foot haul distance, one way, going to an embankment, tracking it by the cubic yard. Anyone care to guess what you would use for a load capacity for that 651 scraper in this type of sandy, silty material? Go ahead, put it in the chat. I’d be interested to see what we have out there.

We have account codes for owned equipment. So, we can track our equipment rent costs, our own company-owned equipment. So, in this case, we have two codes there, one for the dozers, one for the scraper that we’ll be tracking the cost with. Then on the right side, we move over into equipment maintenance. Now, that’s a whole separate group for account codes that this organization chose to use. In this case, it starts with the 90, which is equipment maintenance.

And then it drills down, I don’t know the exact order. But basically, what are one of the most expensive costs or consumables on equipment on operations like this? Tires and tracks. And right below it, another one, if you’re using scrapers would be your wear parts, or in this case, cutting edges. Those cutting edges get worn down very quickly, because in this case, the soil had a high silica content and that’s very abrasive and that adds up and adds up and adds up and you got to be able to account for that. And hopefully, you had it in your estimate for this type of material in excess of what your typical maintenance ratios would be.

Brian? I wanted to show one other example of where account codes can be useful and that’s for benchmarking. So, what you’re looking at here is a screenshot of InEight’s Estimate interface here and we’re looking at a specific cost item. And Brian, if you can click one more time. Yeah. So, it’s broken up into three different areas here. Where I’m headed with this is the … So, on the left side, you have the cost item details. So, in this case, it’s [inaudible 00:49:07] backfill, 18-inch reinforced concrete pipe for storm drainage.

On the left side, cost item details, you’ll have your equipment and basically, your crew and equipment. In the middle, you’ll set your production rates. In this case, the estimator set it at 5 linear feet per hour. There’s a five-man crew, so that equates to 1 linear foot per man-hour. And then when setting up this estimate, we chose which jobs out there as-built jobs that we wanted to compare to. So, on the right side is the benchmarking feature. And Brian, if you can go to the next slide, it’ll show a little larger version of this.

So, here, it’s all done automatically and what I’m showing you is the two panels that were on the right. So, we have our production of 1 linear feet per man-hour, because that’s how we’re going to track it against other jobs. Because the other jobs, we may not know how large their crew size was or anything like that. But the other jobs were somewhat similar except for maybe one here you can see an outlier up there at the upper left in the pink.

So, the white diamond is our current estimate, the orange square in the middle where I’ve got my arrow on, showing what it is, that’s the average of everybody in here. And then the three black triangles are jobs, actual jobs with as-built man-hour factors and we’re using to compare our estimate to. And then we have this built-in feature where we can show the different ranges, 0% to 10% in green, 10% to 20% in yellow, and greater than 20% in the red or the pink, which this is a good visual indicator to show your manager when you’re reviewing estimates that, “Okay. I’m in the ballpark here of other similar projects.”

Now, if everybody was … All the other jobs were up in the red and you were down below, someone might be saying, “You need to speed up.” Or if it was vice versa, they might be telling you to slow down. So, this is just another application of using account codes of as-built work. You assign account codes to your estimate cost item in the Estimate, and then that’s how you can start comparing apples to apples and making sure your production rates are right and also to lower your risk profile, so you can set yourself up for success on the project.

Brian Mikinski:

Thank you, Kimo. I’m going to go through this real quick, because I want to leave some time for questions. So, good data strategy starts at the top. You always want executive buy-in and you want to start coding at the top. So, try to start coding at the highest level possible and ensure that you’ve got good executive support and buy-in. The next thing is AI is going to be horizontal. It’s going to touch everything. All the data systems and the data sources that we talked about are going to need to play a part in AI and they’re going to have some role to play there.

The third point is, AI is not going to replace humans for critical activities. Humans must be in the loop with AI and they will always continue to be. And then the final point here is, start coding today. So, there’s no better time to start than encoding your data today. And AI is the foundation or the real foundation of AI is your data. And if you really want to unlock the potential in the future for construction, you need to start working on it as soon as possible. So, do we have any questions?

Lance Stephenson:

Yeah. First of all, thank you guys for that. It’s very interesting and a lot of people don’t recognize what you need to do just to even prepare yourself to ingest and curate the data that comes in. Just let everybody know, AACE has numerous RPs on how to develop code of accounts. And so, go to our virtual library. There is I think four right now that are on different types of code of accounts, accounting considerations and cross control, those kind of thing.

There’s also a recommended practice on how to develop a project historical database. And so, there’s these opportunities, through AACE, to help you guys bolster up your code of accounts and how you need to work through them. In regards to some questions, yes, we did receive a couple. There’s one here. How does the code of account-centric relationship model you’re proposing handle data types that don’t naturally fit rows and columns such as BIM models, images, schedules, unstructured documents, which are increasingly essential to AI and ML workflows?

Brian Mikinski:

Sure. So, I think I can handle part of that. I would say those data sources certainly are going to present challenges and you’re going to have to do some work to code at that lower level. But the first place to start is start higher up the chain. And then like Kimo and I showed, you really want to have a hierarchical structure. And then you’re probably going to have to build out some processes, some ETLs. You’re going to need to use some of those unstructured data tools that we discussed to really find a way to push that coding structure down into a lower level.

But to that point, too, not everything has to have an account code. You don’t want to be too focused in and really you can go to a level that’s too deep to where it’s not really even useful to what you’re trying to accomplish.

Lance Stephenson:

Yeah. Most definitely right. We have another question here. Let me see here. Based on your experience and best practices, do we need to assign … Oops. It jumped on me. My apologies. Do we need to assign the account codes on the work package level of the WBS?

Brian Mikinski:

I think that would be very, very helpful. Maybe the value today might not be able to realize it today, but I think in the future that is going to pay huge dividends. And one way I’ve seen this done is through a smart coding system with your activity codes or maybe the smart coding system with your WBSs. And I would really advise people to begin to develop those types of structures and their schedules.

Lance Stephenson:

Yeah. A lot of people have to recognize that any type of data that you could put or whether it’s metadata, whether it’s labeling, whatever it might be, it all adds value specifically around feature engineering and feature importance and recognizing which one are the contributing factors. At the end of the day, what you’re trying to do is understand correlation and without that code of accounts, you struggle with recognizing that.

Brian Mikinski:

Yeah.

Lance Stephenson:

Yeah. We do have a question here that someone’s asking. It’s not clear to them how AI delivers improvements and how to select and implement the AI cost effectively? Maybe you can dive into that as our last question before we move on.

Brian Mikinski:

Yeah. Absolutely. I think it is really early on in the AI journey. And the way I like to equate it is we’re at the first pitch of an at-bat and the first inning of a baseball game. Every single day, this stuff is changing. But the places that people are seeing the most early on success is really an automating backend monotonous tasks that humans would’ve done first.

So, that’s where I would tell and advise everybody to start is you want to try to do really basic simple things and then let the technology develop in the coming years. And then I think if you’ve done coding and you’ve been preparing and thinking about these things, you’re going to be really set up for success. So, start small, start in the back-office work, the automation there, and that’s probably the best place to begin.

Lance Stephenson:

Yeah. Yeah. I love the analogy of the baseball. And just remember your AI can take up to 18 innings. And so, you have to be careful of that. Well, we’re at the top of the hour. And so, I want to be respectful of everybody else’s time. And I just want to thank you guys. Just in regards on behalf of AACE, I would like to thank InEight and the presenters for today’s webinar.

Just as a reminder though, this event is CEU eligible and the at attendees will receive a certificate of attendance in the post-event email. Again, look at AACE for their content joined in on InEight or go to InEight’s website to see more about their products and what they have to offer. I am also the chair for the Data Science & Advanced Analytics subcommittee for AACE. So, you can see, we’re all trying to bring this together to see what we can do to better utilize AI machine learning, advanced analytics, and to hopefully make our jobs better.

So, Brian, Kimo, I want to really thank you guys for presenting this today. It’s very informative and you’re bringing home some of the key elements that allows us to use these advanced and emerging technologies. So, thank you so much for that. And so, yeah, with that, any closing words from you guys?

Brian Mikinski:

Nope. Thank you all for being here. It’s been great to share what we know and we’ve learned over the years. And hopefully, we’ll be back and see you all soon.

Lance Stephenson:

All right. All right. Thanks everyone. Have a great rest of the day.

 

FOLLOW US:

REQUEST A DEMO:

Related Resources