Originally aired on 12/10/2024 | 56 Minute Watch Time
Successful projects start with accurate estimates. Boost your estimate integrity and learn to make better use of your project data in this engaging AACE webinar.
Hosted by InEight VP Industry Solutions, Rick Deans and Estimate Product Director, Aaron Cohen, Benchmarking for Success in Construction Estimating guides construction leaders to improving their project health through the improved benchmarking, comprehensive auditing features, and intuitive interface of today’s leading estimating technology.
Join us on Tuesday, December 10th 2024, at 2:00 pm EST for a conversation of innovation and opportunity as we explore how these tools and approaches help teams win bids and keep those bids profitable at the same time. Topics discussed will include:
- Effective use of benchmarking and as-built data to refine estimates and improve forecasting
- The importance of visual tools and auditing features in creating detailed and defendable estimates
- Utilizing advanced estimating tools to enhance accuracy and detail in project bids
Transcript
Mike Pytlik:
Welcome to today’s webinar. While everyone is entering the room, I’d like to go over a few housekeeping rules for today’s event. So my name is Mike Pytlik, I’m with MCFA Global and I’ll be today’s host. The presenters will be taking questions live, so if you have a question, please include it in the QA box located at the bottom of your screen. Please note that attendees will be able to upvote each question by clicking on the thumbs up icon next to an open question. The questions with the most upvotes will be addressed. First, if we don’t get to your questions live, don’t worry. Our presenters will attempt to answer your questions in a follow-up email. Today’s webinar, Benchmarking for Success and Construction Estimating is sponsored by InEight.
So now to our presenters. One of our presenters today is Rick Deans. He’s VP of Industry Solutions at InEight. Rick has worked with InEight customers in more than 35 countries to help identify innovative solutions to address their biggest project management pain points. As an executive project vice president of the industry engagement, Rick leads InEight’s efforts to engage with their most strategic customers through the industry advisory group. Rick works with IAEG members companies to evaluate InEight solutions before they are put to work on projects and also to identify industry best practices. Rick is passionate about facilitating strong partnerships across the industry and helping build awareness of InEight solutions. An engaging public speaker, he leads workshops on the value of InEight products portfolio and is active in many industry associations, including the Associated General Contractors of America and the Construction Industry Institute. Prior to InEight, Rick advised software companies on talent acquisition and retention, and Rick holds a bachelor’s degree from economics in UCLA.
Next we have Aaron Cohen. Aaron serves as the product director for estimate for InEight. He’s responsible for defining product requirements and overseeing development of the best in class estimating software solution for today’s market needs. Prior to InEight, Aaron served for 10 years as the president and owner of Apollo Trenchless Inc. an engineering and construction services provider, specializing in the application of trenchless technologies for municipal construction projects. He has over 15 years of experience in the business as a project manager and estimator for various infrastructure and utility construction projects. Aaron regularly consults with various owners and engineers regarding the constructability of underground utility and infrastructure projects. He speaks on various subjects at regional and national trade shows such the American Society of Civil Engineers, American Public Works Association, and the North American Society for Trenchless Technology.
Aaron has also co-authored and contributed to various textbooks on the topic of construction such as construction planning, methods and materials and moving the Earth. He also teaches courses at Arizona State University on Estimating and Project Controls. Aaron holds a bachelor’s degree of science from Arizona State University as well as a master of science degree from DePaul University and is a certified professional constructor. So without further a delay, I will turn it over to today’s presenters.
Rick Deans:
Mike, thank you for that, and thanks for all that are in attendance today. It looks like we’ve got folks joining us from all over the world, so we’re excited that you’re with us and we hope that you find this to be a great use of your time. For those of you not familiar with InEight, we are the leader in project control software for capital construction. The InEight software platform is modular and it’s integrated and includes software and four broad categories, project information management, costs and schedules, contracts and changes, and construction operations. Our software supports every phase of construction projects to help companies achieve predictable outcomes. Our customers have achieved results as impressive as 30% increase in staff utilization.
We’re glad you joined us on our discussion of benchmarking today. The key takeaways that we’re looking for are going to be to learn to effectively use benchmarking, learn what that’s all about and how as-built data can refine estimates and ensure more reliable project forecasting. We’re also going to discover the importance of visual tools and auditing features in supporting detailed and defendable estimates and learn how to utilize advanced estimation software to enhance accuracy and detail in project bids. So that’s why we’re here. Now we want to learn a little bit about you. So we’ve put together a few questions and I realize we’ve got a very international crowd. So the third question might need a little bit of context for those outside of the states, but we’ll go ahead and bring that poll up and if you would please interact with that.
Let us know which of the following best describes your organization, which of the following best describes your role within the organization, and then give us your best answer on the third question, and we’ll let your results pile in here and then we’ll take a look at the results. Aaron and I would really find this to be helpful because we want to know really who’s in our audience, who are we speaking with, and really try to choose the one that best describes you. If we end up with everyone in the other category, it’s not going to help us out a lot. So we’re still collecting that data. Once it comes up on the screen, we’ll take a look at the results and give us a good idea of who we’re dealing with here.
Okay, good. Good representation across the board. I see consultants and owners are heavy in the audience today, lot of folks in that estimator cost engineer role. And then for our third question, smart money says other. Okay, there we go. Take that down to the bookmakers and turn that into how you can parlay that. Well, good, good. Well, appreciate that. Appreciate you taking the time to fill that out. So what is benchmarking? Let’s give a little bit of background into what benchmarking is, and I’m going to start off with something sort of anecdotal. I’ve been doing this for, I used to round up to 25 years to make it sound like I had a lot of industry experience, now I round down.
But I was in a meeting early in my career and there was a gentleman director of operations for a very well known industrial contractor. I think at the time there were the largest industrial contractor, certainly in North America, if not the world, very upset. “We’ve been building plants in this country for 65 years and there’s not one single place I can go to see what it took to do the work.” So very clear frustration in the fact that they’d been performing work, they’d been doing work and they’d been doing a good job, but people had side spreadsheets and certainly there was the email chain. You could send a note to John over there who did a similar project and maybe get some anecdotal information from that individual. But there was really no centralized and no professionally managed way of collecting estimated as well as actual data from construction projects, normalizing it and putting it into a repository where anyone in the organization could go and see what it took to do this particular task in the past.
It sounds simple, but this was a real world frustration that this particular individual had at the time. So let’s go back in time a little bit. Where does the term benchmarking come from? So according to Wikipedia, the term benchmarking originates from the mid-1800s when ammunition transitioned from muzzle loaded black powder and a bullet to mass produced cartridge ammunition. But let’s even go back before that, prior to gunpowder based firearms, the bow and arrow was really the go-to weapon for soldiers in traveling, armies and hunters, et cetera. When an arrow’s accuracy could easily be visible when shot by the archer, bullets would just leave a mark on the target, and this gave rise to the term marksman, the one who leaves a mark. Now let’s fast-forward back to the 1800s when ammunition is now being mass-produced. The manufacturers, the end-users, the folks involved in every stage of providing firearms along the way, they wanted to know the best ammunition to pair with a specific firearm to achieve the most accurate and predictable outcomes.
So rather than rely on the variability of individual marksmen to test the ammunition, the process they used was to secure the weapon to a bench and then study the marks left on the target. You can see where this is going, right? We’re strapping the rifle to a bench and we’re studying the marks, so that’s where we get our term benchmarking. This approach allowed for detailed analysis from the numerous tests performed and provided a means of comparison and improved performance as further tests were conducted. So those pretty scientific, even for the mid-1800s. In the business world today, benchmarking is used to measure a specific process or operation and compare the measurements against history or best practice standards.
So Aaron and I had A, colleague, Aaron, who you’re going to hear from in a moment, had a nice discussion about this and we thought, what’s something our audience might really relate to? And we talked about the manufacture of these roto-molded coolers. There’s some very specific brands out there, and they’re very popular with the outdoor crowd and they do an incredible job at keeping things cold for a very, very long time. Now, if they were going to introduce a new line of cooler, they would really probably want to rely on some benchmarking results. They’d want to go back and say, well, for this type of material and this thickness under these conditions, we can keep things at a certain temperature for a certain duration, and without that data, they’re really just shooting in the dark. Aaron, do you recall that discussion we had? It was a very spirited discussion, if I’m not mistaken.
Aaron Cohen:
Yeah, I do. Thanks, Rick. So kind of transitioning this over to what we do in the construction business, very, very different from the manufacturing world where we’ll set about figuring out how much something is going to cost based on a certain design and how much does it cost us to mass produce and put into the market and we can generate our sales goals and create a business model that is predicated on a very known thing. For most construction projects, they tend to be one of a kind. They tend to be different from project to project, even from owner to owner with the same type of work. It tends to be very different based on the site, the location, the regional, local requirements. So it can be very hard to take a project, even a contractor that does the same type of work repeatedly and come up with a good set of benchmarks that say, for this project it should be X.
So what we wind up doing with benchmarking and construction from a practical perspective is we look at and we break down the work, various coding schemes and methods in order to do that, and we try to get things down to a atomic level where we’re able to look at and compare from past projects and past experience what those costs, what those production rates should be, what those various elements that we’re attempting to benchmark on how they compare to work that we’ve done in the past. So one way to think about this is if you look at bringing this down and breaking it down to something that’s comparable, tell me what it would cost to build a 2500 square foot home. How would you answer that question?
Well, really depending on what the home includes that you’re building, what is the location of the home? It’s usually going to be a key driver. What about the amenities? What is the proximity to schools? What condition is the roof in? What are the traffic patterns as I’m going to be getting back and forth to work? All those things will influence the construction, it’ll influence the price that somebody’s willing to pay. So where does that leave us with? How do we benchmark against, maybe we’ve built hundreds if not thousands of homes in the past, but if I’m looking at a particularly new project, I need to break that down and I need to identify those variables that are going to impact my costs, my production, so that I can establish a method of benchmarking moving forward.
Rick Deans:
From my own experience, Aaron, I know from working with customers that in some cases they’ve made different investments in equipment in training and personnel, so they might really excel, like they might really excel, for instance, at trenchless utilities, but they might not do so well in other areas, or they might not have a lot of experience in other areas. So I know there’s a lot of sort of third party commercially available databases out there that people can get their hands on, and a lot of our customers will look at those as maybe a good starting point, but very rarely do they rely on those numbers for all of those reasons that you and I just mentioned, that every organization is a little bit different, every organization has its strengths, has its certain areas that they’ve invested in. So the question would be how do I take my organizational data and how do I turn that into something that I can then leverage and rely on going forward in the future?
Aaron Cohen:
Yeah, that’s a great point. I was just given a presentation about a month ago to a collection of engineers, owners in the municipal area with regards to how do you estimate a trenchless projects. And went through the whole thing, and at the very end it was, well, is there a table or is there something out there that tells me what the prices could be for these things? I wish there was, but the reality of it is as a contractor, as an engineer, as an owner, you need to start building your database and collecting the information. I think what’s really exciting about where we’re at right now is more so than ever, we’ve got the opportunity to be able to leverage a lot of that data that we’ve collected. And then I think what we’re going to show you here is going to be how we can leverage and use some of that information as we’re preparing estimates moving forward.
Rick Deans:
Yeah. Let’s take a look at the next slide if we can. This is some, what I call graphical data. And what’s nice about graphical data as opposed to tabular data, which we’ll talk about in a moment, graphical data really helps us identify trends. So for instance, on the graph of the left, we can see that the cost of doing this particular piece of work has been increasing steadily over time. So we can see that from 2019 to 2020, we’re seeing a steady rise in cost for performing this work. The graph on the right is really plotting quantities against our, in this case, unit cost. So we’ve all heard the expression, well, how much does it cost to move a yard or cubic meter of material? Well, are we moving one cubic meter or are we moving a million cubic meters? The answer to that question is going to determine what it’s going to cost to move that cubic meter.
So maybe we want to look at our costs, both historical estimated as well as actual costs relative to those two factors. Some of our customers will look at ratios between quantities. Maybe my concrete volume is measured in cubic meters, but the formwork that I’m using is measured in square meters, so maybe I want to look for some sort of a ratio between those. Maybe a slab on grade has a ratio that I want to use for comparison stake rather than some very intricate supporting concrete structure that we’d built in the past where maybe that ratio between formwork and volume of concrete pour was different. So I can pick the data apart that away as well.
Aaron Cohen:
Yeah. Another couple things I’d like to point out about the illustration that we’re looking at here that you’ll notice is, and this incidentally was produced from a solution that’s intended to serve this, but can we go back one slide please? Thank you. You’ll notice the different color bands that we have there are indicating different ranges, and so the image on the left is going to give you a rather narrow range or rather narrow band of acceptable values, whereas the one on the right is going to be a little bit wider. Those are going to be different depending upon the scope of work and what you’re looking at. But those ranges can give you a sense for the variability or the risk that you have in any particular item that you’re estimating. If you happen to have a wide range of acceptable values in a particular scope of work, that could give you an indication that there might be a lot of variability in that.
And I think the first example you used Rick of, well, we’re moving material earth embankment well, depends on the type of embankment, depends on the haul route, depends on the equipment being used. You could get anywhere from a dollar a cubic yard to $10 a cubic yard, and those would all be acceptable values. So kind of further illustrates the idea that it really depends on the scope of work as far as what that might be. And then the other thing is we’re looking at these illustrations here. We’re primarily looking at cost, cost per unit and then relative quantities, which is a great way of doing it. But you’ll notice that we can also look at quantities or we can look at costs over time.
One thing that I’ve found that becomes very meaningful to benchmark against if you’re self-performing work is going to be what those production rates are that you’re using more so than the cost that it is, because cost can be influenced by a number of things. If I look at a job from 10 years ago, the production rate might not have changed a lot, but my labor cost has probably changed significantly over the past 10 years.
Rick Deans:
Rural area versus urban area where maybe rates are higher, right?
Aaron Cohen:
Yeah. And there comes the conditions of the work and the nature of the work. So yeah, it is subjective, but if we go to the next slide, we can kind of see what this scatterplot was generated from. And this is kind of the underlying data. And again, as I was mentioning earlier, we’re in a point in time right now where we’re starting to collect so much data, we got to figure out how to meaningfully use it. And there’s a lot of opportunity with a lot of different solutions that are out there as far as providing more flexibility with data analytics. So the sooner that you can start collecting information on your past projects and on the work that you’re performing, the sooner you’ll be able to leverage the data and you can start analyzing, looking for trends.
And in this example here, you can see a number of similar jobs that have been performed for any particular project that we’re doing. We can see the quantities, we can see how those things have been measured, what the production rates have been amongst a few different metrics as far as the way that you measure, whether it’s by square foot, whether it’s by cubic yard, so on and so forth. So again, a lot of data that we have at our disposal and having the ability to make some meaningful analysis out of this can often unlock some of those hidden gems for a business. What are those things that you do well? What are those things that make your company successful and differentiate you from your competitors? What types of projects are you winning more often than others and why? And being able to find that through this type of a solution is also incredibly beneficial for the business.
Rick Deans:
Go to the next slide, please. So in the real world, let’s talk about some of the reasons why we would want to benchmark it. It seems like an interesting exercise, but what’s the business value? Well, a lot of our customers tell us, we really want to validate this estimate. We want to make sure that we have a quality estimate, something that we feel really good about, not just the numbers, but we feel really good about being able to justify those numbers, right? Those of you, I, again, going to go back to the poll, we had a lot of cost engineers estimators on the phone.
I’m going to go out on a limb here, one of the things that probably keeps those folks up at night is, hey, I’ve got this big project I’m working on, have I considered all aspects of this? Have I looked at this from every angle? Where are my risks on this project? And the more substantive data you have to back up your assumptions, probably the better off you’re going to sleep. I don’t know, Aaron, you said you owned a company for a while, what were some of your sleepless nights? What were some of the thoughts running through your head?
Aaron Cohen:
Whether I would make payroll or not, but a lot of that came down to are you able to finish the work on that in the time that you had estimated for it, right? I mean, we’ll go through, we’ll do our takeoff on a project and we’ll figure out that we’ve got two pretty high degree of accuracy, 528.3 cubic yards of concrete. And unless the scope of work changes, that quantity is probably not going to change. But the 784.3 man-hours that you estimated to do it, very likely that that number is going to change in some way. So yeah, when we put an estimate together and we are going to sign that contract and we’re going to live with that price, we want to have a feel pretty good that those production estimates and those costs that we’re using are ones that we can attain.
Rick Deans:
Going to interrupt our flow here just for a second. We had a really interesting question dropped in the chat about regression analysis. The data is available to perform regression analysis. So if you wanted to get, for instance, an R-squared value and see how close that aligns to one meaning, how predictable is this, that R-squared value of one means that there’s a direct relationship between for the quantity of work and the man-hours per unit or the units per workforce hour to be able to perform the work means there’s a very predictable relationship. Something less than one tells me that depending on how far less it’s going to range from zero to one, the closer it is to zero means the less predictable it is. I’ve just got a bunch of points scattered across a plot, so I just wanted to address that. I saw that pop up real quick.
And by all means, use that chat folks if you can respond to that question. If you’ve done some statistical analysis, if you’ve used some regression analysis formulas in the past, by all means throw that in the chat. Let’s learn from each other on this. So we want to validate the estimate. Okay, next up, what do we want to do? We want to make sure that we’re looking for any variances and we understand where those variances may occur. Aaron, you pointed out this earlier. No two jobs are similar. We’re not talking about manufacturing coolers in a manufacturing environment where everything is exactly where it was yesterday, and it’s going to be in the same place tomorrow for easy access. We’re out in the field. We might be working in a, like you say, we might be doing something trenchless on a city street. We might be out in Farmer Brown’s field where maybe we’ve got two different sets of access, traffic pattern controls, that sort of thing. What do you think of when you think of identifying variances?
Aaron Cohen:
This is another aspect where I think it’s going to be very… We have a lot of opportunity. As everyone knows today, we’ve got challenges with staffing projects, with getting labor. We’ve got an aging workforce. It used to be when I started in the business, we had a guy and that guy just knew everything. He’d walk out to the job, he’d kick the dirt, sniff the air, and he just sort of knew that there was going to be a problem on this job. Couldn’t tell you why, couldn’t tell you what the problem was going to be, but he knew it was bad news. And as that guy gets older, he retires, all of a sudden he’s replaced with people that don’t have that level of experience. They don’t have that intuition or that gut feel.
So using the data to start identifying those trends, we do a regression analysis, we see a pattern, we see things that fall outside of some sort of a variance. It helps us to identify those things that, wait a minute, something’s different on this job. And if it does nothing more than draw your attention and bring you into, hey, let’s look at this because it’s falling outside of a certain threshold, you at least have a little bit more opportunity to catch it than maybe the guy that couldn’t tell you why something was going to be bad, but he just knew it was going to be bad because he was there. As our companies are continuing to grow and evolve, we’re doing more with doing more work in a lot of ways with less people. So we’re going to be incumbent upon relying on these types of systems to help us supplement that gut feel and that intuition that maybe it helped us build that business to the point where it’s at. So I’m not sure if I… Yeah, go ahead.
Rick Deans:
Here’s a question that we did not discuss, and it wasn’t in any of our pregame warmups, so I’ll throw a little bit of a curveball, but I’m sure that you’ve got a good response for this. AI. At the end of 2024, it’s been a big talk. I’ve played around with ChatGPT, it’s pretty cool. What are some of the AI implications of benchmarking data and what are some of what you see as we stand today at the end of 2024, maybe some of the shortcomings or some of the pitfalls or some of the areas where we should be using caution if we’re putting this data into some sort of an AI generated prediction algorithm?
Aaron Cohen:
Yeah, I think that’s a great question. Would it be off color to say that the slides were produced by using AI? No, they weren’t. But I think it’s just understanding that the concept of recognizing patterns can help alert us to stuff, but it’s not the end all be all. It’s still going to require a company that’s got the operations that are capable of performing the work within the budget, within the schedule. So again, I think AI is going to do a tremendous job of helping us identify those patterns, find those things that matter, but I think it’s going to stop short of doing the work for us like it does when I have to write an essay. It’s still going to require the involvement of the people that are actually responsible for executing the work to make sure that they’re using the AI technologies and things responsibly. So that’s my take on how I see it. I think we’re all waiting to see how AI is going to practically influence us in different ways, and it continues to evolve day by day. But yeah, that’s how I’m looking at it for now.
Rick Deans:
But wouldn’t you also think that the organizations that are doing a good job today in standardizing their data and grooming their data and organizing their data are going to be in a better position to take advantage of the AI capabilities once they?
Aaron Cohen:
Absolutely. Because what you have is, again, going back to the concept of let’s manufacture some coolers or thermoses and we’re going to have a multi-year business plan around this scope of work. For our projects, they continually change. So the ability to collect and manage that data, the more that you can collect and standardize and to normalize that data so that it is something that these AI engines can use and infer conclusions from, the more knowledgeable your knowledge base is, the more populated your knowledge base is, the better information you’re going to be able to mine out of it, the more likely it’s going to be to predict those patterns, to find those things that matter. To take the three-year experienced project engineer and give him a tool that allows him to say, hey, I think there’s going to be an issue on this, because over the past 10 years we’ve had this many projects where this type of thing has gone and taken a turn south and nobody was looking for it, but we found it because these systems were kind of looking for those patterns for us.
Rick Deans:
Yeah, let’s talk about that. So if we can go to the next bullet point, detect and analyze trends. We’ve talked about this a little bit, right? Our price is increasing over time. Does my productivity shift when I’m working in certain areas? I know from just from experience that if I were going to do some oil field services work in southeast Texas, I’d have access to a very well-trained seasoned workforce. If I were to be doing the same thing, maybe in some small town in the Midwest, maybe that wouldn’t be the case. So productivity rates at various quantities, we talked about that. Price changing as quantities increase, decrease, and then the comparison with similar projects. You talked about this earlier. I think the answer for AI is probably going to be a hybrid. We’re going to rely, like we do today, we rely on the computer for certain things.
Let me put information in here that I can retrieve at a later time. Let me perform a bunch of number crunching stuff that’s mind-numbing that I don’t really want to do on my big chief legal pad. Let the computer do that stuff. But you’re absolutely right. Recognizing, hey, that project that had a whole lot of rework on it because it was hit by a hurricane and we had to restage all of our materials and we lost some equipment and yada, yada, yada, whereas maybe we didn’t have the greatest production rates. So at the core of all this stuff is a coding structure. How do we develop a coding structure that takes into consideration those anomalies? You mentioned earlier, hey, if I’m doing some embankment, what’s the soil type? What does the access restriction environment look like? How do we incorporate that kind of stuff into a coding structure that we can then use to do a true comparison of “like projects” in the past?
Aaron Cohen:
Yeah, I always get a kick out of watching the process where, because a lot of what companies will do without having a sophisticated solution to track a lot of this stuff for you and benchmark and create scatter plots, we’re doing it anyway, right? We’re going to put an estimate together and we’re going to say, “Hey, what are the past three jobs that are the most similar to the one that we’re doing right here?” And I’m very blue when we do that, we always say, “Oh, well yeah, we lost our rear end on that one. Don’t include that one because it’s going to throw the numbers off.” Well, that’s the one we have the most opportunity to learn from. So if we advance one more bullet here, you’ll see we talk about this also enables continual improvement… Sorry, there was two more bullets. We’ll see that benchmarking allows us to enable continual improvement.
And what that means is we’re creating a learning organization where we go out, we perform the work, we learn from the work, maybe it went well, maybe it didn’t go as well as planned, but we can use that information to then re feed the way that we’re estimating new work. And for some organizations, especially some of your smaller organizations that are starting to grow and expand, that can be a challenge because there’s a lot of work that goes and a lot of discipline that goes into making sure that you do a post-mortem of your job and learning from what went well and what didn’t, so that as you were estimating and doing new work, you can estimate it more accurately.
The comparison aspect though, is one where it’s like we have the opportunity to not let that be such a manual process anymore. With a lot of the tools that are out there, this can surface a lot more of the information for us without necessarily requiring us to go out and identify and answer what is the question I’m trying to ask? But just through the course of this, we’re putting our estimate together, it can surface this information for us and tell us, hey, you’re outside of a variance that you should be within for any particular scope of work.
Rick Deans:
Let’s go on to the next slide. I want to talk about maybe some creative uses of benchmarking. So when we think about benchmarking and construction, a lot of times we’re drawn to all of the examples we’ve just talked about. Some direct labor involved work activities, pouring concrete, doing embankment, putting in utilities. But one of the things that I’ve played around with a little bit with the help of some of our customers is risk. I want to be able to see what my risk or what my contingency allowance was in the estimate as a percentage of my direct costs for instance. And this can be really helpful for owners and engineers as well. As we go through our various stage gates as we’re getting our various levels of funding approval for this project, we typically want that contingency bucket to be a smaller and smaller percentage of the job total as we’re getting more and more, presumably more and more detailed design engineering, and we’re progressing from 30 to 60 to 90% plans.
But that’s something that I’ve helped some customers track and it’s really interesting to see as they’re looking at doing a similar project in the future, what should we really be carrying in this project for contingency? And if they’re good about how they’ve been doing that, they can actually track that contingency and not just what they had in the estimate for contingency, but what that drawdown of contingency looked like. Even a time-phased distribution of that contingency for certain projects, what that looked like in the past so that they know going into a particular project, if they’ve got the same known unknowns, they might want to have a similar contingency bucket that they could draw down upon. Have you had experience in that realm, Aaron, sort of getting away from the direct work activities and maybe looking at things a little bit more creatively?
Aaron Cohen:
Yeah. I would say maybe if we can click on the next bullet, maybe a little bit more focused on this one is using that, what’s your design budget for any particular project? Because I know for a lot of agency public-funded work, there’s many times procurement laws, at least here in the states, are going to limit what you can spend in the design phase as far as the percentage of your overall program budget. So I think those things all influence the risk that is inherent in any project. And as we can continue to engineer out that risk, we can make our dollars go further.
Rick Deans:
And to that point, if I’m an owner, if we go to the next bullet, maybe I’ve got a candidate project in mind and I don’t necessarily want to go out and get quotes from contractors at this point because that’s going to mean a lot of back and forth. They’re going to want all the details. I don’t have all those details. They’re going to get excited about the opportunity and I don’t really want that. What I really want to do is put together a conceptual budget. I think based on the output of this asset, this factory or this asset that we’re going to build based on some attributes, high-level attributes of that asset, I’m thinking it should cost around x.
Now, is that a definitive estimate that I’m going to go and sign up and put my name on? No, but it starts the process. It starts the budgeting process starts to get us thinking about this project in terms of where does it fit in the overall strategy of the organization, but it does give me the ability, having that good data does give me the ability to come up with some good valid early stage estimates. And then as I begin to go to the market and get quotes from the contracting community, I can use my own benchmarks as a means to either validate or have some discussion with them about their numbers. Why is your number so much higher here?
I really felt that based on past performance, we would be able to be a little closer to this. Well, and maybe it’s a learning opportunity for me as the owner. Well, this is going to be different because this and this. And then so here we’re looking at it maybe from an owner’s point of view, but if I’m a general contractor and I hire a lot of subcontractors, maybe I want to use the history that I’ve got as a means of managing their performance as well.
Aaron Cohen:
Yeah, this goes back to again, having the data at your disposal as a general contractors hiring a subcontractor, you hire the sub because they’re specialized in their work and they know how to do it and they know how to do it effectively. That doesn’t preclude you from still needing to have an understanding for what the scope is and estimating the work that they’re going to do. And the more detail of an estimate you can have, the better off you’ll be. And again, if you’re able to benchmark your own assumptions based on the scope of work relative to the quotes that you’re taking from your subcontractors, based on the subcontractor that you signed up and your ability to manage their performance throughout the course of the project, that data can help you. Then with all of those, you can more easily identify.
Can’t tell you how many times I’ve learned personally from a job where I’m estimating it and all of a sudden I think that this scope of work is going to cost something and then all the sub quotes I get costs significantly more. And what part of this job did I just miss, right? And so having access to that data and being able to benchmark and see those trends and things pop up help you to better manage not just the selection of yourselves, but also when they’re performing their work, you’re limited in the amount of information that you have. When it’s your forces that are performing the work, how much you’re paying them, because you have to put those payroll dollars down on the check and you have a very fine level of control over performance. But with a subcontractor, it becomes a lot more challenging to watch their progress and to measure performance because you don’t have first-hand information coming back at you as much as you would if you were self-performing that work. So again, being able to track and manage those trends is critical for success.
Rick Deans:
I just had an opportunity to look at the chat. Looks like our attendees are going crazy with a lot of side discussion here. There’s some really good opportunities, some discussion here about maybe developing a community where people can share their data. To share data with peers. In some industries here in North America, there are some groups that actually go out and they actually have a little cottage industry of collecting that data and then selling it back to the folks they collected it from. But it does let them see how they compare on certain tasks to their peers in the industry. And there’s this sort of magic quartile that everyone wants to be a part of, right? We want to be the best in the business at what we do. But the really good ideas of maybe sharing some data across groups, certainly it does bring up some confidentiality and things like that.
There was a great suggestion in here about city plan check and building permit costs. And not just the costs of getting those permits, which overall might not really be a lot, but maybe the duration of getting those permits, and maybe we’re thinking it’s going to be a 90-day process, but our benchmarking information tells us it could be at least 180 days with specific entities. So maybe we want to be able to make sure, because if that’s going to hold up construction for six months, there’s going to be a whole lot more cost than just the actual cost of the permit. So really, really good suggestions there.
And then, yes, Pablo has a good point there too. Owners do not value the initial stages of the procurement process. They just want to award the contract and start moving money. Yes, I see that a lot, but I think that they tended not to have as much appreciation for all that front end stuff as maybe the folks that bear the majority of the risk for doing that. So really, really good suggestions there. Didn’t mean to stray from the script, but I think that was important to share some of those discussions. Another thing too is design growth. So shamelessly, we have a set of tools that can help track that quantity growth. As we’re putting an estimate together, and we’re in the early stages of that design process, maybe we want to rely on some historic data that says small bore piping in this type of project, based on these attributes, you better expect that’s going to have a growth factor of 50% based on this, this, and this, and what we’ve seen in the past.
Aaron Cohen:
And it’s certainly a trend that you’d see now is the last bullet point that’s up there, is having a platform that’s going to support the collection and the use of all this data. And you see it a lot in the technology industry now. There’s a lot of different offerings that are out there that are intended to provide that platform offering. And this is why, right? Once you have the ability to collect all this data, if it’s sitting in different solutions that don’t talk to each other, it becomes really, really hard to start identifying and making these connections and these patterns and seeing these trends. And as technology continues to improve in our industry, you’re going to see more and more capabilities relative to that. But as you’re making decisions about what directions you want to take your company in as far as technology and growth and all that type of stuff, it’s just one of those things to keep in mind and for that reason.
Rick Deans:
Even some good reading suggestions in the chat there. And there’s sort of a theme running through the chat is what we talked about earlier, Aaron, and that is, okay, great, we’ve got a lot of community data, we’ve got a lot of data, but maybe that the data doesn’t really mean much to my organization because of the idiosyncratic way we go about performing and executing our work. So I think the takeaway this should be, hey, while it would be very aspirational, while it would be great to share data across the industry, maybe the best place to start is getting our own house in order and starting with our own organization’s data and getting that to the point, I saw someone commented, normalization, normalizing that data, which is really key, especially when we want to do historic comparisons and actualizations.
Aaron Cohen:
Another thing that popped up there that I’m not sure we touched on, I see Michael discuss the level of which you’re benchmarking in that management approach. And that’s another key aspect too that I’m not sure that we’ve necessarily covered on, but getting all the way down to the bolt and gasket level of tracking your productivity versus just looking at a very high level overall on this plant. The piping should be this, the mechanical systems are that. And then I think getting into the right level of detail based upon where you’re at in that estimate is important.
Again, if you’re very early on, you’re more the owner-owners rep and you’re trying to come up with a budget that’s going to be reasonable so that you can secure funding for the project and move forward, then you’re benchmarking at more of that system level approach. Whereas if you’re the contractor that’s signing the contract and is responsible for meeting that man-hour budget, otherwise you’re not going to make money, then you’re going to be getting down at that deeper level. So understanding that’s going to influence the level of information that you’re going to want to track and the way that you’re going to set up your budgets, the way that the level of detail you get to in your estimates, we’ll all be driven by how you want to be able to use that data moving forward.
Rick Deans:
And don’t forget your organization’s maturity index either. So if I’m just starting out to do this stuff, give me six buckets I can put my time into and my costs into that I can expand out to 60 buckets and then maybe I’ll have a few dozen of these things, but don’t try to boil the ocean. I’ve seen too many people either say, we have to have this coding structure that goes down 15 levels of detail, but they’re not equipped to manage that. And then the other piece that I see is, okay, well we’ve done a good job of collecting data. In fact, the three ring notebooks behind me on the shelf are full of really, really good cost reports, but it would take me six years to digitize all of that information.
My response to that would be go back and select a few historic projects where you want those benchmarks, but put your energy on capturing what you’re doing today. While you’re sitting there and trying to get all your 20 years of experience shoved into a system, you’ve got stuff going on out in the field today that you’re not capturing. So that’s always been my recommendation. Pick a handful of historic projects that can be an ongoing continuous improvement to go back and pick some of that historic data, but put your energy, put your focus on what’s happening today and tomorrow, and then you’ll wake up 18 months from now and you’ll have a bevy of really, really, really good data that you can rely on that’s combined with some pretty good data from the past too.
Aaron Cohen:
Yeah, that’s a really great point. Glad you brought that up, Rick. Because I have seen more than a few initiatives where it was well intended but fell short because all of a sudden, hey, here’s this 50,000 cost code structure that we’re going to implement tomorrow so we can track things at this level of detail and it falls over when it gets out to the field where somebody’s got to actually track all that information. And now I got to pick from 50,000 account codes. Guess what? It ain’t going to the right one and the information you’re getting back is not going to be meaningful and it just again, well intended, but just to your point, start small and expand as you start seeing value of the implementation of something like this. So really good point.
Rick Deans:
80/20 rule is the latest comment in there. So 80% of your accuracy will come from 20% of the data anyway. So when you’re just getting started, what are the 20% of all of the tasks that we do 80% of the time? Let’s get some good data on those and then we can branch out from there. You can always look at this as a very phased approach to getting better and better over time.
Aaron Cohen:
Yeah. I see that we’ve got a lot of things in the chat. I also see there’s a Q&A window where there’s a lot of things queued up. I don’t know, Mike, are you able to, or Joanna, help us with triaging some of these? Is there any questions that we wanted to cover in the remaining time that we had?
Mike Pytlik:
Sure. I could certainly go through and fire a few questions off to you guys if you’re okay with that.
Rick Deans:
Sure. We may want to answer some of those off line.
Mike Pytlik:
Obviously for our participants, we’ll make sure that we get a email out, hopefully following up with any other questions. So the first one here is will you address rules of credit for different commodities for different facilities, i.e, nuclear, gas, office construction?
Rick Deans:
Yeah, absolutely. And that’s typically during execution. So when I’ve achieved so much of this work, I’ve achieved X percentage of the broader package and our software does allow for that during execution. So if that’s the question, yeah, while we’re executing against a project, if we want to say doing this particular stage of the work is going to earn us 15% of completion for this package, we can certainly do that and allow that progress to be recorded as we’re off those individual tasks.
Mike Pytlik:
Okay. Another question here about KPIs. So what KPIs should be monitored to measure optimal efficiency?
Rick Deans:
And just like a lot of our discussion has been centered around, well, that’s going to vary company by company. The KPIs within your organization might be a lot different than the KPIs in the organization across the street. But typically what people are going to look for is efficiency. So how close was our man-hour factor in the field? I.E. how many man-hours did it take us to install a unit of work versus what we had estimated? Is there an opportunity to drive that variance down across multiple projects?
Other efficiency factors that I look at are, okay, well we did the work just the way we thought we were going to do, but we paid the people twice as much as we thought. So we want to make sure that our cost per workforce hour during execution for that set of tasks aligns with what we had put in the estimate. Are we using the best resources in the field that’s going to be a cost-effective measure? Obviously scheduled durations are things we want to look at as well from a KPI perspective. What percentage of the original duration did we actually consume? Did we consume 89% of it or did we consume 130% of it, right?
Aaron Cohen:
Yeah. And I think a lot of times those KPIs will be driven based on what it is that drives the risk and drives the success for your company. So for a general contractor, you’ve maybe got a lot of your risk as covered in the assignment of subcontracts. And the biggest thing for you is making sure that you get your schedule met because that’s going to allow you to get your indirects and all that type of stuff capped. And that’s going to be the schedule-driven KPIs are going to be more in tune with you insofar as like a self-performing contractor where it might be more, you got to look at your man-hours, you got to make sure that labor’s not exceeding what you got budgeted and all that. So yeah, really long way of saying it depends, but it depends.
Mike Pytlik:
Okay. I think we probably have time for a couple of questions here. So question came out here of a team that uses a bunch of estimates in eight estimate. Does estimate have a way to easily combine and search that information to establish the own benchmark database?
Rick Deans:
So depending on what version of the software you’re using, you would either combine that in what we call our data warehouse or you would use our connected analytics tool. But it’s basically the way of taking that data out of those individual estimate files, and just to your question, putting it into a common database that you can then filter based on a variety of factors. So one of the things we didn’t discover is filtering out the projects that I want to use, which projects do I want to use that best represent this project that I’m estimating? Maybe I want to look at projects we did in a certain region or for a specific customer or during a specific timeframe. So all of that data can be grouped together. And that’s probably something I did see that question, and we will get access to this list of questions. So Gaurav, we’ll probably reach out to you after the session and we can explore that in more detail.
Mike Pytlik:
I guess there’s probably time for one more question here. I guess a lot of really good questions about the different types of benchmarking. I just apologize, I had a question. The age of data has come up with a number of different questions. Do you guys have any final thoughts about that as we wrap up here and then we’ll go ahead and wrap up the webinar?
Aaron Cohen:
Yeah. Again, not to use a cop out here, but it depends. But the age of the data, it does depend on what it is that you’re trying to compare. So if I had 100 years worth of data, I’d probably be pretty challenged to go back much more than 10 years before the previous 90 was going to be very useful. But for production type information, that tends to be pretty good. There’s only so many ways to build a wall. There’s only so many ways to dig a trench. We make incremental improvements over the years, but there’s not the same type of volatile swings in cost information. So what are materials costing today versus 10 years ago versus two years ago? What do labor costs? How much have they escalated?
Going back and looking at those trends, sometimes we can look at those commodity curves over time and we could predict what we think a project is going to be. If we’ve got a project that’s going to go a number of years in duration and we have to forecast what we think the escalation risk is going to be. And so all of those things are things to look at. So yeah, if you’re looking at the good problem to have, you’ve got so much data, you might want to time box it and make sure that you’re using data that’s relevant. Other things are geography, type of work, making sure that you’re getting all of those aspects of the job that’s representative accounted for. So it’s not just a blended average of everything, but something that’s specific to the work you’re trying to estimate.
Mike Pytlik:
Okay. Well, it looks like we’re getting towards the end of our session here, so I’d like to thank everybody for joining. I’d like to thank InEight and their presenters for today’s webinar. Just a reminder, this event is CEU eligible and attendees will receive a certificate for attendance in the post-event email from AAC in a few days. Thank you again for tuning in, and I hope everyone enjoys the rest of their day. Thank you everybody.
Aaron Cohen:
Thanks all.