KP Unpacked

No One Trains AI and That's a Problem

KP Reddy

 In this episode of KP Unpacked (AI in AEC edition), Jeff Echols and Frank Lazaro—KP Reddy Co.'s in-house AI expert—tackle one of the biggest blind spots in AI adoption: training your team. 

Everyone’s talking about AI. Few are doing anything useful with it. From underused tools to forgotten policies, they break down why most firms are missing the mark—and how to fix it. 

As always, the number one podcast in AEC brings you practical, no-fluff insights designed to help you move fast and build smarter. 

🔹 Why tossing your team into ChatGPT is not a strategy
 🔹 The “12-minute rule” that unlocks 10,000+ hours of productivity
 🔹 What your AI policy is probably missing (hint: actual training)
 🔹 Train-the-trainer vs. send-everyone: Which model works best?
 🔹 The playbook strategy every AEC firm should adopt
 🔹 How lunch & learns can quietly drive AI transformation 

💡 Key takeaway: AI adoption without training is like CAD without a mouse. Build the system. Create the culture. Track the ROI. 

🎧 Plus: How to handle prompt hoarding, office power outages, and becoming your firm’s AI guru (cape optional). 

🎉 Special Offer for KP Unpacked Listeners: Get 55% off your ticket to the 9th Annual AEC Summit on October 29th at the Diverge Innovation Center in Phoenix! Click the link below and use promo code UNPACKED55 at checkout.

🔗 9th Annual AEC Summit

Don't miss this opportunity to connect with top minds in AEC and beyond. Tickets are limited—act fast!

Speaker 1:

Hey, welcome back to KP Unpacked. This is where the biggest ideas in AEC, ai and innovation collide. It's powered by KP Redico and this podcast breaks down the trends, the technology, the discussions and the strategies that are shaping the built environment and beyond. This is the AI in AEC version of the KP Unpacked podcast. My name is Jeff Eccles. I head up our mastermind program, I head up our startup incubator here at KP ReadyCo and I am joined, as always for this version of the podcast, by my friend and colleague, frank lazaro. Frank is one of the heads of our advisory program here at kp ready co and he is our in-house ai expert. So it's always fun to be joined by frank and to to dig into these topics. I get to learn and you get to learn, and we focus on giving you actionable takeaways from what it is that we talk about. So, frank, welcome, I'm glad you're here and I'm getting ready to dig into another another topic here with you.

Speaker 2:

Another exciting topic? Yeah, it was. I know that we were talking back in the green room earlier, but it's interesting as being out in the wild and having listeners come up to me and start talking about our first couple episodes, and it's amazing, just the positive feedback that we're getting. So thank you for all those that are listening. But Jeff is probably going to say this a dozen more times before we get off. If you have topics or ideas that you want us to kind of dive into, you know, leave a comment, let us know, shoot us a note. We're happy to kind of talk about it.

Speaker 1:

So, yeah, looking forward to this episode, yeah yeah, absolutely, because, like you said earlier, I mean, this is the purpose of this. We talked about this as we were, you know, kind of iterating on what this podcast could and should be. And there's plenty of discussion out there, there's plenty of podcasts out there, and this is not a knock on any of them, but there's plenty of them out there that talk about AI and emerging technology, different tools, et cetera, from the 30,000-foot view.

Speaker 1:

You know, here's this, here's that, here's the other, here's this, here's that, here's the other. Very few of them focus specifically on the AEC the architecture, engineering, construction, world, built environment and even fewer, if any, focus on something that's actionable. Hey, here's what we're going to talk about today, and at the end of this, we're going to give you the one or two or three things that you need to go do in your firm this week and so.

Speaker 2:

So I think that's a great point. I think that's an excellent point because the feedback that I get and what we hear from you know, the listeners and the people that I interact with is that they love the practical nature, like you guys dive deep on one topic, you're giving examples, and I think that's the purpose and I think that's the value in this is like I can't. There's no reason for me to tell you that AI is important. I think we all know that right, it's all going to impact us in some way. I think the practical nature and what we do here in 20 minutes, I think, is more valuable than what most people are doing when they sit into a workshop all day.

Speaker 1:

Right, yeah, absolutely. And with all of that in mind, again, it really doesn't matter if we spend 20 minutes or half an hour, whatever the time is, on this episode talking about something that you're not able to do something with it. So back to Frank's point a minute ago if you have something that you want to know about, if you have something you want us to dig into, let us know so that we know exactly. If you tell us, hey, this is what we want, then we can talk about that and we can make sure that it's a valuable content, it's a valuable podcast for you. So probably the easiest way to do that is to jump over to our YouTube channel. It's KP Ready, the letter K, the letter P ready, r-e-d-d-y. Probably the easiest way to do that is to jump over to our youtube channel. It's kp ready, um, the letter k, the letter p ready, r-e-d-d-y and um. Just just drop a comment below the video there on on youtube. Again, that's probably the easiest way. Um, of course, you can consume this in all kinds of different formats the youtube shorts, the long form, the podcast version, apple spotify, whatever. Right, um, which is also the beauty of technology. But um, to leave us, to leave us a comment and let us know what it is that you want to hear us talk about. Just go over there to youtube and drop the. Drop the comment there and we'll be sure to pick it up.

Speaker 1:

Our production team will be keeping a lookout, and anything that we talk about that needs a link go to the show notes. Our production team is also dropping links to the things that we're discussing into the show notes, so you don't have to worry about taking notes or anything like that. Just go to the show notes, find the link and go check it out. All right, frank, let's get into it. Our topic today is AI training getting your team up to speed. In one sense, I do absolutely believe that everybody on your team has a responsibility to learn some of this stuff on their own right. Here we are, we talk about this emerging technology and we talk about AI, this and you know these futuristic things. We talked about using the internet. We talked about using web browsers in the same way I don't know 15, 20 more years ago. I'm just going to do the math real quickly in my head.

Speaker 2:

Stop, you're going to make me feel old.

Speaker 1:

It's too late. I got up this morning. I know how it goes, but the, the, you know. So I think there's some responsibility right. This, this is, it's not even the future, it's here. So we have some responsibility to learn some of these things on our own however right we're, we're all about building teams and and, uh, supporting teams and improving teams, so we may need to train our teams on some things, yeah, so where do we start?

Speaker 2:

so it's interesting. There's really kind of there's two things that I always bring up when either I'm giving a presentation or talking with a client, and one is kind of just to make a point we have no problem with sending a project manager to a PM bootcamp and spending thousands of dollars and them away from the office for two or three days. Then my immediate next question is how much money have you identified for ai training? And inevitably it's zero. They haven't really kind of thought about that and and my point to that is, just giving somebody the tool is not going to make them more efficient and, all honestly, I think what you're going to find is is that they'll waste more time trying to figure out how to do it than they then they will if they've they got training. So for me to figure out how to do it than they will if they got training, so for me it's like a cornerstone.

Speaker 2:

I kind of started my consulting career with giving AI training, so I have a lot of passion around that. But I think the real issue that we have here is that we're giving people these tools with the expectations that they're going to kind of figure it out, and then we worry about data security. We worry about you know. Client information Okay, great. Maybe if you train them on those things they would actually use the tool more effectively.

Speaker 1:

Yeah, well, I think that's a really great point. Right, there's the tool. There's a tool itself can they learn how to use chat? Can they learn how to useude or um? We?

Speaker 1:

We know a lot of people are using co-pilot and other tools. Sure, yeah, sure they can. They can learn um, they can improve their skills, they can get better at writing prompts, etc. But those things that you just mentioned the data security and and privacy and things like that yeah, that that may, may not be the things that pop up on their radar. So there's there's that aspect of it which is critically important, and we had a session for our mastermind members probably a month and a half ago now, all about AI policies for your firm, and and we we actually talked about that on the podcast as well an episode or two ago and it's really important, right, because clients are asking what about my data? These types of things are coming up, so we need to make sure people are using the tools properly just like anybody else that uses a tool, and I think your point about efficiency right.

Speaker 1:

Can they learn it? Yes, yeah, can they learn it quickly? We don't know. And it reminds me that you, you have your, your book finding 12 minutes, yeah, so, if you, if you want, go over to Amazon and look look up Frank Lazaro or look up finding 12 minutes, unlocking efficiency with generative AI and that's one of the things that we talked a lot about when we first launched this podcast yeah so why don't you recap that really quickly?

Speaker 1:

because I think it, I think it ties to this idea of training pretty powerfully as well yeah, you know.

Speaker 2:

So the whole concept is, if you think about breaking down the concept of using generative AI from a productivity and efficiencies perspective, that if you go with the small goal of saying I want to save 12 minutes per day using generative AI and when I say generative AI, chatgpt, copilot, gemini, pick your tool of choice You'll end up saving about an hour per week. And then if you kind of build that out throughout the whole year, what you'll find is that you're saving more than 40 hours per year per person in your firm. So the one big metric we all know this is that every firm focuses on utilization. So if you're looking for found hours, you're looking to try to improve efficiencies and productivity to get people to do more billable work. You know, using an ai note taker to save five minutes per meeting um, using um, your co-pilot or gemini, to kind of help respond back to an email or capture meeting notes or do any of those things, and you can save those small pockets of time collectively over the year. What you find is is that you gain those things and you can save those small pockets of time collectively over the year. What you find is that you gain those efficiencies and productivities. For example, a 200-person firm can generate an additional almost 10,500 hours a year.

Speaker 2:

That's the equivalent to almost five full-time employees' worth of hours, and that's just getting everyone to do 12 minutes per day. So I think it's the baseline right. I think that's the getting everyone to do 12 minutes per day. So I think it's the baseline right. I think that's the bare minimum. You know, if you save five minutes, you're in three meetings. You're already at 15 minutes. Um, I use it consistently throughout the day, um, and I probably save about an hour or more per day, right? So I do, I'm able to do more. I'm able to time shift, like we all have things to do during the day, sometimes a doctor or whatever, but I can make that up using gender debate.

Speaker 1:

Yeah, and so we know that many AEC professionals and of course, this is not unique to the AEC world, but we know that many struggle with the adoption of AI and different AI tools option of AI and different AI tools. So what are some of the common issues that you see as you're going around and doing some of these trainings and doing some of these workshops? What are some of the common issues out there?

Speaker 2:

So, right off the bat, when you think about it, every firm immediately ran out and created an AI policy. Okay, great, did you train people on that policy, right? So when you think of, when it's funny when you think about AI training, everyone thinks about training on how to use the tool. And, yes, absolutely yes, you need to train people on how to use the tool, but you also have to train them on the things that you care about, like how do you use client data? How do we do this? What is our data security? So training goes beyond just saying I need to learn to prompt. Sometimes it's like, yes, that's probably one pillar of several that your AI training should cover.

Speaker 2:

So I think some of the problems, or the things that you see out there is that one is that firms rushed out, created these AI policies and never really trained people on what the policy is or the real problems. The second part of that and I think it's probably equally as problematic is the fact that now that you haven't trained them on the policy, you're also just giving them a tool and they don't know how to use it, and so what you see is is that it's the lack of adoption at like oh yeah, we have all these AI tools and no one's using them. Okay, but what did you do from a training perspective? What was the cadence around that? Again, you think about all other aspects of our business. We have training regiments that already in place, right? We don't make somebody a project manager without actually giving them the PM training. We need to be thinking about that same way when it comes to AI.

Speaker 1:

Yeah, absolutely. It's not just even though we're getting into a point in time where those coming up into leadership maybe not top level leadership yet, although they're close, even though they're digital natives different than you or I, but even though they're digital natives.

Speaker 2:

We're digital ancestors.

Speaker 1:

We are that it's not a guarantee, right? We're digital ancestors, different aspects to training. So when you're thinking about training, how do you go through? You know, or what are the steps that you use, maybe when you're talking about going?

Speaker 2:

through AI training and adoption so interesting. I want to touch on one thing, because you just triggered a thought there. Right, we go back to this concept of you would train someone how to use Autodesk or Revit. Very rarely do they have Autodesk or Revit for personal use. They're not using Autodesk at home to do something really different with AI. So some of their learned behavior and some of their training is because they're personally using it and they're doing things and they're trying things. That doesn't necessarily translate into how you want them to use it from a business perspective. Right, they're planning trips, they're doing things, they're making. That doesn't necessarily translate into how you want them to use it from a business perspective. Right, they're planning trips, they're doing things, they're making funny videos, they're making funny images all from a personal perspective. So this one particular tool, in this particular instance, there's that gray area between I use it for my personal stuff and I kind of know how to use it, and they take that same behavior and they try to do it on the business side, but it may not fit the AI policy. It may not fit, it may not get the results that they want.

Speaker 2:

So some of the areas that you think about, at least when I think about this, is that, yes, you need to go, send your people out that get AI training. Does that mean you send everyone? No, you know, there's the concept of train the trainer. Send one or two people to get trained up and then have them come back to your organization to train everyone else. There's a ton of free resources out there as well, not necessarily geared towards, say, aec specifically, but there are a bunch of trainers out there that are doing stuff for our industry. So you have to think and again, pick your association, pick your three or four letter acronym. They're all kind of offering something along those lines. Heck, we even offer it, right. So it's one of those things to where there are plenty of opportunities to do that. But it doesn't need to be daunting. It doesn't say oh my God, I have 150 people in my organization, I have to send them all to this training.

Speaker 2:

Now you can do a train the trainer model. That's very effective for very big organizations, um, but you know you need to have those people. They need to be champions. Um, they need to document the training right. So you know, use loom or or record a zoom call, do those things that you have it. So people, they're not going to learn it the first time. You tell them that they're going to have to go back and say, oh, what was jeff saying in that training about that prompt? Let me go back, um and then key around documentation. Right, I love telling people. I'm like, if you find a prompt that works, save it right, go back to it. I guess the other thing I should be telling them new is to share it yeah share with a colleague.

Speaker 2:

Hey, this is, this is this is how I'm using it, this is the way that that I've gotten these results, and that would be pretty effective to really start kickstarting that with your organization.

Speaker 1:

Yeah, yeah, that that's great advice, certainly In the. You know, it's funny, you were saying that the alphabet soup, that's what I called the, the associations, right, um, the aia, the acec, the psmj, the smps, you know, the list goes on and on. Maybe kp, kpr, co, is part of the alphabet soup, I'm not sure.

Speaker 2:

But we're getting close we are.

Speaker 1:

we're fully embedded, certainly, but you you also made a point a minute ago that you know some of this training is not necessarily AEC specific and I would say and I doubt that you'll disagree with this, but I would say that for the vast majority of what we're doing and the topics that we're talking about in these sessions I mean we haven't talked about test, fit or swap or any of these those are all specialty yeah, right, yeah.

Speaker 1:

So so many of these tools that we're talking about google notebook, lm and chat and uh gemini things you mentioned earlier claude and uh co-pilot and those, yeah, the, the. The training doesn't need to be aec specific, right these?

Speaker 2:

are. No, I don't, I don't think it does right. There was a phrase that I used in one of my trainings a long time ago and I say a long time ago, it's like last year um, you know the in ai years.

Speaker 2:

That's like a century, but the the phrase was the prompt is the prompt. So the one thing that you have to know about these is that, and and I love showing this side by side you know, I'll open up copilot and open up um, chat gpt, and put the same exact prompt in both and they'll operate right. So once you learn how to use one switching to chat gpt or switching to Copilot or Cloud or Gemini they all operate and act the same way. So prompting is prompting. That doesn't really really change anything.

Speaker 2:

The specialized tools are something that's a little bit different, right. So the swap AIs, the Joyce of the world, the work orbs those are generative AI tools, but those are very specialty tools, right, they're doing something very specific to where we're creating construction documents, we're doing proposals. That's a different learning model, different learning curve. But when we think, generally speaking, from a prompting perspective, using any of the other tools, if you learn to one, use one, you could use any of the other ones, which I think that's fabulous in a lot of ways. But it is also one of those things to where just know that as you start adding in some of these unique tools, your training is going to have to be a little bit broader right. It's just not about learning chat GPT. We're onboarding X tool, so therefore you also have to have training around that tool.

Speaker 1:

Yeah, yeah, A hundred percent, Right. And and and I think we know that it's not you're not gonna, you're not gonna send your accounting team to get trained on AutoCAD or on on Revit, um, nor are you going to send your design team or your construction team to um, construction team to you know some nondescript, non-industry association for training on AutoCAD or Revit, right, it's just. These are things that we know. It's sort of common sense. So, if I were to sum up the steps that you're talking about in terms of training and adoption, you talked about sending a certain number of people the train, the trainer, but sending them to something hands-on, or maybe even bringing somebody in-house to do the hands-on, the training, the live demos and the workshops. And then you talked about documentation, so maybe creating playbooks, right, Okay, this is what we talked about.

Speaker 1:

These are prompts, these are best practices. And then you talked about those AI champions I think is what you call them. They're the people that are going to carry it in-house. And, hey, I've got a question. I need to go talk to Frank, or I needed to go talk to Jeff, because they're the ones that are the, the our I don't know our in-house gurus or whatever, whatever, uh, title we're going to give them. They probably have to wear a sash or something like that, I imagine, around the office they get little pins of pins of honor.

Speaker 2:

You know it's interesting. I've also been yeah, I've also been telling organizations and basically kind of doing what we, what we try to do here at KP Redico is that, you know, having those live demos and just those lunch and learn type opportunities to have people share the different things that they do. So I would encourage organizations, even from an ongoing learning perspective, maybe once a quarter, you know, just to have a one hour lunch and learn, where people can come and say, hey, this is how I use this tool Right, and I think what you find, particularly with AEC firms is is that they like to see what other people are doing. They may not do it exactly like the way you're doing it, but it helps them kind of visualize. Oh, I can see how this can be integrated into my workflow. I'm just going to have to do these couple of steps differently. So those, you know, the live demos and that lunch and learn environment where people can just kind of one toot their own horn or like, hey, check what I've been doing about when it comes to generative AI, could spur the thought and help someone else within the organization to learn.

Speaker 2:

Kp and I were actually at a client meeting, a workshop, a couple of weeks ago and it was interesting. You had one department head saying, oh, we're doing all these things from an innovation AI perspective. Their counterpart was like oh, we didn't know you were doing that. When you talk about training, just being able to get her on a quarterly call to sit there and say, hey, these are the things I'm doing, probably would have got the other department had to go. Okay, wait, wait a minute, maybe we should be doing that. So I think there's a lot of opportunity when it comes to, once you kind of get the baseline training done to, to continue that training. I think it's gonna. I think honestly, I think it's a very continuous training model that most are gonna have to do yeah, yeah, absolutely it's gotta be.

Speaker 1:

I, the tools are changing so quickly, new tools not being developed and everything else. It's got to be continuous, right, yeah, so so I, I hear it. I hear what you're saying. Right, if, if we go through this training, we do all the things that we've been talking about, it's going to help us adopt the AI faster, the AI tools faster. I should say, not the AI, I'm not talking about, you know, the eventual Arnold Schwarzenegger, skynet, terminator kind of event. It's going to help us adopt AI tools faster.

Speaker 1:

And I like what you said right, the other department needs to know what we're doing, that spreading that throughout the organization can reduce some of the resistance to change, because there are people we hear about this sometimes in our mastermind groups hey, the thing that's holding us back is people's resistance to change. As it turns out, human beings often resist change. So there's that. And then the other thing that we touched on with the policies and everything else we need the consistent use. We need a consistency in the time, a consistency in the way, a consistency in the way the data is handled and everything else, because if you talk to your innovation team bell, yet talk to the people that are dealing with the client facing side and and ask them what questions and what requirements the clients are starting to push down.

Speaker 2:

Uh, in terms of of ai so, yeah, and I think there's another pro in there too is that I think that if you have a consistent training program in place, I think it's easier for leaders to kind of see the ROI and the investment. If I invested this much into this technology, this innovation, what am I really getting out of it? Well, I think the training helps fortify that where, now that people are trained and they're using it more effectively, you can then really start understanding and diagnosing where, um that, the return is on those investments. And, by the way, some of this investment's not cheap, right. If you're building your own ai solution, or even if you're buying a uh chat gpt subscriptions, right, it's just another subscription cost that's added on to all the other subscriptions that you have from autodesk and revit, etc. Etc. So, again, I think another pro pro here is that it helps with really defining and helping the leadership understand the return on the investment.

Speaker 1:

Yeah, and I think that also brings it back full circle to what you were talking about before the title of your book Finding the 12 Minutes right. If we can implement this training system and we can get that consistency and all these things we've been talking about, then we start to leverage that 12 minutes per person per day. That starts creating or generating, I guess, that exponential math that you were talking about in terms of utilization rates and billable hours and and eventually somebody will pull out their, their uh casio, uh solar calculator and they will calculate the roi.

Speaker 2:

Unless, yeah, no, I agree, but yeah, but again, if you think about it from the perspective of why do we send people to a pm boot camp? Well, we want them to be more. We want them to be more effective project managers. We don't want them to make mistakes right, because we know that all if we just let them off on their own, without the training that they know that there's going to be mistakes in the project plan and how the project's being managed, we're going to lose money. Ai is the same exact way. Yep, if they know how to do it and they know what they're doing, they'll be more, more effective. And then that's where you can get the ROI.

Speaker 1:

Yeah, yeah, a hundred percent. So we know, right on the con side of this, we know that there it's going to take an investment of resources, time, effort, money, et cetera, budget, and we we also know that the technology continues to change and so it's going to, as you said, it's going to be an ongoing training and we're going to have to keep updating these playbooks that we mentioned before. We're going to have to keep, keep on top of the training. We're going to have to have people and, and you know, I, I picture it as a snowball or a rock or whatever, at the top of the hill, right At some point we've got these champions and they're latching on to the training they're teaching. The trainers have been trained. Right, they're starting to teach people, they're starting to document the prompts and all those things. It's going to start rolling downhill, but it's still. It has to, it has to be continually updated. We have to stay on top of it.

Speaker 1:

So, there are pros and cons, but what are the key takeaways? What are the things? I'm listening to this right now. Maybe I'm the head of HR and I'm in charge of training for my organization. Maybe I'm the CEO that's thinking, hmm, how do I get my people trained? How do I get them on board? What are the? Or a CEO, or whoever it is, that's saying, okay, I understand now that I need to train my people. How do I do this? What's the key takeaway? How do I do? What do I need to know going forward?

Speaker 2:

You know, the one thing that we tell most organizations, like when we do our KP ReadyCo workshops you know, having a dedicated leader that's kind of focused on being able to see things across the organization is super helpful. It doesn't mean that you don't have other inputs from other parts of the organization, but having someone that's really kind of being able to see broadly around, that it's key. Kind of being able to see broadly around, that it's key. The other thing, too, is you have to stop with the surface level activities that don't actually do anything. Just because you created the AI policy doesn't mean people are following it. Just because you gave them the tools doesn't mean that right, that they know how to use it.

Speaker 2:

So do all of those things, but have a cohesive, a coherent plan to sit there and say, yep, when new employees come on, they get trained on the policies, they get trained on AI, they get trained on how to use AI. So it's one of those things to where that you have to think about it from a more comprehensive perspective and not necessarily focus just on the surface level things. Again, having an AI policy does not necessarily mean people know how to use it. Giving people the tool doesn't mean they know how to use it. So it's one of those things to where you know, make sure you have a more comprehensive plan and it just don't do the small things and then think that it's being effective. Yeah 100%.

Speaker 1:

As we've been going through this episode today and I'm listening to what you're saying, it strikes me that again I said this at the very beginning I head up our mastermind program and I facilitate all of our mastermind meetings and so I sit in on a lot of meetings with people who have titles like innovation leader, chief innovation officer, director of innovation, head of construction technology, things like that.

Speaker 1:

So I sit in on a lot of these conversations and, as you're saying this, I'm going yes, you're exactly right, because a lot of these people are that person that you're talking about. They're in charge of the knowledge management or something like that of the knowledge management or something like that, or someone on their team, depending on their level. Someone on their team may be dedicated to that and and you know, it may be that that somebody is listening to this and they already have a robust training program. I know many firms do out there and then this is just a component they need to build into that training program, but it's important and probably growing in importance quicker than many of the other different aspects of training that are important to AEC firms, would be my guess.

Speaker 2:

Yeah, I think the industry is kind of caught off guard by AI a little bit, and so now we got to play those catch up right. It's now that they see how pervasive it is and where it is, and all these tools. I suspect that you're going to see more robust AI training at most organizations, particularly when the big firms, the big software firms, start introducing their AI tools, and obviously chat GPTs are out there. So it's like I think what you're going to find is is that most firms that get ahead of this now are going to be very well suited for when some of these newer tools come out later yeah, yeah, and I understand I've said this a few times and it's it dawned on me as I was saying it is.

Speaker 1:

You know, when I, when I talk, we just lost frank when I talk about we just lost Frank when I talk about the your clients asking or having something that you have to fill out or something like that, and on how you're using AI. I know not all clients are asking that. I know that not everybody, that I know not everybody's clients are asking those things. They may not have clients that are paying attention to those things, but it's coming right and we are seeing this with certain types of clients and certain types of projects, that there's that the client is really pushing hard on understanding how you're using AI, how their data is being used, and it comes all the way back to, you know, the data privacy and protection that we talked about, and I think it was just the previous episode. So these things are getting more and more important as we go along.

Speaker 2:

All right hey you know what?

Speaker 1:

Lost power again. I noticed I was talking and it's like wait, he's gone I was like what?

Speaker 2:

so apparently here in atlanta we have some thunderstorms that are rolling through today, so, uh, good luck to the production team to figure out the edit on this one yeah, yeah, that's all right.

Speaker 1:

They're professionals, they got it they, they are.

Speaker 2:

Uh, yeah, maybe I need to kind of figure out a more battery backup thing, or is it? Is it just springtime in atlanta?

Speaker 1:

we'll see yeah, yeah, springtime here. We've had a break the last couple of days in the midwest, um, we had some, some really strong stuff, um, almost a week ago now and it's certainly, it'll certainly be back. It's tornado season, so, um, but I, you, you know, I think I think we covered what we needed to cover. I think we we gave great pros and cons and great takeaways and and gave three steps to AI, training and adoption. So, for those of you that are out there listening, thank you for joining us again.

Speaker 1:

My name is Jeff Eccles. I head up our mastermind and incubator programs. Also, I host all of our podcasts and live and virtual events, and I'm joined today by Frank Lazzaro, who is one of the heads of our advisory team, and he's also the podcasts that I get to host and record. This one is one of my favorites because I love the fact that we're breaking these tools down, we're staying ahead of the curve and we're making it actionable and we're making it very practical. And you know, like Frank said early in the episode, we hear feedback from people all the time about how much they get to take away and implement from what we talk about. So, frank, thanks for doing this with me again today.

Speaker 1:

And we'll be back again next week.

Speaker 2:

Yeah, looking forward to it.

Speaker 1:

Yeah, All right. Thanks everybody. Thanks for listening. Thanks for listening. Thanks for making this the number one AI and AEC podcast. We'll see you again next week.