top of page

Transcript

Episode 226: Getting Consulting Clients from Design to Results—with Katie Butler

Deb Zahn: Hi. I want to welcome you to this week's episode of The Craft of Consulting. So, on this podcast, we're going to talk about how to get results for your clients, and in particular, starting from how you help them design what they want, figure out the strategies that are going to get them what they want, how you help them figure out when a good time to pivot and adapt to ensure that they ultimately get the results they want. And then ultimately, how to evaluate those so that they know that they truly got what they said they wanted to get out of it.


And I brought on someone who does this in an area that is so very important, and that's basically saving the planet and trying to do it faster. Katie Butler'sis a consultant in the environmental world, and she is going to talk about how she does these things with her clients. And in particular, how she handles it when clients or funders like just shiny new things, and how she helps them understand that it is very useful to go look at what's already working and apply it to the work you're doing. So, so much is relevant no matter what type of consulting you're doing. Let's get started.


Hi, I want to welcome Katie Butler to the show today. Katie, I'm so excited to have you on. Welcome.


Katie Butler: Thank you. I'm so excited to be here. Thanks for inviting me.


Deb Zahn: So, let's start off. Tell my listeners what you do.


Katie Butler: Yeah, I am a program evaluator and an environmental scientist. I help environmental leaders get better results from their programs. So, we work with the private sector, the public sector, and the nonprofit sector, basically anyone who's trying to save the planet in some way, we're interested in helping them do that a little bit faster.


Deb Zahn: Because we need faster.


Katie Butler: The clock is ticking.


Deb Zahn: The clock truly is ticking. And I know you use the term, and it's part of the name of your company Geo-Literacy, which I just love. What does that term mean?


Katie Butler: So, geo-literacy is a term that was coined a couple of decades ago by some geographers, and I do have a background in geography, so I'm partial to that. If you take the word and break it apart, the geo part is the earth and the literacy part is about understanding, speaking the language of.


So, geo-literacy is about trying to understand how the earth works and trying to understand our place in it as humans, how the actions that we take or don't take can have an impact on the planet. And so that's a really important concept to me I think, that understanding really helps us do a better job. So, that's why I wanted to adopt that word as part of the name of my company.


And I love it also because it makes people lean in, right? Because even though it was coined a good long time ago, not everybody knows what it is or what it means. And so they know National Geographic and they know literacy. And it makes them, I would imagine curious about, "OK, hey, what is this all about?" Which from a consulting business perspective is always a wise thing to do, is to make people want to lean in and know more. So, cheers to that.


Deb Zahn: Well, let's dive into some of the work that you do, and we're going to talk about things that are relevant to all consultants. But the first is actually designing the program. So, obviously, there's a lot of things that need to happen quickly in order to save our planet. How do you go about working with clients to actually say, "OK, what is it we're going to do and how are we going to use data to help us figure out what to do?"


Katie Butler: Yeah, I love this question. I like to start out by saying that I am tool agnostic. So, there are a number of different ways that we can get into designing programs. And what I'm most interested in is figuring out the best way for that particular organization, for that particular problem. Every problem is complex. It has got different people, different levers, different things that hold people back, different barriers. So, the way to start that process is really mapping out all of those things I just talked about.


So, one of the things we tend to do is we go into an organization, and we start talking with people about what problem they want to solve. What's the end goal? And then we start working back from that and we start thinking about, what are the different levers involved in solving that problem?


None of us can, no one organization... I worked for the federal government for years. The federal government itself can solve these problems. It takes everyone.


So, once we start understanding all of those levers, if there's a policy lever, an incentive lever, maybe even a conservation or ecosystem restoration lever, what are all of those factors and which ones does this organization have control over?


And once we start understanding that, that sphere of control, then we can start looking for tactics that have worked well in the past, for example, and on other problems or in other organizations, or tactics that are likely to have a good chance of success. Tactics where we can at least get a foot in the door, figure out whether it's working, and then start iterating to improve that tactic along the way.


So, we start out with that kind of mapping process. And in my world, frequently people refer to those as logic models and I'm sure that people... but logic models are the much loved and much maligned kind of basic linear tool for mapping out how a program is going to work.


So, frequently, we'll end up working with folks on that to start out. There are other ways to do that. Like I mentioned, I have a background in geography, so I frequently call it mapping. But that program mapping process can be done in a logic model. It can be done using a theory of change kind of process. One of my favorite ways to do it is using a systems modeling approach. Again, you're mapping out those barriers, so you really understand, if we put a bunch of resources in this one area, we're likely to catalyze our ability to get results moving forward.


Deb Zahn: And I want to pause because you're saying a whole bunch of Jedi things. I just want to point some of those things out. So, the first in terms of, I know that a lot of consultants, and I've certainly experienced this, when they go into a client, they're like, "We want to do this thing and we want this project." And it isn't necessarily a, "We want this result. Now how do we get that result?"


Katie Butler: So, one of the benefits of some of the mapping you do, or creating a logic model, or whatever version of that is it forces you to think through what's actually going to get you the result. And I think it's important for other consultants to hear clients don't always think about that. Sometimes they think about the thing that they want to do, and then they're surprised at the end of it where they're like, "Well, what did that do for us?" And that shouldn't be an afterthought. That should be at the beginning of the process.


Deb Zahn: So, I love that. I just had to point that out when I heard it. And I love that you also use a toolbox approach, which is what I call it, which is what's the right tool for the thing that we're actually trying to make happen here? And not here's what we do, and it's a singular package, and we know it works everywhere.


Katie Butler: That's right. And that ability to map out the logic and identify the best tools, I think usually when you go in and to someone says, "OK, here's our program, here's how we're going to do it." Like you mentioned, once you start that process and you start showing how the dominoes are intended to fall, it becomes pretty clear, "I don't think that domino is actually going to hit this one. Maybe we need to make an adjustment here."


And again, back to the logic of the logic model, once you can understand that logical process, it helps people to make those pivots or adjustments to think about new strategies if that's the right thing.


Deb Zahn: Love it, love it, love it. Now, I also know, and I'm sure you experience that clients or the folks that are funding this tend to like shiny new things. So, they like new approaches, they like new projects. They're always looking for the cutting or the bleeding edge, whatever the heck they're calling it these days. And you come at it from a different direction. How do you approach it when you start to see the shiny object syndrome at play?


Katie Butler: Yeah. Like we said at the top, these are what we call wicked problems in the world. Climate change is a problem that is extraordinarily complex with an extraordinary number. Even under the best climate scientists are still learning about how the climate works. This week's sea surface temperature data is one of those examples.


So, certainly understandable that foundations, and funders, and governments out there are looking for new solutions. We have to move fast. We don't know exactly how to move. What's the new exciting way to do this? So, it's certainly understandable. However, there's been a lot of new solutions over time. Everybody's been trying for a few decades now to come up with the best new mousetrap for this.


So, our approach is to come in and say, "Let's start out by looking at what has actually worked in the past." One of the great examples of this that people bring up, which is on a huge scale and an international scale, is the Montreal Protocol, which was about reducing the size of the ozone, the whole and the ozone layer.


So, that's one example of international cooperation, mandates. There were all these tools that were in place that made that example a success, a relative success. Reducing acid rain is another one of those. So, there are certain examples out there.


Climate resilience, building climate resilience in communities. There are so many communities who've been trying to do this work, and some of them have had excellent results. So, being able to find those examples, and bring them in and say, "Hey, yeah, I'm sure that we've got some excellent new ideas in here, but let's think about whether it makes sense. It might not, but whether it makes sense to apply this proven strategy here." Not use the extra brain power that it takes to reinvent that and start moving faster and moving forward more with speed.


Because we need it. We're going to keep repeating it because we need it. I'm going to keep repeating it. So, now I know, especially with funders... And I worked at a foundation, so I understand how things work, and I understand the dynamics between leadership, and staff, and the board. And sometimes, they're really resistant to something that's not new and shiny because they want to say, "We did this. This is ours." And for any funders listening, if you're doing that, please stop it.


Deb Zahn: But how do you work with that resistance to help them get past what they're used to, which is always wanting something new, and instead saying, "Hey, why don't we take a look at places where this is actually working and see what we can take from?" How do you get them past any resistance that they might have?


Katie Butler: I think that's going to be a question for a board, a conversation to happen with a board. A lot of times, it is the funding requirements, what's in the request for proposals and how it's worded. My opinion is that a strong demonstration of results should be able to get around that. And if it doesn't, an organization that I'm working with now has an extraordinarily strong ability to demonstrate results.


And the way that that looks is they have a theory of change that comes in from this side. They have this black box in the middle where they try a bunch of different things, iterate a bunch, refine, and then out of the other side come those results. So, there's still room for that innovation and iteration in that, what I just described, which is it's not invisible.


Deb Zahn: So, an approach where you're able to say, "Look, you know generally what our theory of change is, you know how we believe we're going to create change in the world. We're going to be innovating, we're going to be iterating within that theory of change. We're going to leave some room in there. But what you can know or be certain of is that we will get that result coming out the other side of that box."


Katie Butler: But yeah, I agree with you. That's something that boards have to discuss. Do we want to win? Do we want to innovate? You know?


Deb Zahn: Yeah. But I love how you do it because you're basically saying, "Don't worry, I'm giving you your dopamine hits," which is what you get through the new, and shiny, and the innovation. There are probably other neurotransmitters at play. But you don't necessarily say, "Forget innovation, let's just take this, and we're going to replicate it here, and you're going to get the exact same results," which is often not always true because everything's multifactorial. But you're saying you get a play and we're going to play and here, but here's the context in which we're going to play. I love that approach, and I can imagine smart boards would actually be able to understand that.


Katie Butler: And I think this is a really important role that communications folks play in an organization is figuring out how to do that storytelling. If you are the program manager, if you're the person on the ground doing the work, generally you care about evidence-based success. You care about efficiency. You want to optimize your work so you're getting the most out of your limited resources. You're really thinking on that kind of pragmatic level. But when you're up there talking with the leadership, talking with boards, talking with foundations, it becomes about storytelling.


And I think that's the power that the other pieces of the program evaluation picture can fill in, which is, we're not just talking about data. We're mostly talking about data, but we're also talking about how to tell a story using those data that speaks to people. Speaks to the public, but also speaks to leaders who they want their money to go somewhere interesting and beneficial as well.


And they want to be able to go to cocktail parties and tell a story about what they're doing. So, we used to take the data points and turn it into a people point because it's facts and feelings. It's facts and feelings. And so it would be the story of this person, which would illustrate the data, illustrate what we're trying to do. But they understood it's a mom with two kids in this circumstance, and here's what she's facing. And therefore, this is why this approach makes sense. So, I love that. And it's a know your audience so that you're responding to them. Wonderful.


Deb Zahn: So, you map all this fabulous stuff out. You get the people who want the new and shiny to understand why evidence and looking for proven strategies is helpful. And then you got to start to get down to the nitty-gritty of, "OK, what's the strategy? What's the tactics then we're going to use to get the result?" How do you that? What does that look like when you're working with a client?


Katie Butler: That brings us into a normal strategic or annual planning type of system strategy where you start really thinking about...You have to measure what you're doing. We talk about smart goals a lot. People still talk about smart goals. So, you need to make sure that you're setting goals that are reasonable, that are achievable, that are measurable, that are time limited, and all those things.


And then you really start mapping that out over time. We do a lot of dashboard stuff. So, we'll come up with a way, even if it's just in Excel, I love Excel.


Deb Zahn: All consultants do. What consultant doesn't like Excel, right?


Katie Butler: Tools out there, but then people don't know how to operate. So, I'm happy with it generally. So, you've got your dashboard and you start looking at, "OK, year one, here's what we want to achieve." And you start mapping that. It's like bouncing your checkbook or whatever. It's like any other process in life where you're kind of just... It's not hard work. It's an easy lift if you do it often, and you're just monitoring that.


Are the things that we expected to happen happening? If yes, awesome. Let's make sure we understand why. If no, why not? Do we need to allow for more time? Do we need to make a pivot?


So, that regular progress, that regular tracking enables you to make smart decisions about when to abandon a strategy, when to stick with a strategy, or when to pivot to another idea or another audience, or another tactic that you might have within that.


I'm going to harken back to again, the funders and some of the clients on this because you're also talking about a much more dynamic way of developing strategy and implementing and monitoring. And what Actually my husband and I talk about it because it both drives both of us crazy because we both have been implementers. Which is often you are measured by the funder, the client, against progress, against the original work plan.


Whereas there is evidence that's actually been researched, that thoughtful deviation from a work plan, and pivoting, and adapting actually gets you better results because you got to be in it, you got to be paying attention. And so have you experienced that where it's, "But we did this work plan, but now you're saying we got to do something different"? And if you have, how do you handle that?


Deb Zahn: Absolutely. I love that. Yeah, there's actually an organization I'm working with right now that put out a bunch of grants. And I believe they went out the summer that the pandemic began. So, in the summer of July 2020. And a lot of these were designed for education programs. So, they were supposed to be these in person...The people who received these grants had all these beautiful plans for doing this work in person. And as we are learning so many lessons because of the pandemic, good ones and bad ones for sure, but this is another example of where the funder... And this included some federal dollars. So, this is not a scenario where you would ordinarily think flexibility would be the name of the name of the game. But in this case, the funder allowed that kind of room for change, room for improvement.


Katie Butler: They had conversations with the grantees. They weren't just out there buying drinks for everyone at the bar or something. They had to actually be using the money appropriately and spending it on the project.


But what we're doing now is an evaluation of the results of those grants, and the results are remarkable. Just that ability to flex, this is all to the credit of this organization. They made those decisions about being flexible with the grantees, but they got better results partly because of things like this, being able to do things online and attract bigger audiences. Their reach was much bigger than it would've been with those initial plans. So, that's a great example of that.


Flexibility. It's like you need to know some things. I worked as an auditor for a while. You need to have some control around things.


Deb Zahn: Yeah.


Katie Butler: But within that, having that flexibility to iterate, and make better decisions, and then describe it to people, say, "Here's why we're going to make this pivot. Do you agree? Great. Let's go for it."


Deb Zahn: Yeah. And for any funders listening to this, that's a great idea to have that flexibility. I had a funder once for a big project. It was in national, and there was all these different sites, and I ran one of them who, I swear, I think the trustees used to watch Dateline and stuff, and then call her and say, "OK, now we want a youth component. Now we want this, now we want"... So, it was flexible in the bad way, we're suddenly you're doing this and suddenly we want you to do that. And we basically said... We were working on something really important, which was immigrant access to healthcare, and we're going to stay the course. And we can satisfy the funder in the way that we need to, while not getting whiplash every single time they want to have us deviate.


So, it's that balance between what you're saying, which is flexibility and adapt and pivoting when it makes sense to do it for the purpose of results. But also, staying the course when you know that there's a logic to this. This matches our theory of change. We're seeing some early results, etc. And often, consultants are the ones that have to facilitate those conversations.


Katie Butler: Yeah. And that brings us back to this classic pieces of strategic planning where you set up the vision, and the mission, and the values. And it's so valuable to be able to come back to those elements. A lot of times when people are going through strategic planning, they're just like, "All right, check we got that one. Check, we got that one." But it's when decisions like the one you just described come up, that those things become such important north stars. Just come back to that. Are we still on target to achieve this thing? It's great to have new goals and to expand, but it needs to be a decision that's made intentionally within the constraints that are available.


Deb Zahn: I love it. So, let's go to evaluation. So, I love that it's your idea generation from designing, to doing the strategy, to the pivoting and adapting, and then the evaluating. So, you've got the soup to nuts of are we doing stuff that's meaningful and works, right? So, how does evaluation work when you're working with folks? What does that often look like?


Katie Butler: Yeah, so evaluation, I have a wonderful group of folks evaluators who I talk with on a regular basis. We are all evaluators in the government and outside, and we frequently talk about how people come to evaluation accidentally. And because of what we were just discussing, they have this bias towards results-based programming. And so then, we learn that there's this thing out there called program evaluation.


So, program evaluation or the evaluation concept is really about collecting data from multiple sources and then trying to synthesize it to figure out what the story is about program success. So, for a long time, I worked for the government as I mentioned, and that was what I did there was I tried to figure out which programs that the EPA were working well.


So, in that case, there's a lot of data. There's a lot of quantitative data that's coming in that the federal government can use to determine whether those programs are working.


And what we would do there is we would go out and we would collect qualitative information. So, we would use surveys, we would use interviews, focus groups, those kinds of things, to get that information and pull it together with the quantitative data.


Deb Zahn: And that's a great example to me of the power of program evaluation. When you're merging quantitative data with qualitative, you kind of ground truth what you're learning. You get the ability to hear from people, "Yup, that's right?" Or, "Hey, that doesn't sound exactly right. Maybe you should look a little more into those data."


Katie Butler: I'm really big into data mining and geographic analysis. So, what we try to do is first go out and seek out data sources, free data sources out there. And again, there's tons of them. I would love to talk to anyone who wants to know about free data. I will help you find it. There are tons of free data sources out there, especially in the environment sector.


So, we try to pull in those quantitative data sources. Of course, understand them, how were they collected, when were they collected, understand their reliability. And then, what we do is we go out and talk to people. So, we collect those qualitative pieces of information. "How did you feel about this? Did your behavior change?"


As you know from the health sector, a lot of what we have to do is change behavior to make those changes in the world, and the environment sector even more so. I mean, we've been talking about saving the planet, and there's this great Mae Jemison quote that I love that I'm not going to get it exactly right, but it's about how it's not really about the saving the planet. The planet's going to be fine. It's about our ability stick around as a species.


Deb Zahn: So, not let the cockroaches take over, basically?


Katie Butler: Exactly. Yeah. Yeah. So, that is the main point of that qualitative piece is, are we having that impact on behavior change that we hoped to have? Are we able to see that that behavior change is leading to a change in the world? It's only through that behavior change that we're going to get to the change in the real world.


Deb Zahn: Love it, love it, love it. And I really like how you do combine the quantitative and qualitative because I usually see that quantitative is the only real data. And it's not because it's not going to tell you if it's right or not. It's not going to tell you the nuance behind why. And don't we want to know why? Because if we're going to replicate this somewhere else or borrow from it, then what are the key elements that actually made it happen? So, that's wonderful.


So, if you had another consultant standing in front of you, and regardless of what type of work they do, and you're thinking about everything that you've learned from the initial idea all the way through finding out if something actually works or not, what advice would you give them when they're working with their client to make sure that the results are actually something that are front and center? What would you tell them?


Katie Butler: That's a great question. I'm going to return to understanding how the program is designed to work and then being able to compare that to the information that's been collected.

So, whether you use a logic model, or a theory of change, or one of those tools, or just map it out on a big whiteboard, really getting on the same page with the client on, "OK, how does this work?"


One of my favorite questions when I'm going in to talk to someone is, "OK, tell me how you do this. Just start from the beginning. When you come into the office, how do you do this process?" Really understanding how people go through the process, how they intend for that to succeed.


And then, of course there's great data questions. My background is in the physical sciences, so I think about it as a research type data reliability question. There are really valuable data questions we can ask about those performance data that people are collecting to understand whether they're relevant, whether they're timely, whether they're valid, whether they're repeatable, and bring those two things together.


So, I think that's starting with the front part and then the back part, and then just ground-truthing it. I think there's a lot of value to just understanding that logic. Does this make sense? Are we getting the information that we think we're getting here?


I mean, another opportunity to explore that would be what they call in the government red teaming, where you take those data, and you start asking what the problems are. You start thinking about it from the other side. Not to poke holes in the organization or the strategy, but just to really completely understand the potential risks and limitations. And then once you understand those, you can come back and have that conversation about, "All right, great, looks like we're doing a great job. Maybe we should get some triangulation data here or there to validate that what we're seeing is really what we're seeing." So, those are a couple of thoughts that I have on that question.


Deb Zahn: Those are great. First, ground truth, I've never heard that term before. And I'm now loving it so much, that I'm going to have to steal it from you. But not to go too deep here, but what this reminds me of and why curiosity and things like that are so important to being a good consultant is the Temple of Apollo in Delphi had three maxims on three doors. And you've probably heard the main one, which is, "Know thyself." The other one was, "Everything in moderation." And the third one is my favorite one, which is, "Certainty brings ruin."


Katie Butler: And what I see a lot happens with clients, or funders, or consultants is they lock into this is reality. This is what we know. And what you're saying, again, I just love to point out the Jedi things that you're saying, is don't be so certain. Poke at it, dig, ask yourself the hard questions. Collect other information to tell you whether or not something is true or not.


And I think that that is an extremely powerful thing that consultants can bring to the mix that often organizations don't because group think, it's too compelling. It's too easy. It's the default setting in a lot of organizations, and so you can be the one that mixes it up.


Deb Zahn: Yeah, no, I mean, group think is powerful. And understanding the reasons that that's powerful is great. People have a sense of purpose in their organization, and they have a sense of purpose for what they do. It's not that it's powerful for only bad reasons, that group think. But that is the power I think that we have as consultants, or program evaluators in general have outside parties coming in to take a look, is that we can come at it with a beginner's mind, is another way to say what you were saying about that third pillar. To come in and just kind of say, "OK, if I had never seen this room before, what would I think about it? What would be the questions that I would ask, the beginner's questions that I would ask? And can we use those to help us improve what we've already got going?" Even if it's going great.


Katie Butler: Yeah. And I love that it's not always for a bad reason, so we don't go in with judgment. I know groupthink, what it also does is it reduces cognitive load. So, unless you're a well-seasoned overthinker like myself, and you're used to overthinking everything, hello, then overthinkers can make really good consultants.


Deb Zahn: But it's too much to question everything all the time every day. So, having some defaults, actually, there's a utility to it. That's why there's utility to bring in a consultant in who can say, "I don't have to do that, and I'm not burdened by it, and it doesn't need to soothe me and make my day easier. So, let me bring something new to the mix."


Katie Butler: Yeah. And even more powerful if that person comes in and says, "Yeah, gosh, y'all are right. This is going great." Then you have this valuable independent objective validation of your processes and your strategies.


Deb Zahn: Dig it. Oh my goodness. This is yummy stuff. I could talk about this all day long. So, you've got some new stuff coming up. Tell me what you've got cooking in your business recently.


Katie Butler: Yeah, I do have a new thing that I'm going to be offering for free for folks out there. One of the things I've noticed as I've been consulting is frequently, we'll come in to do a program evaluation. We'll come in to do some kind of strategy work. And what we find is that we need to step back first and start working on that program map. So, whether that's a logic model, or a theory of change, or a systems diagram, or some other thing that's unnamed that just makes sense to people in the organization, we frequently find that we need to take a step back and do that work. A lot of folks, especially people in the nonprofit sector, may have to do specifically logic models just to apply for grants.


So, this is a pretty common tool. It's something that people frequently need help with. And so what we decided was let's just put some help out there on logic models. Let's just put something out there that can help people to figure out, "Do I need to hire someone to help me with this?" Or, "This looks pretty simple, maybe I can just do it myself." And I'm here to say, you can do it yourself. It's OK. You can figure it out.


So, what we're putting together is a little quiz that's going to ask you some questions about your understanding about the logic model if you have one, where you are in the process, how good is it, can you do some small things to improve it? Do you want to scrap it and start over again? So, these will be some free tools that'll help people to do that. And that'll be on geoliteracy.com for people to access.


We already have some other free tools up there related to logic models, related to assessing your performance metrics. We have one that's called what to do when the evaluators come. Sometimes it can be a little bit scary, and those outside evaluators show up to poke into your business. So, we do already have some other free tools out there, but I'm excited about pulling together this... Why doesn't love a quiz, right? I'm excited about pulling together this-


Deb Zahn: I love a quiz.


Katie Butler: Yeah.


Deb Zahn: Yeah. And having lived through logic model processes where people are like, "Wait a minute, wait, wait, wait. What's an input and an output?" So, I have been through the torturous version of them where we end up going in circles trying to define terms because in essence, we don't really understand what it is and how it operates. Then I can tell you I understand the value of that tool. Because anything that shortcuts, that torturous process is a good thing.


Katie Butler: Yeah, I love that. Yeah, we do end up talking about semantics a lot when we're doing those exercises. But the point is to, like you said, get over that semantic. I don't care what you call it. I don't care what you call it. As long as you can get through it and map out that logic, then you're going to be in a good spot.


Deb Zahn: Love it. Love it. And where can folks find you if they want to make the world a better place? Save the planet faster.


Katie Butler: Save the planet faster. So, you can find GeoLiteracy at geoliteracy.com. You're welcome to connect with me on LinkedIn. The GeoLiteracy Project has a LinkedIn presence as well as do I. And I love connecting with people. I love just having a quick chat about what your program is, what your problem is, and just kind of helping people think through what their next step might be, whether it's with me, or on their own, or any other way.


Deb Zahn: Wonderful. And so when you're not saving the planet faster, how is it that you bring balance to your life, however it is you think about that?


Katie Butler: Yeah, it's a challenge when you're a consultant. I describe to people how it feels kind of like being in grad school where every spare moment you're like, "Maybe I should be studying right now."


Deb Zahn: Oh yeah.


Katie Butler: But I'm getting over it. And one of the ways that I get over it is that I'm a hobby musician. So, I play in a few different bands in Atlanta, and I had band practice last night. And there is nothing like closing the computer for the day, going to band practice, and just singing my head off and playing loud music. So, that's one of my-

Deb Zahn: How cool is that?


Katie Butler: Yeah.


Deb Zahn: So, are there YouTube videos? That's my other question.

Katie Butler: I regret to inform you that there are plenty of YouTube videos.


Deb Zahn: I know what I'm doing after this interview. That's wonderful. And what a great way to just have your brain do something else.


Katie Butler: Yes. Yeah. It's kind of like meditation. You can't worry about stuff while you're trying to remember lyrics and trying to play chords. It's too much for one brain. So, yeah, it helps.


Deb Zahn: Love it. Yeah, same way in my garden. I'm like, "I'm figuring this weed out." And figuring out its adaptive strategies that are driving me crazy. I don't have time to think about other stuff.


Katie Butler: That's right.


Deb Zahn: Very cool. Well, Katie, I've known you for a while, and this is the most in-depth that I've heard about the fabulous things that you do. And so I just got to applaud you, and I appreciate the work that you're doing truly to save the planet because goodness knows we need, or to save the living beings on the planet because we really need it.


Katie Butler: Well, Deb, thank you. And right back at you. I have learned so much from you, from your podcast, and your membership group. It's been invaluable to me as I've learned how to set up this business in a way that's ethical and aligns with my values. And I applaud you for doing what you're doing.


Deb Zahn: Thank you.


Thanks so much for listening to this episode of the Craft of Consulting podcast. I want to ask you to do actually three things. If you enjoyed this episode or you've enjoyed any of my other ones, hit subscribe. I got a lot of other great guests that are coming up in a lot of other great content, and I don't want you to miss anything.


But the other two things that I'm going to ask you to do is, one is if you have any comments, so if you have any suggestions or any kind of feedback that will help make this podcast more helpful to more listeners, please include those. And then the last thing is, again, if you've gotten something out of this, share it. Share it with somebody you know who's a consultant or thinking about being a consultant, and make sure that they also have access to all this great content and all the other great content that's going to be coming up.


So, as always, you can go and get more wonderful information and tools at Craftofconsulting.com. Thanks so much. I will talk to you on the next episode.

bottom of page