<img height="1" width="1" style="display:none" src="https://www.facebook.com/tr?id=1308192452940245&amp;ev=PageView&amp;noscript=1">

Episode 8: Design Thinking: Empowering Innovation Through Human-Centered Approaches with David Phillips of Faster Glass

BTB Episode 8 David Phillips


About the Episode:

In this podcast episode, David Phillips, Founder of Faster Glass, discusses the importance of human-centered design and its application in solving complex problems. Phillips emphasizes the need to put people at the center of the work and work backwards from there, understanding their needs and perspectives. The episode explores how design thinking can drive innovation and create meaningful experiences for customers, employees, and other stakeholders.


About the Guest: David Phillips

Equal parts educator, facilitator, and instigator, David has devoted his professional life to helping people make things better. He founded Faster Glass in 2010 as a way to combine his passion for sharing the principles of innovation and design, his impatience with blindly accepting the status quo, and his love of learning from others.

David spent six years at Bank of America where he developed and led programs focused on integrating the principles of innovation into a Six Sigma culture. He also served as a subject matter expert “on loan” to China Construction Bank as they built their new product innovation lab. Prior to his time in financial services, David served in a wide range of roles in public accounting, public education, and the U.S. Air Force.

David draws on his diversity of experience to approach situations from different perspectives and to enable others to do the same. Based in Charlotte, David also enjoys being a husband, father, photographer, musician, and mountain biker, even as he is often reminded of his amateur status in all five endeavors.

David earned a B.S in Education (cum laude) from Louisiana Tech University.

Little Known Fact:
David once danced with football legend Johnny Unitas. True story!


About the Host: Kevin Carney

Kevin Carney is a Managing Partner at Kingsmen Software. As Client Partner, Kevin assists clients in their transition from Sales to Delivery and then maintains a relationship to ensure successful completion. A Finance major by training, Kevin bridges the gap between business and technology, especially for Kingsmen’s banking and capital markets clients. Kevin has 30 years of experience in consulting to financial services institutions. 


About Kingsmen Software:

We are dedicated, experienced practitioners of software development. We grow software iteratively and adapt quickly to changing business objectives, so we deliver the right software at the right time. Our proven approach combines processes, tooling, automation, and frameworks to enable scalability, efficiency, and business agility. Meanwhile, our advisory & coaching services enable technology leaders, business partners and their teams to learn and adapt their way to sustainable, collaborative, and data-driven operating models.


Production Credits:

Produced in partnership with Mistry Projects: https://mistryprojects.com/

David Phillips 00:08

If we're not careful, we get locked into solving the problem that's presented to us. And we go a long way to solving that problem. And I realize like, oh crap, not the right problem. One of the fundamental principles of human centered design is wherever possible. Design with not for, involve people and the key here is not just with people but with people from a diverse perspectives and mindsets and backgrounds and experiences.


Kevin Carney 00:39

Welcome to Kingsmen Software's Beyond the Build Podcast at Kingsmen we pride ourselves on building enterprise quality software, and we have the privilege of meeting some pretty interesting people along the way. Come join us to meet the visionaries, the disruptors, the entrepreneurs, and the innovators that envision new technology solutions that we help bring to life. Come join us to hear what happens Beyond the Build. Today from the Kingsmen podcast studio, we'll be talking about design thinking. Too often we get caught up in the trap of knowing what needs to be done and executing on it. without pausing to ask if we are building the right thing. It's the proverbial building of a great bridge over the wrong river. Our guest today is David Phillips, Faster Glass. Faster Glass helps people see think and work differently in order to spark positive disruption. As David will explain to us, design thinking is a mindset and a toolset that embraces a set of principles such as empathy for stakeholders, and a tolerance for failure to help guide discovery and the collaborative crafting of innovative solutions. David, welcome, and thanks for being here.


David Phillips 01:45

Thanks, Kevin. Happy to be here.


Kevin Carney 01:47

So I'm Kevin Carney, one of the managing partners here at Kingsmen Software and changed things up a little bit. I'll be hosting today's podcast, my style is a little more informal. So hopefully, we'll have a little fun tonight. And our sound engineer is the one and only Bill Clerici you may know Bill as our intrepid CEO here at Kingsmen Software. But you may not know that Bill was the past president of the Lackawanna Middle School AV Club. So we should be in good hands today. Say Hi, Bill.


Bill Clerici 02:16

Hello, Bill.


Kevin Carney 02:18

All right, let's get into it. So David, you're the CEO and Founder of Faster Glass. And we recently met at Charlotte Innovation Week. I learned you're very engaging to speak with and you're a bit of a celebrity here in Charlotte, a lot of people have said, Oh, you're talking about David. I love all his sessions. So you also hosted an event here at Kingsmen Software at our office called Data Delusions. And there was a screening of Moneyball. Yeah, that tie into your your your data analysis. That's a great movie. So I can sit here and try to explain what Faster Glass does and how you use design thinking to help your clients. But I'm sure pretty sure you do a better job of it than I would. So I'll hand you the mic. And don't forget to explain why you're a self proclaimed innovation provocateur, because that's a great title.


David Phillips 03:11

Okay, so let me I'll go work backwards. So one of the cool things about running your own gig is you can call yourself whatever you want. So innovation provocateur seemed to fit. And you may know this, your audience may know this. provocateur is actually French for knucklehead. So that seems to fit me perfectly.


Kevin Carney 03:30

Yeah, we have a couple of names for Bill that include knucklehead.


Bill Clerici 03:34

Yeah, but you can't say him.


Kevin Carney 03:35

Yeah, right.


David Phillips 03:37

So Faster Glass, we are an innovation consulting and training firm. And what we do is we help organizations apply the principles of human centered design to solve big problems to co create solutions to these big problems. We also our training work is to help build internal innovation capabilities, right for their own people. We also help organizations again, using those principles and practices, intentionally design experiences for key stakeholders, whether those are customers, employees, volunteers, donors, whomever.


Kevin Carney 04:08

And where did the name Faster Glass come from.


David Phillips 04:11

The short version of this story is that it's actually a nod to the world of photography. So about 10 months before I started the company in 2010. I dove into the world of photography, learning all these new terms came across this term called fast glass that professional photographers call their really expensive high end lenses, fast glass because the quality of the glass affects the quality of the resulting image. One of the ways I describe what we do is we help people see differently, we help them frame their challenges and opportunities differently. So it was trying to come up with a name for the company. That term just sort of bubbled back up to the top of my brain like, let's try that.


Kevin Carney 04:48

I like that. Yeah, taking, taking lots of information, gathering lots of light and focusing it on the problem and head


David Phillips 04:54



Kevin Carney 04:55

Nice. Okay. So you mentioned human centric design. So explain that. and your typical client is what engagement looks like that kind of thing.


David Phillips 05:03

Sure. So this is one of those terms, there are a lot of different definitions that we we find it pretty broadly. And it's basically. So if you think of human centered design as a discipline with lots of tools and techniques and principles and practices, but at the core of those is putting humans at the center at the focus of whatever it is you're trying to do. So again, whether you're solving a problem, whether you're designing experience, whatever it may be, is put people at the center of your work, and then work backwards from there. What do people actually really want or need, whether it's, again, those people, your customers, your employees, or volunteers, donors, again, whomever doesn't really matter. We use a six step model. There are many different models or frameworks for design thinking or human centered design I think as there are consultants and academics, some are eight, some are three, some are four steps, it's to me, it's all the same stuff. It's just about how you slice and dice it. But it starts with discovery, like really deeply understanding what people want or need. And then framing a reframing to make sure you're solving the right problem, ideation or generate ideas in a structured way. And prototyping and testing is about making these ideas visible and visual. And then testing let's find out what works. Let's find out what doesn't work as quickly as we can. And then eventually getting to launch right at some point, you have to ship you've got to get stuff out the door. And how you do that matters. What design thinking does basically as a way to walk through that process, and it may sound linear, it's anything but linear. It's very much iterative. We go back and forth as we learn new things. As our hypotheses are busted as assumptions are busted. We like oh, that's not what we thought now we know more. So we can we can go in a different direction. In terms of our engagements, we we are really hyper, hyper narrowly focused on large companies, and midsize companies and small companies. Oh, nonprofits hyper focus on all the hyper focus, as well as nonprofits, government, academia. So maybe hyper focus is not the right word.


Kevin Carney 07:10

So you do more than just software too.


David Phillips 07:14

We are not focused on any particular industry. We are industry agnostic, we are problem agnostic. And for us, it's about helping people in organizations do what they do better by taking a human centered approach to what they do.


Kevin Carney 07:30

So can you give an example just put some meat on that skeleton of a client who has a problem and they're struggling to get that to a good solution to that and they call you up and say, help us? Think through this problem? Help us? decompose it help us think through some out of the box ideas and just give give a scenario or example from your work that can bring this to light? I'll do that better, I'll do I'll give you two. I'll try to do it quickly. It's a bogo.


David Phillips 08:02

It's a BOGO. Exactly. So one was Speedway Motorsports, right. They own eight of the NASCAR tracks around the country. They had a challenge. This is six, seven years ago. Now. They had a challenge where they were seeing the business problem was that revenue was going down year over year over year. So they had launched a program previously about how do we improve the fan experience. But they didn't see as much of a lift as they expected. So then they decided to take a slightly different approach. And, and that's where they engaged us to help them better understand the staff experience. And I didn't know this part of this project. But a large number of significant percentage of the staff who are working their big raises are volunteers. So therefore their volunteers don't go through Speedway Motorsports' training program. So we need to equip these volunteers better to do the job, right. If whatever the work they're going to do, we're going to do that weekend. And they had hired a training company said, Hey, develop training for our volunteers, the training company engaged, they say, well, let's make sure we understand what it's like to be a volunteer at the track on these big weekends, big weekends, you know, 100,000 people there for three or four days, right? And so what we did, instead of just saying, well, here's what we know, because we don't know anything, is we went out to their eight tracks on each big weekend for each one of those tracks to observe. Right? So the discovery piece, it's called ethnographic research. You go and see for yourself what people are actually doing where they do it, compared to like a focus group or survey. Let's go see, let's go observe. Let's go learn what it's like and then draw insights from that. And then share those insights with the client to say, well, here's what we saw. What do you think this means? And what do you think this means in terms of what you can do differently?


Kevin Carney 09:51

Do you find you have an advantage that you have less information because you're less predisposed to certain answers


David Phillips 09:57

100% And that can diminish Over time, because once we know too much, it's hard to be that. That neutral observer, right? It's hard to be that beginner's mind. Right approach.


Kevin Carney 10:09

Yeah. Yeah, see a lot of our clients that are, you know, big office towers, we have a lot of big, big organizations that we work with. And some folks have never spoken with a consumer or user of their software or whatever they're building. I can't help but think that getting out of that office tower are literally an ivory tower, maybe not literally, but an average, a big brick tower. Yep. And talking with those suppliers, or sorry, with those customers, and understanding what they, how they operate, and what they really want, was is transformational, right, because they, they think they know what their customers want. And they've probably learned that from the last person that they trained under. That's what the customer wants. And so now there is a internal corporate mindset. So this is what our customers want. And maybe it was born out of some customer focus groups that was done in the past, as lived on. But maybe that has changed by now. Or maybe that whisper down the lane didn't whisper down the line changed what people interpret that to be right, yeah.


David Phillips 11:15

So this is not to diminish the value. And the power of focus groups of surveys, I mean, that's good information. That's a good way to collect information, but it will necessarily be incomplete. And so a lot of times in market research, or whether you're doing customer discovery, we use this term voice of customer, Voice of Customer is important. I would argue that many times, maybe not necessarily all times, but many times you need to go beyond voice of customer to understand reality of customer. So from voc to roc, go see what they're doing where they do it, because they're the experts, even if what they do is wrong. They're the experts at doing it that way, go learn from them, and then take those insights and let that inform whatever it is you're building.


Kevin Carney 12:01

Okay, start at the beginning of the process here where we've got a problem statement. And I would assume, and correct me if I'm wrong, but I'm assuming a lot people walk in and think, Alright, we're gonna have a lot of kumbaya here today, a lot of touchy feely, get our feelings out, right? Like, like, I have to believe a lot of people would just think this is hokum, or artsy, and getting in touch with their feelings. But you've got some science behind this, right, and you have discrete workshops. And while it might look chaotic to people in the beginning, there is a lot there's a rationale behind all this. So what what what


David Phillips 12:37

Absolutely, yeah, so once we get through that opening session with the tie dyed robes and incense, and you actually touched on it, right? You mentioned the scientific method, there is a strong parallel between the scientific method and design thinking or human centered design, right? You form a hypothesis, you test that hypothesis, based on the results of your test, you make adjustments to your hypothesis, and you make adjustments to the path you're gonna go. This is no different than that. So right, starting off with, here's a problem, here's a problem we want to solve. Great. Let's go do something, you know, what data do you have? What data do you not have? How old is that data? Right? How did you collect it? What's missing? I think it's important to understand, or at least my point of view is that remember, data is not reality. data represents reality. And it may very well be incomplete. It may be accurate, but it might be incomplete. And so there's an author named Scott Anthony, who has a great line says, If you don't go You can't know, to your point of get out of the tower and go see. So this discovery is important. But it's not always just that sort of customer research. Sometimes it's secondary research, who else has solved a similar problem, maybe in a completely different environment, completely different domain. But once we can do some discovery, and really kind of say, this is what this is, what reality is, this is what the needs are, this is kind of the context today, then it's come back this notion of framing and reframing. Okay, we've learned some more, is this still the right problem to solve? Or is there a better problem to solve? Don't get too anchored to that initial framing. Because for those folks, including folks like myself, who consider themselves problem solvers, if we're not careful, we get locked into solving the problem that's presented to us. And we go a long way to solving that problem. And I realize like, Oh, crap, not the right problem. Right. Right, bridges and rivers again. And so then great, this is the problem we're gonna solve. One of the fundamental principles of human centered design is wherever possible, design with not for so design with your stakeholders and with your constituents, not for them. So that next phase of ideation like let's generate ideas, how might we solve this particular x involve people and the key here is not just with people, but with people from a diverse perspectives and mindsets and backgrounds and experiences, because when you do that you benefit from their, what they see from their point of view from their perspectives right from their lived experiences. So it's not just you know, who you designed with matters a lot. Who you ideate with matters a lot, right? If you get a bunch of accountants in the room, great. It's almost like if you have a team of nine right fielders, they may be the best right fielders, but it's gonna be hard to, you know, play well. So that diversity of perspective is critical, generating ideas and deciding which ideas to explore further, which ideas are worth pursuing. Prototyping, if there's any magic in this process, I would say it's in the prototyping phase, at least how we define prototyping. It's simply making visual representations of your concept so that others can understand it and react to it. And it's not magical, but it's the fact that most times at least my experiences, we don't do that, we tend to over communicate with words and words alone. And then we wonder why we miscommunicate, you know, why did they get it? What's wrong with those folks? And so prototyping is about, hey, I've got an idea. Great. Make it visual, I don't care if you sketch it, you roleplay it, you build it with Play-Doh and pipe cleaners doesn't really matter how make a visual representation of it right? Software might be clickable models might be just sketch ups, right? But show it don't just tell it. And then let people get a shared understanding like, Oh, I thought you meant this, like, no, no, I mean that like, okay, and then we can poke holes in the same thing.


Kevin Carney 16:38

The role playing is interesting idea, I can see what that would that would look like, I'm thinking of a few scenarios in my head where I'm a buyer and I want to make a payment to the supplier, I'd have two people up there one being a buyer and being supplier and the buyer saying I want this and then the buyer saying about that. And it's a whole lot easier than drawing a bunch of stick figures and dollar sign on a on a whiteboard.


David Phillips 17:01

Yeah, absolutely. And you can even roleplay processes, right didn't have to be even if they're not human, necessarily, you know, hey, this is the input, and then that data is gonna be processed, there's gonna be some sort of output, and then something else happens something else, but to walk through it. We did this recently with a client, where and it was through that prototyping, right that let's walk through this. So the new hire comes here, then what happens? And then what happens? And then what happens? Like, you know, literally, this person was walking from station to station, we were sitting around a conference table, like, okay, then no, no, here, what's happens here? Then they go here, what happens now? And there was this realization that oh, yeah, we're using the same word to mean two different things. Oh, thus, the confusion and thus the problem like, oh, okay, well, now, you know, great. Let's move forward.


Kevin Carney 17:50

So when someone comes in, and they, I have to believe that someone that engages you, they probably think they already know the problem. They probably think they already know the solution. That's they're coming. They're coming in with a default hypothesis, right? And then you do you ideate you, you get some information, some data, you work with people, not not for them. And then the data comes out that it's not what they thought, how hard is it for someone to give up that hypothesis that they walked in with?


David Phillips 18:26

Yeah, it can range from not hard at all to incredibly difficult. Yeah. Which is another place where that design with not for can come into play. So I remember a project when I worked for a large financial services institution. The problem was, they had just rolled out a new training program for tellers, right, this is how we're going to train tellers across the country, same process, which makes a lot of sense, right? let's standardize that. And I forget the exact numbers, but I think the goal was, you know, the speed to competency was by day 20, that teller should be able to run their window all by themselves with no extra supervision, you know, fully productive,


Kevin Carney 19:09

good key result


David Phillips 19:10

good key result, right and measurable things we like. So they rolled out the training program and then just waited for the results to roll in. And there was some there was a pretty wide disparity. So in some branches, it was day 15 and other branches, it was day 30. And the initial reaction from the executives on both the training and the sort of teller recruiting program where well clearly this training program sucks. We need to fire those people and hire better trainers Oh sure, and create a better training program,


Kevin Carney 19:43

It's definitely the trainers.


David Phillips 19:46

Our innovation team got pulled into this project kind of midstream here like Well, wait a minute. Do we know why these results in Tulsa are different than the results in Miami? No, I mean, we know they are different, we can see the data. But we don't know why. Right? Well, let's go see if we can understand why. So let's go. Instead of us doing the going, we recruited some of the executives and training, we literally like a 30 minute training on how to do observational research. Now let's go. So you can see and not just have to rely on what we see. Right our untrained eyes. And and so when we came back to do our sensemaking of this qualitative research, like what do you see? What do you see? What do you see? I knew we had helped them get over the hump when I heard multiple executives start a sentence with I had no idea that, you know, dot dot. dot. So back to your point of how easy was it for them to realize they needed to change what they were focused on, it was pretty easy because they saw it for themselves. In this case, it was in Tulsa, the computer based training that the new teller needed to do, they did in a backroom, quiet away from customers, they could focus in Miami, that new Teller was sitting at a desk out in the out in the banking center. And they were constantly getting interrupted by customers like, Hey, can you help me with this? Sure. And so right learning tells them we we know, that's not how you learn well, so therefore, they were learning more slowly. It wasn't rocket science. But the data that they were receiving didn't show that context.


Kevin Carney 21:19

So you had talked to me earlier about a TED talk with the scout mentality and the soldier mentality? I think this example relates to that, right? Where if you went out of the field, and you saw the different data, then you would come back and provide that information to the soldier. So explain that story?


David Phillips 21:38

Sure. So brilliant. Ted talk by a woman named Julia Galef about what she calls Scout mindset versus soldier mindset. And soldier mindset is her term for what's cognitive bias known as motivated reasoning. And it's basically this we as humans, are wired to, when we come in contact with information that conflicts with something we already believe or know, we tend to soldier mindset is to either attack that new information, or defend our position, find ways, you know, find reasons why that information is wrong. Or to hype up why I'm right. Scout mindset. So the scout scout in the military is their job is to go and see what is true. What is real. You know, if the map says over this hill, there's a road into the next town, great Scout walks to the top of the hill and says it's not a road, it's a river and it's flooded, reports back, we need to change our plans. That's the right thing to do. You're probably not going to argue with the scout and call them an idiot and hire new scout, who will go out and tell you Nope, it's a road. So this notion of you know, getting sort of anchored to a point of view, but this is the problem can be very much that. And so it's that's where we found again, you can't know if you don't go get out and really understand what the context is. And it will help people go oh, right. You know, just because you were wrong doesn't make you an idiot. It means you there was information you didn't have. Now you have more information. And you can react hopefully, differently.


Kevin Carney 23:14

I can certainly see how you would be if you went to the tellers to Tulsa in Miami, you observed you would be the scout coming back with information to the execs and the execs could say well, you just that that's such a minor thing. And yes, people can learn this shouldn't take you twice as long to come up to speed and be competent as a teller just because people interrupt you there must be something else going on. But the trainer's in Miami are really bad. So you can just you could attack what you saw. Or Yeah, exactly. Cool. So you mentioned the the nine right fielders, and you don't want nine right right fielders, you want a diversity of opinions, or diversity of inputs, perspectives. I can see that having a potential opposite effect, the phrase I always use is a horse designed by committee as a camel. So, you know, if you try to placate everyone a little bit, the solution you come up with, it's just, you know, a Rube Goldberg of a solution. How do you reconcile all that information? Because at some point, someone has to be the adult in the room and say, I hear what you're saying, I hear what this particular stakeholder saying and what they want. But I don't want to, I don't want to either through money or through time. I don't want to provide that to them at the expense of other people, or other stakeholders. How do you resolve all that?


David Phillips 24:41

Yeah, you make a great point and I love the you know, the love the horse versus a camel analogy. So early in my journey of learning human centered design and design thinking I heard this phrase that I absolutely love and it's this consensus is the sire of mediocrity. Oh, yeah. Okay. And if you unpack that it's really consensus too early, you know, can read mediocrity or trying to placate everyone. And that's not what this is about. This is not trying to make everybody happy. The diversity of input of perspective say, what are we not seeing? What are you seeing? What are you seeing? What are you seeing? What are these non obvious risks are not adverse opportunities that I executive can't see because of where I am? Or because I'm in marketing, I don't see what they're doing in manufacturing, right, or in the lab. So if you get these different perspectives together, say alright here's what we're trying to solve for? What do you think? What do you think? What do you think and those, those perspectives actually informed the others as well. But eventually, you do have to get to somebody's making the decision. So it sounds like we could do A, B or C. Right? Given our constraints, given our objectives, given the key results we want to achieve. It seems like C is the right approach, we're going with C. Now, the people who were advocates of A or B may not be happy with that. But because they were involved in meaningful ways, right back to design with not for if you involve people in meaningful ways, they're more likely to support C, as opposed to a or b because they understand why C was chosen, right.


Kevin Carney 26:18

So a lot of people walk into these sessions. And there's everyone talks about collaboration, all of our going to collaborate, we're all going to talk through the problem. And I understand that, right? You want to hear people's input or their perspective. But at some point, you need a leader right there, I see a lot of meetings where we have everyone at the same level, and everyone trying to come to consensus, because that's what they think collaboration is. But there's got to be that one point in time where someone says, Thank you for all the information you guys really thought this through. This is the approach we're taking. And and I don't want to say that someone's got to lose, but not everyone can get everything that they want.


David Phillips 26:59

Right? Yeah, absolutely. Yeah, here's the approach we're taking. And here's why. And again, they still may not agree with it, but they understand it wasn't just some sort of capricious decision, or that's what you wanted all along. And this was a waste of our time, right? And when you involve people in meaningful ways, and that you are thoughtful about taking their input, and have back to your point earlier, but it is it is a science to this, right? This is not just hokum, it's okay, we have a process for how we're going to make decisions about should we do A, B, C, D, or E, you can make this like, Hmm, it sounds like you know, the trade offs are, here's why C is a better option than A or B.


Kevin Carney 27:36

Gotcha. So you mentioned testing and iterating over a solution, right? So you have a hypothesis, you've done some data gathering you. Everyone thinks this is the right approach. How do you test it? And what's the importance of testing and getting more information and iterating back and improving the designer?


David Phillips 27:58

So the unhelpful answer is? Well, you got to test the right things.


Kevin Carney 28:03

Oh, sure. Well, thanks.


David Phillips 28:05

You're welcome. Follow me for more tips. Yeah, maybe a slightly more helpful answer is understand what parts of the solution need to be tested, that we think this is right. But what's better than a good idea, a testable idea. So you don't necessarily need to test the entire solution or the entire product. So if it's let's test the sign on feature, so we don't have to test the whole thing. But we think if we do this, people will sign on like, Okay, we'll go test it. We think that, you know, it's the think of a good example. That part of our solution is there is some interaction between our customer service people and the prospective customer, like, okay, then test that. You don't have to test the whole thing. But what are the pieces of this program, or product or service or process or whatever it is that we're not 100% sure that this is the right approach, then, and there are lots of different ways to test different things depending on what you're trying to test, but figure out what's what is you need to test and when what's the best way to test that piece.


Kevin Carney 29:11

So it doesn't have to be a complete solution. I'm thinking of a scenario where one of our clients, they're loading data to the platform, just keeping it generic, they've loaded data to the platform, and then they would do some analysis and reporting off of it. And so there's no value to the customer. And until you can load in the data and do the reporting, right, loading the data does not really get you anywhere, right. But that piece was done. And loading in data is still an experience, right? Because your data is different than my data. And you want to load it relatively unstructured data into something that's more structured. And you have to map some data and figuring all that out. But it's still an experience in our in our, our customer fought, showing it to a client until there was value to that customer. Of course that takes more time and you're gonna have a longer path And now how you present the data is dependent by loading the data. And so now a year goes by until you actually present anything to the customer. Of course, the first time they look at it like this data ingestion process is really klugy. Right? So you don't need the the end solution in order to go test it and try it out. Right? Absolutely. Dear customer, we are we are still in initial stages of this. We want to try this piece out. We know it's not functional for you yet. But your input is going to help make a better solution for all of us. Yep.


David Phillips 30:29

And get to better faster get the better than this. Hey, we're gonna spend a year and they're gonna get it just right. Yeah. And then- Absolutely.


Kevin Carney 30:38

What's that phrase, don't let great be the enemy of good. Yeah, yeah. Let's get something out there. See how it works.


David Phillips 30:43

your point about it's, it's an understandable point, right, that some customers and we've run into the same dynamic are very reluctant to show their boss or their their customers something that isn't finished, right? This this sort of partially baked thing? And that's an understandable, right? You don't want to look like an idiot, right? Like, what is that? That's, that's not a cake that's flour. And you know, that's not a cake. You have flour and eggs and oil, you don't have a cake. And I don't know what I don't know why you're showing me that you're wasting my time.


Kevin Carney 31:15

You can show me the frosting, though. I'll try that.


David Phillips 31:17

Yeah, exactly. But it's helping people understand. So in this case, helping your internal client understand, we need to find out if this is going to work or not, it's going to save us time in the long run. We did a project for a large appliance manufacturer, and the problem they were trying to solve was they needed to accelerate their new product development process. Because their traditional competitors, they were best in class compared to their traditional competitors. But all of a sudden, to them what seemed like out of nowhere, here comes these other companies who traditionally had been in electronics are now making refrigerators and stoves, and dishwashers. And they're smart, right? You know, it's like boom, boom, boom, and that's good. And they're iterating faster. They're, they're rolling out new product faster. So this kind of said, yeah, we've got to get we've got accelerate product development, how can we do that. And they had a big company. So they've got research, they've got marketing, they've got r&d, they've got all these pieces, they had everything they needed, they just need to do it faster. So we helped them to was using this design sprint model, right, you get a bunch of people in a room for a day or two and kind of walk through that process. And we use some of the discovery they had already done, you know, in terms of that research. But what was new for them what was really new for them, as at the end of that day and a half or two days, whatever it was, we brought in a customer panel. And we showed them these prototypes. They were literally made out of shoe boxes and play-doh. Right? I mean, you know, and drawings and sketches, and roleplay stuff said, hey, what if the oven could do this? What if it had these features? What do you think, and of course, we told the customer panel up front, you're about to see some low fidelity prototypes, these are ideas, we'd love to get your input, which of these ideas resonate with you which don't. And that simply allowed that company to go hmm, people like you know, kind of if there were six concepts that were presented during the show and tell at the end of the design sprint, those six concepts turns out to were big hits, two were maybe maybe not, and two were just dead on arrival. Great, then drop those two at the end, just be done with it. And then you can focus on the two at the high end. And you can decide kind of a portfolio approach, if you will. But this is going to take longer. Again, that was six people or eight people, and maybe they weren't totally representative, but they were getting input sooner, which allowed them to make decisions sooner, about where to deploy resources and where to not.


Kevin Carney 33:48

So you had mentioned to me before that you you have to look what the customer wants, you have to look what's possible. And you have to look at what you can afford or what you're willing to invest in. Right. So take take though, that that triumvirate and walk us through that?


David Phillips 34:03

Sure. And this really kind of goes back to the very beginning, right? What is human centered design? So think of these three lenses through which to look at any particular problem or challenge or opportunity is desirability. What are people actually want or need? Feasibility, So what can we do with our know how, with our resources to address those wants or needs, and then viability which is really financial viability, right. To your point, what makes sense financially, you know, can we do this in a way that it's profitable, or if it's the that it's sustainable? So all three of those are really important. So one example I tend to use is, I think most of us have a remote control at home that's got like 83 buttons, you know, written in six point font with abbreviations that don't make a lot of sense. I don't know exactly why we got to that point. But I'm pretty sure that it was engineer somewhere going oh, wait, we've got another feature we can Add great yeah, then make some more room on the remote for that.


Kevin Carney 35:02

And how many buttons do you actually use? Yeah,


David Phillips 35:04

six. Yeah. All right. So it was feasible, they could do it. Anybody really want it? No. Does it cost him something probably in development time, but you know, big deal. But it's not what people actually wanted. So, you know, all three of these lenses are important. What do people want? Or need? What can we do? What can we do in a way that makes sense financially. But if you truly want human centered products, or services, or programs, or processes, or policies, or whatever it is you're developing, start with desirability, right? Start by deeply understanding what people actually want or need it because sometimes they can't tell you. Sometimes they can't articulate. So you got to dig deeper


Kevin Carney 35:41

Let's go down that path a little bit. Because with AI coming out, and we're we're, we are pivoting Kingsmen to move into AI, with you know, jumping with both feet. And we are seeing that a lot of our customers, including us don't know what's possible yet. I don't think anyone knows what's possible yet. And we have a whole bunch of capabilities, we can do a whole lot of things with AI. But you end up having a hammer and looking for nails, right? And the conventional wisdom is to I'll take AI and automate all my manual processes, like Yeah, well, that's, that's kind of obvious. We knew that in the past. But then you get into a into a conference room with a whiteboard, and you start brainstorming as to what you could do with some of these capabilities. And people starting to learn what's possible. And all of a sudden that disruption kicks in as to how we can change things. And then there's the we don't know, the third bucket of we don't know what anyone could do. We don't know what our customers would want, how they would use it. We're not sure how, how we could implement it, whether it's feasible for us, meaning that you know, the customer, how would they how would they build it, how they support it, how they maintain it, that kind of thing, security and privacy concerns and whatnot. But there's a huge industrial revolution coming in, and the capabilities are changing every day. I have to believe that people need to get into your office and start rethinking what their current paradigm is, because all going to be different. Right? You mentioned that the customer might know what they want. They can't articulate what they want. What's a Henry Ford says, you know that if you ask a customer what they want, they want a faster horse. But you know, we, a lot of people are starting to build faster boats across the Atlantic, and a faster boat across the Atlantic won't get them very far, even though it sounds like it's a cool idea. But you need to start building planes. So it sounds like design thinking is more important than ever. There's like an inflection point with AI, where people have to rethink what the problem statement is, and what the solutions are.


David Phillips 37:53

100% agree. And I have no idea what all these capabilities are. All I know is they seem to be growing exponentially.


Kevin Carney 38:03

Well, you can come to the Kingsmen office and learn those capabilities. www.kingsmen.ai


David Phillips 38:10

Perfect. But I think where the human centered approach to this, maybe maybe applies in a couple of different ways. One is we use the term failsafe I think differently than how it was initially developed, or at least it was initially applied to things like you know, nuclear arms, right cannot fail, right? In this case failsafe is how do you create an environment where it's safe to fail? Sure, right? How do you create your sandbox to allow your people and your maybe people and your customers to say what if we did this? What if we did this and if you blow it up, It's not a big deal. It's it's roped off. I think the other piece about understanding it's kind of working at this opportunity from both ends, but what do people need and what's possible. Because if people don't know what's possible, then they're going to stay in this you know, if only the horse were faster or taller, or you know, could run longer without drinking water, whatever it is. It's still horse.


Kevin Carney 39:05

So there's your camel again.


David Phillips 39:06

Yeah, back to your camel. working out what are the capabilities? And then how might we use these capabilities to either solve problems worth solving, or to be able to completely we don't need to do that anymore Because we can do this.


Kevin Carney 39:24

Right. Let me give some examples of that because we shut our shop down for two weeks just to retrain everyone on AI. It's like retooling the factory, right. And most of our we asked our teams to go do something on AI on Monday, come up with an idea. Work on it. Monday, Tuesday, Wednesday, Thursday, and then Thursday afternoon at Townhall, show us what you did show and tell, and we found that teams that have analysts and engineers that's our normal team. And the analysts had all these great ideas. And the engineer says there's no way we're going to do that in four days. But the analysts pressed on. And sure enough, not only did they do that, but they did more. And so all of a sudden, these engineers have to change their paradigm of what is possible. And then the other example, I was talking with one of my uncle's, who's a professor, economics professor, and we were talking about AI. And he was telling me how AI is wonderful because it allows the students to use AI to create R code, or is it an analysis program, or Python or whatever language you might be using. But he's able to use his students, he's able to use AI to create R code to then do their analysis on their economic statistical model. Like, okay, but imagine that you could just use AI to actually do the analysis, skip going to code, just leapfrog that and it just couldn't click in his mind. He's like, but we're going to use AI to create R better so we can do the analysis. But no, just use AI? No, but we need to do the analysis. I'm like, yeah, just skip the middle part. And so that shift that mental shift is going to be hard in the next year or two.


David Phillips 41:05

Yes. So what comes to mind is a phrase called Drop your tools, which is there's a story behind it, that I think I can tell briefly. But it's the notion that we have to really change our thinking maybe in dramatic ways. So the the source of this drop your tools idea was a wildfire in Montana in the late 40s. And such that the smoke jumpers are dropped in, right, so smokejumpers have to take all their tools with them, right, they've got these heavy backpacks and shovels and saws, whatever they need, they got to take with them. And this particular wildfire that, you know, here's the fire on the side of this ravine, and they, you know, they dropped in here, and they're gonna walk down across the ravine and then start fighting the fire, but unfortunately, fired jumped the ravine. And all of a sudden it's coming at them, and they gotta run, they gotta run to try to make it to the top of the, you know, the mountain or whatever the top of the hill before the fire gets them. And I forget the exact numbers, but I want to say there were 15 firefighters that were dropped in and 13 of them died. 12 or 13, were consumed by that fire. Three, two or three managed to escape. And the they started investigating, like what happened, they found some of and this happened again, in the 90s, even two or three times, of you know, fire changes, firefighters are trying to get those wildland firefighters are trying to escape. And some of them didn't. Both of those situations, they found these wildland firefighters still holding their tools, like still holding the saw. And the human aspect of this as if you're a fire firefighter, your tools are part of who you are. And this notion of drop your tool so you can run faster, just but that's not who I am. Right? These are, these are what I use to do what I do to be who I am. And so in that moment, they just didn't think to drop your tools. Yeah. And so it's now shorthand for rethinking, and being able to and being willing to drop some things that had just been, you know, core to who I am prior like, but I'm a coder, I need to code.


Kevin Carney 43:11

We see that all day long in Excel, right? You get accountants or or analysts that are really good at making models in Excel. And you build a platform for them like, Oh, that's great. Can I export that to excel? Like, no, don't try to do your work inside the platform, not out in Excel? Because the problem you have there is that like the next analyst comes in and says, I don't know how to work the spreadsheet. So the data is not managed.


David Phillips 43:33

Yeah. Give you an example of the Wayback Machine y2k project. I was working at an accounting firm, I was an internal IT Training guy. But we needed lots of just, you know, hands on deck, so was part of this team during the y2k project at this small sort of old school manufacturing operation down in Florence, South Carolina. And as part of the y2k, that said, you know, you got to drop all this old green screen technology and move to a Windows based PC stuff, right? That's how you're going to address a lot of these-


Kevin Carney 44:06

I think I see where this is going.


David Phillips 44:08

And old school, we have a lot of old school people working there, including in the accounting department. So implement, you know, Windows 95, or whatever, whatever the you know, prime Windows thing was in 1999, and Vista, and later, our auditors tell the story about how you know, so, you know, the person she worked with internal accounting gives her a spreadsheet, and she's, you know, odd in the spreadsheet and like, it doesn't fit some of the like, this doesn't add up. Let me see what the formulas are. There were no formulas in the spreadsheet anywhere. She goes back to this, like how did you do this, she goes Oh, so I typed in a number typed in some and then I ten keyed it over here and then I put the, you know, the sum in. You know, to her this spreadsheet tool was just nothing more than a grid. Yep. She was doing what she always did, which is you know-


Kevin Carney 44:58

drop your tools drop your ten key, drop your Excel , drop your code.


David Phillips 45:01

Yeah, like and it's hard it's easy to say and hard to do.


Kevin Carney 45:04

Yeah, drop it like it's hot.


David Phillips 45:08

Might be very hot.


Kevin Carney 45:09

I thought you're gonna say that they use Windows and then they use Windows to actually launch the mainframe inside Windows right so remember that was a rumba was that the the platform go do green screen? I think so. Yeah. Right now that mainframe stuff


David Phillips 45:26

To us who knew how you know, Windows worked. It was it was obvious. This is what Excel does. But if you've never seen this world, I'm going to do what I know how to do, which is ten key. My you know, my calculator.


Kevin Carney 45:41

Alright, let's let's, let's switch topics a little bit to project TBD. Explain what TBD stands for and what the speaker circuits all about?


David Phillips 45:52

Sure, So project TBD. TBD stands for Transformation by Design.


Kevin Carney 45:57

I thought maybe the name was coming soon.


David Phillips 45:59

Yeah, there was a little bit of a play on acronyms other things sure an acronym that this notion so pre COVID, we had this monthly event series, we call it Forward Faster. That was about sharing with you know, whomever is interested about how human centered design can fuel business and civic and social innovation. So if you're interested in the topic, come join us meet some other people, right have these serendipitous collisions with folks that you might not otherwise may not have otherwise met COVID, of course, put the kibosh and all this in person stuff. And so we just recently relaunched here at Kingsmen Software, this gorgeous office space at Camp North End. But we renamed it. And so this notion of project, TBD, it's still about design and innovation, that how we can use these principles and practices of design, to transform what we do and how we do it, regardless of where you're doing, you know, what sort of organization or what sort of context. So each month, that's a different topic related to design or innovation, or just, you know, this notion of humans and how we're wired and how we operate and how we can do better, and how we can do differently. So give a little pitch like what's, what's the next one? What's the cadence? Where are they? Sure.


Kevin Carney 47:11

Soft serving pitch there.


David Phillips 47:12

They are set up to be monthly. Our next one is June 16. Friday, June 16, here at the Kingsmen Software office at Camp North End, Charlotte, North Carolina. The next topic, I think the title is Curiosity, Cats and Decision Making. Oh, and so the first two we've done this year have been loosely related to curiosity, how curiosity is critical for innovation. The topic we just did last week here was called Data Delusions. And this we were exploring how data and analytics and metrics can mislead us if we're not really if we're not careful about making use of those things.


Kevin Carney 48:01

And that's where the Moneyball come in.


David Phillips 48:02

Yeah, because it's a great, it's a great story about the value of analytics. But what's interesting about Moneyball, there were some parts of that story, both in the book and the movie that weren't touched on at all. Specifically, neither the book nor the movie talks about how the Oakland A's that particular season when they seem to transform themselves through analytics and getting these undervalued players. Yes, they did that. They also had three fantastic starting pitchers in baseball, that kind of matters. It makes a difference. It but it wasn't, it didn't make it a part of the story, but was an important part of the results. Anyway, so the topic for June 16 is about curiosity and decision making, about again, how we need to make sure we're not just making decisions without them being really well informed. That we don't fall victim to certain biases to how we think that can lead to maybe not great decisions.


Kevin Carney 49:04

Yeah, there's a point where I like the number and I'm moving on like, Oh, I like that number. Not gonna look at it any further. Yeah, cuz I like it.


David Phillips 49:09

Yeah. And scout and soldier mindset can come into play there as well. Like, this number makes sense. This metric makes sense. I'm good. Or it doesn't make sense. I'm gonna ignore it because it doesn't fit with what I think


Kevin Carney 49:22

Right, exactly. I'll ignore that piece and put it in the book.


David Phillips 49:24

Yeah, I mean, it was, you know, why didn't put in the book. I don't know. But But another part of that Moneyball story of


Kevin Carney 49:30

Was that Michael Lewis?


David Phillips 49:31

Michael Lewis is a great writer. I love his stuff. His podcast against the rules. He went and re interviewed Bill James, who kind of started this whole sabermetrics you know, back in the 70s as a baseball outsider, like maybe the baseball people are missing some things. Baseball people ignored him for decades. They finally like actually, it's pretty smart guy we should we should do what he's saying. run analytics and evaluate players what have you. Michael Lewis re interviewed Bill James again, you know, a year or two ago. And Bill James was saying, you know, it's it's disappointing. How now so much weight is being given to analytics. It's replacing thinking, as opposed to supporting it.


Kevin Carney 50:12

Yeah. Zombie analytics.


David Phillips 50:13

Yeah. Here's the number. I'm good. I can decide. Yeah. Now, you need both the metrics help inform decision making, but human judgment should also inform decision making.


Kevin Carney 50:23

Right. Great. So Friday mornings


David Phillips 50:26

Friday mornings, yep, 8:30 to 9:30.


Kevin Carney 50:28

Is it like a like is the third Friday or-


David Phillips 50:31

typically it's the third Friday.


Kevin Carney 50:32

Okay. Okay,


David Phillips 50:33

we're gonna take July off, we might take August off, we had some things up in the air. But definitely starting September for the rest of the year, that third Friday of the month. We're going to do a series on customer discovery by design, how you use these design tools to do that. We may have another topic as well, we're, we're still figuring that out.


Kevin Carney 50:55

Where does one go to learn more information or sign up?


David Phillips 50:57

Great question. Go to fasterglass.com/projectTBD. Awesome.


Kevin Carney 51:04

Great. Well, David has been a pleasure. Thank you for in smartening us today about design thing. So they can go to fasterglass.com for both TBD, Project TBD. And also if they want to do some innovation, and have you facilitate their Problem Solving. Well, thank you again, keep doing amazing things. Bill. Thank you for making the sound. Sound intelligent. You better do some post production editing to make sure


Bill Clerici 51:34

we'll clean it up.


Kevin Carney 51:35

Okay. All right.


David Phillips 51:36

We're asking a lot of Bill.


Kevin Carney 51:38

Are we working on video next?


Bill Clerici 51:41

We will shortly have videos so you'd be able to watch all the silliness going on that you can't. You don't know what's going on.


Kevin Carney 51:48

Okay, great. You can see me drinking all my diet Coke. You're like a kid in the candy store in this place?


Bill Clerici 51:53

Yes. Oh, yeah. It's awesome.


Kevin Carney 51:55

Yeah. Okay, great. All right. Well, thanks for joining Kingsmen's Beyond the Build Podcast where we showcase interesting tech innovators in Charlotte Until the next podcast, go build something awesome.

Learn about Kingsmen
Contact Us