Jim Morris | How I Test My Teaching Process

Jim Morris

Jim Morris - Product Management Coach & Silicon Valley Entrepreneur

In this episode, Jim and I discuss his wake-up call that pushed him from building first to testing first and how he tests his teaching process with his students at Berkeley.

Have a Listen

Summary

In this episode I’m joined by Jim Morris. We chat about the wake-up call that pushed him from building first to testing first. Jim and I discuss loyalty programs no one wanted, roadmaps filled with sequenced risk, AI prototypes that hallucinate and the uncomfortable reality that confidence often replaces evidence.

We also dig into something deeper: why smart teams ignore data, why leaders fall in love once an idea hits the roadmap, and why testing isn’t about better UX, it’s about real value.

Jim shares how he even tests his own teaching process for students at Berkeley.

Because as he puts it:

“We can build stuff. But if people don’t use it, we’re just creating product debt.”

Enjoy my conversation with Jim Morris.

Takeaways

  • Testing is crucial to ensure product effectiveness and user engagement.

  • Data analysis can reveal the true usage of product features.

  • Mindset plays a significant role in how product ideas are perceived and developed.

  • Not all ideas will succeed; testing helps identify the viable ones.

  • User motivation is key to the success of features and programs.

  • Prototyping tools can enhance the testing process but require careful implementation.

  • Learning from failures in testing is essential for growth and improvement.

  • Roadmaps should be flexible to adapt to changing priorities and evidence.

  • It's important to focus on the core value proposition of a product.

  • Continuous experimentation and adaptation are vital in product management.

Guest Links

Website: https://productdiscoverygroup.com/

LinkedIn Profile: https://www.linkedin.com/in/jimmorrisstanford/

Transcript

David J Bland (0:1.197)

Welcome to the podcast, Jim.

Jim Morris (0:3.447)

Hey, thanks for having me David.

David J Bland (0:5.027)

I'm so glad. I've been wanting to have you on for a while. ⁓ It's ⁓ been interesting to see how our paths cross over the years. ⁓ And you have this deep expertise in sort of product and discovery. ⁓ We've never actually co-taught a class, but I feel like we keep weaving in and out of the same places, especially in Silicon Valley. ⁓ And I'm really happy to get you on here, because I know our listeners are going to learn a lot from your perspective on testing and

You live this, you do this day in and day out advising companies on this. And so I'm just so glad for you to be here. And can we start off maybe with just a little bit about why testing is important to you? Like what draws you to that world? Because it's not necessarily a fun world to live in, ⁓ putting your ideas out there ⁓ and seeing whether or not they're getting traction. So can you give us maybe a little bit of backstory on sort of what pulled you into this world?

Jim Morris (0:59.416)

Sure. The beginning of my career was about 20 years of building stuff. ⁓ And ⁓ about 15 years into that career, ⁓ there was somebody who joined our company. ⁓ And he came over to me, he's a product marketing person, and he said, hey, so how's this product doing? How's this particular feature where people connect this thing up to Facebook doing?

And I was like, I don't know, it's selling well. We've sold it to our business customers ⁓ and ⁓ people seem to like it. Well, let's go look at the data. So it turns out out of the tens of thousands of people using this particular system, of those that adopted this feature, ⁓ was something like five, five people, five times in a month. And we were selling this for about 50 % of our main product. And I was like, wow, this thing is, ⁓ there's a problem here.

And sure enough, over the subsequent years, people would de-book this product, because of course they were asking for a report on how effective the product was, a five out of 10,000 is not effective in any situation. And so that was the beginning of the reflection of, wow, we can build stuff, we can put stuff out there, but if people don't use it, that might actually result in us losing money.

⁓ And as I started to have more reflection on the work my team was doing and what they were putting out and applying all the analytics tools out there, I got a really good sense of what was being used and what wasn't being used. ⁓ And it was an awakening. And then I went to a Marty Kagan seminar and realized that I could actually test things without building them.

And I sat there and the revelation I still remember, it was just like midway through like the first or second day. And I was like, hey, wow, I've been doing it the hard way.

Jim Morris (2:56.238)

⁓ And so I spent the last part of that startup career ⁓ redoing my product organization, design organization, engineering organization to test things in advance, talk to customers, ⁓ and is it a fifth of the time? Is it a 10th of the time to work through an idea as opposed to building it, launching it? And this is back when things didn't have to be as privacy compliant, as security compliant, ⁓ as compliant to healthcare and financial regulation. So I think those were the easy days.

And now it's even harder to put out software in that sense. Of course, it's getting easier. ⁓ But to me, that never left me, the concept of needing to test things. ⁓ my confidence also left me a little bit, right? Because you think all of your ideas are great, especially if you're in a startup, especially in Silicon Valley, you've been funded. ⁓ And I realized, like, you know, there are lots of great ideas out there, but ⁓ they're not all going to work.

And I know that sounds simple, but if you're the person who has an idea who think it's working and you haven't tested it, guess what? Like you're ⁓ on risky territory. ⁓ so ⁓ I changed my team and how they operate. And then I decided, yeah, it was important to kind of ⁓ make a job out of this and switch out of engineering, switch into product fully and yeah, commit.

David J Bland (4:19.362)

I love that story. feel as if, you know, I've been on both sides of that. ⁓ And so also working on startup nights and weekends, building things and then realizing ⁓ nobody wanted what I built, no matter how polished it looked. I think that's one of the reasons I'm drawn to this kind of work and I'm so passionate about it. ⁓ I'm wondering, what was it about the conversation? So ⁓ the person came in with you, ⁓ asked, hey, do you know how many people are using this or how it's doing? ⁓ I've asked that question.

And I have not always seen people respond well to that question. I remember a specific ⁓ client I was working with, and it was almost like they didn't want to know. ⁓ They had Mixpanel integrated back in the day. They had it implemented. They had all these event level tasks tracked, and they could pull it up any time they wanted. And yet they didn't. And I remember in that session saying, no, we're going to pull this up. We're going to look, because we're adding all these features, and are we

creating value or are we destroying value? And I feel like, okay, this was a while ago. I could probably could have handled that better in the way I approached it. But it was pretty clear they did not want to know the answer to that. And I'm wondering, have you seen that or are you just inherently curious? And that's why you said, yes, let's go find out. mean, what have you seen in your experience?

Jim Morris (5:38.424)

For me, ⁓ I've been reflecting on why I care, this exact question. And I go all the way back to the first product I launched ⁓ as a software engineer back when there wasn't product management. And I went into the logs and I wanted to see if anybody was using it. So think from day one, I've always ⁓ loved the instant gratification of the internet. ⁓ And I've looked in the logs, which then became analytics packages. ⁓

I think when I got to business software, it was a bit murkier as to like, could we sell it versus did anybody use it? And we wanted to be on top of the buzz. At that time, Facebook was a buzz. It could be AI. Now it could be mobile, right? Did anybody use these things? Well, we had to have it as part of like checking the box as a business software provider. Now, when I see clients like you're talking about, I think it's that confidence. I think it's that like, hey, this idea works. It's going to make us some money. You build it. And when it launches,

it's going to be great. And then they kind of move on to the next idea. And some of these companies have enough revenue where the signals ⁓ are mixed. So some things are going well, some things are going poorly. And there isn't that person who's really being analytical and saying, well, this is going well, this is going poorly. And I think even in some public companies where you would imagine they would have tied off all these knots, these loose ends, ⁓ there can be some sort of shooting from the hip.

And so I get blown away knowing what I know and seeing what I see. I'm actually in 2026, I'm just getting a little bit more mean and a little bit more pessimistic. Every year I'm a consultant, David, it becomes like, they're turning my optimism into pessimism. I don't know, how do you feel? you sound like an optimist. Like, do you have this journey of going into pessimism with client work for so many years?

David J Bland (7:33.143)

I do sometimes. ⁓ think overall, ⁓ I believe that people are what they want to work on things that are valuable. ⁓ And I think it's the trick is like defining what's valuable to them, you know, and for some people, like the code being very elegant, that's what they value and other people it's like, ⁓ yeah, the code ⁓ has to work, but I want to know people are using it. And I think it's it's very personal in a way when you think about it. And it's I try to be optimistic. You know, I've been pitched so many ideas.

Jim Morris (7:48.621)

you

David J Bland (8:1.432)

One of my favorite was, ⁓ well, we both do work at Draper University and I had a kitty litter idea pitched to me there and I thought it was one of the dumbest things I've ever heard in my head and it ended up being acquired for a billion dollars, you know? So like I've really, ⁓ I have my, ⁓ I do have guardrails on what I think's acceptable and not as a business. ⁓ So there are businesses I don't actively advise because, you know, I don't personally think I could work on them. ⁓ But.

for most of the time, who am I to judge your idea? ⁓ It's what's desirable, viable, feasible risk in that idea, what would have to be true for it to work. And we kind of just work through it, through the process and find out. So I try to be optimistic. One of the things, mean, you know, ⁓ we do assumptions mapping a lot and you have to, people are like, what about the null hypothesis and what about writing things in a negative way? And I'm always like, no, try to write it in a positive way. Try to say,

we believe this would have to happen for this to work instead of we don't believe this is true. Because I find if you slip into we don't believe, we don't believe, and they're writing down all the things they don't believe, then at the end they're like, well, why do we even test this? Because ⁓ we don't believe this is even possible or there's any customer out there. And I think when you're that conservative in your thinking and you're pre-filtering everything, you miss the big opportunities. You're never gonna find that big opportunity.

by being really, really conservative and writing down all the things you don't believe. I try to flip it on its head, but I agree the mindset stuff is probably the hardest. It's not the frameworks, it's not the tools, ⁓ it's not putting things on sticky notes virtually or in person. ⁓ It's the mindset of, ⁓

Jim Morris (9:42.678)

Hey, wait a minute, wait a minute. Watch out the, can't be dark on the sticky notes, David. ⁓ Keep going.

David J Bland (9:48.247)

Yeah, we used make this joke like, you if you put enough sticky notes on a board, you know, an innovation lab appears. And it's kind of like that, you know. ⁓ So ⁓ I ⁓ have to say, ⁓ as a coach, you know, I have an advisor, I do have to come into it kind of with a good energy and feeling as if, yeah, I can be skeptical, but I don't want to be cynical. I don't want to pre-surrender going into a conversation and

Jim Morris (9:54.764)

that we call it's agile wallpaper.

Super.

David J Bland (10:17.836)

That's something that I'm always working on. I'm always trying to work on and be better at.

Jim Morris (10:23.916)

Yeah, I think people have seen enough ideas that don't pass the sniff test do very well to judge an idea of face value, which is honestly why I love testing, because your great idea may be bad and one of your questionable ideas may win. And I'm a big fan of the multiple options style of prototyping. In digital prototypes, it's much easier, of course, than physical real world items. I think it's...

a little harder to get good feedback and digital prototypes because they're so easy to kind of move through as opposed to physical prototypes where you're interacting and there's a more of an emotional involvement. But like when I've got the data scientist who makes the prototype that wins with the customers, it's very humbling for the PM and the designer. So yeah, I think there's a lot of reasons I love testing now ⁓ because I've seen ⁓ the results over time.

But yeah, the pessimism, optimism thing, it's, you know, as a consultant, yeah, like we gave up choosing what we work on. And I agree, there's a bit of like a restricted list that I'll offer, ⁓ work from, but like, it's a freedom where I'm not there to judge the idea and I will join. If I wanted to pick the idea, I would have to go back into industry and get a job, right? We have to be flexible and ⁓ our services are needed in a variety of places, so.

I go where ⁓ the market takes me and that's been a lot of fun. Yeah.

David J Bland (11:52.921)

Yeah, and I think hardware has changed and evolved. I have a client in the past where we were doing, it was like a paint can, it was like a new type of paint. And I literally just walked over to the Maker Hub nearby or drove over, was like a couple of miles away, and 3D printed all these different types of shapes of cans and of lids and everything. And lined up people to have them hold each. And so you can do it now. I mean, obviously the scope matters if you're doing giant B2B hardware, it's a little different than,

and paint cans. ⁓ But I do think ⁓ some of this thinking is becoming more common in non-software and non-digital. ⁓ And ⁓ I think these are folks also who maybe have never heard of Agile either. ⁓ It's like ⁓ Agile is a software development thing. It wasn't necessarily a hardware development thing. ⁓ And so it's really interesting to interact with those folks. And I'm just wondering, you ⁓ work with so many different types of companies. Have you ever had experience where ⁓

you went in and you tried to help them test and it just went sideways. Like what we thought would happen didn't happen. ⁓ If you're going to test and it's going to fail, ⁓ that's part of testing. Not every test succeeds. So I was curious if you have any stories of things you've tried with companies and just, it did not go as it planned for you or for the person you were working with in team.

Jim Morris (13:15.660)

As a moment, I was working with a large e-commerce company, ⁓ well known, and there was a loyalty program ⁓ that they wanted to test. ⁓ They wanted to increase ⁓ signups to the loyalty program. So we analyzed their system. We figured out what some problems were, and we started creating some prototypes to show to users. ⁓ And ⁓ we started showing them to users. ⁓

there was really no uptake at all. There was ⁓ maybe a dropdown message on the site. And I saw a user look straight at the screen, not even the dropdown managed to click the X on that pop-up dropdown thing without even looking at it. was ⁓ amazing. And you sort of see how users can tune out what looks like ads, even though they're from the website they're on. So the best intentions, the best copywriting, the best design, ⁓ but...

but the mechanism we were trying to use was a little bit ad-like. And then we had some inline messages that weren't ad-like, totally all ignored. And we reflected on this process where we got no traction on increasing signups and desire to signups. ⁓ And...

So it turns out the motivation to sign up was just not there because the offering that they had was $2 for every $100 you'd spent. And the idea that you would sort of like get up to that $100 and then save two bucks basically just jumped out as like, this is just not compelling. It doesn't matter what we test. You needed to actually change the value of this program, not the software.

⁓ the signup process, if something's valuable, as you know, ⁓ people will actually use really bad websites, they'll use really bad products, ⁓ and they will suffer through, they'll trudge through the mud if it's valuable. My wife loves a deal. Like she will do almost anything to get a deal. ⁓ And ⁓ we're very honest on how much time it takes. And I think people forget that we can't just decorate things, we can't just make good experiences on top of things that aren't valuable. And so we asked like, well,

Jim Morris (15:31.402)

Why the $2? And the person said, well, we anticipate this many signups and we had this budget and we did the math and that was how we came up with two bucks and that's our offering. And so it wasn't even value based. It was just off of the arbitrary budget line item the CFO had given this particular loyalty group. So I think that was partly a mistake on my part. And now I don't let anybody test things that are centering around and something that's not valuable. ⁓ Of course, the test is about value anyways.

So yeah, was a little bit of a mistake on my coaching part and just a massive miss by this company who just underestimated what actually loyalty meant to people. These were just employees in a company given ⁓ the charter of this program. And they assumed since they had the charter of the program that the program was valuable, right? ⁓ And honestly, it wasn't.

David J Bland (16:19.789)

I think we have to own up to we make mistakes too.

as advisors and consultants. think earlier on in my career, I fell into this trap more where I thought, this worked for this company. So it's going to work over here at this company. And I was so sure it was going to work even in the same industry. So I won't name the companies, but I was working with a giant travel company. We did this thing where we routed people through a different system and back in for as a test because we didn't want to rebuild everything. Let's take a slice of customer base, route them into this other thing.

It'll be a different experience and we'll drop them in and we literally use like the APIs and everything and it worked. Exact same situation, another travel company and I thought, oh, that's what we're gonna do. We're gonna route them into this experience test to have all the metrics and everything, route them back in. And it failed miserably. Like we couldn't even wire it up properly. We couldn't even get people like, literally like the system would break when we tried to do that. And it was a big learning experience for me because I thought, oh,

Wow, that didn't go. We couldn't even learn what we wanted to learn because we couldn't drop them back into the experience where we wanted them to. And so I think it was very humbling for me to experience that early on in my career and understand just because things work at one company, it doesn't mean it's going to work for another. we see so many different, like you work with so many different companies and you've seen so many different permissions. You're like Dr. Strange looking through all the different scenarios, you know?

It's like, no, we messed this up too. Like we ask leading bias questions sometimes. We don't mean to, of course, and we tell people not to, but it's hard not to. ⁓ I feel like... ⁓

Jim Morris (18:3.381)

That's why I keep an open chat during interviews and I always give the first interview as a model and then they do the rest of them. And then I'll say something like, I just, ⁓ you should not say that. I just, ⁓ I just like broke the rule I told you not to break. So yeah, I do try to own up almost in real time because ⁓ yeah, but yeah, a hundred percent. mean, yeah.

David J Bland (18:26.251)

It just happens to us too. It's not like we're immune to mistakes in experiment design ⁓ and how you go about this. This is really hard to do well. And even though that's our focus, think talking about where things go sideways means a lot to listeners as well. ⁓ I think coming back to your loyalty program though, I had a similar experience with a different type of company and it was more like a streaming company.

And they were convinced the loyalty program was a way to go and all the teams that work on the loyalty program. And we talked about our interview scripts, how we're going to interview people about. So we did catch it before we built. But the way I designed the experience is by lunchtime, we had real customers come again so they can test with them. And sure enough, nobody cared. Like ⁓ it was so interesting. They were in love with the idea by lunch.

And we had to have that feedback cycle because they had already convinced, you know, this was a huge revenue driver and let's work on our projections and everything. And then we brought real customers in and they said, ⁓ yeah, this is not something I'd be interested in at all. Like, why would you even offer this to me? And so I feel as if we need to be able to broker that, you know, ⁓ like have that conversation with customers or help them. Help them find a way to get feedback quickly, because I don't know about you, but

I see people fall in love very quickly with the things they're working on, even at the idea stage.

Jim Morris (19:55.659)

mean, once it's on the roadmap, they're in love, right? And they don't stop with one idea. They basically create a whole year's worth of ideas. Now, some of these things are informed by evidence from the previous year, but a lot of things are not. yeah, backing people off of an idea. If I can get them to, instead of draw the line from idea to engineering, if I can get them to draw the line from idea to discovery,

to engineering, ⁓ that is a big win for me. If I can route things through and introduce the concept of skepticism, even with the CEO and say, look, we will test your idea. I won't let the product team bury your idea, but they're going to test it alongside a couple other ideas in this whole optionality concept. And so I was able to deal with this company where they had the CEO product transition, you know, where...

CEO really needs to focus on sales and fundraising and they have to hire that first product person. And how does the first product person really get any air in the room when the CEO knows everything about the product and the founder, but they realize they've got to really pass this off. And so I was able to kind of create this bargain between the two of them. ⁓ But yeah, there's definitely situations where we come in, we do a lot of work and then...

the people up above, they really don't want this testing process. They want to have just a very like regimented, launch this thing, launch this thing kind of process. with leaders to kind of help them understand a little bit of skepticism has been an interesting change too.

David J Bland (21:36.161)

I love how you said it's a bunch of ideas ⁓ over the year in a roadmap, you know, I'm paraphrasing here, but ⁓ I find that to be a pattern of risk. maybe I'm just getting older and I'm questioning more of how we're working ⁓ more. I think when you're growing up inside of a company, it's almost like, okay, this is just the way it's done and I'm gonna follow this and ⁓ do all this. But I look at roadmaps now and I'm terrified.

I see these, it's like sequenced risk. It's, we're to do this and then that leads to this and to this and to this. And really there's not a lot of space for evidence ⁓ from going from ⁓ that one thing to the next thing. Like even going between like Q1 to Q2, this would have to have our phase one to phase two. This would have to be true for phase two to even work at all, or this part of the roadmap to ⁓ even be valuable. And I'm wondering, is there appetite to question that?

because I'm starting to question it more and I'm trying to put a name to it. I just did a workshop a couple of weeks ago where we went through a sample roadmap ⁓ and we started calling out, is a lot of risk that's sequenced here. What evidence would we need to see between here and here? ⁓ What would have to be true for the rest of this roadmap to work? Do you see that happening or do you feel as if we're still locked into ⁓ this feature by this date or this thing by this date? What are you seeing in your ⁓ day-to-day?

Jim Morris (23:4.490)

Basically the people tend to say, I've got more ideas than they can execute on. And their natural inclination is to just sequence the ideas. And they basically feel like they're done. And I might sequence it like I've seen two years out roadmaps. ⁓ Of course, the world's changing. The ⁓ technique I've used and thought about is when I'm working with teams on a long-term basis, when do I see their executives make a change in the strategy? My sense is it's about every two or three months.

There's a team that's getting moved over. There's a product that's getting deprioritized while another one's getting emphasized. ⁓ And ⁓ so I ask it like, ⁓ why ⁓ make these rigid plans and spend all this time when your ⁓ horizon for changing the plan is about three months, right? So why plan the three to 12 months when it's actually very clear you're, ⁓ and of course you have the right to change the business priorities because the world's moving.

just like you should be expecting the product priorities to be changing as you get evidence. And ⁓ we talk about testing evidence, but just think about the economic environment, the political environment, the ⁓ personnel environment, the AI environment. ⁓ You know, I think ⁓ nothing has been certain but change in the time that we've been doing product. I look at these long-term roadmaps like I look at like a pitch deck from a startup. ⁓ It's interesting, but like it's pretty fictional.

You know, the pitch deck is great. Like you've thought about your business, but honestly, there's one thing in this business that's really important and you should basically focus on that. And if that works, people are going to give you money, whether it's investors or people subscribing to your service. And the idea that you have a business model doesn't make your idea better. And so you and I are in this like this phase where we actually see really important things as not as important, where we have to go in and be like, actually, let's move this aside, let's move this aside. None of this matters.

When I'm on stage talking to the folks at Draper, 50 startups and they all have their dreams. Yeah, your business plan is nice, but like, does one person like it? Do two people like it? Did you get five people to like it? And they're kind of shaking their heads like, I don't know. I think thousands will like it.

Jim Morris (25:18.025)

And so they take the grand vision as a supplement for anything. And it masks their actual ⁓ desire for evidence and it masks their ability to get evidence. And of course, when the negative evidence comes in, they're still thinking about the thousand mythical people that love it.

David J Bland (25:36.491)

Yeah, I think it's the core. When we do things like assumptions mapping, ⁓ there's this sort of axis where it's like, you know what, if anything is below that on the importance line, it's not that important right now, at least not for the next three months. So let's just not waste time on that. And I think that was a forcing mechanism to say, let's just defer. It doesn't mean it won't come back. We're going to revisit this thing. It could become important later. ⁓

we have to make some trade-offs here. We can't work on, not everything's the riskiest assumption. ⁓ Not everything's at the top of that map. And so anything below, let's just give ourselves permission not to work on that and not to waste our time. ⁓ It might be fun working down there. Like it's lots of cool technology and maybe there's some stuff we could do, but it doesn't help us de-risk what we're working on. And I think without these focusing mechanisms or just having a structured conversation around risk,

I think we get drawn into stuff and we go down this rabbit hole and before we know it, we've ignored these really important things that could kill it.

Jim Morris (26:43.337)

Well, it's convenient to ignore them. I mean, it feels good. First of all, it feels good to launch things. So people are a little bit addicted to that. I can identify with that. But I no longer feel good if I haven't talked to users in about a two week period. I start to feel really nervous for, if it's my product, I feel very nervous if it's my client's products, because they start to grow and grow and grow. And I know that we can't possibly test all of this stuff. And so I know they're going to launch stuff that no one's going to use at that

⁓ And people want to be completist. It's a sort of a curse of our industry is that the people get into it are very detail oriented, whether it's engineers or product managers. And we tend to ⁓ build out things, whether it's single sign on and login. We want to build like everything from start to finish and then release it to the world as if it's our gift. ⁓ And I think for you and I, we come in and we say, look, we don't need this. We don't need that.

This is actually looks like the core value proposition because you've done like the assumptions mapping and this is the part that is win or lose or these three or four things. Let's focus on this and we can actually do a bunch of testing on this thing. So backing people out of their grand ideas, trying to focus them on what is the like pivot point in success. I mean, I'm getting better at it, but honestly, like whether I've tried, trying your techniques, they're trying other techniques. Like how do I get people off of

the completist, the completionist vibe and onto ⁓ what really matters. And for me, it's just much easier to see after this time in the business. And I've had two successful startups, one unsuccessful startup. I've coached hundreds of companies, small, medium and large. And that's what I'm looking for. It's like, there's this matrix screen of text flying through. I can spot the like, the thing that matters. Now, my ability to communicate that, very mixed.

David J Bland (28:40.558)

It's one of reasons I like using the ⁓ orange, green and blue for the stickies for assumptions mapping is I don't even have to read the stickies. ⁓ I can say this team, they just came up with this idea this morning. They're mapping out their risk. If I squint ⁓ and just look at the colors.

If it's all blue top right, like here's all our feasibility risk, it's all about building. I know where that team is in a mindset wise there. They just want to build. And what I look for early on is I want to see orange. I want to see some green. I mean, yeah, blue can be up there, ⁓ but blue, all blue top right is an anti pattern to me. It means this team, ⁓ they've moved everything over to the left on customer discovery, anything around talking to customers.

Jim Morris (29:8.201)

Mmm.

David J Bland (29:26.030)

And the way they do it is they say something like, well, I talked to a customer and I validated that or I ran a survey and I validated that. So that's all true. They're going to behave exactly like they type that in that survey. And now we need to build and all our risk is around building. So.

I like the colors because it's almost like another information layer over it where I can just squint and go, hmm, yeah, ⁓ my work's cut out with this team, you know? ⁓ But I ⁓ don't know. I mean, that's just one tool. I mean, there are all these different tools, but I think we're working towards the same goal. know, we don't want people wasting time building things that nobody wants and wasting a lot of time doing that ⁓ is even worse. So I know we talk about money, but we also talk about time. know, there's an opportunity cost to just working on things that

from a time perspective. I come back to, do think people wanna work on things that really matter and of value and coming back to what value means to them is really key. ⁓ And I agree, there's not just one tool to do that. It's almost like we're principles led here. We're really like, we have these principles that we stick to and we're trying to find tools that just help people along the way.

So I wanna change the conversation just a little bit because ⁓ you're one of the people, ⁓ one of the reasons I wanted you on here was because you apply this to your own work. And I always appreciate that. I really have a hard time with folks that espouse all these theories and then they don't apply it to their own work, which makes me believe that maybe they don't fully believe in what they're saying. ⁓ But you do, you apply it to your work. So ⁓ can you share something recently you've done?

where you're like, you know what, this is a big test, but ⁓ here's what I think I can learn from this. ⁓ And maybe it went sideways, but it's something I learned from professionally in my own work. Is there something you can share that you've tried that was like, yeah, this is a test and this may not work out, but this is something I should do because I'm gonna be more informed as a result.

Jim Morris (31:22.472)

Yes, I recently taught a class to 35 Berkeley students. And it's a very intensive class where they work an idea through from beginning to prototyping to interviewing to analysis and presentation. And to me again, if I were to strip a lot of product management away, this is the core of any company, any technology company is that this technology is wanted and valued by the customer and has some utility for the business.

So I'm the best of breed type coach in the sense that I'm always pulling ideas and concepts and techniques from different folks and trying to help my clients like build a tool belt. The tool belt honestly has just changed over the years. I'll just take an idea, get really excited about it, use it with a couple clients and be like, oh, that didn't quite work or it's good, but I'll modify it this way. So yeah, I don't think I've given the same slide deck ever.

So yes, I'm constantly reinventing it ⁓ and constantly listening to folks and adopting new ideas. ⁓ you know, we're a very immature profession. Products is by far the most immature profession in a company. We've got engineering, you've got design, you've got finance, human resources, ⁓ executives even have coaching and ways to do things and strategy books. So I think we are just very immature and none of us can agree how to do it anyways. So let's just do another sign.

So I have these students and they're gonna work through three concepts. ⁓ One is finding a study spot. Another is tracking job applications. These are master students. They're all trying to find jobs. And the other is learning California slang. There's a lot of international students. So I give them these three ideas and I disperse them amongst these teams. They do teams of three. And I give them the idea because I'm not there for them to work through their entrepreneurship of their idea. They're here to learn techniques. And I know that these ideas will work.

as a good example material. Because some ideas are actually just kind of hard to test, it's just going to slow them down. And some ideas are, the test is just not that interesting. And so I decided that I wanted to use these AI prototyping tools. the students aren't designers. And so when we get to the part about making and manifesting an idea into a clickable prototype, they've used paper prototypes.

Jim Morris (33:50.524)

they've used things like Figma. Each year they learn the new technology and ⁓ it's just pretty amazing to see what they can do. So I said, okay, let's use the ⁓ AI prototyping tools, but it takes a little while to teach it. And so it's gonna eat up some class time. And so I've got to drop some concepts. ⁓ And I was like, okay, well, let's just drop some things like sketching. Let's drop ⁓ hypothesizing. I mean, this is like, I'm sure I'm breaking your heart, David. And like, let's just go from

problem customer ⁓ success metric to prototyping. And it was bad in ⁓ two very bad ways. Without a focusing event, sketching once, sketching twice, and then what's the hypothesis? And then ⁓ I take them, they choose a sketch and they actually have to break this thing down, screenshot it or cut it up and make a storyboard of discrete screens and they have to put their finger on it or their mouse and they have to tap.

because people leave out screens, people have no idea how to make experiences, and so I teach them how to make an experience through a storyboard, and then to make hypotheses about this button and that image and this content so that they actually understand what they're testing. One person could easily try to test like 40 things, let alone a team of three want to test like 60 things. And when they enumerate all the things that I ⁓ am coaching them to enumerate, they're like, my gosh, of course, they have been testing way too much.

So all of these steps that I've really grown to love over the years ⁓ as focusing events, I just threw them out. Let's just go for it, come on. And let's put them with these AI prototypers. ⁓ So you go in, they chat, ⁓ and for finding a study spot, there's nine students making this particular prototype, and they're all working independently, but I'm grading them, so I'm reviewing them. They all come up with this feature of ⁓ real-time ⁓ busyness and real-time.

loudness of a study spot, whether it's a library or it's a lobby, it's something. And I'm like, okay, does Berkeley have APIs to access real time busyness and loudness in all the rooms on campus? And the students are like, just staring at me, I'm giving this like wide mouth look here over the body. And ⁓ so they had no concept that, yeah, we're not gonna test things that aren't feasible. ⁓

Jim Morris (36:15.493)

Yeah, I'm sure Berkeley at some point might invent this ⁓ in the feature, but like it doesn't exist now, you can't make an app. And we're certainly not gonna test with students because well, one of them actually squeaked it through. And of course they're like, I would love to have this. I would definitely use this if it had these two features. I said, well, guess what you've tested, nothing. We can't act on that. You can't build that. It doesn't exist. And I've had the AI make stuff up about APIs when they didn't exist. And so I had...

With the lack of focus, ⁓ they use these things that were expansive. And that was one example. ⁓ And when ⁓ you use these prototypers, they also have a lack of ownership. So they don't actually start clicking through them. And of course, what do I do? I just click a couple of times and then I know if it's broken and the first click doesn't work, I know they didn't click it either. And this is like when I was managing engineers and product people and they were releasing software and the bugs were so obvious, I could tell that they didn't actually click it.

before it went live. so ⁓ it has lack of ownership problem. ⁓ And then ⁓ for the finding a job, it was actually like, it started to make these job tracking applications that were incredibly comprehensive. It did everything, but nothing. And it looked like a job tracking application. And ⁓ I trained these students and where do they keep saying, ⁓ okay, you look at this screen.

play around with it and tell me what you think. And this couldn't be the worst phrase to say at the beginning of an interview about a prototype. ⁓ It breaks my heart because I'm a bad teacher. ⁓ But ⁓ that's the kind of thing when there's so much there, they're not focused, they're not asking like questions around utility or providing like a scenario that's going to make them pass through a certain bit of functionality.

and then you're gonna see the reactions as they sort of pass through this area without knowing what you're testing. Once they know what you're testing, the gig is up to some degree, because Americans are really nice to you. So yeah, it was an experiment. Now, the good part about the experiment was they talked to users. They talked to users beforehand, because they made them do non-prototype interviews. They talked to users doing prototype interviews. And so just having...

Jim Morris (38:40.507)

technical smart people talk to people now in their career to me is a meaningful event because they will know that they should be doing this throughout their career. Cause I didn't figure this out till 15 years into my career. And they realized that like whatever their dream was, even with all of the prototyping issues, ⁓ users didn't see it that way. So let me pause there, but it was a very interesting experiment to throw all of this out and to have it go sideways with prototypes, prototypers, making stuff up and students.

not understanding what they wanted.

David J Bland (39:13.002)

I think it's great you try that. mean, ⁓ especially bringing in prototyping tools. I mean, they're going to get exposure to them one way or another. And so doing it in ⁓ a classroom setting, I think is helpful and how it flows into ⁓ a way of working versus just as something I'm doing ⁓ on the side. I'm curious, you know, not everything I do ⁓ works out either and how I train people and how I teach and coach. I'm wondering what would you do differently?

based on what you learned from this, ⁓ if you were to do something like that again, have you given thought on how you might try that again or maybe not try it at all? Like what would you do differently next time?

Jim Morris (39:56.295)

⁓ I really like the prototypers because of the lack of the mechanics ability of these students to design. I think they're better than paper prototypes. ⁓ A friend gave me a file that I had all the students upload that helped ⁓ get out of the generic design. ⁓ There's a couple of things that prototypers like to do, which is make a wizard out of everything, which is dumb. And they have this like weird blue color with white.

and like rounded edges they put on everything, which is just dumb. So Josh Herzog-Marz has this document. He let me use it for the class. I'm gonna post it up on a blog post soon. I said, just upload this. And I'm sending Josh like a gift because when I graded 35 prototypes, they were all just different designs. Because of the way he wrote this document. So my sense is I have a concept I'm working on.

actually borrowed from Danielson ⁓ and others was the vertical slice. It's like an MVP and I've written an instruction set to take an idea and slice it up. And so what I might do is start to create that one document that people upload with the AI prototype that really does focus on a slice because the people making these AI prototypers are going the opposite direction than the I am. They talk about the one shot. They talk about getting it all done right.

you'll see Replet go away for 10 minutes while it's coding. ⁓ And so ⁓ what I want actually is an experiment, not a solution. ⁓ And ⁓ so I'm not going to pretend that I can coach them 100 % of the way, to be super clear. So I'm going be playing with how I can make the machine do what I want it to do, while I'm also going to be playing with my instructional techniques. I will probably bring back sketching.

I'm a big fan of human beings using motor skills to extract information from their brain, basically sketching. I think that every idea in your head is amazing. When you write it down, you realize how bad it is. And that's the first moment of reflection ⁓ in an ideation process. And it's also a focusing moment. So I think I will try and bring back only sketching. And if that doesn't work, I'll bring back sketching. ⁓ I don't know. I don't want to bring it. I could bring it all back, but.

Jim Morris (42:20.366)

Nobody wants a long process. They all want to take shortcuts. And so I want to keep experimenting to see what I can cut to give people what they want, which is instant gratification, because I get it. if ⁓ I see products going bad or ⁓ experiments going bad, ⁓ I'll bring that thing back. Because again, it's always faster than coding it, putting it out, launching it. And then ⁓ what we call technical debt is usually product debt, features that people don't use.

So it's really us product people that are creating a lot of debt inside our code bases that are corrupt that we have to maintain.

David J Bland (42:54.485)

I agree. I agree. I love I love that story. And I love that you are thinking through how to take a piece back in and see what happens and measure that and get to that nice balance of making it a shorter process, but also driving to the outcome that you're trying to teach. So I just love that you're applying it. I want to hear more stories of that, you know, from everyone I speak to of how are you applying it in person, you know, to our own stuff. So I just want to thank you so much for hanging out. You know, we went through.

so many things today about ⁓ what even drew you to testing, similar to me, building things and wondering if people were using them. ⁓ And ⁓ just you have such a great experience of startups and bigger companies, and I think you do a great balance between those two worlds working on similar types of work. I'm wondering if people listen to this and they want to reach out to you, what's the best way for them to get in touch?

Jim Morris (43:46.640)

Yeah, you can go to productdiscoverygroup.com and you can find me there or LinkedIn, just look for Jim Morris. There's a lot of me out there. ⁓ So productdiscoverygroup.com is the best way to find me.

David J Bland (43:59.167)

Thank you. And we'll put that link in the description and in the page. I want to thank so much ⁓ just for hanging out with me. I think we could have easily gone for another hour. ⁓ I just love ⁓ just hearing the stories that you see and what you learn from them and how you navigate them. I just really appreciate you sharing your ⁓ perspective on everything with us today.

Jim Morris (44:18.992)

Super fun. Thank you for having me.

Next
Next

Dan Olsen | How I Test With Vibe Coding