Chris Guest | How I Tested Custom Fitted Eyeglasses
Chris Guest - Traction Designer
In this episode, Chris Guest discusses the importance of understanding problem perception and the nuances of desirability testing.
Have a Listen
Summary
In this conversation, Chris Guest discusses the importance of understanding problem perception and the nuances of desirability testing.
He shares insights from his experience with Topology Eyewear, where they tackled the problem of ill-fitting glasses. Chris explains the Problem Perception Spectrum and how it helped them position and communicate their solution. He also emphasizes the need to find early adopters who resonate with the problem and shares his favorite experiments, including a concierge MVP for Topology Eyewear.
Chris Guest shares his experience setting up a pop-up store to test the desirability of a product. He explains how they controlled the experiment and measured interest and feedback from customers. The pop-up store was a success, leading to further scaling and growth.
Chris also discusses the transition from desirability to feasibility and viability, and the importance of trust in the customer experience. He concludes by introducing his current work on traction design and the need to validate market opportunity early on in the startup process.
Guest Links
LinkedIn: https://www.linkedin.com/in/chrisguest/
Website: https://www.tractiondesigner.com/
The Problem Perception Spectrum: https://www.tractiondesigner.com/problem
Traction Design Substack: https://tractiondesign.substack.com/
Traction Native: https://tractiondesign.substack.com/p/the-traction-native-startup-way
Mentioned Links
Topology: https://www.topologyeyewear.com/
Bryte: https://www.bryte.com/
Narrative: https://www.narrative.tech/
Transcript
[00:00:00] Hello, and welcome to the, how I tested that podcast. I'm your host, David J. Bland. I'm an author and advisor, and I'm fascinated with testing early stage ideas. I've helped companies all around the world. And in this podcast, I connect with entrepreneurs and innovators who had the courage to test their ideas with real people in the market with sometimes surprising results, please join us as we explore the ups and downs of experimentation.
[00:00:27] Together. In today's episode, we have Chris Guest. He's a traction designer and a serial startup CMO turned advisor with a track record in guiding tech startups and corporate innovations from inception to product market fit and scalable traction. He has an expertise in category design, positioning, and growth.
[00:00:46] Forged from more than 20 years, comprehensive marketing and product innovation experience from some of the world's most prestigious consumer brands, including Audi, Mercedes Benz, Ferrari, McLaren, Visa, Verizon, and at leading consultancies such as AKQA in both USA and Europe. Let's dig in. Welcome to the podcast, Chris.
[00:01:07] Thanks, David. Good to see you again. I was thinking about when we first met, I think, if I'm remembering correctly, it was probably somewhere between 2012 to 14, maybe at Lean Startup Circle in San Francisco, where we would just kind of hang out as entrepreneurs and try to help each other test things. And I just remember meeting you back then, I was like, Oh wow, this guy has some really cool ideas.
[00:01:30] Thanks, Chris. We should stay in touch and I have to say pre pandemic going into those meetings and just really bouncing ideas off each other. It was a lot of fun. Yeah, that's right. Those were the days that was the Lean Startup Circle Unconference. Once I sort of discovered the book, I became quite a nerd about Lean Startup for a while.
[00:01:49] And so I was quite happy to find my people and go and jump in and Got right into volunteering at the events and everything. So yeah, it was great to meet you there. And I think the last time we met was actually in your book launch, because I was thinking about this earlier. And that would have been one of the last events before COVID, I think, was that the end of 2019, something like that?
[00:02:09] Yeah, launching a book right before COVID was an interesting experience. Lots of tests I ran during that, and I have to say, the airport bookstore strategies sound amazing on paper until everything shuts down and nobody flies. It was a very interesting ride. Yeah, there's always things you really can't predict or mitigate for, I guess.
[00:02:29] But I remember when I was writing that book, you were at topology at the time. I was like, wow, we need some stories about what you're testing there back in the day. And that's how we really started talking about getting you as a case study in the book. And I know in the book, we can't necessarily go into as much detail as we'd like and everything.
[00:02:47] But you all were testing some really cool things back there with this idea of, oh, we're going to measure your face and put glasses on your face that are perfectly measured for you. And I remember the MVP and you guys doing intercepts. It was such a cool story. Yeah, well, it was a really cool technology and really different and unconventional as a startup, which is why I love.
[00:03:07] So it created an interesting testing case study because it was an interesting, but also challenging topic to test. So we had a lot to learn. And so I really couldn't have found myself a better case study to test all of the theory that we'd read about and spoken about over all those meetings. You said something that always stuck with me, and we both wear glasses.
[00:03:29] Those of you listening, we're both wearing glasses. And I remember you saying you were intercepting people on the street. And there was this idea of, well, maybe they're problem aware, or maybe they're symptom aware. I remember that difference that you were explaining to me, and I was like, oh, I never really thought about it.
[00:03:44] Because I always thought about, alright, do they have the problem? Are they actually seeking a solution? That's what I'm sticking with. But you had this nuance to that, that I thought was really interesting, because sometimes people's glasses don't fit properly, and they don't know why. Yeah, that's it. And that was a learning that I got from just talking face to face with lots of different people that were wearing glasses.
[00:04:03] I guess that was the genesis of something I've developed over the five years since, which I call the problem perception spectrum. Maybe a little bit of a mouthful there, I'll take your feedback on that as a name. But it's the idea that in problem validation, the problem isn't binary. It's not just a case of do they have a problem or don't they.
[00:04:21] Or does this person have the problem or not? There's actually a whole bunch of steps in between. And those nuances are actually really interesting and really powerful. And so I guess as it relates to topology, and I should probably just give a quick bit of context so people know what the topic is that we're talking about.
[00:04:36] Topology eyewear is a concept of custom fit eyewear made from scratch based on a scan of your face. The problem that it was addressing was the entrepreneur's vision, Eric Varady. His idea that most glasses don't actually fit most people. The reason for this was that everybody's face is genuinely different and unique, but all glasses are made thousands at a time for a mythical average person that doesn't really exist.
[00:05:03] And so as a result, most glasses don't fit most people, and it's incumbent on glasses wearers to go into a store and try on a thousand different pairs. And so what Topology was, was a system whereby you'd actually scan your face from a video on an iPhone. So you just take a video selfie. This was before even iPhone X structured sensors.
[00:05:24] There was no 3D model. We actually used computer vision. We'd take a video, analyze it in computer vision, and make a 3D model of your face. And then present an augmented reality virtual try on as your face in a 3D model on the screen. And you can then actually see the glasses on your face, change the size of them if you wanted to make them a bit taller, a bit wider, change the color, and then actually order it from your phone and we'd make it from scratch one pair at a time in downtown San Francisco.
[00:05:50] Radically different concept. Coming back to your question about the problem perception, the very first thing that I did, because when I met them I didn't wear glasses, And they're telling me a story about this is what people that wear glasses think. I'm like, do they? I don't know. I don't wear glasses. So the very first thing that I did was go out and spend a whole day walking around San Francisco, just stopping everybody that wore glasses and saying like, Hey, tell me about wearing glasses.
[00:06:13] What do you hate about it? And see what people would say. And over the course of that, what I found was that people were aware of the symptoms, but not the problem. So you could say to someone, Hey, do you have a problem finding glasses that fit? And they'd say no. What's wrong? Ever have a problem with glasses sliding down your nose?
[00:06:28] Oh, yeah. Do they ever pinch on your nose and make red marks? Oh, God, yeah. Do they ever pinch on your head and give you a headache? And then people are like, Man, let me tell you how much wearing glasses sucks. But they didn't realize that it was because the glasses don't fit. And so there was an important difference that I realized straight away between knowing that they have the symptom of a problem and actually knowing what that problem is, which is obviously pivotal for how we position and talk about the problem.
[00:06:54] I love that. I love that you just walked around and chatted with people and tried to understand and that nuance between are they problem aware versus symptom aware? Because again, Ideally, I keep giving advice to people to say go after people that are seeking a solution to a problem. Those should be early adopters, but it's not always that straightforward.
[00:07:12] And so I love that. And I'm wondering, so you mentioned this new framework you were using. I'm just curious. How is that fed into what's happened beyond topology and what you're doing now as sort of a traction designer? How is that feeding into your thinking today? Yeah, I found it absolutely foundational because it was in that project with topology that I realized that there's a difference between knowing the problem and I guess it's the idea of perception is reality, right?
[00:07:40] So it's not whether or not they do or don't have the problem. It's how they feel about it. And so the perception spectrum is saying again, it's not just a binary. Do you have the problem or not? It's actually a spectrum. And it ranges from and I won't sort of read out every step, but it starts with this.
[00:07:55] Okay, no, I don't have a problem. And they're right. They genuinely don't. And then you have that problem unaware where they think they don't have a problem. But we can see as the testers that they actually do. They're just not aware of it. But then there's a bunch of other nuances, which are really interesting.
[00:08:10] So it could be I have this problem, but I'm not bothered enough to do anything about it. Or, I have this problem, but I just don't believe that it can be solved. And then, another step could be something like, Well, I want to do something about it, but I'm not sure what that is or when. And then another step could be, I want to solve this, but I'm going to do it from a manual workaround, which might be a spreadsheet, or a post it note, or an intern, or something which isn't a direct competitor, right?
[00:08:35] So there are some similarities to this idea of the buy a funnel, right? People go from sort of, you know, being brand aware to problem aware and so forth. But it's critically about steps within their understanding of the problem. When you understand that, it's absolutely transformational for how you position and communicate the product.
[00:08:53] So, for example, in topology, what we found, as I mentioned, is that most people felt like they didn't have the problem of glasses that fit. Okay. In actual fact, they did. They just didn't understand that that was why they was having a problem. So what that tells us is that our job is to market the problem in order to market the solution.
[00:09:11] If we just come out and say, finally, glasses that fit, no one's going to get it. We had to do the work before that that would say, if you're seeing this, this and this, Yeah. Actually, it's not you, it's your glasses, right? That's when the penny drops to people. It also, on the topology example, helped us actually understand who the early adopters were.
[00:09:30] Because it turned out that there is one group of people that do already know that their glasses don't fit them. And that's Asian Americans. Because as it turns out, almost all I wear in the U. S. is made for an average Caucasian face. Asian Americans have a certain different bone structure, which means that they don't fit.
[00:09:48] And so, many people have learned to go and look for this category of products called Asian Fit Glasses. Which you can find, there's like usually one shelf of them in Wilder Park or in Jinns or something like this. But that to us that says, okay, that audience is in a different section of the problem perception spectrum there in the old category one, which is, I know I have the problem and I want to solve it, but I'm going to solve it with a different solution, which is in this case, Asian classes, then our job to that audience.
[00:10:19] It's not to market the fact that their glasses don't fit, they wouldn't know that. It's to say, you think you wanted Asian fit glasses, but you actually want custom fit glasses, and here's why. Very, very different positioning, very different communication. That's one example for topology I can give you. So I'm going to share some more recent ones if you're interested.
[00:10:38] Oh, that's amazing. Because we're always talking about, okay, is there observable evidence that people are seeking a solution to the problem? And usually where our heads go, well, where are they going online and offline? And we'd like to think they're just typing in online, how do I find glasses that fit?
[00:10:56] Which I imagine you've all done some of that too. I think this nuance you're bringing up that well, they could be hiring a solution that isn't quite what was intended when that a solution was designed. So it's almost like they're hiring a solution in a different way or they're behaving in a different way, trying to find a solution.
[00:11:13] And I don't know if founders think about that way. I think we're saying, Oh, we're just going to Google. I'm going to do search tree analysis and try to find out, is anyone searching on these terms and look at the related terms and all that. But from what you're describing to me, it sounds like it's much more nuanced than that because people could be hiring solutions that maybe aren't necessarily even marketed to them, but they solve the problem in a way that maybe even the people that created that solution don't realize.
[00:11:36] Yeah, that's right. And it's all about understanding how they contextualize and to think about the problem in their mind. So I'll give you a different example. The last startup I was at a CMO was called Bright, and they made basically a very premium smart mattress. The problem that this mattress solved is the problem of waking up during the night.
[00:11:55] Not so much about falling asleep, but the main problem was waking up during the night and how that makes you feel in the morning like you haven't slept at all. So you go out and you ask people again to try and find out do people have this problem, are people aware and do they care about waking up during the night is different to say falling asleep.
[00:12:14] Structural research interviews to try and get an understanding of what's most important to them. But then as we go through the spectrum with these customers, well, we found people there was I know that I have a problem waking up during the night. I do believe it can be sold, but I never imagined that the mattress could be the solution.
[00:12:32] So people think I'm waking up during the night. The problem is I'm drinking too much water and I need to wake up and pay or I need a solution that is a sleeping pill. Or I need a new pillow or something like this. No one imagined that a mattress could do this. So we actually went a level deeper and found, okay, so when we present it to people, here's a mattress that can actually stop you waking up during the night.
[00:12:52] And it does it because we use AI to monitor your sleep in real time and adjust pressure by the inch. So if we see that there's a spot of pressure that might wake you up, we actually intervene and change pressure by the inch in order to stop you waking up. That particular unlock took people from I don't believe this solution can work or I believe that I need to solve the problem of waking up but I don't think a mattress can do it to okay now you told me that I can't believe you and so it's understanding those nuances I'll give you one more example.
[00:13:25] I'm working with Sam Hatoum, who I think you might have met back in the day. He's working on a concept for a software delivery platform that's potentially a replacement for Agile as a whole methodology. Particularly managing tickets and backlogs and things like that. And it's called Narrative. Just for context, instead of having a backlog of lots of different myopic stories, we have one narrative presented as a flow that unites the entire team.
[00:13:53] The business can see it, the technical team can see it, and they find the tests and specs all in one place. So this solved a number of different problems, we thought. At one point, we had two different hypotheses about which was the problem. One of them is about creating a single source of truth for the whole team.
[00:14:09] The other one was about guiding people from vague requirements to detailed specifications. And we wanted to find out which one was most compelling to the audience. So we went through this process and what we found was actually two very different points in the spectrum. With regards to the single source of truth, Almost everybody was in the column of I have this problem, but I just don't believe it.
[00:14:32] And the general refrain was like, yeah, yeah, yeah, bullshit. Yes, that's a problem, but no one can solve that. People have been talking about it for decades. I don't believe it. And until we showed them and then they said, oh, okay. No, I'm with you now. I'm with you now. But that tells us our job is not to just market single source of truth because it's just going to get discarded and people are going to disregard the claim.
[00:14:55] So that tells us our job is to actually prove it. And what ways can we actually prove that this does work? With the requirements to specs problem, people knew it was a problem but they're solving it via manual workarounds. Say yeah yeah that sucks but we've learned to get everyone in the team together in a room and we do an event storm session or we do a brainstorm and we put stickies on a whiteboard and we do a backlog grooming a very manual work around for the same problem and so there are job is to say okay you thought that you wanted to do.
[00:15:27] That process, but actually we've got this product, which is going to make that much, much easier. And so you can see the understanding the nuances of how they perceive the problem makes a radical difference to how you position and how you message it. I'd love that. I feel as if in the past, a lot of the advice we've heard has been your test with customers and we frame it as a headache versus migraine problems.
[00:15:52] That's been a really common one over the years, right? It's a headache. It's a nuisance. They might try to find a solution to it. They might not. They might just wait it out and it goes away. And then migraine being okay. It's much more painful. They're going to find a solution there and it's going to be urgent.
[00:16:06] To hear you speak to this and I would frame this under kind of desirability risk, right? It's a lot of desirability risk around your value prop and the customer and their jobs, pains and gains and all that. It feels as if that's almost not giving it enough nuance. There's so many different levels of desirability testing here.
[00:16:22] That you're probably not necessarily just running into situations when you're trying to test your ideas. There's so many different layers there that you need to test through that maybe us framing them as those two extremes isn't necessarily helping us in our testing. You're right. It's just that level of the extremity of the problem comes actually a step before understanding the perception.
[00:16:45] At the point you choose the audience and so what I mean by that is we start off and when I work with clients now I start off by trying to really understand what's the problem you're trying to solve and then ask, okay, who's the customer that most experiences that problem and we do go for a process of saying, okay, is it going to be.
[00:17:02] In that last example is it more about the technical product manager or is it the delivery manager in the team that has a problem and it is a problem that enterprise it has more than start ups or scale ups or how do we triangulate some. Niche that we think most acutely most painfully feels this problem because i do agree very strongly actually that that is the best place to find your early adopters but then having.
[00:17:28] Dialed in on someone say hey this is our assumption who the customer is. Yes, you want to validate that they do experience the problem to the extent that you do, but also go a step further and say, what does it feel like to them? How do they perceive it through their eyes and not through the entrepreneur's eyes?
[00:17:45] Not just do they agree with our statement, but what do they say unprompted? And so the way to get to this is a qual interview, basically. But going through a sequence of steps where we start with very broad open questions like, tell me about your sleep. Or favorite question. If you could wave a magic wand and change any one thing about your sleep, no matter how improbable it might seem, what would it be?
[00:18:12] And then we're looking for what's the first thing out of their mouth. Okay, then you had a second wish. What would the next thing be? And you ask a few open questions like that to see what comes out unprompted. And then I go into, Hey, have you ever had a problem with this? And then it's a prompted question.
[00:18:26] And then you use those techniques about tell me in a time when you did this. What did you do to solve that problem? When you start to go through that sequence and the real trick of it is, and this is the bit where it gets really hard. How do you ask one question that doesn't pollute the answer to the next question?
[00:18:43] Right? So you kind of not accidentally leading them to your natural conclusion. Yeah. But yeah, it is still, what's the problem first, who's the customer that experiences it, and now let's take that specific customer and look at it through their eyes. And if you're doing that, and you find out that actually a lot of the people were in the camp of, I don't have the problem, or I don't care about the problem.
[00:19:05] Well, maybe your customer hypothesis is wrong. And then you do go back and say, okay, well, who is our next guest? Now let's take them, do they resonate the problem? Yes, and what's their perception? And then you match those two together. I feel as if. The trend that I was hearing was past experiences from you.
[00:19:23] It's not a lot of future hypothetical, would you buy this solution? Which I think people get very excited about their solution and that's where they lead. Polluting almost the answers as you framed it because they're just excited and they want to know a future hypothetical situation where you're going to buy this thing for this price because this problem you're experiencing.
[00:19:44] And what I'm hearing from you is very much having them tell stories about their past. Yeah, you know, I'm not really an expert in having done a lot of study of research techniques, but someone gave me that advice at some point in the past, you know, don't ask someone what they would do, ask what they have.
[00:20:01] The classic example not being how often do you go to the gym, to which we all want to say three or four times a week, you say, how many times did you go to the gym last week? And then you get a better indication of the truth, you know, try and apply some of that mindset as best as I can remember. I like it.
[00:20:15] So I'm curious, what are some of your favorite experiments you've ran over the last few years? Things that maybe people aren't aware of or aren't very common, just really these amazing like experiments that have generated evidence for you. Some of your favorite ones. What are some of those? I think the topology example in the books, probably one of my favorite because there was just so many unknowns and it was such a wildly different product.
[00:20:40] It required quite a lot of thoughts of how we were going to get evidence for it. So, as I said, the first thing that I did was just go out and talk to a lot of people and ask them about wearing glasses and give me some idea of like, yeah, okay, people don't like this actually. This is something that we might want help with.
[00:20:56] We had a number of big questions and big assumptions, and the first thing was kind of, does anyone understand that they have this problem of finding glasses that don't fit? Would anybody believe the custom tailored eyewear glasses make just for them would actually solve that in any way? But then there was a real key shift.
[00:21:15] This is a textbook discontinuous innovation in terms of requiring the customer to change their behavior in order to adopt the product. You thought that you buy glasses by going to a store and trying on a thousand pairs. Actually, no, you're going to do a virtual try on instead and we're going to make one for you and it's going to fit.
[00:21:34] Huge leap of faith for the customer to take to believe that that's going to happen. Would anyone do that? And I should also say as well that I can't remember what our starting retail price, but frame and lenses together, we're in the region of about 550 bucks, which for someone that doesn't wear glasses or only shops at Wiley Parker, that sounds a lot, but that's actually a well established price point of what you might be paying for stock glasses with the words Tom Ford written on the side.
[00:22:00] So it's existing price point a very different proposition and so when I first met Eric and the team at topology he been working for years in stealth creating this unbelievable technology software and also hardware to actually make it but it never actually gone out and showed it to anyone and got their feedback so there was a lot to learn.
[00:22:20] So we had this idea that we could start getting feedback on this. We didn't have a product that anyone could use in terms of the software. The virtual trying was very difficult to use, but we did have this test rig, we called it. So the actual engineers, when they were making test pairs, they'd have a version of the app with all these ugly sliders on it.
[00:22:39] Like you couldn't decode how to use it. Just had loads of symbols all over it. But we thought, well, no one else could use this. But we could, what if we go in front of customers and we scanned them, we operated the app for them, but then we could at least talk them through it. And so before, okay, this is our chance to do, and I think we'd call this a concierge MVP.
[00:23:00] You could tell me better whether this is concierge or wizard or I was, I think it was maybe a little bit of both. But what we did is we found a partner, lovely chap called Steven, uh, an agency called Partners in Crime, who had actually opened his design studio in a storefront on Union Street in San Francisco.
[00:23:16] And so for anyone that doesn't know the area, this is a lovely, Street full of boutiques was sort of a trendy affluent crowd that would walk up and down and the idea was we'll set up a pop up shop in the front of the store. We created a fake name at the time we weren't called topology so we just made up a name called ourselves alchemy eyewear.
[00:23:36] Steven made us some posters for the window to say you know new concept pop up today. And what we did was set up in the store with Eric, the CEO and Rob, the COO and Jason was there as one of the engineers and said, okay, you guys are operating the demos and I'll go out and grab people and bring them in to come and have a look because it turns out if you want to do that in San Francisco, there's two things that helpful.
[00:24:00] One is you say, I'm a startup founder working on new concept because then other San Francisco types want to stop and talk to you and find out what you're working on. And then the other benefit is if you have an English accent, it does just grab you an extra couple of seconds of intrigue when you could spurt your pitch out there and get your question out and hook someone.
[00:24:19] I went out into the street, would grab people that were wearing glasses, ask them a couple of questions to see if they seemed like they were good to talk to and they were interested. I'd give them a one or two line or explanation of what we were doing and say, could you come into our pop up and check it out or give us some feedback?
[00:24:35] And by the way, at that time, I'm trying different sentences all the time. Every single pitch was a bit different and I'm learning and iterating what people are resonating with. And then I'd walk them into the store and introduce them to one of the guys at ready to do the demo. And then the guy in the store would ask him a few more questions.
[00:24:53] They'd actually do the scan for them and then ask them more questions while the scan is processing because this was before we'd built out all of the AI processing on the cloud or anything. This is 2016. I think we did this before there was the AI infrastructure. There is now. So it took 12 minutes for each scan to process.
[00:25:13] And so when we scanned a customer, the guys had to keep them entertained for 12 minutes to stop them leaving the store. And then we'd show them the virtual try on, help them style their glasses. What we wanted to do is to try and control this experiment in a way that we can actually get useful data for it.
[00:25:29] And I told the guys, like, let's not expect that anyone's gonna buy any glasses from us today. We're not really looking for that as an objective, but we want to see how interested people are and where their interest drops off. Right. So the first thing is, can I come up with a sentence that gets people interested enough to come into the store?
[00:25:46] The next one is then, when they're in the store, what do we need to convince the mob to be bothered to take a scan? Like, a few strange nerdy guys like us kind of scan their face with an iPhone and they have no idea who we are. You know, this requires a little bit of trust. And then having seen their face in the store on the virtual trial, do they actually believe that something real is happening here?
[00:26:11] Do they believe that we've measured their face? How do we help them understand that? And we do things like flip to a developer mode where you could see a wire mesh and a load of measurements and stuff. It turned out that was a big unlock for people understanding this thing was real. And then we had to control for a lot as well.
[00:26:28] So in the real world, we understood, okay, if we have an app and people can download it from the app store and buy online, well, we'd need to have loads of great photographs there. That was an assumption that people would want to see high quality product photography. You know, this is quality product, but we didn't have this yet.
[00:26:42] We couldn't show them that, but we thought, well, this isn't really a fair test if we don't show them what the glasses are. So we said, okay, we'll bring along some samples. But then we can't do that because that would pollute the experiment because in the real world, when they're in the app, they wouldn't be able to see and touch the glasses.
[00:27:01] And so then we had the idea of, okay, let's put them behind some glass. I don't remember where we got a little display case from, but we actually put the glasses, the real ones, on a shelf behind some glass so people could see it and not touch it. And then we said, okay, this is our proxy for seeing a product photograph.
[00:27:19] And then what we did as people were going through is, okay, let's just take a tally. How many people that I stopped on the street would agree to go into the store? How many people that came into the store would let us scan them? How many people that would scan them gave us good feedback? One of the things we predicted was that people would say they loved it when they didn't because people are polite for the most part.
[00:27:39] And so we say, okay, well, we need to find out if they really like you. And so what we did was say, okay, at the end of the scan, let me email that to you. Can you give me your email address and I'll email it to you? We said, okay, we think that if somebody says they like it, but they don't really, they won't give us their email address.
[00:27:55] So we counted that as kind of like an expression of interest. So this was roughly how we set it up and I went out and kind of spent three hours acting like a carnival barker in the street, roll up, roll up, come in and see our demo. And after a few hours, I went back in to see Eric and say, how's it going?
[00:28:11] And he said, we've sold four pairs. Amazing. I think it was four from memory, but yeah, they just like made. A couple of thousand dollars in the last few hours. I guess the other ironic thing is that this storefront that we have is only two doors away from Veo Optics, which is a very high end designer eyewear store on Union Street.
[00:28:31] We easily sold more than they did on that date. That one was a happy outcome. I actually decided to join the company at that day. I was freelance at that point, but I went home, said to my wife, I think they're onto something here. I think I'm going to join. So that's when I joined the CMO after that. But yeah, that was just one step in a very interesting journey.
[00:28:50] I love how methodical you are, though. You're tracking how many people you're intercepting, how many people are greeting coming in, how many people are agreeing to get scanned, some kind of currency with, is there real interest trying to close the say do gap? Because people, yes, will kind of nod and politely say they're interested, but then are they willing to give up anything even if it's through email?
[00:29:07] I love that progression. That's such an amazing story. And it gives such more color to it than what we were able to include in the book. So, thank you so much for sharing that. It feels as if it was a lot of desirability testing there, maybe a little bit of viability. The viability wasn't the focus. It was more about do people understand our value proposition and do they experience this problem enough and will they trust it?
[00:29:28] That trust, having a solution that works but people don't trust it, I imagine would be a very frustrating place to be in. Absolutely. And you're right. It was about desirability at that time. That was the biggest question and we had actually, you know, I should look, we've probably got a photo of it somewhere, but we did a David Bland assumptions mapping exercise.
[00:29:47] With a piece of flip chart paper stuck to the side of a 5 axis CNC machine in the workshop in Bayview. We did it old school, we were legit. And that was the thing that rose to the top, was about do people have the problem, would they care about the solution? So it was pure desirability. After that point, it did flip into more of a mixture of feasibility and viability in parallel.
[00:30:11] The biggest risk for the product and the engineering side was, can we actually make this a great fit? Because it was hard. People say hardware is hard. That was really hard because They had this incredible technology using laser cutting and CNC machines that most of the eyewear industry didn't think was possible, the founder created.
[00:30:33] And so he just needed to make loads and loads of pairs to make sure he had it dialed in. And so we were very quickly into, okay, we need orders, we need orders from real paying customers, not just friends, because again, you make a pair of glasses for a friend and they're going to tell you that they love it because they haven't got the heart to tell you that they don't.
[00:30:50] But if you take 500 bucks off someone, they're going to tell you if they don't like it. And so we wanted to get out and I used Ash Moria. He has this framework in one of his books where you go through different orders of magnitude. So what do you need to go from 0 to 1? And then what do you need to go from 1 to 10, 10 to 100, 100 to 1000 and so forth.
[00:31:10] And so we said, okay, well, that was our experiment to see if we could go from 0 to 1. And so then it was like, okay, what are we going to do in order to go from 1 to 10? Exactly the same thing. There's no reason to change anything else. So we just did more pop ups. So we make it to 10. Are we going to go from 10 to 100 on pop ups?
[00:31:28] Yeah, I think we can, but I think we need to build out a little bit more of this and that. So we did that and we were just always getting orders, getting orders. I said about some failures along the way, they're always surprising, but I think our best event was at the Maker Faire, just south of San Francisco, I think like Redwood City.
[00:31:45] And so this is a meetup of nerds and makers, inventors and tinkerers that are coming to buy parts to make robots with and stuff like that. And so we did a pop up there and we took something like 32 orders over the space of two days, right? So very, very successful. But I think we've got a false positive for two reasons.
[00:32:05] One was that the audience were very nerdy, and it was very much like a, I'm a nerd, you're a nerd, we're all together, one of us, one of us, let's buy them. But also, at the time, it wasn't possible to buy online. So people would try it on and they'd say, this is really interesting, you know, I love what you're doing, so can I just go to your website and buy this?
[00:32:26] And we'd say, no, no, we're not live yet, we're just doing a pop up for two days. I know. Okay. Well, by now it's now or never. The next event we did, we had launched and I think we did a pop up on Valencia street and he was like, can I buy this online? It's like, yeah, yeah, no problem. No audits. We went from like 32 in two days down to zero because we lost that immediacy.
[00:32:48] People say, okay, yeah, I'll do this later. And when someone says I'll do this later, they never do. That was one thing that we realized that actually, yeah, that immediacy was giving us a false positive. The other thing we found which was counterintuitive is that the better we got at the pitch, the worse the conversion got.
[00:33:05] How is that possible? And I think that in the early days, you know, we were all awkward. We were making it up as we went along and I think there was a certain charm to that. I think that people could see that we were genuine and we were trying to figure it out. Maybe they felt a bit sorry for us, but I don't know, but people gave us a go.
[00:33:25] And then by the time I'd spoken, I'd literally spoken thousands of people face to face. I think I estimate one to two thousand people. After all, I got pretty good at the pitch. But the slicker I got it, the less effective it was, it just lost some of that beginner's charm and more like a salesman and less like a founder that was just trying to figure it out.
[00:33:46] That's interesting. I've had that experience sometimes with, I'll help a team and they'll say, we're going to do a really quick and dirty landing page with our value prop and everything. And we'll start getting some interest. And then they'll say, Oh, we got interest. You have to make it look all better now.
[00:33:58] And they'll polish it and spend days moving pixels around. And then the polished version actually generates less, less demand somehow. So yeah, I do think there's something into that. But I just love that story of how you all try pop ups and you say, this is as far as we're going to get with pop ups. What else do we do?
[00:34:16] How do we scale beyond that? I just love hearing you talk about it because it's such an interesting example of it's hardware and software and it's consumer. They're somewhat symptom aware, but not problem aware. There's so many different dynamics in that story. And I'm just curious, like. What are you testing now?
[00:34:30] This was a while ago. So what are you doing today? Like, what are you digging into now? I was a startup CMO for eight years, over three different startups. The two I mentioned, Topology and then Bright, were seven of those years. And now I'm in advisor mode. I specialize in positioning for AI products and unconventional startups.
[00:34:51] And by unconventional, I mean, startups are offering something that is Radically different it requires customers to change their behavior in order to adopt it or it doesn't fit in any existing category of products or it's just really exceptionally difficult to explain I think when you're in that position I mean on one hand that's where the really interesting stuff is and at some point that's maybe where the greatest value is created.
[00:35:16] But it just is really hard to get traction. That's kind of a common thread over a lot of what I've done. It's what I really enjoy is figuring out those hard challenges. And I package this into a practice, which I call traction design. Traction design is really approaching the problem that nowadays traction is the number one existential threat facing startups, I believe, despite brilliant products, great innovators, too many startups die on the battlefield of traction.
[00:35:46] I was thinking that there's almost like three different acts of the trilogy mapping to your kind of three circles of the Venn diagram there was like the age of feasibility where the biggest challenge was could you build it because we didn't have the democratized technology infrastructure that we have now and so it was worth someone tinkering for a year in the garage to make sure they could build the thing they wanted to and then I think Circus Steve Blank, Eric Reese, 2010s, those sort of times, What they taught everyone was, okay, it's not the feasibility, it's the desirability that kills you.
[00:36:19] You might disagree with this, and it might just be because I'm kind of stuck within the local maxima of San Francisco, but I feel like a lot of people have got that message now. And I feel like there's a lot of startup literature about how do you create something that people want. As a result, we've never had more great products hitting the market every single day.
[00:36:39] It's like a fire hose, especially in the age of AI. Bye. And so now I think we're going beyond the desirability phase into the viability phase and particularly the marketability or traction phase. What I'm working on is how can you actually validate from the outset that you've got the potential to gain traction later on down the line, because there's this weird thing that happens where every startup investor is telling founders, come back when you've got more traction, it's probably the most uttered sentence in all of Silicon Valley at the moment, come back when you've got more traction, but yet the process that everyone's following is still product first.
[00:37:17] Everyone still goes, prototype, can we get a problem solution fit, problem solution fit, can we get to product market fit, what's even the difference between there? You find some different points of view on what the difference there is, and then you worry about traction. Well, now 90 percent of your runway's gone, and if our job is to validate the riskiest assumption first, traction should come first.
[00:37:38] And so what I'm working on is kind of how do you turn that around? So can you from the outset from the earliest days? Validate a market opportunity, validate positioning and validate your hypothesis of how you're going to get to traction for your early adopters before you spend all of your runway just trying to get to problem solution.
[00:37:59] And so that's the process of contraction design, which is. Kind of a work in progress. So you'll find it on substack rather than being a finished text. It's kind of thinking in public, if you like. And then I'm developing that with various clients in one to one consulting as well. Yeah, it's almost like your risk moves around.
[00:38:16] You're right. Back when we met, I think I was still at Neo, which was kind of almost like a startup studio and We were pulling together all these designers and developers to work cross functional to build like literally code, you know, and I would say, wow, fast forward to today, that's probably a no code, low code studio, you can just spin up stuff like the feasibility isn't such a risk anymore with a lot of that stuff, or it's certainly not as expensive on the cost side.
[00:38:39] And yes, we have a lot of material around desirability and how to test with customers really quickly. Even if somebody wants something, it's will they pay enough and how do you get enough traction? So I do like that framing. I do think you're onto something with that. I think it's offer traction, product traction, market traction.
[00:38:54] Let's just take out the phrase product market fit, which most people are confused by to the extent. It's not really that helpful. This is the most important thing in your business. But I can't tell you what it is or how to measure it. You will know if you get it. What if you were to actually formalize?
[00:39:09] The first thing to do is to construct an offer. That you can validate will get traction with a valuable market. You can almost think of it as the top five boxes on a lean canvas. So you've got your customer hypothesis, problem, value proposition, solution, and Ash adds it down as unfair advantage, but I think of it more just positioning.
[00:39:31] How are you going to carve out your niche and defend it? So validate those five things plus your price. You should be able to take someone from a point of, I don't know if I need this to shut up and take my money without even needing to touch the product. I met a company, and this is not one of my clients, but I met a company in the EdTech space.
[00:39:53] They had a proposition of your kids get A or A star for your money back in their exams. This is in England. Now as a parent, you almost don't need to know how they're going to do that. To know that you want that outcome, right? The only missing piece of information is how much you say to a parent, give me 50 bucks a month and I'm going to guarantee that your child gets an A or you're going to get your money back.
[00:40:16] You already want to buy. You don't need to know what the solution is. I do believe that you can create an offer. And validate that a valuable market wants to buy the offer before you've even built the solution to fulfill that. And if you can, that really is doing what you teach, which is to identify your risky assumption and do it first, which is what most people don't do.
[00:40:40] That's why I've kind of got to be in my bonnet about, as we say in England, and I'm working on America. I think you're onto something with that. Again, feasibility, time and time again, teams I'm coaching, that is the least risky part of things. Whether they can build it, it's always, well, do people want it?
[00:40:55] And then if they want it, will they pay enough for it? And how do you start figuring that out before you over invest in something that then you're stuck with and I always have a saying that you can't pivot if you're broke. So if you spend all your money on feasibility, it does cost money to pivot, right?
[00:41:08] And if you don't have any money, you can't pivot. I do think this idea of just pivoting your way to success, you know, when you think about in reality, As a startup, how many pivots you really get? Maybe one, two at the most before you run out of money. And so spending your time on feasibility, it just seems like that's the most expensive way to learn.
[00:41:23] And you're only learning kind of in this customer free zone. So I do think you're onto something with, can we get traction and specifically traction going, not just desirability, but also going into viability on the front side and use that to help inform our solution. That would help. I think the mental model with people versus I'm going to have a solution and then I'm just going to see how much will people pay for that.
[00:41:44] I do like. The way you're viewing it. Awesome. Thank you. I really appreciate you spending time. I love the stories you shared about testing, especially back in topology days. I love that you added more color to that. Where would people go if they want to find out more about you? I'm on LinkedIn at slash Chris guest as in hotel guest.
[00:42:02] I also have a website at tractiondesigner. com and the problem perception spectrum, I'll create a worksheet and a resource for your listeners there at tractiondesigner. com slash problem. So you put that in, that'll take you straight to that resource. And then traction design is on Substack. If you're on Substack, just search for traction design.
[00:42:22] You'll find it there. And I'll be writing more about the perception spectrum there and plenty more testing stuff to come. Awesome. Thank you. Looking forward to what you're learning in public on Traction Design. So thanks for joining us today, Chris. Oh, it's a pleasure. Thanks so much for having me. Good to see you again.
[00:42:39] Thank you for listening to another episode of How I Tested That, where we share stories about experimentation, tips and tricks. If you learned something new, please recommend us to a friend or colleague. And remember, keep testing your ideas.