Büşra Coşkuner | How I Test B2B vs B2C
Büşra Coşkuner - Product Management Trainer & Coach
In this episode, Büşra and I explore how she approaches product discovery and testing. From building zero-to-one products at Doodle to coaching teams across B2B and B2C environments.
Have a Listen
Summary
In this episode, I’m joined by Büşra Coşkuner, a product management coach and trainer who helps teams move from project thinking to product thinking.
We explore how she approaches product discovery and testing. From building zero-to-one products at Doodle to coaching teams across B2B and B2C environments, Büşra shares how to actually operationalize experimentation beyond just A/B tests.
We also dig into how to test when you don’t have much data, how to combine qualitative and quantitative insights, and why many teams get stuck thinking they’re doing product work when they’re really just managing tickets.
If you’re trying to build a stronger testing culture or just want to make better decisions this episode will challenge how you think about product metrics and experimentation.
Takeaways
Product transformation is a leadership decision - If leadership isn’t backing the shift from projects to products, it won’t happen, bottom-up enthusiasm isn’t enough.
Most “product orgs” aren’t actually product orgs - Adopting Scrum and calling someone a product owner doesn’t mean you’re doing product, many teams are still just managing tickets.
You can’t test what you can’t measure - Without proper data instrumentation, teams fall into a “build, build, build” loop instead of build–measure–learn.
Metrics frameworks are a starting point, not the system - Pirate metrics (AARRR) or customer factory models help, but real insight comes from adapting them to your actual business model.
Qualitative data is not optional - Quant tells you what is happening, qual tells you why. In low-data environments, qual becomes your primary signal.
“No data” is usually an excuse - Even in B2B, you can extract directional insights, from sales teams, customer conversations, and patterns across feedback.
A/B testing is over-indexed and often misused - Experimentation goes beyond A/B testing. Many teams default to it even when it’s impractical or irrelevant.
Sometimes building is the test - For low-risk features, the fastest way to learn is to ship and observe behavior, treat the release itself as the experiment.
B2B testing requires creativity, not scale - From sales-assisted experiments to prototype validation and even WhatsApp groups, testing in small markets is possible if you rethink the approach.
AI changes the cost of being wrong - When building becomes cheap, you don’t always need heavy upfront validation, you can test the problem through the solution, as long as you’re willing to kill what doesn’t work.
Guest Links
LinkedIn: https://www.linkedin.com/in/busra-coskuner/
Website: https://busra.co/
Transcript
David J Bland (00:01.081)
Welcome to the podcast, Bushra.
Büşra Coşkuner (00:03.234)
Hey, great to be here. Thanks.
David J Bland (00:05.561)
I've been looking forward to this conversation all week. I remember, I don't even know, maybe a couple of years ago, I think I was running a workshop and I saw you there in a breakout room and I just saw your face for a second. I was like, I want to talk to her about metrics. And I knew I would have train wrecked the entire workshop. So this episode is really an elaborate excuse for me to talk to you and have you explain your thinking on things.
Büşra Coşkuner (00:29.55)
Cool, looking forward to that. And I'm very curious, like, what is it? are we going to talk about now, today?
David J Bland (00:34.275)
Hahaha.
Yeah, well, so I would love people to better understand more about you before we jump into all these like nitty gritty metrics and testing stories. So maybe just give our listeners a little bit of background on yourself and how you got into this world of product.
Büşra Coşkuner (00:53.518)
Sure. I'm Bishra. Hi everyone. I am a product management coach, trainer for teams and individuals, companies who are moving from a project organization to a product organization, adopting the product operating model, helping teams to basically work in this new scenario and this new setup and especially go deeper into the world of
product discovery, testing, and how do we make decisions with data, with evidence, with insights, like how do we make informed decisions? So this is like one big arm of what I do, and then the other big arm is training. So either I come in as a coach or I come in as a trainer and give trainings on how to find the right and relevant success metrics for your product so that you can make.
informed decisions, whether it's for iterating on your existing product or it's for building zero to one products. And the way how I came into product management. yeah. And also, sure. I also help product leaders to navigate this transition. Right. And also lead product managers or as I call them the sandwich roles to feel confident in their new role when they become new sandwich leaders. So it's middle managers.
And how I got into product work is I've always been there. It's a very boring career, to be honest. So I started in university with like some student work already in product management and I got my first job in product management, but in telecommunication. So classic product management, not in the digital space, not in software, but in classic work. So I know that world as well. And a German, big...
Enterprise IntelliCommunication, I think I don't have to say the name. And then I moved into the software world, into e-commerce, into marketplaces. Some are very well known in the German speaking area, like Home24. And then I moved into software as a service with Doodle, which was my last employer. And there I was the lead product manager for building up the business scheduling suite.
Büşra Coşkuner (03:20.558)
mainly working on zero to one stuff. One part of my time at Doodle was also transforming the whole subscription experience with subscription tooling and of course the whole new checkout and all that jazz, right? But then mainly building zero to one stuff like the Doodle 101 or Bookable Calendar now called Booking Pages, things like that.
and more and then I became an independent. That's now six years ago when I started my independence.
David J Bland (03:53.41)
Wow. It's amazing how fast he's like, I just started this, you know, and then you're like, no, that was six years ago. Or for me, it's 10 years ago.
Büşra Coşkuner (04:01.846)
Yeah, was sexy as you were exactly. And at the same time, five years ago, I also started my matcha business. So I'm selling matcha in Switzerland. The best in Switzerland, of course. And yeah, and also run a couple of other things on the side.
David J Bland (04:19.225)
Quite busy. think something you said, so there's a lot to unpack there. I think I would love to learn a little bit more about this switch from projects to products inside orgs because I've gone into some orgs beginning that process. So they've usually hired a CPO or a VP of product who came from another org that had some product experience and they're trying to make this shift and
What is your take on that? What do you see on, like, what are some of the obstacles of a company trying to shift from projects to products? Like, what are, like, maybe some of big hurdles you see?
Büşra Coşkuner (04:58.85)
Well, first of all, who bets this whole movement? That's the main thing, actually. If the management doesn't even push for the change, the change is not going to happen. It's with like any other change. So there might be a bottom up spark in there, right? Like some people who are very enthusiastic about it and who maybe even know how to work in this world. But that's it. If the management...
like the upper management, the CEO level, right? Like starting with the CEO actually, doesn't say, yes, we want this to change. We want to change how we work. There will be no change.
So that's reality. So now in a multi thousand person business, it might not need to be the CEO. So it might need to be, you know, if there is a branch of maybe 300 people, then it might be the leader of this branch, for example. Right. So then you can transform this branch or so, but that's the main thing. So it needs backing from the upper management. The other big hurdle that I see is
Some of them know, or some of them think they know how it should look like when you work in a product organization, but they actually don't. So they have a wrong understanding of what it means. And then they try to, I see this with one, let's say, potential client, which I'm not gonna take on to be honest.
And I see that and they are like, they introduce Scrum and say, we will create product teams now and we have already introduced one product team and this is going to work within Scrum. They have a product owner and guess what the product owner does David? It's like a ticket monkey.
David J Bland (07:02.968)
I'm thinking it's more the backlog management and in the ceremonies. Yeah, I've seen that story.
Büşra Coşkuner (07:09.672)
Exactly, exactly. It's a ticket monkey stuff. And they believe that this is a product organization and it's not, right? So, and that's what I mean with the second big hurdle is if you think you know what it means, but you actually don't know what it means.
David J Bland (07:26.112)
Yeah, I can see that. I I think we still mix up projects and products quite a bit. You know, I see product managers, but they're more project managers and they don't know it because they've never really experienced it. So I do see that a bit in my work as well. Although I do commend them trying to make the change. I do think long term, you're probably better off as a product org for the most part, depending on your industry. But when you think of
Büşra Coşkuner (07:36.91)
Yes. Yes.
David J Bland (07:55.459)
So when you think of Doodle though, did that transition, was that transition at all happening or did it feel like a product company when you joined?
Büşra Coşkuner (08:04.782)
That was actually a fun story. When I joined, we were like 30, 35 people. And it was cool. was like, you know, small family and communication lines were short. And we were just, you know, we were rocking it just like that, right? But...
we were in the maintenance mode. When I joined, it was a mature product and it was a cash cow for the company, for the mother company. shortly after I joined, decided, we decided we would do startup 2.0, basically, and kick off another growth phase by opening it up and doing some other things like going upmarket. And that means we need a whole
David J Bland (08:43.768)
Peace.
Büşra Coşkuner (08:55.148)
different experience for the customers and then for the users, which suddenly are not the same anymore, right? We were prosumers basically. So B2C, yes, consumers, but then also prosumers, the professional consumers. We had subscription tiers, but yeah, nobody knew it. Exactly. It was hard to find and it...
David J Bland (09:19.192)
I was going to say, I don't remember the tears.
Büşra Coşkuner (09:24.526)
Like there was no value in the tiers in the subscription or like little value like turn off the ads. And so, but the point was if we really wanted to deliver on this promise, then we had to start working in a different way. And that meant like in 2027, I joined Doodle. I left 2020 and in this almost three years that I was there,
We were the team that basically, I keep saying, transformed Doodle into a junior product organization. I keep saying junior product led. Or product centric, however you want to call it, The product operating model, whatever the word is that we use today. So we took it there and we changed the way how we worked. And you can imagine I joined.
And I mentioned like, you introduced me as the person to talk to about metrics and data. And imagine this, I join Doodle and we have no data.
Like, how do you want to make informed decision if you don't have any data? Yeah, that was a fun story.
David J Bland (10:38.73)
Yeah, I think we overlooked that sometimes where it's like, we do this like build, measure, learn loop, but we can't measure. And it's like, well, what do we learn? And it's really easy just to turn that into a build, build, build loop, you know, and you just basically you measure, you've built things. That's what you're measuring. But I do. I was always a fan of like the pirate metrics thing, which is AARRR, which is acquisition, activation, retention, and then sometimes referral revenue or revenue referral, whatever, whatever.
Büşra Coşkuner (10:51.135)
Hell yeah.
David J Bland (11:08.438)
sequence you like. And I would always give talks on that. I would go into sessions and we'd talk about that. And then people are like, but we can't measure any of that. We don't measure it. So the idea is sounds great. But then if you don't have a baseline metric to kind of use, you just, we're trying to focus on, let's say retention. We want to increase retention. And you do something, but then you're not measuring retention properly.
And you don't know if that had an impact, but it looks better and you think it was better. And therefore you kind of say, yep, that was a win and move on. I think we constantly underestimate the value of instrumentation there and getting that baseline because it's hard. You have to prioritize it. You have to spend time doing it, but there's always more things to build.
Büşra Coşkuner (11:53.548)
Yep, absolutely. 100%. And funny that you mentioned pirate metrics. At Doodle, we use pirate metrics not as Dave McClure's funnel, but as Ash Mariah's CosmoFactory. And that was a game changer for us as well to basically formulate our success metrics. And it helped a lot. I mentioned I worked in 0 to 1 products like
Bookable Calendar and Doodle 101. And we basically mapped the whole experience of these new products on the customer factory. Like, which page are, what is our assumption about the page where the user will be activated? What is our assumption about the page where we see the biggest value to get the person retained?
Where does the revenue part of the monetization part of the revenue piece happen? Obviously on the pricing page and checkout and of course, right? And how does our referral engine work and so on so forth? So it helped a lot. And when you build new products, you don't have a lot of data, right? So you, I mean, we didn't have so much data anyway, even like for the polling part, which we fixed afterwards, right? So we did a whole,
big initiative on setting up our data infrastructure. But then for the new products, you don't have so much data anyway. And then you try to quantify the pieces with data that is, well, that is never going to be, never going to reach significance, right? So like, how do you, how do you do that? But the customer factory helped us a lot with that. That was good.
David J Bland (13:49.175)
I like the customer factory model. I think I've drawn that out a few times for people and credited Ash on that. feel as if... A couple funny stories. One, I gave a talk in San Francisco and I didn't know the rest of the agenda of the people inviting me in. So I was like one of the first people in and I came in, was like, funnels are useless. Don't even think about funnels. Think of it as a factory or system stinking because Ash, I think he gave me one of the early copies of that book to preview and I was like, this looks like you're applying system stinking to...
you know, pirate metrics is really fascinating. And I get this big schveal and I was like, hey, funnels completely useless, don't even think. And they're like, that was great. And they're like, next up, you're so and so to talk about funnels. I was like, no, I completely, sorry, man, like this is gonna be a tough, tough talk. But I think with that model, so one, a couple of things, and I wanna dig into this a little bit. one, the model works pretty well. I still like, I still like the model. I think it helps anchor your testing.
Büşra Coşkuner (14:29.262)
Do do!
Büşra Coşkuner (14:34.958)
That's cool.
David J Bland (14:47.648)
So for example, are we testing to increase acquisition? Are we testing to increase activation? Are we testing to increase retention? Are we testing to increase retention? It doesn't happen often, but if I work with a company who is all in on metrics and how they measure, I feel as if it gives you a place to anchor your testing. like, you know what? Acquisition is looking great, but people aren't activating. So let's not spend all our efforts testing more acquisition channels. Let's test on how to get them to activate.
Büşra Coşkuner (14:51.864)
Mm-hmm. Yep.
David J Bland (15:17.09)
The deeper question is what does activation mean for us? And so the way I try to lead people into that, and you probably do this similar, is what does acquisition mean for you? What does activation mean for you? And you can get it really, it can get pretty complex because you can say, well, we have different classes of users, like sub segments that activate a different ways or they're the routine in different ways. And I like the quantitative side, I think where I've tried to add to it is the qualitative.
Büşra Coşkuner (15:19.949)
Yup.
David J Bland (15:45.581)
So I like the aware, hopeful, satisfied, passionate, which sort of maps to it. You could say acquisition. Okay, yeah, can measure cost per acquisition, but also am I measuring awareness? Like are we generating awareness? Are they becoming hopeful? Is that why they're activating? Is it because they're becoming hopeful? Are they staying around because retention is satisfied? And then are they referring because they're passionate? It doesn't line up exactly. I had this chat with like the late Grant Cooper and then...
Patrick Vlasquez about this back in the day. And they said, yeah, I kind of get what you're getting at, but it doesn't really line up that well. But I think the reason I was doing is I was looking for the qualitative side. Because I think what you're saying here is, yeah, we can implement that metric and we will have quantified information look at. But the qualitative side, I think is equally as important in that system.
Büşra Coşkuner (16:35.106)
Yeah, absolutely. And let me say two things about that. So first thing is, I have seen how well it works in the majority of the cases, but then there are many cases that the customer factory doesn't work. So what I do with my clients is that I have taken it a step further and
I'm basically using the R pieces a bit more loosely and I model the way how the company's business model really looks like. And that can look very different from the customer factory. I have some examples about how I would map LinkedIn, how I would map...
Miro and so on and so forth. And I call it success engine based on Eric Ries' engines of success. And why I mention that is because
You just mentioned like maybe we have to look at awareness. Maybe there's something in between that is more about engagement. There is maybe something in between that it's more about the habit loop. you can put that nicely into, not a concept, but you can visualize very well how the business model really looks like and how these pieces really feed into each other in the real world.
instead of sticking to the customer factory template, when you look at the pieces independently from the template, right? And you model them and visualize them in a way that it really reflects a business model. So that piece that you mentioned sparked this thought in me. And the second thing is the qualitative part. I absolutely agree. mean, we keep saying, right?
Büşra Coşkuner (18:44.91)
quant can tell you what is happening, but only qual can tell you why it's happening. when I say data, wherever I say data, I keep saying qual and quant. like it's both. is data too. And especially in businesses when data is, when you have only little data, it's so important to get qualitative and like
pieces in, right? So that's one of the main things that I keep telling my clients. As I mentioned, I ended up working a lot with B2B clients, right? Even though I have a B2C background, I work a lot with B2B now. And I keep saying that you don't have to, I hear this every day, right? So, we don't have enough data to quantify stuff. Well, you do have some data. Quantify the quality.
the qualitative part, right? So if you talk to, even if you don't have direct access to your customers or you don't have any contact at your customers, you have salespeople. And yes, they are super annoyed about you talking to them and asking them questions. But if you talk to five of them and
four out of five mentioned this one specific topic as the most annoying topic of their clients, well, you have a signal then. It's a directional thing and that is good enough for you to understand that yes, there is something to look at. That is enough. If you don't have a lot of data and can't be super detailed about it, then quantify the qualitative part and that's okay.
David J Bland (20:36.407)
Yeah, like the ring that up. That happens a lot in B2B. I'm wondering, what are the differences? You when you think about testing in a B2C, which it seems like that was your sort of like foray in the product management and that's where you kind of grew up in. And also B2B, since you work in both worlds now, what do you see the difference between testing in both? And you hinted at it there, but I'm curious what you're seeing with the companies you work with.
Büşra Coşkuner (21:06.22)
Yeah, so in B2C, and I learned actually it's not even B2B versus B2C, but it's environments with lots of data and environments with little data. It can be a B2C, a B2B company. For example, Miro is B2B, but it has lots of data, right? So you can do, you can analyze a lot, for example.
So therefore, but let's keep it as a shortcut. Let's keep B2B as a shortcut for not a lot of data and B2C for a lot of data. So in B2C, when you say testing or experiments or experimentation, the default thinking is A-B testing. And I'm like, ugh.
David J Bland (21:55.488)
I knew you were going to say that. I knew you were going to say that. Why is this? OK, I don't want to rant about this, but the AB, the AB marketing people that they have. It's like when the cereal like there's a certain type or it's like the people that market like cranberry juice and now you have cram everything. It's like how did the AB testing crowd? Co-op the entire.
Büşra Coşkuner (22:00.11)
Thank you. Do it, please do it.
David J Bland (22:22.719)
realm of experimentation. It's fascinating to me to watch.
Büşra Coşkuner (22:27.342)
I would love to know that. I don't know how it happened, why it happened, but I think it's a bit about this problem of, know, so I live in an intersection of lean startup, product management, agile bubbles. And we use many, words that are like the same words stand for different things.
And then when you move from one bubble to the other and throw that word in and you think everybody knows what you're talking about, turns out they pick it up from their perspective. then, you know, because we're mixing, like these bubbles are mixing, which is actually good. I love that. Right. So they have to be mixing because we're all doing the same, actually. And we're all trying to achieve the same. But the problem is then that, I don't know, the one group's
David J Bland (23:11.894)
It is.
Büşra Coşkuner (23:23.254)
definition of that word dominates suddenly, I don't know.
David J Bland (23:28.695)
It it drives me a bit insane at the moment. So I was doing outreach on, you know, trying to find trends in the industry and I was like, OK, I'm going to interview people about experimentation. And I would reach out to people on LinkedIn. We'd get on a call and I'd start mapping out what they're talking about. And it always led to A B testing. It was always, you know, yeah, we're A B testing across the company or it it just.
Büşra Coşkuner (23:51.394)
I think it comes a bit from this trend in B2C. mean, in the early digital world, let's say, what was the main thing that happened to all of us? E-commerce. And in e-commerce, there is something called conversion rate optimization. And what do they do there? A, B, test the hell out of
And so when it starts from there and spreads into the world, into the software world, what happens when someone who was in e-commerce for quite a long time suddenly works on software as a service, which was not the beginning of internet, right? The beginning of internet was informational websites like Wikipedia and e-commerce. So, and then when this person goes into software as a service,
What do they take with them?
the wording, experimentation, which is in their world, A-B testing. So I don't know if that is true, but this is how I try to explain myself how this whole thing happens.
David J Bland (25:04.827)
I wonder, and I'm not anti-AB test. I actually think it's a valid way. I think where I get pushback is I made the mistake of putting dots in my book, which has still come to haunt me to this day. So I really revisit how I think about the visualizations and what I write now. But, know, AB test is like the end all be all of experimentation.
It's the best kind of test you could ever run. And if I say, well, not really, it's context specific, but there's just so much more. I think, you know, I probably irritated a lot of people by taking a bunch of methods and calling them experiments and they're like, well, it's just doing research. I was like, yeah, but you know, you're looking at your riskiest assumption and we're tying it back to that. So that's why I call it an experiment and I use it very broadly. So I get that. But I think there's, there's like two spectrums here. There's like using it to
Büşra Coşkuner (25:35.553)
Exactly.
David J Bland (25:57.496)
describe everything you do, like everything's an experiment. Or the other extreme is like, it's just means A-B testing. And I just see the latter more now. It just feels like it's just A-B testing.
Büşra Coşkuner (26:03.65)
Yeah, yeah. Yeah. Yeah. Yeah. And then coming back to the B2C thing, main thing that they are doing there is A-B testing. And there are companies that have an A-B testing culture, right? So any little change has to go through an A-B test. And even if it doesn't have enough traffic, it still has to go through an A-B test. And if...
If you dare not running an A-B test and just launching the change or try to do a different type of test than the data and experimentation people in that company, stand by your desk and asking you, why didn't you do an A-B test? I will talk to you line manager kind of thing, right? So there are these companies out there. So that's the one extreme.
B2C equals a A-B test and there is no other type of experiment. Then there's the nuance of B2C does not necessarily mean A-B test. So Doodle, as I mentioned, was mainly B2C, but also very much B2 prosumer, right? So we have professional consumers and the early adopters of the new tool set that I was building with my team, they were professionals who needed a tool that would work for them, right?
And that at the same time, so we had a big user base, which we used for testing our new solutions. Yes, but still there would be only a couple of thousands that would join the closed beta, for example. How would you get enough data for A-B testing in a pool of 2000 people? You can't.
And then even like later on when we opened up and we had the open beta phase still like you would never get enough or in a very short amount of time to significance there, right? So basically the other nuance of B2C is you do run tests like fake door tests. You do run tests like, for example, at Doodle when our...
Büşra Coşkuner (28:22.262)
early users requested a preview feature that would be technically very difficult to build, we run a fake door test with a preview button showing a pop-up saying like, hey, thank you, we will count this as a vote for it. Is that okay or were you just curious to see what happens? And then we would have two buttons, just curious or yeah, call me in, kind of, right?
And then we would count how many people clicked that, like how many people saw the page, how many people clicked the preview button and how many people said, yes, I'm interested. And just to understand if that was a real request, like is this really a thing or like, should we really build it or, you know, whatever, let's throw it away. And turns out, yeah, people wanted it. So we built it. But then at the same time, we would run tests like...
sometimes it would just build it. We had a ranked list of feedback coming in and on the bottom, one person wrote, or two I think, wrote, I would like to have a field, a text field on the page where I invite someone to book on my calendar. I would like to have a text field where I could write a message.
It was not our priority, but we knew we had a problem with the ratio of people that would copy the link to the calendar and send it out to the people elsewhere, like via email or whatever, versus inviting them into Doodle so that we would also have that person in the system, basically, for the network effects, right?
David J Bland (30:15.967)
Yeah.
Büşra Coşkuner (30:20.92)
We were like, hmm, okay, I mean, it's, know, we could just build it and see what happens because what is it? It's not risky. It's just a text field. Like worst case, what we will remove it and it will not destroy anything in the system. Let's just build it and see what happens. And boom, we improved the ratio from one to four to one to two. Right. So sometimes building is the test, but then you have to, you have to treat.
release as the experiment. And it's not always good, obviously, A-B testing. There is no variant to compare it to. But anyway. And then sometimes we would have, we have tested the name of the tool.
David J Bland (30:57.439)
Yeah.
David J Bland (31:02.965)
Right.
Büşra Coşkuner (31:14.444)
And that was a very qualitative test, right? So we collected, for Bookable Calendar, we collected some ideas, how should we call this tool? And then we opened up one of those remote usability testing tools that we were using. And then we asked questions like, what do you think what this tool does when you hear this name? Which of these names...
match, like here's a description of what the tool does, which of these names matches best with what the tool does. And so on and so Like we had a couple of questions and then we, like there was a very much data informed decision to call the tool Bookable Calendar. Now in hindsight, I mean, when you look at the initials BC, there was a rumor that I chose the name Bookable Calendar because of it's, yeah, because of the initials BC.
It was just coincidence.
David J Bland (32:14.994)
Okay. But there's more than just like, I think you're onto something here with. Sometimes you just build like it's it you just build and I do think we are. I love that people want to test more than let's say in 2010. I love that I'm not having to convince people of it. But at the same time, you don't treat everything as a test. I mean, that's just that that would be mentally exhausting and it doesn't deserve it. It doesn't warrant it. I mean, some of the things you just.
Büşra Coşkuner (32:23.384)
Sometimes just.
Büşra Coşkuner (32:40.824)
Yeah.
David J Bland (32:44.032)
cost, especially like we looked today with AI and everything, which we can get into as well. But like getting things out there, sometimes it doesn't have to be this giant like process, you sometimes you just you just build it.
Büşra Coşkuner (32:57.442)
Yeah, absolutely. And then the B2B side of this and yeah, let's talk about it because I don't fully agree with the product world that says like, now that we can build with AI, we have to think a lot more about the problem to solve. I partially agree and partially don't, but anyway, hot take. Okay, do you want to hear this first or the B2B version of testing?
David J Bland (33:14.533)
Okay.
I want the hot take, so what is the disagreement here?
David J Bland (33:25.878)
this is very conflicting. Let's come back to this after the B2B version of testing.
Büşra Coşkuner (33:32.394)
Okay. Okay. So the B2B testing. So many times it starts with only interviews. Yes. If this is the biggest thing that you can do because you don't usually have contact to your customers, but you are in the product team and you need something, then we start with the thing that A is easiest to do and
B might open up some doors, which is talk to your salespeople. do, hopefully they do win loss analysis. And so they do have some data on the topics of why things get rejected. They have relationships as account managers. So they will be able to tell you how the relationship with the customer is.
they can do this kind of thing. like start with this type of conversation and yes, they want you to solve some problems for those customers that keep complaining about stuff, right? So get this out from the brains of them. That's one thing. The other thing is one client of mine, for example, is building a zero to one thing, but he's testing the features of the zero to one thing separately. And he has a dashboard.
spreadsheet where he basically says the experiment is, the, for example, the hypothesis is this feature that we're building is going to help create the painkiller feeling for the tool that we're building. Because so far it's been, like the tool itself has been considered as a vitamin and not a painkiller. So what he does is he builds a prototype with lovable, for example.
In B2B, yes, you can do that.
David J Bland (35:29.692)
I'm a fail lovable. I don't hear as many B2B stories. So I like this.
Büşra Coşkuner (35:34.412)
Yeah, so he built this feature in Lovable, embedded it in the tool that was built with real code, let's say, quote unquote. Yeah, for everyone who listens to that, real code, quote unquote. And he embedded it in their new tool. And then you have to be best friends with your sales.
So he's best friends with his sales buddy and basically they tracked the funnel. B2B funnel means you have a first call, maybe you end up having a second call, maybe only with that same person, ideally with a second person. And maybe you end up with a third call, maybe in the third call there's someone who either owns the budget or has some say in the whole decision making process.
And then maybe there's a fourth call and then maybe there's a sale. So there is a funnel and he tracks the funnel. How much did this new feature help to go through this funnel, like from step to step up to which step and how fast, right? So he's tracking how fast he or the sales person goes through the funnel with the customer or how fast the customer goes through the funnel basically. And this is super cool.
David J Bland (36:57.418)
Yeah, okay.
Büşra Coşkuner (37:00.884)
And basically the salesperson has to score how helpful it was. Like, did I show this feature to the customer? Yes or no. How helpful it was to convince the customer that this tool is super cool and how helpful it was to make sure the next step in the funnel, like the customer goes the next step in the funnel.
And he took, notes as well. And this is amazing. This is fantastic. Like this is how you can test in B2B. Like as an example, as an example, another team, for example, did clickable prototypes and reached out to the customers, that they were in contact with, like the employees of their customers and basically showed the whole vision thing. This is what we think of.
David J Bland (37:40.842)
Yeah, I like how I thought.
Büşra Coşkuner (37:59.608)
give us feedback. They had only one chance to talk to them and that's what they made out of it. So they made a mix of problem interview in the beginning, then they showed the prototype and then they collected the feedback. And that's it. That was the one-shot thing that they did. But they did it. And then they collected how many times the one or the other topic was mentioned.
A third example on an AI feature, another team is building a new AI tool that is basically about collective, I need to be careful not to say too much about it, but let's say a collective space to discuss about, well, the purchase, let's say, but like a self-serve thing.
And the main thing was, how do you know that they need something like this? Two things. One, well, they see that today, that is some sort of, that is research, but that can be your test, right? They see that today these people are connecting on other channels like LinkedIn or like they do group
conversations on LinkedIn or group conversations on WhatsApp, what they shouldn't be doing because that's a business world, right? So you shouldn't be doing that. And that is a strong signal that they are looking for some way to have a space to discuss something collectively. And second is, well, we will run an experiment with a link to a custom WhatsApp group.
so that they don't have to set up the custom WhatsApp group. And we will see if they will say, yeah, finally there's someone who actually supports us in connecting and we don't have to do it ourselves. So that is the experiment. We will send them a custom WhatsApp link and not build a custom tool for that in the first step. It can be so easy. And people are often thinking it.
David J Bland (40:19.125)
I think it's cultural. Yeah, I think it's cultural. I think you'd be really creative in B2B testing. I think, you know, if you tap into people's creativity, there's a lot you can do.
Büşra Coşkuner (40:24.216)
Yeah. And this is, yeah. And just to mention, all of these examples are in very niche B2B markets with where the market is like what 200 potential clients in total, maybe 300 and that's it. Super niche. And you can do that.
David J Bland (40:48.339)
I love that. feel as if lately it's been switching costs that I've been trying to help sales with. I think sometimes in B2B sales, everything sounds great, but then we forget, or maybe we just don't make it a priority of understanding, even though they love the solution, if it costs them like five to $10 million to switch to the new one. That's something we need to uncover sooner versus later. And I saw Alex Ostwalder, he posted like the...
Forces diagram a few weeks ago from Bobby Muesda. And I still use that to today. I still use it because we're trying to think of like, what's gonna help make the switch, which is gonna prevent. this is a test. You can think of it as testing, as testing, know, especially in B2B where B2C, yeah, switching costs matters. Maybe not so much, but in B2B, it can be millions of dollars to switch. And that is something you wanna test for sooner versus later. So I think...
Büşra Coşkuner (41:43.266)
slowly.
David J Bland (41:44.918)
I'm loving that you're sharing stories about B2B because I do think we need to get more of stories out there that yes, it's possible. Yes, you can be creative. Yes, maybe you don't have as much data. And I like your framing of having data versus not as well. Maybe that's even more useful than coming back to the old B2C versus B2B debate. I'm wondering. Yeah, it is, it is. So, OK, so at the.
Büşra Coşkuner (42:04.184)
But it's very mouthful, so it's very difficult to coin that.
David J Bland (42:13.457)
at the risk of making this an hour long podcast. What is your hot take on AI and testing? If you will indulge me on this.
Büşra Coşkuner (42:23.348)
Yeah, so.
I work a lot.
in, or let's say my brain works a lot in first principles thinking. Basically going down backwards, going down the rabbit hole backwards and trying to understand why are we doing X, Y, Z? Like what is the initial reason that made us do X, Y, Z? What is the initial reason that made us test
What is the initial reason that made us understand the problem space very, very, well before we built? What is the initial reason that we keep saying you should validate the problem and or evaluate, whatever, the problem space and the solution space before you write any line of code? Because writing was expensive. It's not anymore.
So, two thoughts on this. A or one, whatever. The alphabet, let's go down the alphabet. No, one, and maybe there will be third or fourth while I'm talking, let's see. So, the main point is one, so coding was expensive. Pivoting was expensive.
David J Bland (43:36.211)
Ha ha ha.
Büşra Coşkuner (43:59.692)
because coding was expensive. Changing your mind all the time destroys trust and so on and so forth. Okay, that one, yeah, okay, depends on the context and niche B2B, yeah, maybe. Now building is not that expensive anymore. Why shouldn't we just build and see what happens? I mentioned this.
Why didn't we just build it? Why didn't we A-B test the effect of it? Because it wasn't risky at all. What is risk? Risk is, well, we built something and it blows up the whole product. Nope. Or it's so complex that it's gonna take us a long time to build it, which means
a lot of money to throw in. Nope. If we build it wrong, then the customers will be upset. Maybe, it depends. And then if the customers are upset, it will take us a long time to fix this. Nope.
So what is risk now? First principle is thinking. Now the risk depends on your very own situation. If you are a company that... well, B2C there is no risk, sorry. There is no risk at all anymore. Because consumers are okay when you mess up one small thing and then you fix it like the hour later or the day later, whatever.
I mean, when we took Doodle one-on-one out of beta, it was down for a month. Yes, it was down for a month because there was some update that we didn't see. We were at the same time refactoring the whole old system. And there was one API call that was basically destroying everything and we couldn't find it. And it took us like almost a month, like three weeks or so to find that.
Büşra Coşkuner (46:14.882)
Who was upset? Yeah. Not many people. And even if once it's up again, well, yeah. In B2B, what is risk? Well, reputation risk. Yes, maybe. But then at the same time, it means your position, your image wasn't solid enough.
And what is the reputation risk? When does the reputation risk happen? Well, if you work in a very conservative industry, their reputation risk is real. When you work in a regulated environment, there is not only reputation risk, but there is also legal risk, yes.
And now you see we're talking about risk, Viability, feasibility, and desirability. It's the feasibility stuff that is maybe risky.
But other than that, there's not much risk anymore. Why would you have to test so many things to make sure that you have worked on the right problem if you can build a solution and with the feedback on the solution, you can see if you were working on the right problem or not.
Büşra Coşkuner (47:48.8)
That's my point about yes, I agree. We need to make sure that we work on the right problem because if it's not the right problem, nobody will give us money to use the product. Or if it's not about money but usage, then nobody will use that product or feature. It's not only about products. We're not only talking about zero to one products. We're also talking about features, right? And nobody will use it. Okay, yes.
But you can figure out if you were working on the right product.
with the solution. Build the solution and see if it was the right problem. Again, make sure that the release is the experiment. Again, make sure that you know what your hypothesis was, what your assumption was, what your hypothesis was, what your success metric, there we are again, is that will tell you that you are onto something. Don't just throw it out and see what happens, right? Spaghetti at the wall thing.
define it upfront and then see if it was the right problem to work on or not. You can iterate very fast now. I mentioned before we started recording, I mentioned that I'm running interviews with product leaders whose companies went through AI adoption. And one of them said a quote that's something that still sits with me. She said,
We've never been as agile as we are right now because we can now switch, change our minds so quickly with AI coding.
Büşra Coşkuner (49:33.932)
And it doesn't matter if you're in B2B or B2C, that's my hot take on that. You don't have to think too much about problem space anymore, or to make sure that you are really working on the right problem. You don't have to do that.
David J Bland (49:49.429)
All right, so I just want to time check before I respond to that. We are a little bit over. Are you OK to chat a few more minutes? OK. I'll edit this part out, by the way. OK, so. I'm going to throw a quote at you here in the response that I heard last week as a.
Büşra Coşkuner (49:52.866)
Yeah. I'm okay. Yeah.
David J Bland (50:08.98)
as a pushback on this and then we'll wrap up on what you're excited about and how people can reach out to you because I only have to only said this was an hour, but I want to make sure we can we can debate this for a while. Maybe there's another forum we can debate this as well. OK, so we'll pick about it now.
Büşra Coşkuner (50:25.698)
Okay.
David J Bland (50:29.734)
So the pushback I heard as recently as last week on this, and I think it was, I was talking to Jeff Patton about this, who I admire, he's influenced my career quite a bit. And he said, with the speed that we're doing this, imagine you go into a restaurant and you look at the menu and you pick something and what you eat is horrible, or it's not like amazing. And then the next day you come back to that restaurant and the entire menu's changed.
and you pick something else and it's also not very good. Your customers aren't going to keep coming back to that restaurant. So I think the point he was trying to make was, yes, we can change things faster, but the quality still matters and the problem we're solving still matters. But I do think it's like throwing spaghetti at the wall. can't be like infinite spaghetti at all the walls all the time. So I do think there's a balance to this as well. I hear what you're saying, but I do think there's...
We have to be careful of not taking it to the extremes.
Büşra Coşkuner (51:33.026)
that I agree, 100%. And so as I mentioned, take the release as the experiment and make sure you know what you are testing there. Like I see every release as a test. And if you see every released as a test, then you, it's definitely about quality. So it's not about just randomly building stuff and throwing it out there. It's about.
Again, informed decisions. Like there is a reason why you initially thought you might want to build the solution. And instead of spending one month on researching on whether that's a good problem to solve or not, dig into your data again. And this is very difficult. This is senior level stuff, right? To make sure that you don't fall victim to your own bias. That for sure. But...
you have an initial assumption on why you went that path, why you went down that path and then like, and you don't need to extend your solution testing by another four weeks to figure out if you are working on the right problems. I am fully, like I fully agree the problems that we're working on matter even more than before I would even say.
I said this when No Code came out six years ago and I was fully involved in the No Code movement. learned to No Code, basically. And I have an article even on my blog that says, yes, there is a big risk that we build even more shit out there.
And then it's clutter, internet clutter, and that shit is out there and nobody cares kind of thing. So if you don't have the guts to also take it down, then don't take it up at all. And I think AI accelerated that. There's a lot of bad stuff out there and it's going to take some time until people will learn to not just shit every bad stuff that they have in their mind.
Büşra Coşkuner (53:56.972)
you really have to have this testing culture. Let's put it this way, this testing culture of admitting this was bad, I will take it down. If you don't have that testing culture to also take stuff down, then I am still on this side of the things to say, well, then please validate the problem first before you build anything. But if you have that testing culture and the guts to say we were wrong, we will take it down,
then you can test your problem with the solution. That's what I'm saying. I'm not saying problem space doesn't matter anymore or it's not important to make sure that we are working on the right problem. I say that there is a new path to understand if we are working on the right problem or not. And that is by getting to the solution faster.
David J Bland (54:45.748)
I like that framing. think something you said is really important there is regards to throwing it away or being willing to throw it away or take it down. And this was something like some of the most successful products I've ever advised on. We're talking like tens of millions of customers. We threw away the MVP before we ever scaled it because it was never meant to scale. You know, we were testing it to learn is there a fit or we do have the right segment. OK, let's understand how they're using it and generate all that evidence.
Büşra Coşkuner (55:08.387)
Yep.
David J Bland (55:15.528)
And we're like, you know what? Let's throw that away because that is not something like we can rebuild this in a way that now we have the evidence to confident enough to inform the design of the rebuild. I don't think that's still common. I still think we fall in love with things really quickly. And I used to think that it was always the it was the time it was the effort we put into that. That's why we're in love with it. No, I watch people like vibe code stuff in 30 seconds and they love it immediately. So I don't think it's the time anymore.
There's something deeper there we have to address about our investment in kind of manifesting this thing into the real world that's in our head. Like it's almost like we're in love with it before it even gets to be a product. So I do think we have to be very, very careful of that. And I like that framing of you have to be willing to take it down or you have to be willing to throw it away. I think that's a good threshold to call out. And I like that.
Büşra Coşkuner (56:08.075)
Yeah. Yeah. And that brings us back to balance, right? Because I agree with you totally, like you have to have some balance. It's not about... And that's maybe also one of the things that our product community has to learn to not think in extremes. Like when you go on LinkedIn, everybody's on one extreme, right? And even my hot take is not to be taken in the extreme. I mean...
It's more about saying, hey, there is a new tool in your toolbox. if you, you you have, of course you have to find the balance. It's also about strategic fit or strategic thinking, right? So the more strategic the topic is, the more risky it is, right? And there you cannot just throw out a solution and say, let's see how that works, right? So you cannot do that. So you have to...
make sure that the problem space or the opportunity that you're looking at is solid. You have to do that before you get to your solution. But if it's about daily stuff, like a small feature, well, don't exaggerate. I mean, try things out and treat it as an experiment, treat it as a test and then see if it works. And if it doesn't, then take it down or iterate, right? So that's the other option. Kill, pivot, persevere.
in the end. So that's so much about balance, Product people, don't think too much in extremes. Please, there's the most.
David J Bland (57:43.381)
I like ending. Yeah, I hear you. I like ending on that note of balance. know, it's in the olden days, it was more like, well, we can't test anything. So we're just building this giant thing and finding out if it works. then we shifted to let's test everything. And not everything needs to be tested. So I do think it's about balance. I want to thank you so much for hanging out with us. You, we covered so much.
Büşra Coşkuner (58:03.138)
Yeah, absolutely.
David J Bland (58:10.258)
I'm always, I learned so much just by hearing you speak and reading what you write. We learned about how you came into product management, how you kind of grew up in B2C, how you tested stuff at Doodles. I love the stories you shared, how B2B works compared to B2C, how we think about experimentation and the words we use and how we find balance. I really, really appreciate your point of view. If people are listening to this and they're like, hey, I need...
her help. I'm in this company and I feel very alone. What's the best way for them to reach out to you?
Büşra Coşkuner (58:41.208)
Yeah.
Büşra Coşkuner (58:48.47)
Yeah, so there are two ways. One is LinkedIn. Obviously I'm very, very active on LinkedIn. I don't know for how long, to be very honest. I'm like, yeah, LinkedIn isn't fun anymore. Today I had to remove my sparkle because they forced me to remove my sparkle, the emoji in my name. So I put it in my headline, so I'm the one with the stars. Anyway.
So LinkedIn is one way. The better way is probably through my, like you can write me an email, can leave people my email or through my website there is my email in the general terms. So you can find it there. You can read my blog, subscribe to my newsletter on my website. So it's B-U-S.
Yeah, love to hear from people.
David J Bland (59:52.594)
Yeah, definitely. So we'll include all those links in the description and also in the detailed page. Thanks so much for hanging out with me. I thoroughly enjoyed our conversation. This is I was looking forward to it all week, because I learned so much by just chatting with you. So thanks.
Büşra Coşkuner (01:00:02.166)
Me too. Yeah.
Büşra Coşkuner (01:00:06.85)
Thanks.