It's All a Judgment Call
Chances are, you’ve heard Jason Fried and David Heinemeier Hansson stress the vital role of trusting your instincts in business.
This week, host Kimberly Rhodes takes a deeper dive into the heart of human decision-making with the co-founders of 37signals—exploring how the interplay of intuition and accumulated experiences becomes your compass for navigating the complex landscape of business decisions.
Join Jason and David as they unravel the intricate dance between data-driven decisions and the art of making judgment calls, guiding listeners through trusting your gut and breaking free from external pressures when making product and pricing decisions.
Listen in to discover how the power of experimentation and the ‘try it and see’ approach often unveil valuable insights, with reality as the ultimate judge.
Check out the full video episode on YouTube
[00:28] - The limitations of data-driven decisions
[03:00] - The magic of intuition and the black-box nature of gut decisions
[04:53] - The role of market feedback in evaluating the effectiveness of decisions
[06:39] - Cutting through the noise in decision making
[09:00] - Experience and judgment in decision making
[11:39] - Testing pricing and the importance of experimentation
[15:45] - The crucial role of gut feelings
[17:51] - Accepting uncertainty and avoiding perfectionism
[21:10] - Utilizing momentum and adopting a ‘try it and see’ approach
Rework is a production of 37signals. You can find show notes and transcripts on our website. Full video episodes are available on YouTube and X.
If you have a question for Jason or David about a better way to work and run your business, leave us a voicemail at 708-628-7850 or email, and we might answer it on a future episode.
Links and Resources:
From Jason’s HEY World: It’s all a judgment call
Books by 37signals
Sign up for a 30-day free trial at Basecamp.com
HEY World | HEY
The REWORK podcast
The Rework Podcast on YouTube
The 37signals Dev Blog
37signals on YouTube
@37signals on X
37signals on LinkedIn
Kimberly (00:00): Welcome to Rework, a podcast by 37signals about the better way to work and run your business. I’m your host, Kimberly Rhodes, and I’m joined by the co-founders of 37signals, Jason Fried and David Heinemeier Hansson. If you’ve listened to the podcast at all, you’ve heard Jason and David talk about trusting your gut and trusting your gut in business. This week we’re going to talk about that. Jason, you recently wrote about this on your HEY World in a post called It’s All a Judgment Call. Kind of tell us about that.
Jason (00:28): Yeah, I was having a discussion with someone about data-driven decisions and whatever, and it just like, it kind of dawned on me that, okay, so you got some data. Is that the only thing you’re making this decision based on? Really? Truly? Just the data? That’s it? Well then why not just have a computer do it? And then the person’s like, whoa. Because it’s like… exactly, because you have experience, you have judgment, you’ve seen other things. There’s a million points of view that you have. You don’t even know where they came from. You don’t know why you were influenced to make these decisions, but you’re making a human decision. A human decision is a judgment call every single time. It’s a judgment call. You’re weighing all sorts of things. You don’t even know what you’re weighing, but you’re weighing millions of different data points and not just the data that’s in front of you.
(01:12): And I started thinking about, well, why do companies hire when they hire executives? Who do they hire? Do they hire people who are really good at Excel? No, they hire people who have good judgment and good experience. They’re looking for experience. What is that? Experience is a very fuzzy, blurry thing. It’s not cut and dry. It’s like, I’ve seen this and I’ve seen that and I’ve been around this and that was around this when that happened and I saw how this came together. These are all things that color your point of view, and then you bring that point of view to the next thing and you paint with that. That’s what people actually want. So I just feel like it’s time to be honest about that. Everyone’s trying to find certainty and scientific decision making and whatnot, and there’s some data that’s obviously more correct than others, but that’s not the only thing that makes a decision. And if it does, might as well be a machine.
Kimberly (02:01): Okay, so a lot of gut decisions to me. I think intuition and harnessing what you know to be true. How do you guys take that step of trusting your intuition and trusting your gut all the time?
David (02:15): I think part of it is that it’s a privilege to be able to do so. And that a lot of the objectification, a lot of the rationalization comes when people are trying to impress other people with their thought process, if you will. It’s not even the thought process because if it was just that, it would be more benign. But it’s like trying to rationalize, trying to bring proof, oh, I made this decision because A, B, C and D. Yeah, you can only present the known factors of that decision to other people. What about all the other stuff that just lives in your gut? I mean, the reason we call it gut is that it’s literally not your head, that there’s a computer inside of your stomach that you even really know how it operates. It’s a little bit of a black box. You feed it some problems and it’ll spit out an intuition.
(03:08): It’ll spit out a direction that go like, you know what? I think we should go over here. I have a hunch, I have a sniff that this is the path forward. I can’t explain what that is. It’s funny, my oldest kid was talking about how they do math and a lot of how you do math is you have to show your work. So you’re showing each step and you go through it, and that makes sense when it’s like a math problem. The variables are known, they’re fixed, they’re bounded, nothing. Almost nothing in the realm of business has that privilege that you get to hold all the variables constant and just tinker with a few. No, you have to deal with the fluidity of the situation as people say. That fluidity just means that there’s no fixed point. That you can’t just stand in one position and go like, all right, here’s my five inputs and here’s my two outputs, and that’s what it is.
(03:59): You don’t even know. But the magic to me is that you don’t have to know. If that gut computer you have has been trained on a, now I’m really pushing the metaphor here, steady diet of experiences and different tastes and different, also, I’ve been in a lot of different places, you can learn to trust it. And I think that’s what we’ve essentially allowed ourselves to do is to trust that that black box is not going to reveal all its magic to us in terms of how it reaches its decision, but it’s going to reach good decisions. And this is where I see the clearest contrast to people who are very good at articulating what they know and their inputs. There’s a million MBAs who can tell you exactly how to run through this formula. Who can do the Boston consulting grid, oh, you got to do this and then you got to do that.
(04:53): And they suck. They fucking suck. They cannot make a good decision that leads to a great outcome to save their damn life or the life of their company, which is why so many companies who end up with professional managers go down the damn tube. They go down the drain, they can explain all the things you’re supposed to do, and none of it fucking works because they don’t have a tuned black box in their gut to guide it in the right direction. And at the same time, you can see entrepreneurs and founders and so forth beyond able to articulate why do I do the things that I do? I don’t know. I just have all this input, but I make good decisions. And that’s ultimately what it is. This is why business is such a satisfying domain to be in because you get the verdict back from the market.
(05:39): Are you making good decisions? I can’t articulate it, I can’t even predict it, but you can trend it sort of backwards. You can have a moving average. There are entrepreneurs who are phenomenally business person, not just entrepreneurs, but business people in general who if you have a conversation with them, they’re not necessarily the 140 IQ super whizzes, right? In fact, I’ve seen some studies show it’s almost a detriment to you if you are too highly attuned to the intellectual rigors of whatever. You lose sight of all those intangibles and you’re actually a worst business person. So I find that learning to trust your gut is first of all the realistic thing in business because we have no alternative. We don’t get to fix the variables. We don’t get to live in the Harvard Business case study domain where we can just look backwards. No, we have to jump into the unknown with the only real compass that we have with us, and we have to trust that.
(06:39): And the sooner you get to reality of that, the more bullshit you can cut out because that’s really what it is when it comes to the bottom line here. There’s so much bullshit in rationalizing and projections and Excel spreadsheets and all this stuff that goes into this decision-making process. And to get concrete with this, we had an example at 37signals for a while when we were trying to pick new prices for the product, and we were like, it’s just time to experiment. We’ve had the same prices for a long time. Let’s try something. We tried the heavily rationalization, and I’m partial to this stuff. I love statistical significance. I love the rigor of it, and I could also look back upon that decision making process and go like, you know what? We should just have gone with Jason’s gut. We would’ve been done in fucking 15 minutes. Instead, it took three months to arrive at the same goddamn outcome.
Jason (07:34): Something else I want to add by the way, is because I’ve gotten into debates with people about this often, in fact. They’re like, well, we did all the market research. We asked, we surveyed 500 customers. We did these surveys, these extensive surveys with 80 questions. We have all this data. Are you telling me we shouldn’t follow it? And all I would say is, where’d the questions come from? So you asked 80 questions. Why’d you decide those questions? And then they go, well, because those are the ones… It’s like, well, what if I asked 80 different questions? Why those 80? It’s all a judgment call. You thought that those 80 questions were the important questions to ask a customer. I might think there are different 80 questions to ask a customer or only 30 or only six that mattered. These are judgment calls. So if you just strip back a few layers, it all comes down to, I don’t really know.
(08:27): We’re making our best guess here. We are bringing our judgment to bear, and we think that these are the questions we should be asking. The data you get, you might feel is certain, but it’s still based on something you thought to ask. Where’d that thought come from? Experience? What’s that? So you just have to go back a few layers here and you realize it’s all very, very soft. It’s not concrete. It’s very soft, it’s very blurry, it’s very muddy, it’s very full of unknowns, and these things are not based on solid foundations. Even if you have the answers that you think are. The questions themselves came from a really muddy, blurry, obscure place. So anyway, that’s how I’ve approached this and that’s kind of what the writeup was about.
David (09:18): And I think the fundamental misunderstanding here is that there are people who think all this rigor is valid or applicable because what they’re doing is actually like hard science. We’re being very scientific about this stuff. It’s almost like we’re doing physics or we’re doing chemistry. Are we doing, god forbid math? No, you’re not. You’re doing fucking sociology, psychology. Do you know what’s been going on in the social sciences for the past 50 years? A lot of fucking bullshit. Half of it, just off the top, half of it. All the famous studies you’ve heard about it, they don’t replicate. No one knows anything. No one can repeat any of the fundamental insights that we are supposed to have learned from the social studies for so long. That’s what business is. It’s social studies and no one can agree on what the terms are. Everyone dresses this stuff up in statistical rigor and it’s P-hacking out the wazoo.
(10:07): It’s picking your conclusions after the fact. It’s all this post-rationalization. I want this certain outcome and I’m going to back trace my way into the steps I need to to get it. Just skip up all the steps. Just go straight to your preferred outcome. Test it against reality. Conclude that one experience or experiment within this context, had a verdict. Do you know what? I can extract 10% rough guidance from that. I can’t extract life lessons that aren’t can apply universally even in my own damn business. This is one of the reasons I enjoy testing our priors so much. Pricing, this pricing experiment I referred to was one of those examples. We tested pricing many times in the past. You know what? A pricing test from 2013 is not worth the whole damn lot. It’ll tell you something in some extract ways or at best prompt you to put some good questions to the market, but you can absolutely not use data that’s 10 years out of date because it’s just out of date. All the variables change, all the circumstances change, all the expectations of customers changed. So accept the fact that you are in the social sciences and that the social sciences are mostly bullshit most of the time.
Kimberly (11:19): Okay, so David, you mentioned pricing when it comes to Basecamp, which makes me think of this new once umbrella and those kind of pricing decisions, which there’s no history to go back to like there is with Basecamp. How do you make that kind of judgment call when it comes to something like that? For either of you?
Jason (11:37): Pricing you mean?
Kimberly (11:38): Yeah.
Jason (11:39): You pick a price that feels reasonable, it takes three seconds and you’re done. You put it out in the market. You see what happens. Hey, we’re selling enough. Great. We’re not selling as many as we thought. Could it be pricing? Could it be something else? I don’t know. We could try this then. The whole point is if you want an answer, you got to do the thing, put it out in the market, and that’s where you get your answers. You don’t get your answers from research in this case. You don’t get your answers from surveying. You don’t get your answers from asking people, what would you be willing to pay? Because it doesn’t cost anyone to give you an answer. The only answers that matter are the ones that actually cost the answer. You put it out in the market for $299. If people buy it for 2 99, the answer is yes. If you say, would you pay $299? And they say yes, well, it costs them zero. So it’s not an answer. It’s that. That’s how you do it.
David (12:28): I was just on a call with a couple of entrepreneurs running a business where we were talking about exactly this pricing stuff and they were like, oh, we’re trying to find the right advisors, the right advisors who can tell us what to do. And I’m like, no, no, no, no, no. Absolutely not. Do not take anyone’s advice. Where did they pull that advice from? Their own circumstances in their own business that has a million variables that do not transfer to your context. The only way to get it, it’s just test and start tomorrow. They’d been like, oh, we’ve been talking about pricing for a few months. What do you, no, no. Start testing tomorrow. You get new customers, especially if you’re dealing with web-based stuff. I mean, we have new customers coming to basecamp.com every single day. Thousands of them who are fresh. They’ve heard of Basecamp, they’ve gotten a recommendation, whatever.
(13:16): They’re going to see our offer more or less blind for the first time. We can change that offer from day to day. Now, there’s some secondary effects and dah, dah, dah, dah. But fundamentally, you should look at these pricing tests as they’re isolated experiments. They are actually relatively replicable within your own context, within a narrow timeframe. So you can test one thing the one day and another thing the next day, and you can be reasonably confident that that’s going to give you some answers to go on. But that’s about it. That’s what transfers from one context to the other. And this is why advisors and other inputs into this is just mostly useless. Start with your own intuition. What’s going to give you the foundation for the experiment and then run the damn experiment.
Jason (14:01): The other thing is that people really are surprised when I tell 'em this. Literally pricing decisions, we just make it up. There was not a meeting about this. There was not a two hour conversation about this. We didn’t put 17 different options on the table and debate each one. We’re like, I don’t know, $299 or, I dunno, maybe it was even like $499. I dunno. How about two, nine? Nine? Yeah, that sounds pretty good. Let’s try that. I mean, literally, honestly, it’s like a minute to three minutes. That’s it. And most decisions should probably be like that, frankly. I mean the idea that you have to sit there and debate something that you’ll never know the answer to until you do the thing anyway. Just do the thing. Use your best again, your best judgment, whatever that, and people, what it is that I don’t know what it is, it’s your best judgment. That’s what it is. Make a decision, have a feel, feel it out. Go for it, test it on the market. See what happens. That’s your answer. There’s no other way to get there.
David (14:57): And this comes to the point of leaning on that gut computer. You’ve been training your gut computer your entire well life, but even within a given domain. Jason and I have been training our product computers for 25 years. Whether we talk about this for five minutes or 10 minutes or two days or two weeks, it’s such a vanishingly small amount of time compared to 25 years we’ve been training the gut computer. Now it’s going to give us an answer, and sometimes the answer is 42 and you don’t know what it means, but you’ve got to go with it anyway, and you just go like, all right, we’ll start there. That’s the jumping off point for the experimentation, which is I think as Jason says, it’s shocking when people hear about it in the context of pricing, but it’s actually shocking in the context of everything.
(15:45): Jason and I very rarely deliberate for long periods of time in real time or even offline time. If you compress most of the major decisions we’ve ever made, you could compress 'em to probably 10 minutes, maybe 30 minutes. If it’s a truly one-way door, as Jeff Bezos would say, there’s no coming back from it. Maybe we’ve taken a combined two hours. That two hours might’ve been spread out over two weeks because we wanted to see how it feels tomorrow. This is actually a good input to this whole thing about the black box computer. What we often do is we will reach a decision or a preliminary decision in about five minutes. And then we’ll go like, let’s sleep on it. Do you know what that means? Let’s sleep on. It means feed it to the gut computer. Let the gut computer run its calculations.
Jason (16:37): Digest it. Digest it. Literally.
David (16:38): Yes, digest it literally. And in the morning it’ll deliver an answer. It’ll deliver an answer that you can feel like you have some confidence about because I think this is the other part. When you make a decision quickly, a lot of people feel like they don’t have confidence in that decision because it’s very monumental perhaps. Most aren’t really. People just pretend, but they believe it’s monumental. And therefore they believe it requires more input. Again, all the input is already in the computer. So if you need to, and sometimes even we do, it is a big decision, we just go like sleep on it. Sleep on it is letting the computer work and it will deliver the answer, and then don’t fuck around with the answer anymore. Once you know, you know. This is the other thing that’s so curious, and I found this again. Many times when Jason and I talk, we’ll show up to a discussion. We already know what the fucking answer is. The discussion is about getting onto terms with the answer that is already known. This is relevant in pricing, it’s relevant in employees, it’s relevant in a thousand evaluations. We show up already knowing what the answer is because the computer has computed what we have to do. We have to do these little dances to get comfortable with what we know to be true
Jason (17:51): Already. One other thing I want to add, Kimberly, really quick about pricing. Someone might say who’s listening to this, they might say, well, if you don’t test and keep testing, you might be leaving some money on the table. Maybe you could have charged $349 instead of $299. And the answer is, you’re right. Maybe we could have. And maybe we’ll eventually find that better price, or maybe we won’t, but we’re comfortable not trying to be perfectly optimized. We’re not after certainty. We’re not trying to find that perfect, perfect point. We’re trying to find a big enough circle that’s good enough for us. It’s not about the point. It’s not about exactness. It’s about, this is pretty good. Yeah, maybe we’re leaving some money on the table, whatever. It’s fine. It’s fine. Especially if it would’ve taken nine months to find that point. Don’t really care. Not that important. So anyway, that’s another way to look at this too.
David (18:43): This is one of the pure privileges of running a company where you don’t answer to anyone because the problem with having investors and a board or executives, whatever other people that you have to convince that your gut computer was correct is that that requires a level of exactness. And you can’t say things like, eh, we’re just going to leave some money on the table. Like, hey, what do you mean leave some money on the table? That’s my money. I’m an investor here. I don’t give a shit about your black box computer. I want everything now. Everything I can get out of this thing, you should be getting all of it. This is the principal agent problem. We don’t have a lot of principle agent problems at 37signals because we are principals and agents at the same time. So we are allowed by ourselves to play it cool. And you know what? Things come easy when you take it easy.
Kimberly (19:34): Okay, last question before we wrap it up.
David (19:35): That sounds like a T-shirt. We should print that.
Kimberly (19:38): We’re going to embroider that on a pillow. How often do you guys sleep on something and come back to the original thought?
David (19:46): Most of the time.
Kimberly (19:48): Yeah, you’re typically right the first time and you just need that extra time. You’re not typically changing the position.
Jason (19:54): The sleeping on it basically either confirms or radically denies. So it’s either like, man, that was actually, you know what? I slept on that. I don’t know. It’s not like I slept on it and we’re 6% off the original guess. It’s like yes or no. That’s what the sleep is, that’s what the sleep’s for. And again, it comes back to it like it’s usually probably a yes. Sometimes it’s a no. If it’s a yes, it’s like just try it. Even if it’s a no, it’s kind of like, well, I don’t know. The no could be, we don’t know, but let’s, is it easy to try? I think there’s another thing. If it’s very easy, this is the one way, two way thing, the Bezos thing, not really quite the same, but it’s similar. If it’s easy to try, let’s just try it. This pricing thing, since we’re getting back to this. $299. I don’t know.
(20:39): I’m kind of feeling a little weird about it. I think we could charge more. Well, maybe we could, but why don’t we start at $299? Okay. It’s an easy, okay, then we can do it. We can move. If you’re sitting in deliberation, you’re not moving, you’re treading. And that’s okay here and there, whatever. But we don’t do it much, but it’s okay, but you just can’t sit there all the time. So I think you’re better off just saying yes or no versus like, yes times 1.3 or something like that. No, just it’s a yes or no.
David (21:10): I think this is why we have such forward momentum because neither Jason or I need our gut computer to win every deliberation because so much of it just gets put to reality, and reality will tell us whether it’s correct or not. So sometimes Jason will have an intuition that goes like, oh, I think we should do this. And I go like, my gut computer actually comes with the opposite answer and it says, I don’t think that’s right. Sometimes I’ll raise the concern. Other times I’d be like, who cares? So we try it. Either it works and I was wrong, hooray, or it doesn’t work and I was right. And also hooray because we just tried the next thing. There’s such a premium that we try to place on momentum, move forward, try things, let reality be the judge. Why are we wasting our time arguing about something that reality could tell us if we just put it to the test?
Kimberly (22:02): I also hear you guys say, what’s the worst that can happen? I feel like you guys use that sometimes in your decision making too.
Jason (22:09): Yeah. Exactly.
Kimberly (22:10): Okay, well with that we’re going to wrap it up. Rework is a production of 37signals. You can find show notes and transcripts on our website at 37signals.com/podcasts. Full video episodes are on YouTube and Twitter, and if you have a specific question for Jason or David about a better way to work and run your business, leave us a voicemail at 708-628-7850. You can also text that number or send us an email to email@example.com.