BiblioTechWATCH: ‘How To Move Up when the Only Way is Down’ by Judah Taub
BiblioTech
WATCH: ‘How To Move Up when the Only Way is Down’ by Judah Taub
Hetz Ventures Co-founder and Managing Partner Judah Taub, spoke to CTech for a wide-ranging interview discussing the inspiration for his new book, his IDF experience, Jewish philosophy, and what modern-day business lessons founders can learn.
Judah Taub is co-founder and managing partner at Hetz Ventures. Previously, he served as Head of Data for Lansdowne Partners as well as an advisor to multiple young startups. In the Israel Defense Forces, Taub served as an officer in a classified intelligence unit where he engineered a large-scale project that won the IDF 2014 Creativity Award.
His new book, ‘How To Move Up when the Only Way is Down’, is designed to transform readers’ decision-making by recognizing Local Maximums and skill building based on lessons from AI.
Taub spoke to CTech as part of BiblioTech for a wide-ranging interview discussing the inspiration for the book, his IDF experience, Jewish philosophy, and what modern-day business lessons can be applied to founders. This has been lightly edited for brevity and clarity.
I'd like to understand a bit more about you and your and the inspiration behind writing a book like this. Why don't you tell the viewers at home who may not be so familiar with what the book is about and your inspiration?
As a venture capitalist who sits on boards, especially of early-stage founders and startups, a big part of your job is speaking to the founders, typically late at night when the day-to-day stuff is sort of calming down and they're not on other calls. And they have that mental space to deal with the big questions: Are we heading in the right direction? Is there a huge thing that we're missing? Should we be pivoting? And it's a real privilege as a venture capitalist to sit on these boards at such an early stage where the pivots are not small because you're already a big company. These are huge changes in the direction of a company.
I'm having very different conversations with the different founders, but there are a number of recurring themes across them. One of them is really at the crux of this book: are we climbing the right mountain? Is there a different mountain that is potentially better?
One of the realizations that I had a few years ago was that some of the techniques that engineers use to optimize algorithms are no less applicable to humans making decisions. But on the first one, we as a society are spending billions of dollars, literally billions. So Google, Amazon, and Facebook are spending billions of dollars every year to try and optimize the way algorithms learn to make decisions. It just makes sense that we should leverage some of those insights into how we make decisions.
It doesn't always mean you have to listen to what an algorithm would do. Like chess players today, the grandmasters all use software algorithms not to tell them what they have to do, but so that they're aware of moves that they wouldn't at least have considered. I think on decision making, there are certain types of decisions, especially the one that's at the crux of this book, the decisions of a local maximum. How to avoid them is something that AI is quite a bit better at than humans.
What are some of the ways that humans can learn from AI in these instances where humans might metaphorically ‘climb the wrong mountain’? How can humans adopt some of those AI lessons?
The key challenge that I'm discussing is the goal of everybody is to reach a higher point, a higher altitude. The higher out point in a field could represent a higher valuation if you're a startup, or if you're an individual making more money. These are the very capitalist ones - other things could be more happiness in your life or doing more good. You can define success however you want. And the point is higher up in the field is better, lower down is not as good.
We as individuals are walking around, we see a path and from that point, you're up a mountain. And the challenge of a local maximum is climbing up a mountain, which turns out to be a mountain that only reaches a certain height. Sometimes during the climb and sometimes only at the very end do you realize there's a much better taller mountain elsewhere. And the challenge for the individual is how to know which mountain to climb.
At what point do you start trying to go down one mountain to go up another? Are there certain characteristics of individuals that make it easier for them or their companies to move from one mountain to another? Those are the different elements that I try to provide techniques to.
AI is interesting because for software, usually climbing is very quick. So once I tell a piece of software a solution to optimize, it can do that very quickly. The part that the software takes a lot of time to do is figuring out all the mountains and which one should I climb.
For example: if I am Google Search and I'm trying to figure out what type of mountain or solution the query wants me to respond to. Or the Amazon Prime route to decide the quickest way for drivers to deliver all the items. For a software algorithm, the key challenge is picking the mountain. And that's why that is the part I think we can learn more from software. And I think a lot of the techniques I go through in the book can be translated to human decision-making.
You mention modern examples like the Netflix/Blockbuster business saga and Theranos with Elizabeth Holmes. Why did you include those kinds of examples alongside the IDF experience, which you highlight a lot with some of your fellow soldiers? Talk me through how you chose those examples to articulate your points.
I tried to find examples from two types of worlds. The first is how we see this in a business environment. And there you have examples from big companies like Blockbuster versus Netflix or Theronos. The idea in each story was hopefully to try and read it in a relatively new way.
So like the part about Blockbuster versus Netflix. I think it was the CEO of Blockbuster only months before they went bust who said: “I don't understand why everyone's fascinated with Netflix. We have everything Netflix has and more.” I think the part that he missed is that they had more and sometimes more can be problematic. Blockbuster was high up a mountain, but it happened to be the DVD rental mountain. So they're further up the mountain because they've bought stores and millions of DVDs and they have huge amounts invested in that. They were all the way up this mountain and they weren’t agile. For me, the interesting take was how to build an agile company versus a muscular one.
I tried to take a lot of business stories that show a point. What is unique about the army is very often what I try to show is training. So when you're about to start your career, when you're about to choose who to hire when you're thinking about your organization chart, when you're thinking about how to build a team that makes an agile to continue that chapter approach, what are some of the techniques that you can do in building that? So I would try and have in every chapter a business story showing what happens and how you could train for a different scenario.
One of the areas of the IDF that you highlighted was The Devil's Advocate Unit [Ipcha Mistabra]. Talk to me about what that is and how that ties into what you discussed in the book.
The term Ipcha Mistabra comes from Aramaic and it comes from the [Talmudic] Gemara, where when the Gemara makes one argument, sometimes it will say that the opposite is in fact true.
The basic notion of Ipcha Mistabra is to try and allow people both in lower ranks and above to provide an opposing opinion and not just go ahead with one notion. The idea I gave in that chapter is techniques to try and make sure that along the way you are spreading your knowledge not across just the one mountain you're climbing, but as much as you can across the field to try and learn if there are other pieces of information that could potentially change the mountain you’re on or the way you're climbing it.
One of the ones that I mentioned is the idea of having an inbuilt Ipcha Mistabra, which could be if I'm an individual, an hour a week with somebody who argues the other case, or it could be devoting some resources to doing this exploration rather than just running up the current mountain.
It's actually the part of the book which I found hardest to write. I wrote the book all before October 7th and it was obviously published a long time after. This is one of the units that has been and probably will be highlighted in the future that needs a variety of changes.
It's just such a fundamentally Jewish idea, always questioning and thinking about what else can be done.
The idea of questioning everything constantly is a very Jewish philosophical approach.
Finally, I want to talk about the end of the book, where you highlight different areas in business and society where we could reach local maximums. Can you talk us through some of those?
What I really wanted to do is give practical examples of how a local maximum approach could be different: One could look at these problems and come to this type of conclusion. I wanted to put on the local maximum lens and say, “Ok, now let me look at these challenges.”
Healthcare systems globally are all sort of under severe stress… So how would a local maximum approach potentially look at this? I don't think there is a solution to everything. I think what there is is a toolbox, and in your toolbox, you have roughly 8 to 10 different tools. Then depending on the problem you're trying to solve, this is how the engineers work with AI. So for me, it was trying to take the tools that I've described in the previous few chapters and see how they could be applied to problems.
By definition, a local maximum approach is going to be optimistic. And the simple reason for that is most folks are thinking about incremental improvements. So a small step up because we're on a mountain, we're constantly trying to optimize and get a little bit higher. Every step we take is just a little bit better or hopefully a little bit better if we even can.
There may be a mountain, and there may be solutions, systems, and ways we can live that are not marginally better than us, but better than the ones we're doing, and dramatically better for all of us. So by definition, I think people who read the book or look at problems with this type of lens should be more optimistic.
Readers interested in practical frameworks for decision-making can find Taub’s ‘How To Move Up When the Only Way is Down’ on Amazon.