Feb. 19, 2024

S1E7 - Building Open & Closed-Loop ML Products - Nadim Hossain, VP Product Management, Databricks

Nadim Hossain is VP of Product Management at Databricks. We often hear that careers don't happen in a straight line. So how can we be thoughtful of the zig and deliberate about the zag? Nadim has done exactly that. His journey has taken him at McAfee, Amazon, Salesforce, CEO of his own startup BrightFunnel, Uber and now Databricks.

Key Highlights:

In this episode Nadim covers:

- Applying "product thinking" to career and job search
- Taking on a role that you think you are not qualified for
- Product-led vs engineering-led
- Truth-seeking for PMs
- Building open-loop vs closed-loop ML products
- PMs working effectively with ML and AI engineers
- Customer-facing features vs tech-debt
- When to not be dogmatic about OKRs

Connect with Nadim Hossain:

https://www.linkedin.com/in/nadimhossain/

Referenced in this episode:

 

Transcript

00:03 - Rahul Abhyankar (Host)

Nadim, it's great to have you here and I've been looking forward to having our conversations ever since we reconnected again.

Nadim Hossain (Guest)

Yeah, quite likewise. Good to see you.

Rahul Abhyankar (Host)

So you are VP of Product Management at Databricks. You've been there almost three years and you wrote an investment thesis about why you joined Databricks. So is this something that you've done for every company that you've worked for, writing an investment thesis?

Applying "product thinking" to career and job search

00:27 - Nadim Hossain (Guest)

No, not for every company. You can find the post on my LinkedIn or my blog, nadimhossain.com. So I wrote it three years ago when I joined and I realized rereading it that it was like wait, this is actually an investment thesis. The same thing would be the reasons to invest in the company. It just struck me that way. So at the time I was thinking of it as where do I want to work? Just like anyone. But really it is the same decision when you're joining a private company. It's almost like a leveraged investment. It's even more than an investment. You're not making 10 bets or 35 seed bets, you're investing all of your time. Of course, it has to be a bi-directional investment. Actually the same thing with startup investments right? The startups have to pick investors as well. So I think there's a lot of parallels in thinking about that and no, I haven't done it for every company.

There's parts of my career where earlier in career, it was kind of more obvious I wanted to work somewhere. Other times it was very opportunistic. I think later in my career, especially this transition was one where I really wanted to be thoughtful about that. That specific transition, I had followed my curiosity to go to Uber and work on self-driving. It wasn't to open up any new doors. It had no other goal other than I was fascinated and I wanted to work with the best engineers and I wanted to work on the hardest problems. That's why I went to Uber. Leaving Uber or when I was ready to get back to my roots in B2B, I wanted to be very thoughtful because my background is, at that point, a non-traditional background, even though I'd done SaaS for so long because of that kind of departure. That's what prompted my thinking in more depth.

01:54 - Rahul Abhyankar (Host)

Yeah, I do want to come to Uber a little bit later, but I want to stay on this topic of this investment thesis. How did you come up with this way of thinking about your career as an investment?

02:05 - Nadim Hossain (Guest)

So really it was trying to prioritize and evaluate different opportunities. Sometimes there are apples and oranges and really thinking about what motivated me.

What do I really want to get out of my career as a PM? I think you should ask that question over and over, even later in your career.

I think it's always Day 1. I guess, to paraphrase Bezos, for me I realized that I had been a founder and I have a lot of respect for founders and I wanted to be at a founder-led company. Look at the greatest companies. They've been founder-led for a long time and I think I write about that. You can analyze the numbers or whatever.

There's lots of great companies that are not founder-led, but for me, I wanted to be at a founder-led company and that was both a gut and intellectual decision. That's one thing I talked about. One thing also informed my experience both at Uber and as a founder was products are hard, innovation is hard, AI is hard. I wanted to be somewhere with lots of tailwinds, right, where there's a big market and lots of momentum towards that market, whether the company or the individual products, wherever they were in the cycle. So those are the things that prompted thinking that way about what is important to me personally and then

I wrote it down because I was excited and I knew I was going to be building a team and I wanted one way to communicate my enthusiasm and attract folks in my network but also outside, was to write that down.

03:22 - Rahul Abhyankar (Host)

So I think this way of thinking about career as an investment, applying some criteria to companies that you're talking to, is a great way to approach that process.

03:33 - Nadim Hossain (Guest)

You know, everyone has a lot to offer, right? Everyone's got unique skills and backgrounds and it's really about fitting what you have to offer to the person who has that problem. So, one person the advice I gave them was they were telling me what they wanted, right, and I was trying to reframe it for them, saying like, look, that's the wrong question to ask what do I want? What do I want? That can come, but the first question to ask for this person was like you have this unique set of skills, what problem are you solving? Right? Someone has a job to be done in the form of a product role and you're going to solve that problem for them.

That's kind of how I thought about coming to Databricks, was my skill sets really fit what they needed at the time? So I think, as PMs, that's a good way to think about it, where, in a world where there's maybe you might have some urgency, or maybe you're someone's out of a job or you know you're excited about a new space, you want to move quickly and there's endless options. The worst thing you can do is to go after endless options, right? So just really being targeted. I think it's really important. So, even if you in generality, in terms of industry spaces, that people you want to work with, and those are reasons to let someone know that you want to talk to them.

04:32 - Rahul Abhyankar (Host)

Right. You know, many years ago I had read a blog or an article written by Marc Andreessen about evaluating companies on the basis of market, product and team. I don't know if you've read that, but that's a fascinating piece where he talks about which do you choose? You know, a company with a great team, company with a great product or company with a great market?

04:51 - Nadim Hossain (Guest)

That's a great question and I think you hear a lot of investors talk about that. Don Valentine at Sequoia, his point of view was, you know, market, market, market. The very first thing is a market, even if they have a B team. That's the most important thing, I think. If I paraphrasing their point of view, ultimately, people, if you want to build a really big company, you're going to need different ingredients. And when people say they don't care about a big market, what they mean is that the market hasn't revealed itself or that in the early stage it might change. So why focus? The thing that won't change are the founders, right? You know you've got some founders. So I think it's a stage specific thing as well. If you're doing seed investing, like yeah, market might be stupid because you know they might be so early that they're going to be bouncing around different ideas, potentially there's nuance and that. So I think, with that said, I mean I think the later the company is, the clear the market should be, and you can see signs of that.

If a company's struggling, you know, I've met companies that are billions of valuation on paper and they're tens of millions in revenue and it's very obvious they're struggling with market size. It's very obvious because they're doing things that they shouldn't be doing till they're a billion dollar company, like that Databricks, for example, didn't do. We didn't use the word partner internally. I didn't hear the word partner in the first year I was here, right, maybe like six months in started really focusing on upping our game on partners, for example, and the company's already 400 plus million revenues because the market was so big. So I think if you're a 20 million dollar company and you're spending all your time figuring out partners, there's probably something wrong with the market size, right. So if you're a product manager, you should look at that very carefully right Or if you're working so hard to make your sales efficiency perfect and you're sub 50 million like what's going on there, right, to me that's a real loss. It's a bad sign.

06:30

Basically, right, that's not some of the market is tapped out and you're trying to reinvent yourself. So I think I guess I do agree with Don. I guess is what I'm saying. If you were to pick one as an employee, I would pick the market first, because this is assuming you're joining when the company's already got traction.

06:44

Team is very, very important. Again, like I don't know, this quote goes around the valley, but you know,

A good team's reputation will suffer if they enter a bad market, and not vice versa.

So the team is really important, it's necessary, but I think for me the market is prior than looking at the team. It's just you know the quality. If you're joining as an employee, it's the density of the talent is really important, right, you know it's sort of their 10 people. You look at founders and the first three engineers are really important, for example. It really sets the tone. You know if there are 100 people, if you're joining as a PM actually I wouldn't look at the PM profiles at all. I don't care about that. That's what I'm bringing to the table. So if they suck, they suck, but if your engineers suck, then you're screwed, right? So I would really look at the quality of the engineering team.

07:27 - Rahul Abhyankar (Host)

But you also talked about culture, and so when you are in that early stage of the interviewing process talking to you know leaders in the company, how do you get a sense of culture?

07:37 - Nadim Hossain (Guest)

That's a really good question. To look at a culture, I would look at you know the budget, right, like, to understand a company's strategy and culture. Oftentimes the budget tells you, they're not going to give you their budget, but you can look on LinkedIn. You can see where the employees based, percentage of employees and you can also see how they operate, right, like, what are they good at, what are they bad at? You can assess that from the outside. You know, as a PM, you should be able to assess their marketing. Maybe it's bad and maybe you're okay with that because you don't care about that as much. Or maybe it's really important to you because you know you want to join a company that has a competence in that. So those are things you can assess. But I would say, both the market and the culture. You should definitely talk firsthand.

08:13

Just as a PM, if you're evaluating a new product idea, you're going to go to the source and talk to the customers and users for a new product.

If you're entering a company, you know if you're doing it without talking to any customers, you're really dropping the ball right. You should be first-principled, figure out a way to talk to a customer and say what do they think?

You know n equals one or five.

08:32

But even as a you know entering as a VP of product, I talked to multiple customers, right? Just see what they say. And oftentimes, as a new product manager, as a new executive, that's always going to be the currency that has value, right? I mean every good company cares about their customers and any senior executive wishes they spent more time with customers. I found this to be true generally, right? So if you're coming in as a PM and you already start doing your job, saying, hey, here's something I learned about the about your, your customers or your non customers, both are interesting, I think that's an important part of the diligence, right.

09:03 - Rahul Abhyankar (Host)

Let's go to Uber. You know looking across your career journey, you've had experience in B2B enterprise market and then you had your own startup with Bright Funnel that you were the CEO of. That was in the marketing analytics space and Databricks, which is a data platform. So many interesting domains of so many interesting types of markets. But the reason I am curious about Uber you were driving product management for hardcore research at Uber. How did you get into that?

09:33 - Nadim Hossain (Guest)

Yeah, it's a good question. So after Bright Funnel I did go through an exercise of thinking about what were my priorities and I was leading as CEO 50 people and you know all that kind of stuff. It's often a lot of fun being a startup CEO and very hard. But I realized what I wanted next was learning, even though I was whatever mid late career, I wanted to keep learning. I also wanted to have impact in a way that in a startup you can't quite have the level of impact I wanted to have in terms of touching number of users and customers. So I'd done only B2B in my career until that point. And I realized that

10:06

to go to an environment that had large scale engineering problems mostly those are in consumer. Obviously the three big hyperscalers in their cloud divisions also have those kind of large scale engineering problems. But that's something I wanted to do. I hadn't done my career really until that point, and so the usual suspects were, you know, the Amazon and Uber and Google and whatnot. And I also want to be an IC. Look, I just want to be a senior IC and learn and ship stuff. And so initially I went to Uber. They had a team that was building something similar to Bright Funnel, had a lot of similarities, and Uber was spending, in 2017, a billion dollars a year on various rider, driver eater acquisitions, right, promotions of all these things to build this marketplace. The problem definition was how do you make this efficient? One way to make it efficient was to have internal data science teams, internal analytics teams and internal software that would do things like calculating left time value or bid on certain ads and not on others.

Taking on a role that you think you are not qualified for

10:59

Originally, I joined that team and built things like again that were new to me. We're trying to use ML at Bright Funnel, but in B2B you'd have less data and it's a little harder to do interesting things, and so, for example, we built this multi armed bandit model to go bid on job ads right, and to acquire drivers, even though they're not employees. They're contractors. So you're bidding against UPS for someone looking for a job in Cleveland, right, you could be a UPS driver. You could respond to this ad from Uber and go sign up as a driver. So that driver acquisition channel is an example of kind of some new things that I was helping with, so that was super interesting. It was a great way to get a foot in the door to Uber, I would say about six months

11:36

in, I realized that, okay, I was exhausted after doing my startup four plus years and I said, okay, look, I want to take a break from managing people. I said, within six months, I was refreshed, you know, my energy was back to 100% and I was a little bit bored. I missed the challenges of leading people and you know just all the things that come with that. You know, I was chatting with the self driving team, which is something that I didn't think I was qualified for, right, but turns out, no one is qualified for that. No one had built self driving cars that were commercial, certainly, but really 2019 was a lot of big problems to figure out.

12:05

So I looked at that I thought look, ultimately you know BrightFunnel was a software product built on a data platform, and a lot of that we had to create ourselves and you know Uber obviously was trying to do this. Bidding on top of this massive data and self driving is kind of a similar thing. Ultimately you've got to have the autonomy software that's sitting on the robot and all the other pieces drive around either in a virtual world or in a real world, and you know you can test itself and then you make the models better and you ship the new models. So it's this big dev loop that's unique, obviously, in some ways, but in other ways it's just like any other dev loop. And you know the one self driving car at the time I was looking at some data is it produces as much data as all of Google photos in a year. Annually one car versus all of Google photos. This is an interesting challenge, right.

12:47

It makes sense that everything from the core data platform and the way the data centers operate to all the models are trained all that stuff has to be rethought. It just seemed very novel and interesting. Looking back, I'd characterize it as I had never worked on frontier tech or research in my career, and it was. It was an amazing opportunity to do that. Now, if you're at a really big company like a Microsoft, I'm sure you can move around and seek those opportunities. But in Silicon Valley sometimes you know those changes come from job changes. 

13:17

It was fascinating too and specifically one of the things that my team was leading product for simulation in self driving. That's really the whole ballgame. Right, there's the sure, there's a hardware, there's a sensors and those change every few years and then the models have to operate with that. You know sensor kit. So there's a lot of complexity in the system. Its a system integration, system engineering problem. What I really enjoyed was, you know, in a research environment you have 30 engineers, 40 engineers to 1 PM. You know, really it's much more engineering led at that point in the journey and it's a different kind of challenge. Right, and I was looking for that challenge at the time and really enjoyed it.

Product-led vs Engineering-led

13:50 - Rahul Abhyankar (Host)

You know, just to contrast product management in a research or a heavy engineering led area, the problems are really coming from a lot of super smart engineers. They are the ones identifying the problem. So how did you reorient your mental models about product management to that situation?

14:07 - Nadim Hossain (Guest)

Yeah, I think it is a different kind of challenge, like a software company, like a Salesforce or a Workday might even have even less than 10 to 1. It might be 6 or 7 to 1, right, where there's that kind of organization will also have room for more junior PMs, who are, you know, feature PMs or specific UI or API. In the case of research environment, it's not coming from bottoms up customer research. The user wants to get in the car and get to drive somewhere. That's very clear, right? The challenge is okay, how's the maps going to work? How's the navigation going to work? How are we going to detect and classify the different objects, whether it's a stationary object or a vulnerable road user, like a person or a cyclist. So those are more engineering problems. So, how to reorient? First of all, it's expectations, right. It's what is the role of the product manager?

14:52

I think, if you have clarity that ultimately you might have to wear multiple hats, ultimately trying to be truth seeking, get to a better product. In this case, it might not be a user oriented truth seeking journey. It might be more about what are the problems and timelines and different systems that are coming together. I think that's almost having a systems engineering mindset, or engineering mindset, not just a PM mindset. That thing is important for me as a leader, and I've done this at Databricks as well there's some commonalities, but the thing was to reorient on what kinds of people would succeed.

15:22

And I think one quality that's really important is humility, because it's not product led, it's engineering led. So what does that mean? Well, you have to be okay with that and take a backseat on some decisions, right or enable existing decisions. I think that's really important because when the domain is moving so quickly, like it is in research, you can't be as maybe bottoms up in terms of customer problem in that solution and all that kind of stuff that we all go through. That loop has to be a bit faster.

Truth-seeking for PMs

15:49 - Rahul Abhyankar (Host)

So you mentioned the word truth seeking a couple of times. What do you mean by that?

15:54 - Nadim Hossain (Guest)

I think it's personal value, but it's also aligns well with the product management journey. It also aligns well with kind of the investment metaphor we're using. It's really kind of getting to the bottom of something about without, obviously you care about people, you care about people's feelings, you care about how you get it done. It is the how is important. You don't want to be a bull in a China shop. Those are all must haves.

16:14

But ultimately, if you're not getting to the crux of the issue, what are the priorities? Why are those the priorities? You're not doing your job as a product manager. So if you're shipping product, it could be to solve customer problem, maybe it's you're looking at your new market. You're improving reliability, whatever the goal is. Just really being clear about this is the goal, this is how we're going to achieve that goal. And then, have we achieved that goal. I mean, I think to me that's also the definition of how you build trust and how you show integrity is you say you're going to do something and you do it. You also have to tell people you've done it. I mean, if you go through that cycle over and over, people are going to trust you and you're going to be perceived to have an integrity which I think is accurate.

Building open-loop and closed-loop ML products

16:51 - Rahul Abhyankar (Host)

Excellent. So let's deconstruct machine learning and AI. You've worked with these technologies in your startup BrightFunnel that you founded, at Uber at a much bigger scale and also at Databricks.

17:02 - Nadim Hossain (Guest)

So at BrightFunnel, for example, the company I founded, it was marketing and sales analytics trying to give you insights into attribution, into all these dozens of marketing and sales touches, which are the ones that are most predictive, most valuable, and where should you put your dollars right, whether you have a sales development team or marketing channel. And we were working with B2B customers. And one thing I remember is and these we thought there were large data sets that were large enough. We had trouble keeping up with the data pipeline, doing all the complex joins to come up with a attribution logic. But looking back, given the infrastructure that exists today, it's not very large, right, even though it's all the data. It was basically all the CRM data from 100 enterprise companies, right, like New Relic, Cloudera, Hortonworks, SAP. These are all of our customers and we'd say, hey, give us the keys to your Salesforce so we can give you this insight. They said, sure, here's the keys, so there's a lot of product market fit where they'd give us all the data. But pulling it into our system and analyzing it was non-trivial Initially. It was just getting the data in and doing basic analytics and then doing this attribution logic, which was an algorithm and we filed a patent on it. But it was fairly straightforward. Initially it wasn't powered by ML, it was sort of more heuristics. And then we, towards really the end of the journey, before we sold the company, we were doing some experiments with ML and remember the first prototype we had. I was so excited about it.

18:11

But sitting down with a customer, we showed them the weights and we realized, okay, the user experience really matters. This is an analytics product. You're telling them something about their data and a couple of things. One is, when we showed them an insight, they said okay, what does this mean? Well, okay, this means that your field marketing team has zero value. It has no predictive value in terms of the influence on revenue. And they said well, what do I do with this? I'm not going to. We don't believe it. Like we believe field marketing is important. Our sales team will kill me if we get rid of field marketing. And, anyway, I wouldn't shut down a team. These are people I can't shut down the team. So it was really humbling to realize that. Okay, yes, he's completely right. That insight doesn't carry any action or weight because it's telling you something that's impossible to do in the near term without giving you enough trust in the product.

18:55

And if you work backwards from this, any product that's using ML to make a recommendation, for example, if it's a closed loop, it's a different problem. But if you're recommending a human action, you have to build the trust that you trust the recommendation. And you know, we realized that, look, we have to bring them along. And first they have to believe that we have all their data because we're ingesting it from different sources. Then they have to believe that we did the right things with their data. So those are humans, are distrustful by nature. They're going to say, okay, wait, did you? I don't believe this insight. It must be because you forgot to update the data, that sort of thing. So you have to build things into a product to say, you know, last refresh or data source, things like that really become UX is really important, I guess.

19:34

And then, of course, the speed of the whole. Data freshness is really important. The data pipeline has to be robust and well architected, which has to be fast and all that kind of stuff. And then you know, if you look at something like Uber, it's a very different problem. That's kind of one of the reasons that drew me there Originally was there's a tons of data, tons of data. It's very unique, this geospatial data, for example, that Uber had, you know. It's combining the real world with you know, with the virtual world, which is super interesting. But it's got this unique thing about marketplaces and then local cities. The market balance supply demand is very important. So it was a good use case for advertising, where you're trying to take the foot off the gas or press it harder, depending on how important a market was to Uber, which is information that only we had. So that was super interesting to me. But the difference was it wasn't a recommendation.

20:22

It was oftentimes an action like a bid there were also recommendations we were making to marketers internally, but they were internal employees so it was a little bit easier to influence them, or there's more trust. But when we're doing bidding, you know that was a closed loop. So I think that's that sort of influences how you think about the product. And then from Databricks, obviously this is a data platform so it can power things like BrightFunnel analytics applications we have customers building analytics apps on top of Databricks, lots of them or data warehousing applications, or it can power, you know, advanced AI, things like self driving, and we have examples of that as well. So it's more horizontal across all different use cases.

20:59

And here there's both the problem of it's a tool to enable AI and ML, but it's also a product that's powered by AI and ML. Both are true and both are priorities and there. So there's some parallels with some of the things I've done before. But there's there's things that are a bit more closed loop, like where we're using ML to how we provision compute intelligently for our customers, and that's something that you know. We see the margins, you know the customer, see the customer experience, of course, like how fast compute spins up and all that kind of stuff. But there's a bit more closed loop. You know there's other things. You know we have AI assistance in the product. Now we have recommendations. You know that are going to be more more the other use case right where you're telling someone to take action, where you have to have a certain level of fidelity and trust.

PMs working effectively with ML and AI engineers

21:41 - Rahul Abhyankar (Host)

Yeah, so, looking across your experience with ML and AI at BrightFunnel, Uber, Databricks, you know one thing that has happened with product managers is the number of different functions that they have to interact with in order to bring the product together has continuously kept expanding, right from UI UX designers to data scientists, now ML and AI. We're talking about researchers, ML AI engineers and ML AI ops. And so how does product management work effectively with ML AI engineers and ML AI ops?

22:16 - Nadim Hossain (Guest)

I think a lot of companies are in the basic phases, right, which means you know it's the data, stupid, right? I mean, do you have the data right? If you don't have your data figured out? Like you know what is the right data, who has access to it, when does it live, where does it cost, what are the pipelines? You're going to have trouble doing anything interesting with AI and ML, right? So that part doesn't change, and probably anyone listening to this has experience with data and data platforms. So I think that's the same skills that you're applying before, just on platform creation and management.

22:47

You know how do you work with these different functions. I think you know just understanding. You know, like in data teams understand data, data science teams understand the data science, but oftentimes there are some insights that the PM has to bring in, like you know. Why is there a blip here? You know the data science team can try new models or predict things, but they might not know that. Okay, every winter there's a blip because you know it's an e-commerce company or something right?

23:10

Those are insights that sure you know data science teams can have, but they're not domain experts. That's not their job. The PM's job is to be the domain expert, typically right, or the engineering leader's job. So, just to make sure you bring in the insights into each other's process, not just analytics, changing your product priorities, but maybe even PMs, making sure they're helping shape the models you know, shape the process of what data is collected. And because there's judgment calls throughout, it's not all black and white. For example, you know how do you define a metric, how do you collect it. Those are things that are PMs can have influence on as well.

Customer-facing features vs Tech-debt

23:39 - Rahul Abhyankar (Host)

Yeah, you started BrightFunnel and you were the founder CEO. You know, when you are building your own company, that's where you are continuously looking for product market fit. But as you go through those stages of growth, you find that your platform needs to be re-architected for a different level, a different scale of growth. There is that inherent tension between building customer-specific features versus investing in the platform. And it's not just that you face these tensions when you are in a startup, but even large organizations have to go through these aspects of what are you gonna prioritize - customer facing features or investment into the platform and retiring the technical debt? So love to hear your thoughts on these tensions.

24:23 - Nadim Hossain (Guest)

For sure. I think, if you look at the crux of the issue, it's about time horizon right, like, what are you optimizing for? And the faster the company is growing and the more dynamic the landscape is, like the ground you're standing on, the shorter your horizon has to be, because there's no point trying to predict where things are going. So, for example, I'd say here, you know, horizon in the last few years really has been up to two years. It's been certainly immediate term executing what's happening. But there were no three-year plans on the product team. Sure you had like, okay, this is things that are in your later bucket, right, okay, someday we'll build this. But things that have some desire to build, like a specific thing with my PRD, you know, it was really more or less max 24 months.

You know, when I was interviewing people coming from, just, let's pick on Google because they're obviously one of the most successful companies and easy to pick on. Sometimes the PMs are saying and even the people that joined was like hey, like, look, we gotta look three years out, we've gotta build three years out, because they were trained very well at Google to think well, first of all, everything's solid, the platforms already the dev platforms are very mature and very robust. And then you know you've got a monopoly generating, spewing out cash, and so you can think long-term, also growing slower than a company. So it's a different reframing. What's right for Google isn't right for Databricks. You can't think three plus three to five years out, that's a recipe for not having a job. Right, you're gonna fail. On the other hand, as we get bigger, as any company gets bigger, yeah, you should probably lengthen your time horizon.

25:43

So your question about tech debt and building platforms versus user facing features, there's not an easy answer. It's always a trade-off of priorities. But you know, for a startup it does make sense to take debt. Right, you value the present a lot more than the future. Right, you've got a high discount rate, so to speak. Back to that investment metaphor If you have a high discount rate, you are trying to optimize the immediate a bit more than the future, and so it's just balancing that compared to a much bigger company. But then, as you get bigger, you have to think a bit more and more about how do you make that trade-off.

26:16

So one framework I think that a lot of people find useful is the idea of one-way doors and two-way doors, right? So if something is a decision you're making that's impossible or very hard to unwind, then you should make it in the present, very carefully. You should be very judicious. But if it's, look, we can do this. But the consequence of making the wrong decision will be we throw away the work we're gonna take on tech debt. It's gonna result in throwaway work. That's probably fine for a startup, right? Because you're, let's just say, a very simple example of do this work to get revenue or do this other work to build a long-term, stable platform? Well, if you're a startup, your default debt or default live is like one or that Paul Graham uses. Yeah, you're worried about survival.

26:56

The CEO's one job is to not run out of money. It's the first job, right is don't run out of money. Your second job might be to build a good product, but the first job is to not run out of money. As a PM, you're also thinking about that, right. You're not operating in a vacuum at a startup. You're supporting your CEO, who should not kill the company first and foremost, and revenue is the thing that keeps you alive, right? So that's kind of one framework.

27:15

Now, obviously, if your mission of the company is to serve in-market fintech companies, financial companies, and you have a sales rep is bringing a million-dollar deal, that's a healthcare company in Europe and they want a custom feature. Obviously you've got to say no to that. I mean, if it's not part of your mission, if that development is going to set you back, not take you forward. So it's not an easy answer how to think about that. But I think a little bit of that one-way door, two-way door mindset is important and also asking the question okay, what's going to happen when you do succeed? Right?

27:45

So at BrightFunnel, for example, I think we did do a good job of trading those off, but they were still painful. Right, we made choices that we knew that if we succeeded we would have to. Then we architect right. And in the case of an analytics platform, it's not just the platform, is the user experience right? If it takes too long to get your data to give you an insight, then your product is bad. So those were sometimes things we had to revisit and rebuild.

28:09

I think you should always be asking "Are we doing something stupid here?" And sometimes you have to up level. It does have to go to an executive level to see that pattern. If five different engineering teams are doing five different hacks to solve the exact same problem, then the right answer is OK, we should create a small central team to solve the problem for those five teams. So sometimes those are not patterns that the individual PM or engine manager would even know about. It's only a planning process will reveal it.

28:38

So something a planning process is important for that, both at a team level and an aggregate level that's when you reveal OK, what are we spending our resources on? Are these the right things? You can look at us from an edge team. You can look at things like what is the KTLO budget and what is the day-to-day like for engineering teams If they're constantly on on call and fighting fires. Well, maybe that's a stint of having insufficient investments in some of those foundational things, things that don't even require PMs, like reliability. And maybe the PMs pushed the other way. They were saying, hey, I want to build new features and the engineers said, fine, here's some more feature person weeks, and then, as a result, you've had a couple of quarters of unhappy engineers. So you've got to look at that level of data right. Those are all things you can measure in terms of bugs or on-call load, or what reliability all those things?

When to not be dogmatic about OKRs

29:25 - Rahul Abhyankar (Host)

Digging into that planning process a little bit. You worked at Salesforce, Amazon, Uber, Databricks. Can you compare and contrast the different planning processes in these companies?

29:36 - Nadim Hossain (Guest)

Yeah, very, very, very different. So there is some uniqueness. It's like Salesforce was. They called it V2MOM. It's their own internal OKR process and it was very religious and in hindsight I really appreciate that Benioff and Salesforce in general were so dogmatic about everyone individuals having their own V2MOMs, teams having it and cascading rolling up. I think they did a really good job of that process. There's one way to drive clarity on company-level vision and how you roll up to it. Now it does get kind of silly when you have a five-person team having a vision. It was like look, company vision really is the thing you're going for. But it's worth the effort. I think it worked out well for them. Most companies don't have that kind of rigorous OKR process. The most important part of the process OKR kind of collapses that the key thing is to be clear on your goals and how do you measure those goals and how you're progressing against them. So I think OKRs are really important. That said, we talk about speed and the landscaping of the company. So the higher the level of growth, it might become harder to keep up with that. I'll give you an example that I think people resonate with.

30:37

We had an executive meeting. Nothing super sensitive. We had an executive meeting that our CEO was leading here Databricks and we had just finished our OKRs for the year, and Ali, our CEO, was saying look, we've got to make LLMs number one priority. This is before we bought Mosaic, which is a big acquisition, before we shipped product features around LLMs, but it really was a top-down thing. That, I think, is really inspiring that he did that, and even the executive leaders. There's a little bit of eye-rolling or grumbling, like really like we're in the room together, really like we just locked our OKRs like a month ago, maybe weeks ago. You're really going to change your number one priority and he sort of made it, took some of the tension out of the room by making a joke about hey, look, you guys, if a meteor hit the planet, you'd be complaining about focusing on the meteor and you wanted to go back to your OKRs. But he sort of made it clear that, look, this is important for me as a CEO, for us as a company. We should focus on it.

31:27

As a CEO, he had a view into, maybe trends that we didn't all have or we were a bit more in the weeds even as executives, but it was clear it should be a priority. Clearly it was a right call. So that's an example where being dogmatic about OKRs would have been silly, right. And if we have a process that's too heavyweight, given how fast this data and AI landscape changes, we would just be shooting ourselves in the foot. That said, when it's dynamic, there's consequences, right. I mean you don't have clarity. You have to deal with more ambiguity. It might feel like someone's doing something that opposes your goals, right, because there's not clarity on how everything cascades as much as there might be at something like a Salesforce, even in the earlier days. So I just think it really varies by company.

32:07 - Rahul Abhyankar (Host)

Yeah, great, excellent. So, Nadim we'll come to the rapid fire round of this discussion. Are you more of an audiobook person or smell the pages as you read type of a person.

32:19 - Nadim Hossain (Guest)

I do both, so I definitely love both audiobooks and podcasts, but I always have a physical book that I'm reading as well, maybe slowly. I think audiobooks are really, and podcasts are very interesting medium because it is very intimate, because it's straight to your brain. It's only one mode of communication. You're not looking at something and hearing something. I think it can be very powerful. I think both are great.

32:42 - Rahul Abhyankar (Host)

Which is a book that you are reading right now or listening to right now?

32:45 - Nadim Hossain (Guest)

I'm reading the physical book that I'm reading. I'm blanking on the. It's the producer. Okay, clearly it's a great cover with a dot on it. It talks about creativity, right? What's the name of the book? I'm forgetting.

33:01 - Rahul Abhyankar (Host)

Oh, the Art of ... by Rick Rubin.

33:03 - Nadim Hossain (Guest)

Rick Rubin yeah, yeah, exactly. So I mean yeah it's very minimalist cover, so it's all about creativity.

33:08

So, yeah, it's interesting. Like he doesn't imbue himself in the book at all. It's really that's how much you know he takes himself out of the picture it's. He's talking about his ideas. You know it's interesting. It's one that you can read in snippets. It's a parts of it or interesting parts of it I don't like that. This is kind of silly, but I have found nuggets to be very applicable to a problem I'm dealing with that week. So it's kind of it's more of a for me it's more of a slow read because it's sort of reflective. So that's been an interesting one.

33:34 - Rahul Abhyankar (Host)

Yeah, I know Rick Rubin's book seems like you could pick up and skip to any page and you'll have something interesting to read. That type of a book.

33:43 - Nadim Hossain (Guest)

Exactly because you know creativity can be. You know clearly he's someone who is, is an expert or is very steeped in creativity as product leaders. You know, yeah, we're creative, but that's not the number one thing we're good at, right Sometimes. So I think it's it's a good reminder of of some of the things that are a bit more softer aspects of creating products or inspiration. Right, it is hard to pin down, but he talks about it in a really relatable way.

34:10 - Rahul Abhyankar (Host)

Yeah, excellent. Well, Nadim, thank you so much, really being gracious with your time and your experience and wisdom and the knowledge that you've shared. So truly appreciate it and look forward to talking again soon.

34:22 - Nadim Hossain (Guest)

Likewise. Thank you.