//

“Don’t Be Elizabeth Holmes: When To Make That Difficult Decision To Cut Scope”: Natasha Harpalani with AWS (Video + Transcript)

April 10, 2024
VIDEO

Sometimes, it’s hard to know when to push for excellent products and when to cut scope. Natasha Harpalani (Amazon Web Services Senior Technical Product Manager) will share her framework for knowing when to cut scope on a project. Attendees will learn to measure the tradeoffs associated with delivering a project on time and as planned versus cutting scope, execute a decrease in scope through communication and planning, and how to cut scope without compromising on an excellent customer experience.


WATCH ON YOUTUBE

In this ELEVATE session, Natasha Harpalani, a senior technical product manager at Amazon Web Services, discusses the importance of knowing when to cut scope in product development. She emphasizes the need to monitor and track progress, set milestones, and constantly evaluate against success criteria while analyzing customer needs and understanding what is truly necessary to achieve product goals.

Like what you see here? Our mission-aligned Girl Geek X partners are hiring!

Natasha Harpalani ELEVATE always monitor and track progress for any product

Transcript of ELEVATE Session:

Natasha Harpalani:

Thank you so much and hello, everyone. I wish I could see every individual here, but really, just can’t emphasize how excited I am to be speaking and to be a part of this conference. As you kind of all noted through my very kind introduction, my name is Natasha Harpalani and I’m a senior technical PM at AWS. As you also may have seen through the title of my presentation, I’ll be speaking about how not to be Elizabeth Holmes and really talking through when to make that difficult decision to cut scope.

I will explain the Elizabeth Holmes reference a little bit later in my presentation, but this is a topic that’s really near and dear to my heart and something that I’ve learned, I think, a lot about in every product experience I’ve had.

Before I dive into the details of this presentation, just to expand a little bit more on my background, I started my product management career at a startup called AppNexus. AppNexus was an adtech company that was later acquired by AT&T. My experience there was working on machine-learning products. After spending time at AppNexus and then, subsequently, at AT&T, I pivoted into the cleantech space. I went back to the startup world and worked at Mainspring Energy as an IoT product manager. Mainspring develops a clean generator that can run on hydrogen and biogas. From Mainspring, I came to AWS, where I continued to follow my interest and my passion working on solutions to climate change, and this is where I’m at today.

Today, in my day-to-day job, I focus on working with utility companies and clean energy companies and helping them use the cloud to be able to scale some of the data issues that they have and work with the increasing complexity of data that a lot of utilities and clean energy companies are challenged with as the electric grid is changing very rapidly with more and more renewables coming online.

Now that you know a little bit about me, just to set the stage for this presentation, I want to start by explaining and talking through an experience that I had very early on in my product management career. One of the first products that I worked on was a product I developed very closely with a group of machine-learning engineers and data scientists, and what we were aiming to do was to predict the likelihood of someone online to purchase a product. It was a really interesting product to develop, I was personally really fascinated by it, and as we worked on this product, we hit this point where we’re all sitting in this room and I remember sharing with the team, “What if we use this additional data set that we’ve all talked about that we all wanted to explore to make our prediction algorithm better?”

The intent was good, but it was a situation where I was completely wrong. I was wrong, because we were working on releasing a product that had pretty tight timelines, that was already fairly complex to begin with, and I really needed to take a step back and reevaluate what the team could and could not accomplish in a meaningful amount of time.

After this day, I remember having a pretty hard conversation with the engineering manager that I worked with, and I will never forget. He sat me down and literally said to me, “Natasha, you are asking the team to build Theranos. It’s like you want this perfect product that does everything and is beautiful and has all this functionality and deliver a perfect product to customers, but we work at a startup and we have some really aggressive timelines.” While I do love a turtleneck, the analogy to Elizabeth Holmes was a little frightening and one that I will never forget.

However, even though I didn’t necessarily appreciate the reference to Theranos and to Elizabeth Holmes, I will never forget this lesson.

What I’m hoping to do today is to share how that lesson has helped me in some of my product management experiences and, hopefully, be able to share some of that with you to help you as you’re either a product manager, an engineer, a designer, really anyone that’s working on developing products that customers love.

To set the stage for what you can expect over the next 15 or so minutes, I’m going to talk about how to really monitor and track progress when you’re working on a product to really be able to identify when you might need to get to a point to cut scope. I’ll talk about evaluating your product and your goals against the success criteria that you may have started with, and also, discuss how to really analyze what customers truly need and separating that from what you maybe want to build.

Finally, I’ll talk about when not to cut scope and end things off by hitting on a few key takeaways that you all can hopefully use and leverage in your experiences. With that, let me start with one of the, in my opinion, most underappreciated and undervalued parts of product management, which is monitoring and tracking progress.

If you work on developing software, it’s possible that you’ve seen Gantt charts and milestone tables. Something like what I have on the screen. I’ve certainly seen versions of this that are far longer, far more complex, has 50 boxes in a page. Regardless of how a team organizes and tracks milestones. I think it’s so important to set milestones and constantly track against them.

The number one thing that a team has to do in order to be able to execute well, and it’s the number one thing that I think will set up a team and a software development group to make sure that they’re always evaluating how they’re doing against their milestones and get to a point where they maybe have to make a decision and recognize that they need to cut scope on a product.

When you’re looking at milestones, it becomes very clear when something is off. I think very common causes that lead folks and lead teams to have to cut scope tend to be… Work takes longer than expected, that happens all the time. There’s unexpected bugs, there’s unforeseen issues, and there could also just be delays outside of your team. For example, if you work with another engineering team that is developing a component of the product that you’re putting out to your customer, you have a real dependency there.

Despite some of those very common causes, once you look at your program, roadmap, look at your milestones and you realize, “Okay, I’m not going to hit a certain date,” or, “I’m not going to hit a certain milestone,” there’s generally three options you can evaluate, and I’ll caveat, these are not the only options, but I’d say that these are almost always three of the main options, which is you can extend the timeline on something, you can add additional resources.

That might mean recruiting and including more engineers or more designers or more data scientists on the team to help the velocity of that team, or you cut scope.

This third one is a very common one, and while it’s very hard sometimes to knock it through all the P0s, P1s, and P2s that you want to deliver to a customer, it is so important and I think it’s so important to become really comfortable saying, “You know what? It’s okay, we can cut scope and we can evaluate this and still deliver a really great product.” Once you’ve made that decision. One thing I’ll note is coming back to the example that I’d given you very early on, in the case of the conversion optimization product that I was developing with a team, we had a really strict timeline. We wanted to release this product by the end of Q3 before hitting the holiday season in Q4.

That was really important, because in the ad tech industry, the holiday season is one of the biggest buying seasons and that leaves users sometimes less willing or interested in testing a new product, because there’s just so much spend happening during that time.

Once you’ve made that hard decision and realized, “Okay, I need to cut scope,” one of the most important things I think you have to do is come back to the success criteria that you and your team have talked about and aligned on for your product. Let’s say that, for some reason, you don’t have those success criteria, those can always be developed later on. Really, all that means is, what will make this product release, product launch successful?

The reason it’s so important to take a look at your success criteria is, especially as you start product development, I think that sometimes you can really reevaluate and take another look at, “What is truly necessary for me to achieve the product goals that are set out?” Sometimes you can let go of a few P2s, sometimes you can let go of a few P1s.

Heck, sometimes you can let go of P0s, as long as the work that you’re doing helps you achieve the goals that you’ve set. And that also really involves looking at every single feature. If a feature is a nice to have, or maybe it’s a great feature, but the amount of value it drives for the customer is small, that could be a candidate for something that maybe isn’t released and launched with the first release of a product, but comes shortly thereafter.

Coming back to the example that I’ve been discussing during this presentation, I think back to this example, actually, all the time. Amazing how early lessons really stick with you. But for the product I was developing, our goal that we had set at the very beginning of product development, and at the beginning of beginning our beta testing, was that if 75% of users that were testing this new product could achieve their performance goals, then we would release this product to GA.

As I was talking to my team, as we were thinking about using richer, more data sets to make our prediction algorithms even better, one thing that really was important to look at was, “Hey, will making our prediction algorithm better actually help us achieve our goal or will it just make us surpass that goal?”

At the end of the day, if we see that 75% of our users are achieving their goals, then making an algorithm even better is great, and we always want to deliver value to our customers and give them the best experience that they can, but we’re actually potentially already achieving our goal for this period of time when we’re testing with beta users.

Another really important evaluation step is to analyze the customer need, and this maybe sounds simple, right? Duh, of course I should be always looking at customer needs. But I think in situations like this, when you’re looking to cut scope, you have to come back to the customer and come back to, “What are truly the customer requirements and what are the customer needs?” Looking at things and really deeply understanding what are the customer’s goals, what are they gaining from this experience? Will the customer gain value from a partial test?

Is there something that you can do to supplement the customer’s usage of your product, such as providing additional customer support or additional documentation? Is your customer willing to use a workaround? These are all really important questions to ask your customer, potentially more than one customers, and to understand.

I am constantly surprised and I have to remind myself that, especially early customers that are testing a new product with you, they tend to be pretty agile and really open to the fact that they’re testing a new product. It’s not the end of the world sometimes for certain customers if there’s one or two bugs, it’s not the end of the world if maybe they have to use a workaround, so long as you’re very aware of what is it that the customer is trying to get out of this. Because if I can still deliver that value and I’m releasing one bug with it, but have documented that bug, it might be okay.

Once again, just to set the stage with an example, when I was developing the conversion optimization product, one of the… The crux of that product was that our end user had a performance target. They either hit it or they don’t. Of course, they would love to surpass their performance goal, that’s always great, but the definition of success for my customer was, “Did they or did they not achieve their goal?”

Therefore, by coming back to that understanding and remembering that for our customer, giving them even more value is great, but understanding how they’re judged and how they’re evaluated, once again helped us determine, “Okay, maybe we don’t need to look at everything that we had originally discussed and outlined in our project plan if we can still help our customer achieve that outcome.”

With that, I would be remiss if I didn’t at least touch on when not to cut scope. I don’t think any of these will be particularly surprising, because in many ways, I touched on this when describing the rubric you can use for when and how to cut scope, but I do think it’s worth really calling out.

If cutting scope will lead to your product no longer solving a problem, it’s not worth it. If it leads to delivering such a terrible customer experience that you might lose trust with your customer or your customer will have such a bad experience that they won’t come back to testing or using your product, it might not be worth it.

And then most importantly, if cutting scope will keep you and your team from really learning and getting the insights and achieving those product goals that you need to launch and really develop a great cohesive product, also not worth it. With that, I want to say there’s always ways to cut scope and to think about it, but you do have to be careful and make sure you don’t cross that line where you end up delivering something that isn’t actually achieving the goals that you or the customer set out to achieve. Let me tie back to some of the things I hope you take away here.

Remember to always monitor and track progress for any product. This is so important to help you get to a point to where you even know you have to make a decision to potentially cut scope. I also think it’s just a great habit and great culture to create that the team is constantly evaluating, “Hey, how are we doing against the goals and the timelines that we had set out?” Those can obviously change a lot as you get into the thick of a project and you’re developing a new product, and so constantly keeping an eye on that is so important, because the worst thing that could happen is you get to a certain point in time when you’re supposed to release something to a customer and you realize, “Shoot, I can’t do this. I wish I had cut scope earlier on.”

Next, make sure to always evaluate what you’re delivering to the customer against your success criteria and really deeply understand the needs of your customer. As a product manager, it can sometimes feel really difficult to cut scope. I love developing great products that look beautiful, that feel beautiful, that have all the functionality that my customers love and want.

Amanda Beaty:

Thanks, Natasha. I’m sorry, we’ve got to…

Natasha Harpalani:

No worries, no worries.

Amanda Beaty:

Thank you so much.

Natasha Harpalani:

Thank you so much.

Amanda Beaty:

Everybody, I hope everybody can join us in the next session. Thank you, Natasha.

Natasha Harpalani:

Thank you.

Like what you see here? Our mission-aligned Girl Geek X partners are hiring!

Share this