How to Scope an MVP (Without Cutting the Wrong Things)
Every founder I've worked with has made the same mistake at least once. They either built too much — spending six months on features that turned out not to matter — or they cut too aggressively and launched something so stripped back that users couldn't understand what it was for.
Scoping an MVP is harder than it sounds. Not because the concept is complicated, but because it requires you to make hard decisions with incomplete information, under time pressure, while people around you have strong opinions about what the product needs to be.
After doing this across fintech, healthtech, blockchain, and AI — and helping founders at various stages do it too — I've developed a framework that keeps me honest. Here it is.
Start With the Riskiest Assumption, Not the Longest Feature List
Most MVP scoping starts in the wrong place. People open Notion, start listing features, and then try to cut things. The problem with this approach is that it anchors you to your own ideas about what the product should be, not what you actually need to learn.
The right starting point is your riskiest assumption.
Every product is built on a stack of assumptions. Some of them, if wrong, would kill the product entirely. Others, if wrong, are just annoying to fix. The MVP's job is to test the ones that would kill you.
When I was working on BantuPay, one of our riskiest assumptions was that users would adopt a username-based transfer system over wallet addresses. If people didn't trust usernames — if they felt less secure, or if they preferred to verify the full address — the whole UX direction was wrong. We needed to test that assumption before we built anything else around it.
Ask yourself: If this assumption is wrong, does the rest of the product still make sense? If no, that's what your MVP needs to answer.
The Three-Column Test
For every feature you're considering, run it through three columns:
Column 1: Does it test a core assumption? If yes, it's a candidate for the MVP. If no, it doesn't automatically get cut — but it needs a stronger justification to stay in.
Column 2: Can users complete the core journey without it? This is where a lot of founders cut the wrong things. I've seen people remove onboarding copy to save time, then launch and watch users churn because they didn't understand what to do. The core journey isn't just the main feature — it's everything a user needs to go from landing on your product to experiencing its value.
Column 3: How long would it take to add after launch? Some things are genuinely hard to retrofit. Database schema decisions, authentication architecture, core UX flows — these have compounding costs if you get them wrong early. Other things — a dashboard widget, a notification preference, an export function — can be added cleanly later. Be honest about which is which.
Features that pass Column 1 and fail Column 2 are in. Features that fail Column 1 and fail Column 3 need a very good reason to make it in. Everything else is a Phase 2.
Define "Launch" Before You Define the MVP
This sounds obvious but it trips people up constantly. "Launch" means different things:
- A private beta with 10 users you know
- A public beta with a waitlist
- A press launch with a marketing campaign
- Full public availability
Your MVP looks completely different depending on which one you're targeting. For a private beta, you can get away with rough edges, manual processes, and incomplete flows — as long as the core value is demonstrable. For a press launch, you need polish, an empty state strategy, and error handling that doesn't embarrass you.
When I built the UAT system for SwissPay, we had 170+ test cases across 18 features. Not because I love writing test cases, but because we were going into a controlled beta with users who needed to trust that the product worked. The launch context defined the bar.
The "What Would Break Trust" Filter
Here's a filter I use that doesn't come from any framework: for each feature you're cutting, ask whether the absence of it would break trust with your first users.
Trust-breaking cuts are often invisible until it's too late. Things like:
- Removing email confirmations ("we can add that later") — then users don't know if their action worked
- Cutting error states ("it'll be fine") — then users think the product is broken when they make a mistake
- Skipping empty states — users land on a blank screen with no guidance and leave
- Removing password reset ("people can email us") — immediately signals amateur product
None of these are exciting features. None of them are part of your core value proposition. But their absence signals to users that the product isn't ready, or that you don't care about their experience.
Keep the things that preserve trust. Cut the things that are genuinely just nice to have.
Accept That the MVP Will Be Wrong
The final thing I'd say is this: the MVP is not supposed to be right. It's supposed to be informative.
The goal isn't to launch something that's ready for everyone. It's to launch something that teaches you what you need to know so you can build what's actually needed.
When I launched Timbuktu — the decentralised exchange we built at Bantu Blockchain — we had 7,400 users within months. We also had a list of things we got wrong that became obvious the moment real people started using it. That list was worth more than any discovery session or user interview we could have run beforehand.
Scope for learning. Build the thing that gets you the information. Then iterate.
The founders who get this right aren't the ones who scoped perfectly — they're the ones who launched something real, stayed close to their users, and moved fast when the feedback came in.
A Quick Checklist
Before you lock your MVP scope, run through these:
- Have you identified your top three riskiest assumptions?
- Does the MVP test at least one of them directly?
- Can a new user complete the core journey end-to-end without hitting a dead end?
- Have you defined what "launch" means in concrete terms?
- Have you checked each cut feature against the "does it break trust?" filter?
- Do you know how you'll collect feedback from your first users?
If you can answer yes to all of these, you're in a much better position than most teams at this stage.
Scope clearly. Ship something real. Learn fast.