Today's gems are mined ⛏️ from another Sub Club podcast episode, this time with Leon Sasson (Co-founder & CTO at Rise Science - Sleep & Energy Tracker).
A really interesting discussion, especially the second part on onboarding and testing.
From B2B to a $500k ARR Subscription App
Even when they were in B2B, they would drive cohorts of users to the app for product development purposes (through Testflight initially). It allowed Rise to get some analytics data as well as talking with actual users about their sleep challenges.
If you’re looking to improve your B2B sales practices, go to a SalesForce website and sign up for a demo to see how they follow up.
When in B2B, Rise would “coach” players one-on-one, and had to convince them that sleep matters and that Rise solves that problem. They were doing this in small presentations of 10 minutes, that they were always refining. They’ve tried emulating this in the app’s onboarding.
The job of the onboarding is never to show people how to use the app. If you need a tutorial, it’s too complicated. People want to know how your product affects their lives and why they should care about it. It’s education and convincing.
If your content is good and you’re giving people information that’s relevant to them, fewer screens is not better. Even if it requires more friction, if you create more value for the user then you should do it.
Rise nurtures intent throughout onboarding so that even though there is friction, people keep going through. (David, RevenueCat)
Testing is key to figure out the product side, but testing by itself is not going to make a great product: you need to talk to users. Way before testing, Rise actually talked to users and got on hundreds of zoom calls where they’d give gift cards: both to get feedback on designs and the app as well as what people have tried to improve sleep.
Even if they have a strong opinion on why something might work better than what they have now, they’ll still A/B test (with statistical significance), whether it’s a small change like changing the messaging on the paywall or redesigning the home screen.
You need to test the most extreme things, and the things that are fastest to learn. Example: getting users in the free trial the same day you onboard (otherwise it can take you 2 months to learn something).
Having counter-metrics is helpful so you don’t move something in one direction that ends up hurting somewhere else. Unfortunately a lot of funnel optimizations end up like that. Example: increasing trials but hurting NPS and long-term retention.
Rise talked to people in the space (Calm, Headspace, etc.) to understand their year 1 retention to get some ranges of what the best get in order to estimate LTV.
If you don’t have an obvious answer from your A/B test, it probably is not better and should not be rolled out. The exception is if you know that it’s something users want.