Assumption Testing: Don’t Throw Darts in The Dark
When software products fail, they do so because they fail to deliver the intended impact. One problem is that teams don’t tackle risks upfront. Will users choose to use the feature? Will users know how to use the feature? Will the feature deliver the expected business results?
Assumption testing helps to answer these questions before the actual development starts. Running these tests might slow things down a bit, but ultimately it saves time — since you only ship validated ideas, there is less throwaway work.
Our team increased trailer playbacks by 44x on the Showmax website app within a few releases. This did not happen by chance. It was based on a deliberate, careful approach of running a series of mini-tests to verify underlying assumptions.
Always start with the customer
From user feedback, we knew that content discovery — finding a film or series to watch — was a pain point for many of our subscribers. Here’s one example:
Can Showmax show previews or a written summary? Sometimes you would like to know what the movie is all about before watching it. That would be nice.
The information about films and series was too hidden, too buried in the lower levels of the app — users had to do a bit of work to find it. At the same time, we learned from user interviews that trailers played a major role in their deciding what to watch. Moreover, data analysis revealed that there could be a link between trailer playbacks and engaged viewing.
Would more accessible trailers drive engagement?
Our hypothesis was that making trailers easier to watch would drive engagement. We ran a simple A/B test by adding a trailer play button on the homepage that would show on hover. Daily trailer playbacks increased more than 3x.
More importantly, our engagement metrics also showed significant improvement. We were on the right track. The assumption that making trailers more accessible would make users watch more films and series was verified.
Stay with the customer
When we saw that the new feature delivered the expected positive results, it energised us to come up with a new, upgraded feature — trailer autoplay on hover. The implementation of the trailer button was easy. If it hadn’t worked, we would’ve written off a mere few hours of development. However, there were many user-related risks involved for the more complex autoplay feature.
Would users care about increased data usage?
We assumed that some users might be worried about data usage. In Africa, not everyone has unlimited fibre at home. We ran a questionnaire with a simple question:
Would you have any concerns about a preview playing automatically?
As it turned out, only a small fraction of website users worried about data. Even asking directly about data in the follow up question raised only minor concerns. To minimise data consumption and reduce this usability risk further, we decided to use low resolution for the trailers.
Would users know how to use the feature?
In the older version of the Showmax app, clicking on the poster would take the user to the film detail. Now, the poster turns into the trailer playback window, so we have to offer another way to get to the asset detail information. Our designer proposed two new routes: a dedicated button, and clicking on the poster background. Both of them would show more information about the film.
Not everyone on the team was convinced that users would understand the new navigation. To test the hypothesis, we used Maze, a tool that allows you to ask interactive questions and get answers quickly. The heatmap showing user clicks convinced us that most users would find their way without any difficulties.
Furthermore, the old design made it unclear whether a poster represented a film or series. The obvious solution would be to explicitly indicate the asset type. However, following the classical principle that good design involves as little design as possible, we assumed that showing the asset length would be enough.
“8 seasons” shows implicitly that you’re about to watch a series, whereas “1h 53m” points to a film.
Again, we tested the understanding in the Maze app. With 84% repondees spotting the difference immediately, we gained confidence that the proposed solution would prove sufficient.
Would the UX be good for most users?
Finally, one usability risk remained unanswered. The local internet speeds may not have been fast enough for the autoplay, delaying playback. To address this assumption, we asked engineers for help.
At Showmax, we believe that involving developers in product discovery, before the actual development, is vital. This proved invaluable when tweaking the trailer startup time. While using internet speed throttling tools, we tested numerous resolutions. On top of that, we tinkered with the backend a bit. Every time the app initiates a playback, it also calls a few APIs. Thanks to the engineers on our team, we realized that not all of the API calls were necessary to play trailers. Trimming the calls to the bare minimum further reduced the time it required to launch trailers.
Don’t throw darts in the dark
Upon the launch of the trailer autoplay feature, we observed a massive adoption. A few more iterative releases later, we achieved a 44x growth of trailer playbacks. More importantly, engagement metrics grew significantly as well.
Albert Einstein said,”Assumptions are made, and most assumptions are wrong.” And that’s what often happens in software. Don’t throw darts in the dark by developing expensive MVPs and hoping that the A/B test will turn out well — that is often a costly exercise. Test assumptions first, develop later.