External Testing with Tesena
Whether you want it or not, the long term use of manual testing with in-house resources usually leads to a point where you are blind to your own bugs. Because of this, we looked for - and found - an opportunity to test our applications from the outside. Tesena and their Klikaton event were perfect. Specifically, Showmax wanted to test our two native mobile applications - Android, written in Kotlin, and iOS, written in Swift.
What’s a Clickathon?
A clickathon is a public event in which anyone interested in bug hunting in other applications can participate. Usually, these are experienced testers who genuinely enjoy the testing process, but the person who finds the most bugs wins, so there’s a competitive element as well.
Tesena holds events like these often, and they are usually free of charge - the only investments are time and some decent prizes for the winners. In our case, we had gift certificates for Alza.cz (a large electronics e-shop) for the top three testers.
The bug hunts typically have about 15 people testing over the course of two hours. During the event, the development team sits in a different room and evaluates the severity of the discovered bugs in real time. Developers assign points to individual bugs, and the tester with the most points at the end wins.
In total, 13 testers attended the event, so we had to prepare 13 devices apiece with iOS and Android OS. Due to the limited number of devices with iOS, some participants got an iPhone and some an iPad.
Tesena helped us prepare test accounts that gave the testers full access to the application so they could fully-utilize every tool at their disposal. We decided that the participants would test against our testing environment, mainly because it’s easier to set everything up there as opposed to in a production environment.
We prepared accounts, devices, and wifi access, but that was still not enough. The next step was to figure out how to upload the build into devices that did not belong to us. For iOS, we chose to generate a public link from iTunes Connect, shared this link via the notes, and sent it to the devices via AirDrop. From there, we opened the link, installed TestFlight, and then our application.
For Android, we used Beta Crashlytics to distribute the build. All we had to do was forward the email with the invitation to the build and install Beta Crashlytics, which then allows the download of the test build from it.
Because I also planned to take pictures at the event (for this post, in fact), we had our lawyer draw up a consent document for the protection of personal data under GDPR.
After two hours of testing, it was time to evaluate. The bugs had been reported into a shared Google Sheet, which was then analyzed by our designers and developers. Testers did not just report mistakes - they also suggested some improvements some things.
For both applications, the issues were with things like the search bar, where people were expecting different behavior and/or results. The second most common problem was with filters in the Sports section. We discussed both things with our designers and are working on plans to improve them.
In the end, testers found 173 issues in our applications over the course of two hours - 88 on Android and 85 on iOS.
Because our new features were not 100% ready for testing, we didn’t get the results we expected. Instead, they showed us more general shortcomings in the application, and that seeing the app for the first time does not mean one understands how it works - things that are clear to us are not necessarily clear for new users. The lesson is clear and applicable for anyone building anything - it’s good to remember that, even in the clearest of cases, a normal user may not grasp the functionality of what you’ve built. If the testers, who are generally pretty experienced users of tools like this, were unclear on things, it stands to reason that that effect would only be magnified across a population of regular consumers.
The testers’ findings with regards to search was amplified by a mismatch in the search functionality between Android and iOS. On iOS, the search function works after three characters, but on Android it starts immediately after entering just one character. Both applications also display the search results differently, which further confused the testers because they had both platforms and could directly compare them. For example, when they searched for “Intouchables” using the “Int” string, it did not appear in the top position. Instead, there were films with “int” within the name, like “Sintel” or “Match Point”.
Another commonly-reported issue was the use of a filter in the Sports section where, when sorting for events “today”, the results also displayed upcoming events.
No one likes to see issues with something they built, but we were pleased that our external testers did not find any serious bugs that were previously unknown to us. This told us that we have a good team, and that our processes, test plans, and test scenarios are well-designed.
Of course, a huge thanks goes out to our testers. We’re now more aware of how new users can feel about our app and why some may be dissatisfied. Congratulations to the winners and thank you all for your participation and help. A special thanks also goes to Tesena - Without them this never would have happened.