WeTest QA Case Study: A casual mobile game

This is a popular global mobile game maker which encountered issues in March 2020.

This is a popular global mobile game maker which encountered issues in March 2020.

Due to COVID-19 quarantine policies, its original external compatibility testing service supplier failed to provide effective testing services, mainly in three aspects: a lack of internal QA Expertise, slow server response, and insufficient testing models.

Each of their tests was only supported by 20 models, which indicates that potential compatibility issues on hundreds of phone models weren't examined and resolved.


To solve the problems, we create a roadmap that includes six detailed steps:

Firstly, Collect materials required for testing.

Secondly, Tech experts communicate requirements, customize test plans, and complete test-case designs.

For the third step, we select the test models. In this specific case, we use the TOP 1,000 Models, which covers 1,000 popular phone models worldwide selected by big data and user analysis.

After selecting the test models, we use automated testing and AI testing to identify issues in specific game scenarios.

Then, issues are monitored using mobile devices in the cloud lab, which capture the screenshots and real-time logs for further debugging.

Lastly, we classify issues, debug problems, conduct regression testing and generate the test report.

This process is supported by millions of registered players, a management distribution platform, and a professional game assessment team.

Comprehensive Solutions

This is an overview of the comprehensive solutions that WeTest uses to test mobile games.

During the DEMO period, WeTest conducted researches about word-of-mouth trends, find out the causes of the changes, supply recent hot events with key communication nodes, and analyze industry trends, in order to understand how the game is adapted in the market.

Among the full chain of R&D to operation phase, a variety of testing categories are activated, including localization quality assurance for the target market, functional testing, security testing, performance testing, network testing, compatibility testing, and payment testing.

Before the game releases, app store compliance and regulatory issues were put on the table.

In the operation & growth phase, the performance of the game or app was monitored continuously through social listening, game crowd testing, crash reporting, and Client Application Performance Monitoring (APM).


The effective testing models that we selected to cover 80% of global mobile users.

After many rounds of trial and comparison, more than 50% of its testing costs have been saved due to expertise escalation and model update.

As a result, the profits and quality of the game are largely boosted.

Latest Posts
1Navigating the Road to Success in Designing Distributed Systems This article discusses the author's various endeavors in the realm of mobile gaming to implement distributed systems and outlines the definition of services, the development of the overall framework, and the process of internal service segmentation during these efforts.
2A Case Study on Debugging High-Concurrency Bugs in a Multi-Threaded Environment The article covers the debugging process for core dump issues, memory leaks, and performance hotspots, as well as the use of various tools such as GDB, Valgrind, AddressSanitizer, and Perf.
3A Comprehensive Guide to Using Fiddler for Mobile Data Packet Capture In this article, we will primarily focus on how to use Fiddler to capture data packets from mobile devices.
4Android Performance Optimization: Best Practices and Tools This article summarizes the best practices and tools for optimizing Android app performance, covering topics such as render performance, understanding overdraw, VSYNC, GPU rendering, memory management, and battery optimization.
5A Comprehensive Guide to Android NDK Development with Android Studio This guide provides a step-by-step tutorial on how to set up and use the NDK in Android Studio, covering everything from environment configuration to writing and compiling native code.