Customer Cases
Pricing

Enhancing Business Value with Automation: Practical Team Practices

Learn how QJIAYI Tech Quality Team enhances automation business value with practical practices. 10k+ test cases, 80+ monthly bugs detected—turn automation into a business-driven capability.

Introduction

Automation has become a core part of modern quality assurance and R&D efficiency. However, many teams struggle with low defect detection, high maintenance costs, misaligned technologies, and difficulty proving real business value.

After years of practice and iteration, the QJIAYI Quality Team has built a mature automation system that now runs 10,000+ UI and API test cases in continuous regression and detects more than 80 bugs per month. This article shares how we transformed automation from a costly experiment into a stable, business-driven capability.

Common Challenges in Test Automation

Many teams face similar pain points when building automation systems:

  • High volumes of automated test cases, but few real defects found.

  • Frequent business changes lead to heavy script maintenance.

  • New frameworks and tools often fail to adapt to real business scenarios.

  • Stakeholders doubt the value of automation and are reluctant to invest.

  • Long-term investment does not meet expected return on investment.

Teams often jump between new tools and frameworks without solving fundamental problems, creating a cycle of inefficiency. At QJIAYI, we broke this cycle by aligning automation closely with business goals and team workflows.

Align Automation Goals with Business Stages

Rather than chasing universal metrics such as code coverage, pass rate, or CI compliance, we believe automation goals must match the business lifecycle.

Lessons from Real-World Practice

  • In early-stage products, over-emphasizing high test coverage can slow iteration and hurt customer experience.

  • Neglecting automation during rapid growth leads to online failures, regressions, and damaged brand reputation.

We structured our automation roadmap into three progressive stages:

1. Tool-Driven Stage

We built foundational frameworks including:

  • Apollo API Automation Framework

  • Hades UI Automation Framework

  • Data Regression Platform

The goal was to replace repetitive manual tests with stable script execution and improve case writing efficiency.

2. Platform-Driven Stage

We developed a unified automation platform for:

  • Centralized test case management

  • Standardized execution scheduling

  • Automated reporting and metric analysis

This platform allowed more team members to participate and improved overall efficiency.

3. Engineer-Driven Stage

Test engineers took ownership to optimize systems, expand scenarios, and innovate solutions:

  • Scenario-based and data-driven testing

  • Image comparison, JSON validation, and data consistency checks

  • Platform-based testing for internal packages

This stage turned automation from a tool into a team-driven capability.

Strategic Automation for Complex Business

One-size-fits-all automation rarely works in complex systems. At QJIAYI, our business spans design tools, merchant backends, open platforms, mini-programs, and international services. We use different automation strategies for different technical architectures, including backend services, open APIs, front-end components, and plugins.

Lessons from Long-Chain Automation Attempts

We once built an end-to-end automation system covering design, generation, and data production. While it detected real issues, we faced:

  • Unstable data and inconsistent IDs

  • High comparison noise and maintenance costs

  • Difficulty generalizing front-end interactions

  • High cross-team collaboration costs

This experience taught us to:

  • Strengthen front-end data validation

  • Conduct in-depth feasibility research before implementation

  • Consider both technical and organizational challenges

How We Implemented Automation Effectively

In the early stages, our metrics looked good—high coverage, high pass rate—but stakeholders still questioned automation value. We took targeted actions:

1. Analyze Online Failures Backwards

We reviewed every missed bug to identify gaps in validation, scenario design, and false positives. This made automation more targeted.

2. Conduct Automated Code Reviews

Regular reviews improved stability, reduced redundancy, and standardized development practices.

3. Integrate Automation into Stability Projects

For high-risk businesses, automation became part of project goals from the start. We worked with developers to simplify data construction and improve testability.

4. Optimize CI Stability

We tracked frequent failures, fixed environmental issues, and reduced non-bug CI blockages. This made automation trusted and efficient.

Deepening the Business Value of Automation

Once automation was stable, we amplified its impact:

  • Recognized and rewarded teams for automation innovation

  • Used major tech projects to test and improve automation capabilities

  • Encouraged internal sharing of practical methods

  • Connected automation failures directly to bug tickets with logs and quick re-run functions

These steps turned automation into a trustworthy, indispensable part of the development pipeline.

Conclusion

Building automation is easy. Building a sustainable, business-aligned automation system is difficult. Our key takeaways:

  • Match automation goals to business stages, not just KPIs.

  • Collaborate closely with product and R&D teams.

  • Build your automation platform like a real product—with tools, training, and operation mechanisms.

  • Learn from failures and iterate continuously.

When you focus on making automation stable, effective, and business-oriented, your team and partners will follow.

Latest Posts
1Enhancing Business Value with Automation: Practical Team Practices Learn how QJIAYI Tech Quality Team enhances automation business value with practical practices. 10k+ test cases, 80+ monthly bugs detected—turn automation into a business-driven capability.
2Testing Fundamentals: A Better Solution for Balancing Product Quality and Testing Efficiency Learn how to balance product quality and testing efficiency with context-driven testing, RBT & practical QA strategies. A better solution for agile testing teams to deliver high-quality products faster.
3AI Testing: The Challenges of AIGC Implementation in the Testing Domain Explore the key challenges of AIGC implementation in the software testing domain. Learn how AIGC impacts testing processes, quality, and efficiency for testers and AI R&D professionals.
4Game AI Automated Testing: Technology Evolution & Market Landscape Analysis Explore the evolution of Game AI testing from rule-based scripts to Generative Agents (LLM). Deep dive into market drivers, RL vs. VLM tech, and industry benchmarks like DeepMind's SIMA and Tencent's Juewu.
5Exploration and Implementation of User Experience (UX) Testing Practices Enhance software quality with our UX testing guide. Explore usability metrics, expert reviews, and real-world case studies to improve user satisfaction.