About the Client:
Client is a leading broadband connectivity company and cable operator serving more than 32 million customers in 41 states in the USA through its brand. Over an advanced communications network, the company offers a full range of state-of-the-art residential and business services including Internet, TV, Mobile, and Voice.
Background:
The client was facing challenges with the new API’s release time cycles which was critical. This directly affected the development process which was not well defined or executed. The client needed a technology expert who could better manage the inconsistencies in the process and help them wade through the entire development process.
Challenges:
- Fortnightly deployments for various services & applications
- Keeping up with their fast-paced development cycles
- Validation process was slow and cumbersome
- Overall delivery schedule & quality are adversely affected as number of deployments rise
- Insufficient time for testing applications/services
- Pushing new updates and features into production becoming a challenge.
Approach:
- The selected projects were staffed by test automation specialist with a test analyst/Subject Matter Expert (SME) available as needed to explain the application’s operation.
- Models of the system operation were developed iteratively as the team increased its understanding of the application as explained by the SME.
- The selected projects were staffed by test automation specialist with a test analyst/Subject Matter Expert (SME) available as needed to explain the application’s operation.
- A RAS specialist helped with best practice modelling recommendations.
- They worked with the SME in complex areas of the application that required detailed knowledge of the way the application worked.
- Models of the system operation were developed iteratively as the team increased its understanding of the application as explained by the SME.
- Once the models were done the test cases were automatically generated, along with the test steps and expected test validations at these points.
- The generation of tests for data coverage with pairwise optimization was matched against optimized test requirements coverage generation and both were compared to manual design efforts for completeness and quality.
- In a full end-to-end test process flow, the next step would be linkage with and automated execution by the selected test execution tool or tools directly from the Creator generated executable test scripts and test execution validation results.
Tools Used
SOAPUI, JMeter
Solution:
- The selected projects were staffed by test automation specialist with a test analyst/Subject Matter Expert (SME) available as needed to explain the application’s operation.
- Models of the system operation were developed iteratively as the team increased its understanding of the application as explained by the SME.
- The selected projects were staffed by test automation specialist with a test analyst/Subject Matter Expert (SME) available as needed to explain the application’s operation.
- A RAS specialist helped with best practice modelling recommendations.
- They worked with the SME in complex areas of the application that required detailed knowledge of the way the application worked.
- Models of the system operation were developed iteratively as the team increased its understanding of the application as explained by the SME.
- Once the models were done the test cases were automatically generated, along with the test steps and expected test validations at these points.
- The generation of tests for data coverage with pairwise optimization was matched against optimized test requirements coverage generation and both were compared to manual design efforts for completeness and quality.
- In a full end-to-end test process flow, the next step would be linkage with and automated execution by the selected test execution tool or tools directly from the Creator generated executable test scripts and test execution validation results.
Benefit:
- 100% automation coverage achieved around end-to-end regression workflows.
- 29% decrease in production bugs.
- The high/critical problem ratio dropped from 18% to 4%.
- Automated API testing resulted in a TTD reduction of 86% during the development/QA phase.
- Performance bottlenecks have been identified and fixed resulting in increased system performance by 18%.