Below are synopses of actual Beta Breakers automation engagements.  In each synopsis however, customer and/or product identities have been scrubbed in order to protect Non-disclosure agreements. If you are interested in similar results for your QA needs, complete the form or call us to get started. 

 

Case Study #1 – Android Application Acceptance Test Suite

The Android Acceptance Test Suite covers all essential regression features and provides timely feedback and reporting to Android Application developers and QA. This suite is easily run within the CI process or in QA by pulling the most recent HockeyApp build on which to test. By providing constant feedback to QA and reporting regression bugs, the suite saves hours of QA testing per run and allows manual testers to focus on tighter feedback cycles for brand new features. Test results are provided via Cucumber Reports with attached debug information such as custom exception messages, screenshots at the time of the error, and a snapshot of the page XML. Reporting features useful metrics in addition to results, such as percentage of features passed, features failed, skipped, and pending implementation. Utilizing Selenium/Appium, we leverage a single codebase against Google Android and Amazon Fire OS mobile operating systems.  No third party vendors were involved.  The Beta Breakers team consisted of one full-time QE resource with occasional augmented and collaborative support from other Beta Breakers QE staff.  The completed automation suite was delivered on time and was very well accepted.

Test Cases Written (75)
Implementation Time (6 Weeks)
Execution Time (30 Minutes)
Customer Satisfaction

Case Study #2 – API Acceptance Test Suite

The API Acceptance Test Suite covers all of the exposed endpoints in the customer’s backend API. The test cases are written from the perspective of user stories as opposed to merely unit testing the endpoints. In addition to the backend API, the test suite leverages other dependent APIs. In order to easily facilitate the testing requirements of the Mosaic development team, an HTTP Client was designed and developed for testing REST and REST-like APIs. It currently resides in a Nexus repository and is used by other teams. The Test Suite is fully integrated into the API’s Continuous Integration pipeline on Jenkins. In addition to testing new builds at every deployment, the test suite runs twice a day on a timer. The test suite is configured to accept environmental variables as testing arguments allowing multiple Jenkins jobs to leverage the same test suite while testing multiple environments and features. The Jenkins jobs are configured to conditionally notify the development team of test failures based on the urgency around the features. Developers are notified via email and slack notifications. Test results are stored and displayed on the Jenkins job via the Jenkins Cucumber-JVM plugin. In addition to displaying feature coverage and test results, the reports contain step-by-step user flows with detailed drop downs. These step drop downs convert the HTTP requests made by the test suite into curl commands that the reader can copy and paste directly into their terminal. The reports also contain the REST responses (in this case JSON responses) that were returned via each API call. This approach allows for a high level look at Behavior Driven reports while still allowing readers of the reports to zoom to a lower level view of the flows and what may have gone wrong. This approach also allows many hours of manual QA to be replaced by automation, as well as supplemented by automation. The look and feel of the reports are perfect for giving management a high level look at the quality engineering product velocity while also maintaining low level use for developers and manual QA alike. Baked into the API test suite is a series of tests specifically for monitoring the status of various user-facing processes and service health checks. These test results are sent to an InfluxDB instance where they are read and displayed on another customer product, the “Quality Monitoring Dashboard”.  Again, no third party vendors were involved.  The Beta Breakers team consisted of two full-time QE resources with occasional augmented and collaborative support from other Beta Breakers QE staff.  The completed automation suite was delivered on time and was very well accepted.

Test Cases Written (48)
Implementation Time (4 weeks)
Execution Time (20 minutes)
Customer Satisfaction

Case Study #3 – Web Acceptance Test Suite

The Web Acceptance Test Suite covers all user-facing web features available on the customer’s web application. In addition to covering user features, compatibility testing with various browsers is baked in. Currently, the Web Acceptance test suite measures compatibility with Firefox, Chrome, Safari, Edge, and Internet Explorer. Within this compatibility testing, it also tests video playback and video player functionality. The core engine that drives the Web Acceptance Test suite is Selenium Webdriver and Cucumber. Cucumber allows all test plans to be written and expressed via user stories compatible with Behavior Driven Development. Given the many stages of the development process, and numerous environments, the test suite is parameterized to build and test environments, browsers, and features within the specifications of the user running the test suite. The suite itself manifests in many Jenkins jobs that run the specific tasks required for each environment and browser. The core purpose of the test suite is for complete integration within the development CI pipeline. When developers push new builds to their respective environments, this action triggers a test suite run against those specifications. The results are displayed in HTML Cucumber Reports and the development teams are notified via email and slack. The manual QA team also has access to a number of Jenkins jobs in order to supplement their regression testing, saving hours of manual QA time in the process. The team’s Selenium Grid instance allows for multiple jobs to be running simultaneously against different environments and browsers. This architecture means the manual QA team can run their own regression tests without fear of interfering with the development CI process. Embedded in the test reports are screenshots of test failures as well as links to video playback of the tests in question.

Test Cases Written (112)
Implementation Time (8 weeks)
Execution Time (60 minutes)
Customer Satisfaction

Case Study #4 – iOS Acceptance Test Suite

The iOS Mobile Acceptance Test Suite, similar to the Android Acceptance Test Suite, utilizes our in-house Selenium/Appium framework and covers all essential regression features as well as all other automatable feature tests. The test suite built as a Maven project is integrated into the development CI pipeline. The test suite is configured to test against real devices or iOS simulators; however, due to limitations of the applications requirements, key functionality is not possible on simulators. The test suite thereby will trigger test runs on physical devices located on campus. Test results are provided in the form of Cucumber Reports, via the Jenkins Cucumber-JVM plugin. Embedded in the reports are screenshots of key steps during failure as well as other exception messages, stack traces and other key debug information for developers. The reports themselves provide a good high level view of the user stories which are tested. The test suite is also configured to test against environments throughout the entire development process, including development, staging, QA, and production.

Test Cases Written (60)
Implementation Time (6 weeks)
Execution Time (30 Minutes)
Customer Satisfaction

Case Study #5 – Product Acceptance Test Suite

This product (herein referred to as Product A) is available on Web and iOS applications. Our team integrated into the development process, attending sprint planning meetings, daily standups and weekly scrums. Our team developed automated test suites using our in-house Selenium wrapper for both Web and iOS applications. Test suites cover all essential regression features as well as all other automatable feature tests. The Product A iOS Mobile Acceptance Test Suite is configured to test against real devices or iOS simulators; however, due to limitations of the application’s requirements, key functionality is not possible on simulators. The test suite thereby will trigger test runs on physical devices located on campus. Both Web and iOS test suites are integrated within the development CI/CD pipeline. Test results are provided in the form of Cucumber Reports, via the Jenkins Cucumber-JVM plugin. Embedded in the reports are screenshots of key steps during failure as well as other exception messages, stack traces and other key debug information for developers. The reports themselves provide a good high level view of the user stories for which are tested. The test suite is also configured to test against environments throughout the entire development process, including development, staging, QA, and production. In addition to the standard automation of the product, a detailed cost analysis and recommendation was performed and given with regards to the Google App Engine configuration of the product.

Test Cases Written (160)
Implementation Time (9 weeks)
Execution Time (90 minutes)
Customer Satisfaction

Case Study #6 – Product A Performance/Cost Test Suite

In order to properly cost test Product A on the Google App Engine, a parameterized performance test suite needed to be made first in order to simulate the proper user behaviors at expected traffic count. The performance test suite leverages JMeter, using the Maven Blazemeter plugin. The JMeter test plan was written to include the developers’ unit test suite, written in JUnit, in order to simulate user interaction with the site. The JMeter test plan also included manual http requests against features that were not tested in the developers’ unit test suite as user stories. The test suite allows thread count to be passed as an argument by the test runner. A custom sandbox GAE instance was created for the front and back end of the application. Front and back end instances were then tested using the performance test suite, and GAE metrics were measured as the tests were run multiple times against various configurations. For the backend instance, the Cron jobs also had to be taken into consideration. The Cron buckets were filled with user activity and in swoop emptied to allow isolated data analysis of the GAE configuration’s behavior. Once each test was run with each instance type, configurations were further narrowed in on by fine tuning other test parameters. Eventually, an expected cost for each configuration was attained, and a recommendation was made based on the best user experience at the lowest cost.

Test Cases Written (Endpoints 12)
Implementation Time (2.5 weeks)
Customer Satisfaction

Case Study #7 – API Acceptance Test Suite

A full regression test was developed for Product A, a collection of REST API endpoints for financial services. The test cases were developed by the automation engineer and finalized collaboratively with the client. A list of bugs and flaws in the behavior of the APIs was delivered. The client did not utilize a CI/CD process, and wished to run the test suite with a manual trigger. Thus, a solution was provided to run the suite via the Maven command line, with an integrated Cucumber HTML report to provide a convenient log of the results for each test run. Technical assistance was provided to configure the Maven project to run securely on a VM. A JMeter load testing script was also developed and provided to the client, with a demonstration of how to use the script on their own, as well as recommendations of load limits from a round of in-house testing.

Test Cases Written (26)
Implementation Time (3 weeks)
Execution Time (15 minutes)
Customer Satisfaction

Case Study #8 – New Product Integration Into A Software Ecosystem

Our customer’s project was to replace a crucial but outdated in-house software application with a customized third-party SAAS solution. The whole of the project included a web front-end SAAS, legacy desktop software, REST APIs, Amazon Web Services and PostgreSQL. Test cases were developed from ‘User Stories’ which documented the end-to-end flows of typical product usage. Since the User Stories could branch into different types of usage or into error conditions, multiple test cases were derived from a single User Story.  Generally, a test case would begin by creating assets via front-end Selenium automation through the third-party SAAS. Next, automated back-end validation of the downstream components was performed, testing the handling, storage and distribution of the information in the assets. Integration tests were also created to test individual components using mocked data. Our customer used an in-house Selenium solution for front-end automation, SmartBear ReadyAPI, and NodeJS with strict linting rules for AWS interactions. Test cases were tied together using Atlassian Bamboo and results were pushed to Practitest. Our automation engineer worked closely with the customer’s QA resources, and integrated into their AGILE workflow, participating in daily SCRUMS, Sprint planning and retrospectives. Our engineer was also responsible for providing live or recorded demonstrations of completed tasks, as well as extensive documentation of the testing behavior.

Test Cases Written (64)
Implementation Time (3 weeks)
Execution Time (30 minutes)
Customer Satisfaction