The How-To’s of Test Automation

When it comes to automated testing, company and team culture play just as big a role as anything else.

Written by Michael Hines
Published on Aug. 04, 2021
The How-To’s of Test Automation
Brand Studio Logo

With so many tools on the market, automated testing can feel like a turn-key solution for teams looking to decrease their time to release.

Shockingly, it’s not that simple.

In addition to picking the right tools for the job, QA teams also need to consider what tests can actually be automated, how tests will be structured and what role humans will play in the process. As a result, the approach a team takes to automated testing is impacted as much by the products they build and their culture as it is the tools they choose, as evidenced by the engineering leaders we spoke to at PHYTEC America and Xpansiv

We asked both leaders the same questions and received two different sets of answers, revealing that when it comes to automated testing, company and team culture play just as big a role as anything else.

 

6 Test Automation Best Practices

  • Communicate
  • Collaborate
  • Choose Your Tools Wisely
  • Focus on Reliabiltiy
  • Make Sure Your (Hardware Tests) Are Portable
  • Don’t Overlook Test Documentation

 

Lanny Schwarz
QA Manager • Xpansiv

When it comes to automated testing, a lot of the focus is on picking the right tools for the job, and with good reason. While Lanny Schwarz, QA Manager at Xpansiv, agrees that picking the right tools is incredibly important, he also noted that collaboration and communication between team members — something that can’t be automated as of yet — is essential for success in test automation.

 

Briefly describe your top three test automation best practices.

Communication: Discussing and deciding what test cases to automate and why as a team. Choosing tools wisely: Choosing the right tools is important in order to accomplish successful end-to-end testing and meet requirements. Pair testing: Collaborating within our team and software engineers to find solutions that have a positive impact on our client’s workflows.

We try to automate as much as possible so that we can deliver quick results to our stakeholders and determine if we have met the release criteria.


What kind of tests does your team automate, and why?

We automate functional tests that can be verified in the API and against our database. We also create UI tests that are based on everyday use cases that represent regression tests, which allows us to focus on the more difficult or edge cases that require a more manual type of testing. We try to automate as much as possible so that we can deliver quick results to our stakeholders and determine if we have met the release criteria.

 

What are your team’s favorite test automation tools, and how do they match up with your existing tech stack?

We have multiple business lines that use different tech stacks, and since this is the case we have multiple tools at our disposal. Our platform is written in Scala, and we use Java and the Rest Assured framework to test API calls and Selenium WebDriver to test the UI. We also have some proprietary software and recently started using Cypress to create end-to-end tests for that application and plan to roll it out to others in the future. We also use Cucumber to create BDD test cases and test RESTful API within our registry.

 

Joshua Nejedly
Sr. Test Manager • PHYTEC America

The tech industry is full of software companies, which means that many of the test automation best practices floating around the internet aren’t targeted at hardware companies like PHYTEC America. Joshua Nejedly, senior test engineer at PHYTEC America, walked Built In through the unique considerations that come with automating hardware testing.

 

Briefly describe your top three test automation best practices.

First and most important is the need for reliability in test automation. Repeated testing over a range of hardware, under different environmental conditions and with different test operators may reveal issues that would not otherwise be found. It’s not always feasible to predict and find every possible issue, but it is best to attempt to cover a broad range of scenarios and look at the test setup from different points of view.

Second, portability is essential to many test setups and is generally good practice. Tests should be flexible enough to be moved and run in a new location with minimal changes, besides setting up any required local resources and infrastructure to support the test process.

Third, but still crucial, is the need for good test documentation. This ties into the second point, because test documentation is always required to move tests to a new location. However, good test documentation also allows new test operators to set up and start working on an automated test with minimal training and can be used to notify operators of common user errors to avoid.

Portability is essential to many test setups and is generally good practice. Tests should be flexible enough to be moved and run in a new location with minimal changes.


What kind of tests does your team automate, and why?

Primarily end-of-line manufacturing tests. These tests are the last and most thorough check for full hardware functionality, so it is important that they have as much coverage as possible. Tests need to leverage any and all supported software features as well as pin-by-pin connectivity tests to fully validate chip functionality and manufacturing quality.

 

What are your team’s favorite test automation tools, and how do they match up with your existing tech stack?

GÖPEL CASCON software and hardware is essential for the design of pin-by-pin connectivity testing or boundary scan. These tests are able to handle the bulk of ordinary continuity checks on processor pins, component population and memory device checks. Our old internal testing system is based on Visual Basic for Applications. This system has been used for the majority of testing for many years and it is reliable, portable and has extensive support for other test utilities. However, it is limited in many other respects due to the platform and language it was built for, and we are moving to phase out its use.

The Python scripting language is the primary platform for our new internal testing system, which is still in development but will eventually replace our current test systems. The new system has many improvements such as test reliability, ease of use, ease of new test implementation, version control and quality of results data.

All responses have been edited for length and clarity.

Hiring Now
System1
AdTech • Big Data • Digital Media • Marketing Tech