System QA team in the Agile Environment
Most of the testing effort, and in some cases all of it, is strictly the responsibility of the Agile teams. They should perform all activities related to delivering high-quality solutions to their customers.

Another important aspect is the whole-team approach, which states that all team members have accountability, responsibility, and ownership for the commitments of the development cycle. However, while this is good in theory, there are just too many challenges that may prevent the team from handling all the testing aspects.
The whole-team approach is limited in specific scenarios such as scaling or a particular testing process that demands specialized knowledge and experience that do not exist in the team (e.g., performance, scale, and security testing).
Why Do We Need Independent Test Teams?
To handle these situations, there is a need to establish (or at least consider) independent test teams (yes, like in the traditional environment) that provide the second quality layer by handling some of the more challenging test activities that the regular Agile team cannot.
A necessary clarification is that the independent test teams do not replace any of the regular test activities of the regular Agile teams. In fact, their work is based on the creation of a working build that the regular Agile teams deliver in a specific period (in the middle of the sprint or sometimes even at its end).
Moreover, independent test teams are not (and should never) replace the testing process conducted by the development teams in the regular development cycles but should focus on the testing gaps that the Agile teams cannot cover.
These are some aspects that independent test teams should focus on:
Approve the overall quality of the build – Independent test teams are usually focused on non-functional testing that gives the team quick feedback about the overall quality of the build that is mainly tested on the functional side.
Validate the implementation – Independent testing teams have the power to reveal whether the team has successfully implemented the requirements or not. This is one significant advantage as it increases the overall confidence in the product and ensures that the customer gets exactly what he asked for.
Reveal any of the critical bugs missed by the teams. As part of their testing activities, independent test teams reveal many of the missed bugs in the regular development cycles.
Conducting the testing that is too difficult to manage – This is one of the main reasons to create these test teams in the first place. Independent test activities should focus on the testing aspects that regular development teams cannot handle.
Readiness testing pre-production upgrades – In many practices, the independent test team holds the responsibility for approving the system before use in the internal production environment.
Example of System QA in the Agile Environment
One of my customers asked me a few years ago to define the various characteristics of a System QA that they wanted to implement in their environment. You can find a few items from the PPT I shared with the customer below, which you can modify according to your needs:
Team structure
The team structured from a group of 3-5 engineers based on the following architecture:
Team leader – Manage the team and ensure progress and continuous improvement.
Team members – Test Engineers that will perform the work (high priority to internal employees), capabilities:
Have Coding skills
Have knowledge in <company name> cross-platforms/architectures
Fast learners
Pushers
Motivated
Team Interfaces
The team interfaces (internal/external) should be as follow:
The team will work according to the Quality Standards defined by the Quality Director.
The team will report to the Quality Director directly.
The team will collaborate closely with the Product team to receive any new feature alerts, particularly those that have a system-wide impact.
The team will coordinate with Eng teams in the following scenarios:
Team education on current/new architectures, features, and technologies.
Team investigation and Bug fixes.
The team will coordinate with the Tier4 team in the following scenarios:
To gain information on “Hot” issues from the field that the team should add to their testing efforts.
To gather any information about defect "Trends" that can be used to guide the team's efforts.
To gain any information related to defect “Trends” that can direct the team efforts.
To set up OMS infrastructure in their labs so that potential issues can be tracked easily.
To gain feedback related to their testing efforts and how they actually reflected in the field.
Team Boundaries
It's essential that we establish clear boundaries from the beginning in order to give the team the foundations and time to grow:
The team will NOT take ownership of any tests that can (and should) be part of the ordinary efforts of our Eng teams.
The team efforts will and shouldn’t replace ANY of the current team efforts, therefore, they will NOT receive any tasks/requests from external teams to cover their testing efforts.
Only the team has control over the testing flows and how they are carried out.
Team focus and expectations
Like a special unit, the system team should focus on the things that really matter. To be able to do so, we must determine and clarify our expectations, as I see it:
The team will dedicate their efforts to “E2E” tests that are NOT covered as part of "Feature" testing.
The team will work as part of the entire release, meaning they start to test from day one of the version testing.
Work on “Ad-Hoc” projects with High risk.
Each team member will have to work in a different environment to increase the coverage of the team efforts (Topologies, Operating Systems, Filers, products, etc.).
The team should have their own monitoring dashboard with indications for different activities such as Testing progress, Bugs found, and more.
Testing guidelines
To ensure that the system team will work at the highest efficiency:
All test flows will be documents in MTM as the centralized test system.
Exploratory and Risk-Base testing will be limited to narrow timeframes (half – two days) with the approval of the team manager (All tests will be documented and shared with the team report).
The team will direct its efforts to add the testing layer related to integration between our teams.
The team should create, manage and monitor their own testing projects.
The team should design a dedicated “Regression” test suite that will be updated weekly base on new features/product changes.
The team will base their tests on various test guidelines as I created for all teams including Performance, localization, endurance, Negative, etc. tests.
The team will try to work as closely as possible with real customer environments and testing data.