top of page

Are you making the most of Continuous Delivery?

In the last few years, I have witnessed many great DevOps teams (and some that are not) that managed to fulfill their full potential and, as a result, increase the entire velocity of the release pipeline. So, it was a great opportunity for me to use my analytical skills to analyze what makes them so great. The first thing that stood out the most was their ability to work together as one unit without any boundaries like in-person communication or technical gaps between them.


Another thing that was clear right from the start is that they know how to take the core pillars of DevOps (e.g., continuous integration (CI), continuous delivery (CD), and continuous testing (CT)) and put them into practice using specific infrastructures, frameworks, and tools.


When most people think about these three pillars, they usually assume that the main focus of the DevOps teams should be entirely on improving the pace of the build-test-deploy-operate loop.


They do not consider how to expedite and enhance the intake process. The key to obtaining the desired acceleration through CD is to confirm that quality is incorporated into the application rather than tested after the fact. Organizations involved in testing and quality assurance are undoubtedly working to change themselves so they can enable the CD pipeline to move forward more quickly.


Organizations attempting to achieve CD immediately discover that they are excellent at building things efficiently and correctly. They are, however, uncertain if they are building the right things. There is a distinction between the two, and this, in my expertise, is a discrepancy in CD projects.


Earlier this week, a large multinational company summoned me to attend an all-day DevOps transformation workshop where the business, testing, development, and operations teams were gathered. The goal of the first session in their adventure was for the various teams to settle on a common language among the groups as well as identify the top areas they wanted to begin focusing their efforts on. They wanted some assistance from me to make sure they were heading in the right direction.


That day, it was completely obvious that all discussions undoubtedly converged on how to speed up development efforts, run tests, and roll out to various environments. Nobody really, however, brought into question whether they were using the proper inputs (i.e., concise, extremely clear requirements) before they began coding, testing, and deploying. That's when I stepped in to take control of the discussion. How then can you make sure that you are building the right things fast? Regardless of whether you work with traditional or agile methodologies, you concentrate on improving and optimizing the requirements-gathering process.


The way requirements are communicated across different teams is a good example of this. Even though requirements are the cornerstone of every aspect of the software development lifecycle (SDLC/ASDLC), they are still conveyed the same way across different teams after 15 years: through written language. They are written in word processing or spreadsheet programs or in requirements management software. That is an entirely manual, less efficient, and tedious process.


A manual process, by definition, becomes a major constraint in a fully automated continuous delivery (CD) piping system, where the ultimate objective is speed with performance. Furthermore, requirements written in the text are frequently unclear and subject to interpretation. Approximately 60% (and sometimes more) of the defects discovered in software code are caused by misunderstanding. One of the most widely known illustrations, in my opinion, is the figure "Project Management: A Tree Swing Story," in which each team that participated in the SDLC has different understandings and preconceptions of the requirements.


Methodologies for agile software development, as well as, more recently, CD procedures, all attempt to prevent this. They shift the software development conceptual framework to small cycles and continuous feedback all over teams to guarantee that any communication discrepancies or incorrect expectations, all of which ultimately led to requirements ambivalence, are discovered and addressed pretty early in the lifecycle, truly "shifting left" all quality-related practices to prevent defects rather than testing for defects. However, the majority of these operations are manual, slowing down the CD pipeline.


Aside from requirements velocity being overlooked in CD initiatives, I see many organizations I collaborate with still doing a significant number of tasks manually that could be automated, for instance:

  • Manual Writing of tests: Test scenarios are still created in the same manner: by reading requirement specifications or user stories and creating test scenarios and test cases. The procedure is manual and time-consuming as well. The test case coverage is determined by the test case writer's ability to comprehend the requirements.


  • The testing data bottleneck: Since it takes up roughly 40–50% of the tester's time, test data is a constant bottleneck and a source of pain for most organizations. At the most fundamental level, they normally deal with it by creating a duplicate of the production data, camouflaging it, and making it accessible to the preproduction environments. This time-consuming process typically involves a combination of manual and automated steps (i.e., days to weeks). Unfortunately, from my experience working with various organizations, the data variety in production typically only covers 10–20% of the test scenarios. Consequently, manually preparing, creating, and manipulating data requires more time.


  • Test automation with significant manual labor: Teams with higher levels of automation continue to struggle because developing automated tests remains a largely manual process. Test-driven development (TDD) developers or test automation engineers must still manually write code to automate tests in the application. Guess it depends on the software engineer or Tester, automated test scripts will achieve various degrees of test coverage and will need to be maintained over time, necessitating more manual intervention.


Therefore, there is a lot of talk about automation throughout the CD pipeline, and we are told that in order to maximize acceleration, we must get rid of as many manual tasks as we can. However, it is clear that there is still much work to be done.


The Foundation of Continuous Deployment

When designing a house, an engineer begins with a basic outline or master plan. The scheme is then iteratively reviewed with the customer. Once the client is completely satisfied, the engineer creates a baseline blueprint and distributes it to the other design and construction professionals, such as the interior designer and manufacturing engineers. They then add their design ideas as additional layers of the engineer's foundational plan.


Undoubtedly, this is a relatively high-level view of the design process but bear with me. The analogy will soon become clear. In design projects, designers and the other professions in the process described above will most likely employ specialized software, such as computer-aided design (cad) software. The software enables them to maintain full traceability across all design layers and link them to the fundamental blueprint. Therefore, using the cad tool, each individual can view the entire project with all layers enabled or they can just turn on their particular layer (for example, wiring or electronics) to see only what relates to them. The primary benefit is that each individual engaged is operating under the same fundamental guidelines and is aware of how everything relates to one another on a project level.


The most exciting part is when one of the layers changes. The software is automatically able to detect the impact of changes made across all layers and asks the user to review a specific layer and deal with the impact. There are numerous automated recommendations for each shareholder of an impacted layer regarding how to reconstruct the layer. Consider what would happen if these building design concepts, strategies, and tools existed in the world of software development. Clearly define a requirement. maintain full traceability throughout all code, automated tests, test data, defects, and so on? There's no reason to keep imagining. I know many companies who have already moved to true CD pipeline acceleration, beginning with requirements and progressing to coding, testing, and releasing.

19 views0 comments
bottom of page