JPMorgan recently announced it was hiring 2,000 engineers, despite the gloom in global economic markets. Is this not an odd risk for an organization to take, given the demand for (and cost of) software developers today? What’s happened?
JPMorgan’s hiring drive is no mystery. Virtually every company in the world is leveraging software, with many transforming themselves into software organizations to drive new efficiencies, widen profit margins and create new markets. Software has been disruptive, so much so that the current era has been dubbed the “third industrial revolution.” Consumers are benefiting from switching providers at the click of a few buttons and most consume services through software. Bad software experiences today mean rapid customer churn and competitive disadvantage.
Quality Should be the Heartbeat of any Organization
In spite of quality’s importance, testing is still perceived as a secondary discipline and a check-the-box exercise. This reflects practices developed for assembly lines during the original industrial revolution. A product gets assembled and testers verify its quality as it leaves the production line. Testing’s goal in this context was to identify problems before they reach end-users. This same approach has been applied widely to software development and is still used by “waterfall” organizations.
Yet, software development has been revolutionized by new methodologies and practices. Software today is constructed at incredible speed. Notice I’m using the word “constructed” and not “built”—the best developers leverage libraries or APIs to assemble software using existing components. They then add a few pieces of custom business logic to define the intellectual property of an application. Many organizations don’t even code software, instead customizing off-the-shelf software with vast configurations to modify functionality.
The assembly-line approach to software development is no longer appropriate. When testing is left late, defects are found before a release or even in production. Release cycles are longer and products poorer quality. Yet, many organizations still stick religiously to this approach, with testing teams operating in distinct silos.
The Role of the Tester is Evolving
The best organizations in the world are changing how software is built. Testing is no longer a check-the-box exercise at the end of the assembly line. It is transforming into a discipline that revolves around evangelizing and driving quality throughout an organization.
Quality should be baked in as early as possible, touching every artifact produced. Ideally, this should occur from the start, as a requirement is formulated, all the way through to the software being shipped. The idea of siloed teams working on different pieces is outdated. For quality to thrive, teams must collaborate.
Modeling offers one approach to driving quality and collaboration throughout an organization. Its journey starts with a requirement. These are typically formulated by business analysts or product owners. Having been a product owner, I found that this usually involves lots of scribbles on paper before communicating to stakeholders on a whiteboard. The preliminary requirement then gets reviewed and refined.
At some point, the requirement will be converted into walls of text in a project management tool. Modeling intervenes at this stage, adding visualizations that make requirements more consumable, less ambiguous and easier to understand.
The word “modeling” may seem a scary concept, but it need be no more than a flowchart that encapsulates a requirement. The benefit of modeling is not only that it provides a digital collaboration asset; models can also generate coverage-focused tests before a line of code is written. Modeling creates requirements that are quality-driven and quality-driving.
The Journey of Quality-Driven Requirements
- Models Provide Data Upfront
Developers today require rich data to develop, while testing is only as good as the available test data. Yet, many organizations still feed slices of production data into developers and test cases.
This approach is problematic for several reasons. First, it often risks legislative non-compliance. Second, production data features large volumes of low-variant data. Data may not then be available when developing unusual-but-important scenarios. Imagine an overdrawn account with several beneficiaries: What should happen when a transfer is attempted?
Modeling defines data explicitly in requirements. It allows engineers to understand a requirement explicitly, including the relevant data. From the modeled definitions, correct data can then be masked and subset, or generated synthetically. Data can further be delivered on-demand and at scale using database virtualization.
- Models Provide Understanding
Once the requirement and necessary data are available, code can be developed. Using text-based requirements, misinterpretations often lead to incorrect functionality. Visual models instead align closely with business logic, and we often see developers enriching models with additional context so that models align with the actual software implementation.
- Models Generate In-Sprint Tests
Once the code is developed, the same tests which were created earlier from the requirements can be executed against new functionality. This facilitates reuse and means the test cases are ready to be executed immediately.
Some tests might further be selected for automation. In this instance, automation can be overlaid onto the requirements model, reusing code to generate scripts. These scripts then become part of the standard regression pack as new features are added to the product.
A Change Request Comes In
With iterative development, change requests are common. This typically creates havoc for quality, since it is difficult to understand the impact of a change, and time-consuming to update artifacts like tests and data.
Traceability is key to overcoming this challenge. Models can connect to external DevOps tools to provide end-to-end traceability, from a requirement, all the way to code and tests. The impact of a change can then be identified directly. Models can be updated, automatically synchronizing test data, test cases, and automation to bring everything up to date.
Testing for the Fourth Industrial Revolution
Software is playing a vital role in virtually every organization, as already recognized by organizations thriving today. Baking quality in early is the only way to ensure quality software. The role of testing must change on a large scale. Models offer a way to achieve a “shift left” and collaborative approach to quality while testing earlier and more rigorously by automating and optimizing test and data creation.