State of AI applied to Quality Engineering 2021-22
Section 2: Design

Chapter 1 by Curiosity Software

Moving to an AI-driven Digital Twin

Business ●●●○○
Technical ●●○○○

Listen to the audio version

Download the "Section 2: Design" as a PDF

 

By submitting this form, I understand that my data will be processed by Sogeti as described in the Privacy Policy.

Artificial Intelligence assists us to express business logic as a digital twin of our critical transactions. Utilizing a digital twin for software testing allows us to establish a single version of the truth for test logic, it provides a collaboration hub for the prioritization of testing tasks and a standardized definition of risk across teams.

Currently, test logic (and thus business logic) is contained within scripts that are inextricably linked to specific tools. As a result, we can estimate that testing efforts are duplicated roughly sixfold across teams. This duplication of testing effort does not end with the development of test cases. Duplication has a compounding effect on downstream efforts such as test maintenance, defect triage, and remediation.

To avoid duplication of testing efforts, we require a mechanism for establishing a shared understanding of business risk and priority. We require a shared reference point for determining what should be tested across teams. If we could establish a shared framework for risk and priority, we can also determine what does not need to be tested in a given release cycle.

The next significant opportunity to accelerate the software testing lifecycle will necessitate a paradigm shift in how we currently operate. Using artificial intelligence, we can now describe business logic in terms of a digital twin of our critical transactions. The digital twin is a dynamic, graphical representation of an organization's canonical processes that enables development teams to think locally while acting globally. This provides a platform for collaboration between humans and machines, as well as a target for machines to interact with and update automatically.

This chapter discusses how a digital twin provides a single source of truth for your critical business transactions. We will also describe how the digital twin acts as a collaboration hub connecting disparate tools and allowing teams to keep pace with change.

Taking our first steps toward the Digital Twin

The digital twin is not a brand-new concept. Simulations in the automotive, energy, aerospace, and mining industries have all been conducted using this concept. In these instances, a digital twin is a software representation of a discrete device that enables engineers and testers to interact with the device's virtual representation throughout its lifecycle.

The digital twin's value is that it is a dynamic representation of reality. That is, the digital twin consumes data and reacts to changing parameters, providing insights that enable the discovery of defects, optimization of use cases, and planning for future enhancements. A digital twin's critical capabilities include:

  • The ability to generate a graphical representation of a complex object or entity;
  • The ability to receive and process dynamic inputs;
  • The ability to react to changing inputs; and
  • The ability for humans to comprehend the impact of change.

Content-Break-06.jpg

The Digital Twin, a Business Enabler

Through a visual representation of critical flows, the digital twin enables technical and non-technical resources to comprehend risk and business priorities. Is this model-based testing? Numerous concepts similar to those promoted by model-based testing are incorporated into the concept of the digital twin. However, the digital twin has two significant distinctions. First, the digital twin is designed for human and machine collaboration. Second, the digital twin is not meant to be an exhaustive description of an application. The digital twin is a continuous representation of the organization's most critical business flows – regardless of application or dependency boundaries.

Human and machine collaboration

Human Collaboration

Despite the widespread availability of testing frameworks, software testing components are rarely reused. How come? While many test automation frameworks support reuse, the reality is that it is generally faster to generate raw automation than it is to build a test from a repository of components. Due to a lack of discipline in test creation, business logic is trapped within scripts (and is thus inextricably linked to a tool), compounding the effort required for test maintenance. Low test asset reuse is entirely preventable as a result of poor test design.

This is where the concept of the digital twin comes into play. The digital twin is the focal point of collaboration for both the business and technical teams. For the business, it serves as a clear delineation of the current state of a software change as well as a canvas on which to express expectations about the future state. For the technical team, it's a single version of the truth that aides in the coordination of dependencies necessary for advanced automation.

Machine Interaction

The digital twin distinguishes itself by ingesting data from the development, test, and production ecosystems as inputs to optimize software testing. This illustrates the distinction between a static, no-code model and a dynamic digital twin.

Numerous data points necessary for testing optimization exist, but they are dispersed across a variety of monitoring and infrastructure management tools, including application performance monitoring (APM), version control, agile requirements management, test management, web analytics, defect management, and API management. To require a human to interpret these data points and make optimization decisions is an impossible task. This is where the digital twin's true value is revealed.

To optimize the tests conducted at each stage of the CI/CD cycle, the digital twin can dynamically flag or assess the impact of the following:

CI/CD cycle
Change Risk Execution Test Quality
  • Code
  • Dependencies
  • Feature Flags
  • Tests
  • Assertions
  • Release Scope
  • User Stories
  • Environments
  • Configurations
  • External APIs
  • Test Data
  • Schedule
  • Capacity
  • Timeline
  • Persona
  • Devices
  • Component
  • Code
  • Dependencies
  • Scope
  • Defects
  • Duplicates
  • Coverage
  • False Positives
  • False Neg. Warnings

 

Continuous testing must be more intelligent. Today, we are intensely focused on the creation and maintenance of scripts. However, the real difficulty with automated testing arises from the script's dependencies and externalities. Optimizing the software testing lifecycle must take into account the dynamic environment in which applications operate and the end-to-end user experience.

 

 

Value of the Digital Twin to Quality Engineering

The digital twin enables an interaction pattern that is as follows: Inform, Act, Automate. This sequence of interactions will enable quality engineering to evolve into a learning system with a “human in the loop.” More importantly, by centralizing critical assumptions about business risk and priority, the “Inform, Act, Automate” pattern helps to level set the organization. A digital twin is a central point of collaboration that enables testers to keep up with change. It converts observations into action, enabling the organization to:

  • Inform the Team as Change Happens
    What causes software testing to be delayed? Change, specifically late changes that were not communicated to the testing team in a timely manner. One of the key differentiators of a digital Twin is its ability to monitor and correlate a diverse set of data points and to alert the team to critical changes as they occur. A digital twin analyzes data in real time and notifies the team of any changes that affect the current release cycle. This is not a control panel! This is a proactive system that provides timely information about the current scope of work.
  • Act on observations to focus on precisely what needs to be tested
    While identifying and communicating change is critical, the most impactful use of a digital twin occurs when testers are prompted to act. Observed changes can in some cases automatically update the test suite, the priority of test execution, or the surrounding sub-tasks associated with software testing. Common optimizations for test execution, such as risk-based prioritization or change-based prioritization, can be triggered automatically by the CI/CD pipeline. Additional triggers to act are presented as recommendations within the digital twin interface, based on well-known software testing software algorithms.
  • Automate sub-processes to support effective risk mitigation strategies
    When we refer to "automation," we typically refer to the task of automating test logic rather than an UI or API. Naturally, the scope of automated tests extends beyond the UI or API, but it is also critical to understand that the scope of what can be automated throughout the software testing lifecycle extends well beyond the test itself. Automation patterns can be used to automate the following:
    • Requirements analysis
    • Test planning
    • Test data
    • Environment provisioning
    • Test prioritization
    • Test execution
    • Test execution analysis
    • Test process optimization

Value of the AI-driven Digital Twin to Quality Engineering

Numerous inputs can help optimize the outcome of automated testing. Not all optimization patterns require artificial intelligence. There are fifty distinct patterns supported by algorithms and event-based rules. These are extremely valuable patterns that provide proactive early warnings of changes occurring within the scope of a time-based or sprint-based change. Among the areas being trained today are the following:

  • Semantic user story assessment
  • Team contribution versus quality
  • Scope creep versus value-add
  • Story point cheating
  • Sprint innovation penalties
  • Test data solving
Open Examples
Example How Value
Use any testing tool Test logic is abstracted and optimized by the central platform
  • An advanced learning system optimizes ‘what’ to test
  • Teams can use any testing tool
  • Teams can switch tools as architectures evolve
  • Avoid vendor lock-in
Cross-team collaboration The central platform allows multiple teams to access test logic
  • Testers can generate disposable tests
  • Testers significantly reduce test maintenance
  • Teams reuse test logic
  • Testers eliminate building redundant tests
Automate testing sub-tasks Manual sub-tasks can be automated via the platform
  • The platform enables workflows specifically for software testing
  • Automating sub-tasks increases process effectiveness and repeatability
Access critical data Leverage the data in the organization’s existing infrastructure tools
  • Curate data for the purpose of software testing
  • Proactively keep testers informed of critical changes
  • Avoid delays in the release cycle
Turn observations into action Known software testing lifecycle patterns allow observations to be converted into action
  • Data is no longer passive – it is translated into specific actions
  • An organization’s unique patterns or tasks can be added
  • Provides the baseline for continuous process improvement

Content-Break-07.jpg

Value of the Digital Twin to Test Design

Not all releases or iterations are equal in terms of risk. The key now is to have an automated infrastructure that is aligned with business objectives and the unique risks associated with the current scope of change. When confronted with changes under time constraints, our team frequently asks:

  • Are we testing the right thing?
  • Is it necessary to test everything?
  • How can we prioritize our efforts in light of business imperatives?
  • How do we optimize what to maintain?
  • How do we limit false positives as the test suite grows?
  • Can we determine whether something does not require testing?

The value of the digital twin is that it can provide real-time answers to these questions. It enables the creation of disposable tests (and data1) and the ability to focus on the most critical aspects of testing.

Critical aspects of testing

Disposable Tests

Although each organization is structurally and culturally distinct, one thing that agile teams have in common is that the practice of software testing has become siloed. Typically, the silo is limited to the team or to a single application that may be built by multiple teams. Due to the fact that risk must be quantified across distributed system architectures, these constraints create barriers.

The widespread availability of best-of-breed open-source and proprietary tools also contributed to the formation of these silos. Point tools have developed a reputation for being extremely adept at driving automated tests. However, test logic has become trapped in a variety of tools as scripts. Allowing teams to use a diverse set of tools comes at a price: significant redundancy, a lack of understanding of coverage across silos, and a high level of test maintenance.

The digital twin abstracts test logic away from scripts and testing tools. It enables the dynamic generation of test scripts that more accurately assess the risk of release. As mentioned previously, the digital twin consumes a large number of data points from the surrounding infrastructure. Observability into surrounding infrastructure provides us with unprecedented detail about risk, allowing us to dynamically prioritize the test paths and data required to de-risk the release. The digital twin can now generate test scripts for any tool automatically.

It takes some time to adjust to the concept of a disposable test. Historically, we have coveted automated scripts such as Bitcoin. The primary benefit of abstracting logic from a tool is that it enables teams to automatically tune tests to meet the precise requirements of a particular release.

Consider the task of manually adjusting each script in response to changes in the broader environment - it would be an impossible task. Now consider automatically generating a set of test scripts that reflect the risks and priorities identified in the surrounding infrastructure – this seems reasonable. We can incorporate these nuances at any point in the process by simply discarding old tests and auto-generating new ones using the digital twin.

Auto generate test scripts

Test What’s Important

Most organizations categorize their testing as either "testing what we can" or "testing everything." Due to a lack of data to confidently prioritize tests, the primary strategy for de-risking a release is to "test everything." Regrettably, this strategy is both costly and ineffective. These teams are burdened by bloated test suites and a high rate of false positives, which erodes their confidence in the test results. While the "test what we can" approach is more efficient from a business standpoint, these teams have a significantly higher rate of production defect incidents.

The majority of teams would prefer to "test what matters." The digital twin enables this approach by providing the infrastructure necessary to view priorities and risks from multiple perspectives. A digital twin enables a team to view priorities from the following perspectives:

  • Bottom-up, via the scope of in-sprint user stories
  • Top-down, via different risk lenses (business process, end-user persona, defect density…)

Case Study: Security Group – Multinational Financial Services

In a multinational financial services organization security is the main priority. Over 40 agile teams were challenged to increase the level of test automation across a complex set of customer-facing applications without compromising security or quality. With multiple teams changing customer-facing application on an aggressive release cadence, the central security team made the recommendation to leverage a digital twin to provide a common definition of critical business logic.

Teams making changes to critical business flows collaborated around the graphical representation of the process in the digital twin. With this approach business leaders, technical team members and members of the security team were able to immediately understand the impact of change and the impact to application dependencies rapidly. With observations from the source code repository, test management system and pre-production monitoring system connected to the digital twin, the team had real-time, bi-directional impact analysis from source code to requirements to tests reflected in a shared interface. Leveraging the digital twin the organization realized the following benefits:

  • Confidence that critical business logic is understood across teams
  • Common understanding the critical business logic gave the team the ability to drive contextual coverage to 100%
  • Clear understanding of the impact of change – derived automatically by observing changes in source code
  • Teams reduced the number of tests in the regression test suite by 45%
  • Teams have confidence in testing outcomes as priority is dynamically established by the digital twin
  • Teams have increased visibility of risk as risk is dynamically defined by the digital twin

 

About the author

Curiosity Software Ireland Ltd

Curiosity Software Ireland Ltd

Curiosity was the first company to launch an Open Testing Platform. Our goal is simple – don’t test when you don’t have to. With data-driven insights into the impact of change, we expose the fastest path to validate your changes complete with required test data. Our Open Testing Platform is test tool agnostic, we can optimize what your team should be testing across any tool or framework.

Visit us at curiositysoftware.ie