Navigating Infinite Change

Digital Asset #2 - Agile solution delivery

It’s difficult to imagine a world where Volatility, Uncertainty, Complexity and Ambiguity (VUCA) are more commonplace than today. Business predictability appears to have disappeared overnight.

Sorry, this content can only be visible if Functional Cookies are accepted. Please go to the Cookie Settings and change your preferences.

When predictions fail, it’s time for an alternative approach: to experiment, measure, pivot and repeat. This explains why so many organizations have shifted from conventional ‘waterfall’ methods to Agile approaches for their digital projects.

Solution delivery requires effective methods, strong skills and disciplined execution. Previous methods have been geared to well-defined requirements, mature technologies and predictable goals. These methods are less effective when prediction becomes impossible and goals are difficult to determine upfront. The Agile approach fills the void, and can be summarized in four key principles:

  • Individuals and interactions over processes and tools
  • Working software over comprehensive documentation
  • Customer collaboration over contract negotiation
  • Responding to change over following a plan

Implementing Agile requires, foremost, a change in enterprise culture, especially around organization management practices. Too often, the Agile approach is limited to IT departments. Many case studies have demonstrated, however, that an organization-wide transformation is required, usually ending in the fusion of the IT function with the wider business. The disruption this can cause is worth it, because the ability to deliver working solutions quickly and iteratively to problems is what will set successful organizations apart today and in the future.

Target Technologies

  • Well organized and well-equipped teams are the starting point for Agile solution delivery. DevOps product teams take full ownership of the end-to-end cycle of a product or service, bringing IT and business teams together. Many organizations combine and maintain teams around specific technologies to ensure more reliable, efficient and accelerated delivery of business value.
  • Security is an integral part of development and starts from the beginning of any project, following the ‘shift left’ (hence the term DevSecOps.)
  • The same goes for project quality, which is no longer a focus only at the end of a project cycle. QE starts when development starts and is an integral part of the entire process. AI can be used to ensure QE keeps pace with accelerated development. Legacy tools are reliant on manual processes, data complexity is growing and the digitization of life means there is more to test (not only ‘does it work’, but ‘how does it work?’)
  • In future, Autonomous Quality Enablement Bots will automatically and autonomously identify application changes and select, generate, adapt and run selected test scenarios to validate new applications and features. These self-running and self-healing test bots are the consequence of applying hyper-automation to QE activities (more on this later.)


  • Resource management is key. Technical professionals are scarce but in high demand, and must therefore be supported to be productive – doing only what they are needed for, while everything else is automated). In other words, developer velocity should be as high as possible.
  • Low code systems like Microsoft Power Platform are one way of ‘saving’ developer resources. Until now, low code has not become a full member of the business toolbox because it is often too complex and too difficult to maintain by Ops teams. A solution for this problem has emerged in the combination of low code with professional code, so that the formed becomes easier for businesses to implement and maintain.
  • To further increase the efficiency of writing code, developers can use GitHub Copilot, an ‘AI pair programmer’. This tool analyzes comments and code in realtime and suggests individual lines and whole functions instantly.
  • To improve developer conditions and wellbeing further, systems like Microsoft’s Smart Workspace with Viva can be used to provide insights and support to manage work-life balance and improve staff productivity.
  • DataOps combines best practices in data management, communication and automation to reduce cycle times in delivering business value from data analytics, including for AI projects. Data-handling traditionally involves a lot of moving parts. DataOps activities and principles operationalize this process, by creating efficient, cohesive and consistent data pipelines.

Business opportunity workflow. Cyclical process. Data Ops (Identify & collect, process, store). Business Opportunities or unmet goals in continuous delivery trigger: Problem Definition (Requirements, Explore, Analytics framework selection). Modelling (extract & transform, train, optimize). Continuous integration (plan, develop & package, test). Continuous delivery (staging, approval models goals met, release & configure). Monitor & Validate (monitor, validate). If not validated, retrain and on to Data Ops.
Click to enlarge image Business opportunity workflow

Business opportunity workflow


  • Finally, MLOps. AI and Machine Learning (ML) can be applied to increase the efficiency and/or speed of decision-making in nearly every scenario. Most AI solutions suffer from long lead times, however, with many never even making it to production. This negatively impacts businesses and undermines confidence in AI.

MLOps is designed to target the causes of this slow progress – namely, complicated fields of modelling, an excess of available frameworks, computational challenges, team compositions and more. It does this by standardizing and streamlining ML lifecycle management, to make use of – amongst other things – traditional DevOps processes within the context of ML.

Testing & Validation

Over the last decade, the advent of Agile development methodologies has fundamentally altered how organizations conduct quality and test activities. Organizations have transitioned from siloed independent test centers and test teams to autonomous, multidisciplinary high-performance teams, where quality is ensured by all.

The basic objective of these teams is to achieve results quickly. Quality and test operations are therefore often less stringent in Agile operations and teams are given considerable latitude to develop their test methodology and test automation technology. Many organizations have encountered disadvantages in this engineering focused approach – not least, unintended consequences in production, insufficient testing coverage prior to go-live, inefficient test operations, difficulties with successful test automation and an over-abundance of tools and test techniques.

The causes of these difficulties are a lack of a strategy, insufficient rules, a lack of Agile-compatible assessment methods and a lack of professional testing skills within teams. As a result,,, many organizations are returning to a more balanced Agile quality approach with a combination of centralized quality support functions and individual quality engineering support integrated into feature teams.


We have captured the current best practices in an Agile Quality Orchestration model named ‘Quality for DevOps Teams.’ This contains four key elements:

  • The injection of Agile quality engineers into high-performance teams
  • An Agile quality enablement Center of Excellence (CoE) to provide teams with a standardized base test automation platform and agile test guidelines
  • A quality squad to ensure end-to-end business scenario and customer journey testing across multiple feature teams
  • An Agile quality architect at the business domain level to oversee quality strategy and quality assurance

Effective implementation of our model paradigm has enabled businesses to achieve a >20% improvement in test velocity, >30% increase in test cost efficiency and less than 1% defect leakage to production.