Innovative solutions driving your business forward.
Discover our insights & resources.
Explore your career opportunities.
Learn more about Sogeti.
Start typing keywords to search the site. Press enter to submit.
Generative AI
Cloud
Testing
Artificial intelligence
Security
March 20, 2023
How secure is the test data you use to perform your testing and development cycles? It’s an important question with test data being integral to the software testing lifecycle. This data is governed by regulatory requirements around privacy (e.g. GDPR) and by the need to securely perform quality engineering activities as the rapid pace of software releases continues unabated.
Provisioning, whereby the test data is made available in the target development and test environments, appears to be something of a standalone project-based activity, rather than an enterprise-wide strategy if the World Quality Report (WQR) is anything to go by. Our WQR survey found that 41% of respondents had a provisioning strategy that was highly project specific and based on project requirements. A further 31% had an enterprise test data provisioning strategy but it had not been implemented.
While strategically there appears to be a lack of data provisioning maturity, there have been more positive moves in the use of de-sensitized data – where data is masked or synthetically generated to avoid non-compliance with regulations and security stipulations. In fact, 92% of the WQR survey respondents to this topic said their organizations had deployed de-sensitized test data to a cloud repository. This means it is available on demand for testing and development purposes.
Currently, 78% of companies only store their test data on a private cloud for a limited time, such as a project duration. Again, this shows a trend towards on-demand access to the data needed for testing and development purposes. An enterprise-wide test data management strategy with cloud as a key driver will ensure the accesibility of test data on demand and we are increasingly seeing cloud testing as an integral part of the overall software development lifecycle.
This chapter in the WQR also considers the level of automation in the provisioning of data. It seems that manual provisioning remains one of the top barriers to integrating the provision of testing data into the continuous integration and delivery pipelines. We learned that some 42% of provisioning of appropriate test data was still undertaken manually, while 31% of respondents believed that the test data integration strategy into the CI/CD pipelines was often overlooked.
Skills challenges have emerged in the utilization of data provisioning tools. However, more positively, 41% of the survey respondents felt effective partnership strategies with the tool vendors and system integrators would resolve this, while 34% preferred to focus on internal training and upskilling in the different tools.
Of course, if you’re analysing data as input to various strategies for the enterprise, you need to validate its quality and accuracy. This validation has a crucial role to play in the ultimate customer experience that is one of the main business drivers of quality engineering. Our survey found that 48% of respondents used commercial off-the-shelf validation tools, with open source and custom-built scripts used by 26% of respondents respectively.
While around 90% of the WQR participants agreed that a robust data validation capability would improve efficiencies and have an impact on the bottom line, some 42% saw implementing data validation as a time-consuming exercise. Further, multiple databases (47%) and outdated data (45%) were barriers to effective validation. There is clearly a need to address multiple dependencies in order to remove the barriers for effective validation implementation.
If you’d like to hear more about the importance of data provisioning and validation across the software testing cycle, please get in touch.
Senior Director, Capgemini Financial Services UK