Quality engineers spend lot of time in identifying the right set of objects, write tons of test scripts and test cases. Our case is related leading bank in the region which was working on developing a Loan Origination and disbursal process. Business was looking to develop an application to generate score for leads on daily basis because customer information was getting enhance on daily basis and flagging lead as purse or Not-purse also may change with additional details and result in change the probability of sanctioning the loan. Changing data points and features results in changing test cases and test scripts and it was nightmare for the team. AI based techniques helped us to generate test scripts autonomously based upon baseline as well as new added features. How does it work?
- Identify key topics of use cases, requirement documents
- Analyze application code using advanced Deep learning-based techniques
- Generate test cases for data pipelines and modules
- Generate test scripts based upon acceptance criterion and KPIs
- Predict foreseen bugs and severity and prescribe the suitable self-heal framework
GPT3 with 175 billion parameters was the backbone of all these experiments. Test cases were framed quite good per requirements and application code.
Every day new data brings another challenge of data ingestion, integrating with other pieces of the puzzle to get business insights in time. To handle this challenge, we used CI/CD pipelines for Data Ingestion which covers:
- Validation and Compliance
CI/CD pipeline is a group of steps that must be performed in order to deliver a new version of software.
After CI/CD pipelines, we do not need to wait for final ingestion module to Integrate with other part of the project, so we have all the time latest data and latest piece of code integrated.
Figure: Diagram 2
Now, using AI algorithms, we already have test script and test cases which have been scheduled to get excepted once new data arrives.
Test cases and test logs have been analyzed by an AI algorithm to investigate functions that were highly malicious along with probability score to take proper action and automate even self-healing techniques going forward. AI based program enabled to run lead scoring application around the clock and tester can execute tests as and when required with zero dependency upon application execution. Above all, it happens in real-time which enhance the accuracy and precision.
AI algorithms have been integrated within the different stages of the software lifecycle for the use case.
Self-learning and self-healing pipeline using DevOps, made sure that models are getting becoming intelligent with every consumption and decision making
At Stage of lead generation, due to large number of data sources – structured and unstructured, it is not possible to ingest everything and feed everything to check if Intelligence decision making is happening or not and moreover, it is lengthy, costly and time-consuming exercise. Here, AI systems have predicted the cost of quality across the software lifecycle and priority of each feature by taking into account historical project data and QA costs.