A large retailer has several Ecommerce apps which are native mobile. They compete in a highly competitive category around the world. How fast the app responds directly correlates to likeliness to complete a purchase. Measuring the timing of critical Ecommerce user flows, point to point, in their own mobile apps as well as competitors, is critical in a hyper competitive space. Fast and accurate data can drive improvements with each build.
A new feature was developed leveraging the above image recognition algorithms leveraging cloud GPU’s. The system recognizes screen elements in iOS and Android as a human would and reacts similarly to the way a human would. With the ability to operate any application and accurately time actions to available items on the next screen. Including cached and server requested items. The result is rapid accurate data that improves builds and increases sales.
Specifically, we mimic the phone screen on a windows computer and measure the delays in the screen duplication as well as execution of steps. This data is used to adjust the measured timing.
The system records the mobile screen at 30 frames per second (or one snapshot every 33.33ms). Each frame contains encoded timing data as well as the screen replication itself. This is captured and transcoded as black and white since color information is not required in measuring performance timing.
A specific user flow with one app on a single device is run 1000X, from launching the app (which is measured) to nearly completing the purchase. Executing (automatically) the same user flow 1000 times generates an asymmetrical bell curve of results that frequently resembles this. :