Performance Testing

NSW Government Agency

Case Study – NSW Government Agency

 

Background:

A NSW government agency had a requirement to move a number of their Critical applications infrastructure from GovDC (on Premise) to the cloud (AWS) to mitigate infrastructure end of life risks.  Performance engineering & testing was required to define and baseline the workloads to be migrated from on Premise to cloud and based off agreed NFRs. The Performance engineering & testing was carried on both non-production workloads on premise and tobe Cloud environments along with conducting comparative test cycles to detect any performance anomalies. The to-be production environments were also performance tested based off agreed NFRs before they went live.   Performance testing was carried out on both batch and realtime workloads. The Client’s main application environments had several internal shared components such as a document/image repository, batch components (Batch server and secured fileshare landing zone) & CA Autosys which added more complexity along with external agencies feeds, 3rd party services and payment gateways 

Problem Statements:

5

Client’s resources were utilised on other high priority projects and did not have availability to carry out the performance testing.

5

The client’s documentation that existed did not have enough detail and a good level of analysis to help define and scope out the NFRs for each migration bundle

5

No baselines for performance testing existed and performance scripts did not exist

Client outcomes:

Gave the client’s business confidence that their main objective would be met where the cloud performance was at least the same or better and not performing worse than in GovDC. It was overall substantially better by a factor of 30% across the different application stack layers

Improved confidence that during the migration cutover (to production) especially on the batch side that processing would be quicker than expected to catch up from critical batch feeds and allowed more contingency and time for other tasks during the implementation

Simulated the actual database workloads in non-production before and after across the environment lanes before production to measure performance

Found bottlenecks when some migration components were in a hybrid state (i.e., in GovDC and Cloud) in testing in the lower environments which allowed re-planning of migration to mitigate risks

Activities:

=

TL Consultants managed the whole performance delivery from start to the end from Discovery, test planning, environment configuration, test preparation and scripting, execution, defect triaging and reporting

=

DBA resources were shared across multiple projects and TL took on DB tasks/activities such as backups, restores and captured workloads using DB capture tools from production to be executed in the Nonproduction on premise and to their respective to be cloud environment for comparison.

=

Multiple workshops were required with various teams such as architecture, development, DBAs, and business were held to agree on expected NFRs

=

Worked with Analytics teams to analyse production workload statistics over the previous 6-9 months to help define NFRs

=

Updated documentation & scripting around performance both real-time and batch interfaces was produced to support re-usability

=

Performance scripts for real-time workloads were developed in JMeter and scripts were maintained in GIT following best practices. Database workload capture tools were also used to capture agreed production database transactions that were replayed into the non-Production database instances. Database capture tools such as Oracle RAT, PgReplay & SQL Experimentation Assistant were utilized

=

Grafana was used as a reporting tool

=

Critical production batch files were executed via AutoSys and executed in the Non-production environments both on premise and cloud environments to compare workloads NFRs

=

Tuning recommendations were provided both on batch and real-time workloads

Measurable Benefits:

=

On average API response times were 15% faster compared to the baseline on GovDC

=

Performance tested various 3rd party consumption. On average 20-25% better performance than on GovDC

=

On average 45% faster on cloud. (Database (only) layer timings compared to same timing baselined on GovDC)

=

On average (Application server and DB) it was 25-30% faster

Uplift your Quality Engineering capability today

;

Contact Us

TALK TO AN EXPERT