Migration Testing of ESOP Trading Platform
Case Study
Summary
Post-merger, a leading bank accelerated the process of consolidating accounts and retiring its legacy Employee Stock Option Trading Platform with iceDQ.
Companies often provide stock options to their employees as compensation or bonuses. The administration and management of Employee Stock Option Plans can be complex and difficult. To address these challenges, various banks offer ‘Stock Plan Trading Platforms’ for companies and their employees.
Some of the features of the ESOP trading platforms are:
- Employee Stock Purchase Plans (ESPP).
- Equity and ESPP financial reporting solutions.
- Manage ASC 718 compliance.
- Mutual fund and equity / ETF trading.
- Cash transactions like debit, direct deposit, etc.
In 2020, a leading global bank acquired a discount brokerage firm, each having their own Employee Stock Plan Trading platform.
Post-merger, it became clear that managing two separate Employee Stock Plan platforms was inefficient. To streamline operations, the decision was made to sunset one platform and systematically transition companies and their employees to the acquired platform.
The migration involved comprehensive application testing, with data migration being a significant focus to ensure seamless integration and accuracy across systems.
iceDQ, an automated data testing platform, facilitated data migration testing with its out-of-the-box connectivity to files, DB2, and data lakes. It ensured data accuracy at scale through automated complex checks and reconciliation across different environments, with impressive results:
- Data Testing Automation Level: 100% ↑
- Test Coverage: 100% ↑
- Headcount Reduction: 6 → 4
- Direct Cost Savings: $1M ↑
The ESOP Platform Migration Project
The consolidation and retirement of systems involved a major data migration initiative, involving 320 employer groups, 1.3 million individual accounts and $12 billion AUM.
It required thorough testing and certification with key success criteria:
- Seamless customer experience with guaranteed business continuity.
- Proof of thorough testing and certification for compliance such as BCBS 239 and FINRA.
- Minimization of costs, time, and resource usage.
- Technical capabilities to reconcile data across two separate bank environments and comparing fixed-width files, DB2 databases, and Hive data lake.
Project Phases: Due to the large volume of data, the migration and consolidation was executed in three sequential phases:
- Phase 1: Small employer groups and their employees.
- Phase 2: Mid-size employer groups and their employees.
- Phase 3: Individual large employer and their employees.
Each phase was further divided into two subphases:
- 1. Account master data migration: Moving only account information such as A/c numbers, names, and other customer profile information from the bank to the discount brokerage.
- 2. Transaction migration: Moving all transactions and investment details such as stock and cash holdings from the bank to the discount brokerage’s platform.
P1: Small Employer Migration | P2: Mid-size Employer Migration | P3: Large Employer Migration | |||
P1.1 Account master data migration | P1.2 Transaction migration | P2.1 Account master data migration | P2.2 Transaction migration | P3.1 Account master data migration | P3.2 Transaction migration |
Data Migration Process
The account & transaction history migration process involved the following key steps:
- A. Data Extraction: Client information and transaction history were extracted from the Bank’s DB2 database and transferred to a network storage location via fixed-width files.
- B. Data Transfer: The fixed-width files were automatically retrieved from the network storage and stored in the discount brokerage firm’s network storage.
- C. New Account Creation and Adding Transaction History: New account numbers were assigned to the client data and stored in the discount brokerage firm’s DB2 database. Subsequently, transaction history was imported and linked to the newly created accounts.
- D. Cross-Referencing: The newly assigned account numbers and transaction history were transferred back to a NAS storage as fixed-width files.
- E. Backup Cross-References: The fixed-width files were automatically retrieved from the network storage and stored in the bank’s network storage.
- F. Account Cross-Referencing: The files containing the new account numbers were moved to the Bank’s DB2 database.
- G. Transaction Cross-Referencing: The transaction history with the new account numbers was moved to the bank’s Hive data lake.
Data Migration Testing with iceDQ
Testing was required to reconcile the migrated data across the two environments, ensuring data consistency and integrity. The migration testing spanned across files, DB2 databases and Hive data lakes. The above figure 4 explains iceDQ’s data reconciliation testing scope at each stage as below:
- A. DB2 vs. Fixed-width file Reconciliation
- B. Fixed-width file vs. Fixed-width file Reconciliation
- C. Fixed-width file vs. DB2 Reconciliation
- D. DB2 vs. Fixed-width file
- E. Fixed-width file vs. Fixed-width file
- F. Fixed-width file vs. DB2
- G. Fixed-width file vs. Hive data lake
The above figure 5 illustrates the reconciliation rule from the iceDQ platform, along with the out-of-the-box checks that were created.
- The Bank’s DB2 database is selected as source.
- The fixed-width limited file is selected as target.
- The column “id” is chosen as the ‘diffJoin’ type.
- The result type check is set to “A-B”. This checks that all records from DB2 (source) are present in the fixed-width file (target).
Conclusion
iceDQ’s robust data reconciliation and testing capabilities were crucial in ensuring the project’s success. By automating full-volume reconciliation testing, iceDQ accelerated the migration timeline, reduced the risk of errors, and ensured a seamless customer experience with continuous business operations.
- Native connectivity to read and compare fixed-width files.
- The ability to connect and reconcile data across different databases and data lakes.
- The capability to generate proof of testing required for regulatory compliance.
- Scalability to handle increasing data volumes without compromising performance and stability.
- Support for low-code / no-code test automation, enabling automated test creation, execution, and integration with test management tools such as JIRA.
The project met stringent regulatory requirements, such as BCBS 239 and FINRA, and provided proof of thorough testing and certification. Additionally, the migration was completed with minimal cost, time, and resources, demonstrating iceDQ’s ability to streamline complex, resource-intensive data migrations.
About iceDQ
iceDQ empowers organizations to ensure data trust and reliability throughout the data life cycle.
Our comprehensive platform combines data testing, data monitoring, and data observability into a single solution, enabling data engineers to proactively manage data quality and eliminate data issues before they impact business decisions.
Leading companies across industries, including prominent players in banking, insurance, and healthcare, rely on iceDQ to continuously test, monitor, and observe their data-driven systems. This ensures trustworthy data that fuels informed decision-making and drives business success.
iceDQ Use Cases
- Data Testing
- ETL & Data Warehouse Testing
- Cloud Data Migration Testing
- BI Report Testing
- Big Data Lake Testing
- System Migration Testing
- Data Monitoring
- Data Observability
Share this case study