A data migration?! What now?

Preparing and executing the data migration presents many challenges and in many cases these are underestimated. According to Gartner (link to report), 84 percent of migration projects fail or go well over budget.

All the more reason to be aware of the challenges you face and how to best deal with them! In this article we discuss the four biggest challenges of a data migration.

 

The four biggest challenges

Data migration is an issue when a new ICT system is implemented or when existing systems are merged. The underlying reasons are often digital transformation, the application of valuable new technology (think of cloud technology, AI/ML, etc.), supporting changing business processes, rationalizing the ICT system landscape or realizing synergy benefits after a merger or acquisition.

In recent years, data migration has become a greater challenge. The (online) data volumes are constantly growing. The way in which information systems are implemented and managed, think of cloud solutions for example, generally does not make the migration any easier. In addition, the diversity of technology stacks used is increasing further.

The four main challenges in executing a data migration are:

1. Assessing and improving data quality

2. 100% testing with migrated data

3. Business continuity

4. Reliable reporting

These challenges are described below and how to deal with them.

 

1. Assessing and improving data quality

The target system sets specific technical and functional requirements for the data to be migrated. Other relevant systems in the chain – for example business intelligence systems, reporting systems – also set requirements for the data that is important when you migrate data.

That is why assessing data quality is an important part of every data migration. The assessment of the quality of the source or legacy data is performed for completeness, accuracy, consistency, relevance and integrity with respect to the requirements that apply in the new situation. Incorrect data leads to errors and inconsistencies.

Incomplete data causes gaps in the information and unreliable data jeopardizes the integrity of the new system.

Extensive data quality assessment and improvement helps to identify and improve potential problems. This process consists of profiling, auditing, cleaning, validation, verification and reporting of both source and target data.

Data eXcellence (DX) has developed its own tools to perform this part of the data migration process. The tools support the entire process from profiling to reporting.

 

2. 100% testing with migrated data

The challenge of 100% testing consists of three parts – essential for the success of the data migration and implementation of the new system.

– full testing of transformations

– full testing of migrated data

– full testing of functionalities and the data chain of the target system with the migrated data

Testing of transformations

In every data migration, data is transformed. The migration system is set up based on transformation specifications (also called mappings).

We test the transformations by implementing two versions of the mappings. We test both versions iteratively and in successive steps in an ISO 27001 and SOC II type 2 certified environment. For this we use fully automated regression tests and compare the results.

Full testing of migrated data

Connecting target data to source data is not easy. Checks such as counts and control totals are important, but only a real one-to-one connection gives a 100% guarantee!

However, this test is very challenging! Think of cases where data is merged, calculations are made, new identifiers are created, etc. Then the match of a data element in the target environment with data in the source environment is not easy to make!

The connection between target data and source data is called reconciliation. Read more about 100% reconciliation here.

In addition to reconciliation, DX uses a very complete control framework that checks the completeness and integrity of data during the migration process and generates reports.

Full testing of functionalities of the target system with the migrated data

The functionality of the target system is tested with the migrated data in a separate environment. In this way, processes can be tested based on realistic data, without this having consequences for business processes or the old system.

The presence of a test environment and automated tests (including system tests and integration tests) fortunately make performing this test less challenging, but certainly no less important!

 

3. Business continuity

Downtime during or after the migration process is a major concern for many companies, because it has consequences for business operations, productivity and turnover. Limited downtime does not always have direct consequences. What is always important, however, is to ensure that any downtime is well planned.

In many cases, a big-bang migration strategy is chosen. This is the preferred strategy in approximately 75% of projects. In a big-bang migration strategy, all data is moved from the source system to the target system at once (usually over a weekend). Interestingly, at least half of these projects initially planned to perform a multi-segment data migration

1. The alternative – migration in stages – means that you have to have two (or more) operational systems – with all the disadvantages that entails.

2. Migration moments are complex, expensive and a heavy burden on the personnel involved. So: the fewer migration moments, the better.

3. A frequently used argument for performing a migration in multiple stages is that you reduce risks and build trust. With the iterative DX approach, however, testing and verification is so complete along the way that the confidence is there to switch in one go. Which makes the advantages of the first two points extra important.

The DX tooling is designed to process large amounts of data in a completely predictable way in a relatively short time. For example, DX makes it possible to perform a big bang migration in a single weekend. And on Monday morning, business starts from the new system!

DX also has a lot of experience with migrations without any downtime. This can be necessary in situations where the source and target systems are part of a 24×7 operation, for example a bank payment system. Here DX uses so-called delta migrations.

The idea behind a delta migration is simple: take all the time you need to migrate a first set of data. Then perform a catch-up where you only process the changes, the deltas, that occurred between the start of the first migration and the start of the catch-up.

Repeat this process until there is no more delta. The source system remains the leading system after the initial migration. Only after successfully processing all deltas in the target system does the target system become the leading system. With this technique, high-performance high-volume systems can be migrated without downtime!

 

4. Reliable reporting

A new system, new processes and often new or slightly different functionality also means new reports. Of course, the data remains the same, but reports can still differ from previous versions. For this reason, it is very important that the new reports are verified and compared with the old reports.

The connection of old with new reports is challenging. Any differences must be explained, possibly resolved and ultimately accepted by the client.

This component is often picked up as part of the testing. Resolving findings can mean an adjustment in the source system or the selection and transformation of the data. With each adjustment, of course, complete testing must also be carried out, which is why it is very common to pick up the same time in the test phase.

 

DX likes to take on the challenges!

The biggest challenges of a data migration are our daily work. Our approach makes data migration completely predictable. Together we ensure 100% data migration!