When a client asks me “what’s the most difficult part of an implementation project?”, what they really mean is, “what aspect is going to take the longest time and the most resources to complete?”.
Many will assume it’s the bespoke development and configuration of the new data model, the design of the hundreds of shiny new reports they want to create, or perhaps it’s the days to roll out and train users, admin or management across departments, offices and countries.
Often, in fact, it’s none of the above.
Time and time again, for us project managers and developers, the answer is Data Migration. Often perceived to be the simplest of tasks, data migration is often overlooked, underrated and undervalued in many implementation projects.
The transfer of years of data with varying levels of granularity, from disparate data sources (often unmanaged grotty spreadsheets), geographies and of various different natures (financial and non-financial), all needs to be cleansed, remapped, reconciled and transferred to the new system. After working over 15 years in multiple industries, I’ve seen this all too often.
Often, the new system has a new chart of accounts, reporting structure and protocols that old data needs to miraculously “fit” in. More often than not, the data migration process itself flushes out data anomalies, accounting errors and erroneous entries built up over years of sweeping untidy accounting loose ends under the carpet.
The painstaking job of data migration becomes not just merely a project “task”, but a gordian knot from which no one cannot accurately predict the end of the piece of string. This is particularly true when it needs to be computed and reformed back into something we recognise like a jumper or more likely, last years’ published accounts!
And therein lies the final stage test of a successful data migration – does the information coming from our new system replicate the published financial figures that the board, auditors and the market have already seen? Because if it doesn’t rinse and repeat endlessly, until it does.
Here are 6 proven practices of dos and don’ts when planning a data migration component within a project
- Do not under-estimate the importance of data migration. It is not merely the process of transferring data from old systems to new. The process sheds light on current operating processes and reveals the nuances. Too often, I have been witness to changing requirements and a need for some fundamental data model redesign because of facts unearthed through the data migration process.
- Do not under-estimate the size of the task. There is always more activity than you expect. More systems and more history; data that needs to be cleansed, rationalised and accounted for. More data = potential of more erroneous and anomalous data entries that need to be reconciled. Build contingency into your project plan, data migration is where the unknowns tend to lurk. The project plan needs to account for the reconciling efforts and production of a clear audit trail.
- Do not abbreviate the migration activity to make up for lost time from other project activities. At the same time, do not cannibalise the migration project to resource other parts of the project. Project success is greatly affected by the quality and availability of data in the new system.
- Do make use of the best tools. I am particularly a big fan of Microsoft Power Query. Its in-built capability to extract and transform data adds an automation layer to the task and it is easily modified and refined as the activity matures. Furthermore, Power Query can be used to prototype your future data model and ETLs.
- Do involve all workstreams of the project at every step of the activity. Ensure appropriate signoffs are in place prior to moving onto the next set of data. Ensure data audit requirements are understood by all involved so that they are at the forefront of the data reconciliation exercise.
- Do work cohesively with the rest of the project and the program. Make sure that you are aware of looming changes to data models and involve the rest of the team in your discovery and data analysis. Deal with the nuances as and when they arrive (proactively), rather than having the data model team respond reactively.