Data Quality Paranoia 

Data quality should be a key criterion for any go or no-go live decision.

If your functionality is in a good shape but your data is not, then you cannot go live. You really want to avoid a situation where you are cleaning data on a live system. It is like trying to repair your car while driving it at full speed.

Bad data in a live environment ends up generating even more bad data as it goes through functionality. 

When it comes to your data migration framework, you should split this into logical steps, such as Load, Transform, Validate, Insert, Post-Insert. Take backups between every step so that you can rollback to the previous working step if you encounter any problems like a bug in one of the scripts.

Scope in some effort to build-in a reporting element in your migration framework. This might sound like unnecessary extra work but the first thing every business will ask you after a migration run (even test ones) is to see reconciliation figures between data extracted and data inserted, reasons and examples of data failing insert validation, etc. Having these reports also helps to identify data quality issues so that the extract team on the side of the business can turn around fixes more quickly.

Share this article

Facebook
Twitter
LinkedIn
Email
Print

Follow our Blog