Businesses and organization around the world are collecting and analyzing growing amounts of data, leading to better decision making, greater transparency, and improved efficiency. The down side to all that data, though, is the fact that “bad” data gets in with all the rest, possibly skewing reporting and analysis results. There’s not many things worse than spending months, even years, collecting data and realizing that any insights you hoped to glean from analysis are shot – all because of that small percentage of flawed, incorrect, or just “bad” data.
So what can you do about it?
The most obvious thing you can do about bad data is to avoid getting any in the first place. Not always possible, but here are a few starter suggestions:
- A lot of data errors can come from the manual nature of collecting some data. Human error, inconsistent fields, typos, etc. can lead to major issues come report time. Simple solution? Set up a pre-check system for any manual data that is collected. Have a documented procedure for those collection types so field names, date types, etc. are consistent.
- Bad data causes a ripple effect that if left alone too long, may not be fixable. This can happen even when good data is neglected and becomes outdated. Stay up to date with your records – Keep a scheduled report (daily, preferably) of system data and address issues as they arise. Data quality requires ongoing attention – addressing and correcting bad data before it causes an issue, avoiding expensive and time consuming reconciliations.
- Many times data is entered incorrectly or inconsistently on the front-end. Avoid this by collecting manual data through pre-populated online forms when possible. Run front-end tests on data collection sources to make sure the results are being recorded the way you want and find out how easy or hard it is to get in bad information.
- Restrict incoming data –if possible, have separate places for ready-to-use data and unreviewed data. Train employees on entering, rechecking, and cleaning bad data through documented, clear processes.
- Always be skeptical of spikes or data outliers – check out anything out of the ordinary. Be acquainted with natural data patterns so you know exactly when something looks out of place.
Be diligent! With a little effort made every day, you can ensure that bad data rarely gets past you and your system, meaning accurate reporting and better informed decisions.