“Dirty Data” adds costs and hampers decision making

Erin McCune

March 16, 2007

Over at the EyeOnBI blog Mike Leano summarizes a recent Gartner presentation on the cost of "dirty data" – data that is inaccurate, incomplete, or duplicated. Gartner estimates that more than 25% of the data within Fortune 1000 companies is flawed, and will continue to be flawed for the foreseeable future at most companies. Gartner observes that bad data isn't just an IT problem, and suggests that companies need to appoint 'data stewards' within business groups that are responsible for the quality of information.

You've got to have clean data if you are going to successfully compete on analytics. Speaking of competing on analytics – Tom Davenport's book by that name was recently published (expanding on his HBR article mentioned in my previous post) and arrived from Amazon this week. I'll be reading it soon and share my thoughts in an upcoming post.

Competing on Analytics: The New Science of Winning

by Thomas H. Davenport and Jeanne G. Harris
Harvard Business School Press
March 2007

Payments News

Stay on top of the rapidly evolving payments world with Glenbrook’s free curated news feed, delivered daily to your inbox.

Payments Views

Read our commentary and opinion blog written by members of the Glenbrook team on payments industry topics, large and small.

Glenbrook’s live and on-demand workshops help you understand and apply the innovations shaping the payments industry. Register today or schedule a custom workshop for your team.

Launch, improve & grow your payments business