Complexity of banking system ‘makes data quality issues hard to manage’


Mapping the compliance model to a risk matrix ...

Image via Wikipedia

The case for complexity analysis & management within financial sector keeps being made. This latest from Experian QAS:

Financial institutions find it difficult to address their data quality issues due to the complex procedures they undertake, it has been suggested. Arnt-Erik Hansen, an International Association for Information and Data Quality (IAIDQ) member with experience in financial services, commented: “Organisational complexity is a big issue in these organisations, from systems, process and responsibility perspectives.”

Daragh O’Brien, former Director of Publicity at the IAIDQ, said this issue will need to be addressed before organisations are able to harvest a sustainable return on their investment in information quality and prevent failures in financial risk management.

“In [Arnt-Erik Hansen’s] experience, most banks have a devastating disparity in operations and culture that is preventing them from implementing or improving their information supply chain,” he explained. A survey carried out earlier this month by business analytics software and services provider SAS (originally Statistical Analysis System) revealed that dissatisfaction with information quality is a common denominator as far as enterprise risk management is concerned.

Just 39 per cent of respondents believe they are effectively collecting, storing and aggregating data within their organisation.

Related articles by Zemanta

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s