Complexity of banking system ‘makes data quality issues hard to manage’
Wednesday, 12 May, 2010 Leave a comment
Image via Wikipedia
The case for complexity analysis & management within financial sector keeps being made. This latest from Experian QAS:
Financial institutions find it difficult to address their data quality issues due to the complex procedures they undertake, it has been suggested. Arnt-Erik Hansen, an International Association for Information and Data Quality (IAIDQ) member with experience in financial services, commented: “Organisational complexity is a big issue in these organisations, from systems, process and responsibility perspectives.”
Daragh O’Brien, former Director of Publicity at the IAIDQ, said this issue will need to be addressed before organisations are able to harvest a sustainable return on their investment in information quality and prevent failures in financial risk management.
“In [Arnt-Erik Hansen’s] experience, most banks have a devastating disparity in operations and culture that is preventing them from implementing or improving their information supply chain,” he explained. A survey carried out earlier this month by business analytics software and services provider SAS (originally Statistical Analysis System) revealed that dissatisfaction with information quality is a common denominator as far as enterprise risk management is concerned.
Just 39 per cent of respondents believe they are effectively collecting, storing and aggregating data within their organisation.
Related articles by Zemanta
- What’s next for global banks – Is it a tent and some dodgy headgear? (davidgwilson.spaces.live.com)
- Complex Systems Management: New business successes within the last 7 days (davidgwilson.spaces.live.com)