Content area

Abstract

The development of big data is presenting banks and financial services firms with increasing volumes and varieties of information. Those who successfully mine the data stand to benefit from valuable insights they can use to inform their investment strategies and risk analyses. As they seek out these benefits, the need to ensure the data meets the necessary standards of quality is encouraging firms to adopt new technological and strategic approaches. Dennis Smith, Pittsburgh-based managing director of BNY Mellon's advanced engineering group, says that as volumes of data increase, it is becoming difficult for firms to validate the information at the level of individual records. Data quality has become a particularly sensitive issue because of the more rigorous demands regulators are placing on banks and financial institutions. As data volumes and varieties have increased, Apache Hadoop, the open-source software that enables distributed processing of large data sets, has become one of the most popular tools for managing big data.

Details

Title
The Big Data Quality Question
Author
Hamilton, Nicholas
Pages
12-13
Section
Data Management
Publication year
2012
Publication date
Sep 2012
Publisher
Incisive Media Limited
ISSN
17508517
e-ISSN
17508525
Source type
Trade Journal
Language of publication
English
ProQuest document ID
1040721210
Copyright
Copyright Incisive Media Plc Sep 2012