Data Governance in FIs: Root Cause Analysis

This series focuses on Data Governance in Financial Institutions. Our first post introduced the fundamentals of Data Governance. This discussion centers on how to find root causes of problems in organizations and recommend actions to solve them.

 When analyzing deep issues and causes, it is important to take a comprehensive and holistic approach. Root Cause Analysis (RCA) is the systematic problem-solving approach intended to identify root causes of problems or events. It is based on the principle that problems are most effectively solved by correcting or eliminating the primary causes, rather than only addressing the symptoms. Properly done, RCA can help a Financial Institution implement an effective Data Governance program by investigating and addressing data quality issues. It can also help the FI design the appropriate data governance policies and data standards.

What is a Root Cause?

A root cause is an initiating event or condition in a cause-and-effect chain. It must be subject to change; that is, there is a definable factor that can be adjusted to create a positive outcome or to prevent a negative one.

A root cause must also meet four criteria:

  1. It is an underlying event that initiates a sequence of subsequent events
  2. It is logically and economically practical to identify
  3. It can be affected by management actions
  4. It is a practical basis to formulate and recommend corrective actions

The Process of RCA

There is no prescriptive process for RCA, but there are five steps that can help guide organizations:

5 steps for the Root Cause Analysis process

These steps are best completed via an iterative approach rather than a sequential one, to encourage regular participant feedback and continuous improvement based on that feedback.

Describe the Problem

When describing a problem, start with a factual statement of what is happening and why it is a problem. The following questions can also help when describing a problem:

  • When did the problem first occur?
  • Is it continuous or occasional?
  • Has the frequency of occurrence increased or decreased over time?
  • Who are the stakeholders and what processes are involved?

Gather the Data

Once the problem is defined, you can then begin gathering the data. Gathering the data usually entails collecting and reviewing examples of problem instances. You can also use these techniques to seek possible causes:

  • A review with subject matter experts (SMEs): does this root cause make sense?
  • A brainstorming session with SMEs and stakeholders: what do you think the problem could be?
  • Change analysis: what changed when the problem started?
  • Identification of archetypes: are there common patterns of behavior in systems?
  • Compare and contrast: when does the problem happen and when does it not?

Model Causal Changes

A causal model is a conceptual model that describes the causal mechanisms of a system. There are various techniques that can be used for modeling causal changes, but this blog will focus on the most common and widely useful techniques, including Five Whys, Fishbone Diagramming, and Causal Loops.

Five Whys

Five Whys is a good tool for identifying a single most prominent cause.

How to complete the Five Whys

  • Write down the specific problem to help formalize the problem and describe it completely.
  • Ask “why” the problem happens and write the answer down below the problem.
  • If the answer provided does not identify the root cause of the problem that you wrote down in Step 1, ask “why” again and write that answer down.
  • Loop back to step 3 until the team agrees that the problem’s root cause is identified. This may take fewer or more than five times.

Five Whys Example

5 whys diagram example

Fishbone Diagramming

Fishbone Diagramming is effective for causal hierarchy and linear chains with multiple causes.

How to complete the Fishbone Diagram:

  1. Identify the problem statement and write it at the mouth of the fish.
  2. Identify the major categories of causes of the problem and write them as branches of the main arrow for each of the major categories. Some examples include equipment or supply factors, environmental factors, rules/policy/procedure factors, and people/staff factors.
  3. Ask “why” a major category cause happens and write the answer as a branch from the appropriate category.
  4. Repeat the other categories by asking “why” about each cause.
  5. Write sub-causes branching off the cause branches.
  6. Ask “why” and generate deeper levels of causes and continue organizing them under related causes or categories until the root cause is identified.

Fish Diagram Example

Fishbone diagram example


Causal Loops

Causal Loops work well for complex situations that involve circles of influence.

How to complete Causal Loops:

  1. Identify the nouns or variables that are important to the issue.
  2. Fill the “verbs” by linking the variables together and determining how one variable affects the other. Generally, if two variables move in the same direction or have a positive relationship, the link would be denoted as an “s”. If the two variables move in an opposite direction or have a negative relationship, the link would be labeled by an “o”.
  3. Determine if the links in the loop will produce a reinforcing or balancing causal loop and label them accordingly. To determine the type of the loop, count the number of “o’s”. If there are an even number of “o’s” or none are present, it is a reinforcing loop. If there are an odd number of “o’s”, it is a balancing loop.
  4. Walk through the loops and “tell the story” to be sure the loops capture the behavior being described.

Causal Loop Example

Causal Loop example


The model causal changes techniques can be used alone or in combination with one another to get as much information on the problem as possible.

Identify Root Causes

When identifying the root cause(s), it is a good idea to ask questions like “Where can I remove or correct the issue?” or “Where can I minimize the effect?”. It is worth noting that good analysis is actionable analysis, so if there is not enough information to answer these questions, it may be a good idea to circle back to the previous steps in the process.

Recommend Actions

And finally, when recommending actions, we want to eliminate interference and errors, improve processes, and consider side-effects of actions. It is beneficial to plan ahead to predict the effects of your solution so you can spot potential failures before they happen. It is important to learn from underlying issues within the root cause so that you can apply what you learned to systematically prevent future issues. RCA may require multiple corrective actions but if a root cause is identified correctly, it is unlikely that problems will reoccur.


RCA is an essential way to perform a comprehensive and system-wide review of significant problems as well as the factors that led to them. By following the process of RCA above, you will be able to describe a problem, gather data, model casual changes, identify root causes, and ultimately recommend actions that lead to a long-term solution.



Data Governance in FIs: Intro to Data Governance


Carol Newcomb on The Data Roundtable. “A Data Governance Primer, Part 1: Finding the Root Cause.” The Data Roundtable, 4 Sept. 2013,

“Causal Loop Construction: The Basics.” The Systems Thinker, 14 Jan. 2016,

Cause and Effect Analysis: Using Fishbone Diagram and 5 Whys,

“Determine the Root Cause: 5 Whys.” ISixSigma, 27 Nov. 2018,

“ELearningCurve.” Information & Data Management Courses & Certification Online,             

“Root Cause Analysis Explained: Definition, Examples, and Methods.” Tableau,

Data Governance in FIs: Intro to Data Governance

This blog series will focus on Data Governance in Financial Institutions. Our first post introduces data governance fundamentals. It will be followed by a discussion of root cause analysis, metadata, the five stages of data governance deployment, and a final blog that crafts a business case for data governance.

Today’s industry leaders recognize data among their top enterprise assets. According to Gartner, the leading global research firm, 20-25% of enterprise value is directly attributed to the quality of its data. However, Financial Institutions (FIs) often underutilize this key business driver by not establishing a formal data strategy.

Let’s look at some of the challenges to building a data strategy, opportunities for implementing a data strategy, critical components of a successful DG program, and the aspects of data that can be governed. We also want to discuss some potential consequences of poor DG implementation and the most important step to mitigate the risk of it occurring in a Financial Institution.

What is Data Governance?

Data Governance (DG) serves as the framework for defining the who, what, when, how, where and why of your formal data strategy. Through the collection of policies, roles, and processes, DG ensures the proper definition, management, and use of data towards achieving enterprise goals.

Challenges of Building a Data Strategy

Too often, the largest hindrance to building a data- and analytics-driven enterprise is the enterprise itself. For historical reasons, data tends to be siloed within internal business units, resulting in disparate collections of overlapping yet inconsistent data. Given that data is built and accumulated over time in various places in the organization, often via mergers and acquisitions, it can be difficult and time-consuming to gather and use the data.

Without a transparent view of enterprise-wide data, credible decision making becomes nearly impossible. More time is spent gathering and consolidating the data than analyzing it. The goal, then, of DG is to break down the silos in which data becomes segregated and foster a holistic approach towards managing common data. Common data creates a shared understanding of data information and is of paramount importance when sharing data between different systems and/or groups of people.

With the proper implementation of DG standards (data naming, quality, security, architecture, etc.), a firm can realize a variety of optimization-based benefits.

Data Strategy Opportunities

An enterprise that properly implements and executes DG creates opportunities for enhanced productivity.

For example, if an enterprise works with large data sets, having defined naming standards allows for data consistency across all commonly used domains (i.e., Customer, Transactions, Employee, etc.) within the enterprise. This results in increased productivity and a competitive advantage relative to other firms.

As DG improves operational efficiencies, FIs can expect increased customer satisfaction rates, attracting both a loyal following from current customers and new prospects.

Critical Components of a Successful Data Governance Program

FIs have a lot of information as part of their normal business processes so it may be difficult to identify what data needs to be governed.

It is important to note that not all data needs to be governed. There are two types of data that do not need DG: department-specific data and application data not needed for regulatory reporting or cross-department communication.

However, there are three key types of data that should be governed to provide reliable information that can be leveraged across all departments of the FI:

  • Strategic data is unique and usually created within the company, providing a competitive advantage to the firm. A few examples include data about customer insight, market insight, and risk models.
  • Critical data ‘materially affects’ most external reporting, risk management, and/or supports critical business functions. This includes financial data, supply chain data, and counterparty data.
  • Shared data is used in multiple business processes in which the definition, quality, and format needs to be synchronized. For example, customer data for marketing, customer service and sales, and counterparty data for risk management and pricing.

Critical Data Aspects

Beyond the data itself, there are multiple aspects of data that are critical to govern. A successful program will consider the following:


Data Ownership: The possession of and responsibility for information

Data Handling: Ensuring that research data is stored, archived or disposed of in a safe and secure manner

Data Allowable Values: Some data types let you specify that a property is restricted to a set of values

Meta Data: A set of data that describes and gives information about other data

Data Storing: The recording of information in a storage medium

Data Architecture: The structure of an organization’s logical and physical data assets and data management resources

Data Quality: The state of qualitative or quantitative pieces of information

Data Definitions: The origin of a field that references a data domain and determines the data type and the format of data entry

Data Reporting: Collecting and formatting raw information and translating it into a digestible format to assess business performance

Poor DG Consequences

A word of caution: There is such thing as poor DG implementation. If your program is poorly built, the enterprise will suffer.

Building inefficient processes, for example, can delay timelines for tasks like data retrieval and data analysis.

An inferior DG implementation may also create compliance issues. If the program is difficult to understand, enterprise employees may disregard your guidelines.

Overall, if DG is applied within internal silos, it cannot be optimized across the organization. The segregation of data that internal silos create needs to be broken down to achieve the goal of managing common data.

How to Mitigate Poor DG Risk

The entire FI must “buy in” to a DG program to be most effective. Without assistance from both data practices and business functions in the rollout of DG program initiatives, the program will likely fail. It is the responsibility of the business, IT, and internal operations facets to be fully engaged and coordinated within the implementation of DG program initiatives.

What’s Next?

Now that we have outlined what a successful Data Governance program includes, it is time to discuss Root Cause Analysis. Our next post in this series will discuss how to find root causes in FIs and recommend actions to solve problems that you may face when implementing a DG program.



“ELearningCurve.” Information & Data Management Courses & Certification Online, 

 Data Ownership, 

 Data Handling, 

 “Administering and Working with Oracle Enterprise Data Management Cloud.” Oracle Help Center, 24 Nov. 2021, 

 “Metadata.” Wikipedia, Wikimedia Foundation, 23 Dec. 2021, 

 “What Is Data Storage?” IBM, 

 Olavsrud, Thor, and Senior Writer. “What Is Data Architecture? A Framework for Managing Data.” CIO, 24 Jan. 2022,

 “What Is Data Quality? Definition and Faqs.” OmniSci,

 “Data Definitions.” IBM, 

 “What Is Data Reporting and Why It’s Important?” Sisense, 21 May 2021,


Data Management – Leveraging Data for a Competitive Advantage

This is the first in a series of blogs that explore how data can be an asset or a risk to organizations in an uncertain economic climate.

Humanity has always valued novelty. Since the advent of the Digital Age, this preference has driven change at an astronomical pace. For example, more data was generated in the last two years than in the entire human history to date, a concept made more staggering by Machine Learning and Artificial Intelligence tools that allow users to access and analyze data as never before. The question now is: how can business leaders and investors best make sense of this information and use it for their competitive advantage?

Traditionally, access to good data has been a limiting factor. Revolutionary business strategies were reserved for those who knew how to obtain, prepare, and analyze it. While top-tier decision making is still data- and insight-driven, today’s data challenges are characterized more by glut than scarcity, both in terms of overall volume of information and the tools available to make sense of it. As of today, only 0.5% of data that is produced is even analyzed.

This overabundance of information and tech tools has ironically led to greater uncertainty for business leaders. Massive data sets and powerful, user-friendly tools often mask underlying issues, resulting in many firms maintaining and processing duplicates of their data, creating silos of critical but unconnected data that must be sorted and reconciled. Analysts still spend between 80% of their time collecting and preparing their data and only 20% analyzing it.

Global interconnectivity is making the world smaller and more competitive. Regulators, who understand the power of data, are increasing controls over it. Now, more than ever, it is critical for firms to take action. To remain competitive, organizations must understand the critical data that drives their business, so they are able to make use of it and alternative data sets for future decision making; otherwise they face obsolescence. These are not just internal concerns. Clients are also requesting more customized services and demanding to understand how firms are using their information. Firms must identify critical data and understand that all data is not, and should not, be treated the same so they can extract the full power of the information and meet client and regulatory requirements.

Let’s picture data as an onion. As the core of the onion supports it outer layers, the ‘core’ or critical enterprise data supports all the functions of a business. When the core is strong, so is the rest of the structure. When the core is contaminated or rotten – that is a problem, for the onion and for your company.

A comparison picture showing an onion with healthy core data vs. an onion with a contaminated core.

Data that is core to a business – information like client IDs, addresses, products and positions, to name a few examples – must be solid and healthy enough to support the outer layers of data use and reporting in the organization. This enterprise data must be defined, clean and unique, or the firm will waste time cleaning and reconciling it, and the client, business and regulatory reports that it supports will be inaccurate.

How do you source, define and store your data to cleanly extract the pieces you need? Look at the onion again. You could take the chainsaw approach to slice the onion, which would give you instant access to everything inside, good and contaminated, and will probably spoil your dish. Likewise, if you use bad data at the core, any calculations you perform on it or reports aggregating the data will not be correct. If you need a clean slice of onion required by a specific recipe (or calculated data required for a particular report), precision and cleanliness of the slice (good core data and unique contextual definition) is key.

Once your core data is unique, supported and available, clients, business and corporate users can combine it with alternative and non-traditional data sets, to extract information, enhance it and add value. As demand for new “recipes” of data (for management, client or regulatory reporting) is ever increasing, firms who do not clean up and leverage their core data effectively will become obsolete. These demands can be anything from data needed for instant access and client reporting across different form factors (i.e. Web, iOS & Android apps), to data visualization and manipulation tools for employees analyzing new and enhanced information to determine trends. Demand also stems from the numerous requirements needed to comply with the complex patchwork of regional financial regulations across the globe. Many different users, many different recipes, all reliant on the health of their core data (onion core).

What is the actionable advice when you read a headline like: “A recent study in the Harvard Business Review found that over 92% of surveyed firms agreed that data analytics for decision making will be more important 2 years from now”? We have some ideas. In this blog series, FRG Data Advisory & Analytics will take you through several use cases to outline what data is foundational or core to business operations and how to achieve the contextual precision demanded from the market and regulators within our current environment of uncertainty, highlighting both how data can be an asset, or a potential risk, if not treated appropriately.

Dessa Glasser, Ph.D., is an FRG Principal Consultant, with 30+ years experience designing and implementing innovative solutions and organizations in data, risk, and analytics. She leads the Data Advisory & Analytics Team for FRG and focuses on data, analytics and regulatory solutions for clients.  

Edward Hanlon is a Senior Consultant and Engagement Manager on FRG’s Data Advisory & Analytics Team. He focuses on development and implementation of data strategy solutions for FRG, leveraging previous experience launching new Digital products and reengineering operational models as a Digital Technology platform owner and program lead in financial services.


Avoiding Discrimination in Unstructured Data

An article published by the Wall Street Journal on Jan. 30, 2019  got me thinking about the challenges of using unstructured data in modeling. The article discusses how New York’s Department of Financial Services is allowing life insurers to use social media, as well as other nontraditional sources, to set premium rates. The crux: the data cannot unfairly discriminate.  

I finished the article with three questions on my mind. The first: How does a company convert unstructured data into something useful? The article mentions that insurers are leveraging public information – like motor vehicle records and bankruptcy documents – in addition to social media. Surely, though, this information is not in a structured format to facilitate querying and model builds.

Second: How does a company ensure the data is good quality? Quality here doesn’t only mean the data is clean and useful, it also means the data is complete and unbiased. A lot of effort will be required to take this information and make it model ready. Otherwise, the models will at best provide spurious output and at worst provide biased output.

The third: With all this data available what “new” modeling techniques can be leveraged? I suspect many people read that last sentence and thought AI. That is one option. However, the key is to make sure the model does not unfairly discriminate. Using a powerful machine learning algorithm right from the start might not be the best option. Just ask Amazon about its AI recruiting tool.[1]

The answers to these questions are not simple, and they do require a blend of technological aptitude and machine learning sophistication. Stay tuned for future blog posts as we provide answers to these questions.


[1] Amazon scraps secret AI recruiting tool that showed bias against women


Jonathan Leonardelli, FRM, Director of Business Analytics for the Financial Risk Group, leads the group responsible for model development, data science, documentation, testing, and training. He has over 15 years’ experience in the area of financial risk.

Top 6 Things To Consider When Creating a Data Services Checklist

“Data! Data! Data! I can’t make bricks without clay.”
— Sherlock Holmes, in Arthur Conan Doyle’s The Adventure of the Copper Beeches

You should by now have a solid understanding of the growth of and history of data, data challenges and how to effectively manage themwhat data as a service (DaaS) is, how to optimize data using both internal and  external data sources, and the benefits of using DaaS. In our final post of the series, we will discuss the top six things to consider when creating a Data Services strategy.

Let’s break this down into two sections: 1) pre-requisites and 2) the checklist.


We’ve identified four crucial points below to consider prior to starting your data services strategy. These will help frame and pull together the sections of information needed to build a comprehensive strategy to move your business towards success.

1: View data as a strategic business asset

 In the age of data regulation including BCBS 239 principles for effective risk data aggregation and risk reporting, GDPR and others, data, especially that relating to an individual, is considered an asset that must be managed and protected. It also can be aggregated, purchased, traded and legally shared to create bespoke user experiences and engage in more targeted business decisions. Data must be classified and managed with the appropriate level of governance in the same vein as other assets, such as people, processes and technology. Being in this mindset and appreciating the value of data and recognizing that not all data is alike and must be manged appropriately will ultimately ensure business success in a data-driven world.

2: Ensure executive buy-in, senior sponsorship and support

As with any project, having executive buy-in is required to ensure top down adoption. However, partnering with business line executives who create data and are power users of it can help champion its proper management and reuse in the organization. This assists in achieving goals and ensuring project and business success. The numbers don’t lie: business decisions should be driven by data.

3: Have a defined data strategy and target state that supports the business strategy

Having data for the sake of it won’t provide any value; rather, a clearly-defined data strategy and target state which outlines how data will support the business will allow for increased user buy in and support. This strategy must include and outline:

  • A Governance Model
  • An Organization chart with ownership, roles and responsibility, and operations; and
  • Goals for data accessibility and operations (or data maturity goals)

If these sections are not agreed from the start, uncertainty, overlapping responsibilities, duplication of data and efforts as well as regulatory or potentially legal issues may arise.

4: Have a Reference Data Architecture to Demonstrate where Data Services Fit

Understanding the architecture that supports data and data maturity goals, including the components that are required to support the management of data from acquisition through distribution and retirement is critical. It is also important to understand how they fit into the overall architecture and infrastructure of the technology at the firm.  Defining a clear data architecture and its components including:

  • Data model(s)
  • Acquisition
  • Access
  • Distribution
  • Storage
  • Taxonomy

are required prior to integration of the data.

5. Data Operating Model – Understanding how the Data Transverses the Organization

It is crucial to understand the data operations and operating model – including who does what to the data and how the data ownership changes over time or transfers among owners. Data lineage is key – where your data came from, its intended use, who has/is allowed to access it and where it goes inside or outside the organization – to keep it clean and optimize its use. Data quality tracking, metrics and remediation will be required.

Existing recognized standards such as the Global Legal Entity Identifier (LEI) that can be acquired and distributed via data services can help in the sharing and reuse of data that is ‘core’ to the firm. They can also assist in tying together data sets used across the firm.

Checklist/Things to Consider

Once you’ve finished the requirements gathering and understand the data landscape, including roles and responsibilities described above, you’re now ready to begin putting together your data services strategy. To build an all-encompassing strategy, the experts suggest inclusion of the following.

1:  Defined Data Services Required

  •  Classification: core vs. business shared data and ownership
    • Is everyone speaking a common language?
    • What data is ‘core’ to the business, meaning it will need to be commonly defined and used across the organization?
    • What data will be used by a specific business that may not need to be uniformly defined?
    • What business-specific data will be shared across the organization, which may need to be uniformly defined and might need more governance?
  • Internal vs external sourcing
    • Has the business collected or created the data themselves or has it been purchased from a 3rd party? Are definitions, metadata and business rules defined?
    • Has data been gathered or sourced appropriately and with the correct uniform definitions, rules, metadata and classification, such as LEI?
  • Authoritative Data Sources for the Data Services
    • Have you documented where, from whom, when etc. the data was gathered (from Sources of Record or Sources of Origin)? For example, the Source of Origin might be a trading system, an accounting system or a payments system. The general ledger might be the Source of Record for positions.
    • Who is the definitive source (internal/external)? Which system?
  • Data governance requirements
    • Have you adhered to the proper definitions, rules, and standards set in order to handle data?
    • Who should be allowed to access the data?
    • Which applications (critical, usually externally facing) applications must access the data directly?
  • Data operations and maintenance
    • Have you kept your data clean and up to date?
    • Are you up to speed with regulations, such as GDPR, and successfully obtained explicit consent for the information?
    • Following your organization chart and rules and requirements detailed above, are the data owners known, informed and understand they are responsible for making sure their data maintains its integrity?
    • Are data quality metrics monitored with a process to correct data issues?
    • Do all users with access to the data know who to speak to if there is a data quality issue and know how to fix it?
  • Data access, distribution and quality control requirements
    • Has the data been classified properly? Is it public information? If not, is it restricted to those who need it?
    • Have you defined how you share data between internal/external parties?
    • Have the appropriate rules and standards been applied to keep it clean?
    • Is there a clearly defined process for this?
  • Data integration requirements
    • If the data will be merged with other data sets/software, have the data quality requirements been met to ensure validity?
    • Have you prioritized the adoption of which applications must access the authoritative data distributed via data services directly?
    • Have you made adoption easy – allowing flexible forms of access to the same data (e.g., via spreadsheets, file transfers, direct APIs, etc.)?

2: Build or Acquire Data Services

 To recap, are you building or acquiring your own Data Services? Keep in mind the following must be met and adhere to compliance:

  • Data sourcing and classification, assigning ownership
  • Data Access and Integration
  • Proper Data Services Implementation, access to authoritative data
  • Proper data testing, and data remediation, keeping the data clean
  • Appropriate access control and distribution of the data, flexible access
  • Quality control monitoring
  • Data issue resolution process

The use and regulations around data will be constantly evolving as will the number of users data can support in business ventures. We hope that this checklist will provide a foundation towards building and supporting your organization’s data strategies. If there are any areas you’re unclear on, don’t forget to take a look back through our first five blogs which provide more in-depth overviews on the use of data services to support the business.

Thank you for tuning into our first blog series on data management. We hope that you found it informative but most importantly useful towards your business goals.

If you enjoyed our blog series or have questions on the topics discussed, write to us on Twitter@FRGRISK.

Dessa Glasser is a Principal with the Financial Risk Group, and an independent board member of Oppenheimer & Company, who assists Virtual Clarity, Ltd. on data solutions as an Associate. 



Data Is Big, Did You Know?

Data Management – The Challenges

Data as a Service (DaaS) Solution – Described

Data as a Service (DaaS) Data Sources – Internal or External?

Data as a Service (DaaS) – The Benefits

Data as a Service (DaaS) – The Benefits

Let’s start with a succinct summary of the benefits of DaaS.

Data as a Service (DaaS) is one way to consistently deliver and effectively manage data from multiple sources across the firm, both internal and external. It can be used as one “logical” and centralized, authoritative (golden) source for critical data used across the organization.

It is an efficient way to deliver data that can also improve speed to market on requests for new and additional data, either from internal parties or regulators or substitute sources.

DaaS can thus be used effectively to achieve the following:

  • Reduce the cost of supplying non-proprietary external data needed across the firm
  • Quickly deliver internal, proprietary data to groups that need to share data
  • Deliver a single view of the data across Finance, Risk and the Business to meet business and regulatory demands
  • Provide a 360-degree view of clients for firms with complex client relationships and service organizations
  • Deliver a definitive record of a firm’s products across the organization

At the same time, the quality of the data can be monitored and reported centrally, along with federated (decentralized) data ownership. This allows ‘data owners’ to be responsible for defining and maintaining the data that they generate and know best, allowing others to ‘share it’. Examples include definitions of a firm’s products by the marketing groups or analytic calculations, such as Risk-Weighted Assets or capital calculations from Finance and Risk groups. Transparency of the data is increased and reuse of data is facilitated.

Critically, the quality of data can be significantly improved when DaaS is implemented within a firm. Central data monitoring, access and updating by the Sources of Record makes sure the data is sourced from the owners on a timely basis. Sharing of data and reuse, with multiple eyes on the same data, allows for quick resolution of errors and can save companies potential embarrassment.

All of this leads to three key benefits for firms:

  • Agility: Firms become more agile as they can quickly implement changes and roll out new data because of the unified data models, transparency, and simplicity of the process.
  • Flexibility and Cost Efficiency: New applications and necessary regression testing – which verifies that software previously developed still performs the same way after it’s been interfaced with other software – is easier as definitions, structures and data models are already known and often enhanced and extended.
  • Transparency: Firms utilize unified data models, definitions, metadata, tools, and support. They can leverage the specific experience of data owners and providers to access data closer to the source and increase transparency and benefit from enhancements.

All of the above seems so logical, so sensible. And it is. As we’ve seen, however, it’s not the logic behind the DaaS process which trips people up; it’s mastering the practical implementation of the process.

In the next blog, we offer a check list of things to consider when you’re developing a DaaS solution for your firm.


Dessa Glasser is a Principal with the Financial Risk Group, and an independent board member of Oppenheimer & Company, who assists Virtual Clarity, Ltd. on data solutions as an Associate. 


Questions? Comments? Talk to us on Twitter @FRGRISK.


Data as a Service (DaaS) Data Sources – Internal or External?

As mentioned previously, Data as a Service (DaaS) can be used to provide a single source of authoritative (or golden) data for use in a firm’s critical applications, particularly when data is needed from multiple sources or it is ‘siloed’ in the organization.

DaaS provides the ability to use a single mechanism to deliver data from both internal or external data sources in a consistent format and to monitor its quality and use. The choice of whether to utilize internal or external providers depends on the proprietary nature of your data or unique requirements.

External providers are appropriate when data is ‘commoditized’, or readily available and accessible, where there is no competitive advantage to generate the data. Securities Master and other reference data, like currency codes and exchange tickers, are examples of external data that can be provided via DaaS. Here you can rely upon a third party to provide the information in a usable form that can be plugged into the DaaS framework and delivered to users.

It may also make sense to turn to an external source when a third-party data provider has a particular expertise in data, such as a specialized data feed of prepayment or economic data, vendor indexes, or universal identifier information, such as the Legal Entity Identifier (LEI). Independence requirements for items like securities pricing and valuations will often dictate use of an external service, too.

The benefit of using external providers for data delivered via DaaS is that it frees your internal teams up to focus on derived data and value-added activities, such as analyzing the data and developing products.  It also be used to ensure independence for items like third party pricing.

Communication and transparency with your external provider are essential, though. External providers have to be made aware of what exactly a company needs to ensure it is accurate. Both internal and external data subject to regulatory scrutiny, for example, usually must be 100% correct, understandably, while data for internal use only – such as a client’s dining preferences or where unstructured data is used to identify trends for marketing and product development, might not need the same level of scrutiny (although any personal data still requires security and privacy protection.)

If using an external data, don’t get so dependent on that supplier that you can’t substitute one for another— you may need to switch suppliers down the road because of changes in needs or to improve data quality. Instead, adapt it to your own interface so that another provider can be inserted easily, if need be, in your own format. That is the power of DaaS. A standardized framework will make additions and substitutions easier. It can also assist in tracking the use of data as required to meet regulatory requirements, such as the Global Data Protection Requirement (GDPR).

Regardless of whether you use an internal or external DaaS service, companies have to be prepared to give a clear definition of the data and metadata to ensure the correct use and interpretation of the data and to assist in its use and management. This will assist answering queries from regulators or third parties, including data providers.


Dessa Glasser is a Principal with the Financial Risk Group, and an independent board member of Oppenheimer & Company, who assists Virtual Clarity, Ltd. on data solutions as an Associate. 

Questions? Comments? Talk to us on Twitter @FRGRisk.


Data as a Service (DaaS) Solution – Described

Data as a Service (DaaS) can be used to provide a single source of authoritative (or golden) data for use in a firm’s critical applications. Here, a logical layer of the data (often in-memory for quick access) can serve up data that has been verified, defined, and described with metadata from source systems. This provides data that is readily understood and has unique and unambiguous meaning with the context in which these data is known and used.

Source systems can be tapped in real time to ensure that all changes are accurately and immediately represented in the data service.  This source system can be internal or external to the firm, depending on the need by the receiving party.

The authoritative data can then be served up to multiple users at the same time, delivered in a format that they prefer (e.g., file transfer, online access, download into other systems or spreadsheets), giving them quicker access to information in a format that they can readily use.

By cleaning the data, describing it and distributing it from a central logical location to users and applications, data quality checks can be performed and efficiencies gained. Given that ‘all eyes’ are on the same data, any data quality issues are quickly identified and resolved.

DaaS offers the flexibility to provide access to both internal and external data in an easily consumable form. Access to a multitude of authoritative data in a consistent format can be extremely useful in timely delivery of new applications or reporting, including regulatory reports, and will be quicker than waiting for a single physical source for this data to be built.

This is particularly useful when data are needed by multiple parties and when data is ‘siloed’ in an organization. How many versions are there? How many platforms? Don’t forget, data generation has vast potential.

The more complex your data needs, the more likely that a DaaS solution will benefit you.

Dessa Glasser is a Principal with the Financial Risk Group, and an independent board member of Oppenheimer & Company, who assists Virtual Clarity, Ltd. on data solutions as an Associate. 

Questions? Comments? Talk to us on Twitter @FRGRisk


Data Management – The Challenges

Does your company suffer the challenges from data silos? Dessa Glasser, Principal with the Financial Risk Group, who assists Virtual Clarity on data solutions as an Associate, discusses the challenges of data management in our second post for our blog series.

In our previous blog, we talked about the need for companies to get a handle on their data management. This is tough, but necessary. As companies develop – as they merge and grow and as more data becomes available to them in multiple forms – data silos occur, making it difficult for a ‘single truth’ of data to emerge. Systems and data are available to all , but often behavior among teams are different, including the ‘context’ in which the data is used. Groups have gathered and enhanced their own data to support their business, making it difficult to reconcile and converge to a single source for business critical data.

This complication is magnified because:

  • New technology brings in large amounts of data – both structured and unstructured
  • Each source has its own glossary of terms, definitions, metadata, and business rules
  • Unstructured data often needs tagging to structured data to assist firms in analytics
  • Structured and unstructured data require metadata to interpret the data and its context

The problem, then, is not getting the data. The problem is processing, interpreting and understanding the data.

Companies can also be hindered by the ‘do it yourself’ mentality of their teams, whereby individuals who want systems implemented immediately will often construct a process and data themselves, rather than waiting for IT to deliver it, which either takes time or may not be not available on a timely basis.

 These cross-over efforts undermine a firm’s ability to effectively use the data and often leads to:

  • Data sources being available in multiple forms – both internal and external
  • The costly and manual reconciliation of incorrect data and difficulty aggregating data
  • The inability to generate business insights from the data – more time is spent processing, and reconciling the data, rather than analyzing it

Meanwhile, clients are demanding a holistic view of the services they’re buying into, and management and regulators, when they ask for data, want to know the full relationship with clients across the firm and a holistic view of all aggregated risk positions, which is hard to pull together from numerous teams who work with and may interpret the data differently. Companies must present a cohesive front, regardless of each team’s different procedures or context in which the data is used.

All of the above are prime examples of why the governance and management of data is essential. The end goal is one central, logical, authoritative source for all critical data for a company. It is important to treat data as a business asset and ensure the timely delivery of both well-defined data and metadata to the firm’s applications and business users. This can be done by developing a typical data warehouse to serve up the data, which often can take years to build. However, this can also be facilitated more quickly by leveraging advances in technologies, such as the cloud, data access and management tools, and designing a Data as a Service (DaaS) solution within a firm.

So, how to go about it?

Tune in next month to blog 3 where we’ll discuss.

Dessa Glasser is a Principal with the Financial Risk Group, and an independent board member of Oppenheimer & Company, who assists Virtual Clarity, Ltd. on data solutions as an Associate. 

Questions? Comments? Talk to us on Twitter @FRGRisk


Data Is Big, Did You Know?

Data is big. Big news. Big importance.

How big, you ask? Consider that all the information we have as the human race has been growing since the beginning of time. At the same time, we are enacting more processes every day that add to this growing data, whether on a company or personal level; local or global; text or numerical; in native language or foreign and now digital format, with pictures or video.

Let’s put the sheer volumes of data gathering into some perspective. Back in 2003, it was estimated by Turek, IBM, that between the beginning of time up until 2003, 5 exabytes (5 billion gigabytes) of data was created. By 2011, that same 5 exabytes of data was generated every two days. Forbes in 2015 published “20 Mind-Boggling Facts” indicating that more data was created between 2013 and 2015 than in the entire history of the human race and that “by the year 2020, about 1.7 megabytes of information will be created every second for every human being on the planet”, “growing from 4.4 zettabytes” (in 2015) “to 44 zettabytes, or 44 trillion gigabytes”.

What are the reasons for this exponential growth in data?

A key factor is the evolution of computing. In the space of just one hundred years, we’ve evolved from the very first, basic tabulating systems to the cognitive, sensory systems we see in today’s world. Progress has been fierce and rapid. We would argue there has never been a more significant advancement in the development of humans as that of computing. It’s changed the ways in which we interact with one another, the ways we process information, and, crucially, the ways, and the speed, at which we do business.

The explosion of data occurs across all platforms. We no longer communicate just in binary or text – why would we, when there are so many more stimulating options out there, such as multimedia, visuals, sensors and hand-held devices? The ways in which we generate, and consume, data have grown and grown, whilst at the same time peoples’ attention spans have shrunk to that of a goldfish, leading to the introduction of even more mediums of communication and data generation and the need for tools, such as machine learning and artificial intelligence to process the large amounts of data.

The consequences of Big Data

Companies, consequently, find themselves having to deal with significant amounts of disparate data available from multiple sources, internally and externally; whether it be from clients, employees, or vendors, from internal operations or growth or caused by mergers, acquisitions etc.

All of this requires significant time spent reconciling and processing data and the use of tools (such as business intelligence, knowledge graphs and machine learning, etc.) to analyse it. How do you make sense of all of it? How do you interpret it in a way that allows you to use it to build a more efficient business model? Who can help you with this? Forbes in 2015 estimated that “less than 0.5% of all data is ever analysed and used”, creating a significant business opportunity.

In this six-blog series, we’re going to talk about the challenges that companies face in managing data and the type of tools available for managing the data. We’re going to tell you why it’s important; and we’re going to explain the benefits of getting a proper handle on your data management.

For this, we’re going to draw on Dessa Glasser, Principal at the Financial Risk Group, working with Virtual Clarity on data strategies, for her knowledge of data management strategies and tools, including the use of Data as a Service (DaaS) – to manage and provision data. Dessa, the former CDO of JPMorgan Chase Asset Management and Deputy Director of the Office of Financial Research (US Treasury), has a wealth of experience in implementing solutions in risk, data and analytics across financial and non-financial firms in both the private and public sector, including enacting operational efficiencies and change management via implementing such tools as DaaS.

Dessa Glasser is a Principal with the Financial Risk Group, and an independent board member of Oppenheimer & Company, who assists Virtual Clarity, Ltd. on data solutions as an Associate. 

Questions? Comments? Talk to us on Twitter @FRGRisk


Subscribe to our blog!