Data as a Service (DaaS) Solution – Described

Data as a Service (DaaS) can be used to provide a single source of authoritative (or golden) data for use in a firm’s critical applications. Here, a logical layer of the data (often in-memory for quick access) can serve up data that has been verified, defined, and described with metadata from source systems. This provides data that is readily understood and has unique and unambiguous meaning with the context in which these data is known and used.

Source systems can be tapped in real time to ensure that all changes are accurately and immediately represented in the data service.  This source system can be internal or external to the firm, depending on the need by the receiving party.

The authoritative data can then be served up to multiple users at the same time, delivered in a format that they prefer (e.g., file transfer, online access, download into other systems or spreadsheets), giving them quicker access to information in a format that they can readily use.

By cleaning the data, describing it and distributing it from a central logical location to users and applications, data quality checks can be performed and efficiencies gained. Given that ‘all eyes’ are on the same data, any data quality issues are quickly identified and resolved.

DaaS offers the flexibility to provide access to both internal and external data in an easily consumable form. Access to a multitude of authoritative data in a consistent format can be extremely useful in timely delivery of new applications or reporting, including regulatory reports, and will be quicker than waiting for a single physical source for this data to be built.

This is particularly useful when data are needed by multiple parties and when data is ‘siloed’ in an organization. How many versions are there? How many platforms? Don’t forget, data generation has vast potential.

The more complex your data needs, the more likely that a DaaS solution will benefit you.

Dessa Glasser is a Principal with the Financial Risk Group, who assists Virtual Clarity, Ltd. on data solutions as an Associate.

Questions? Comments? Talk to us on Twitter @FRGRisk

Related:

Data Management – The Challenges

Does your company suffer the challenges from data silos? Dessa Glasser, Principal with the Financial Risk Group, who assists Virtual Clarity on data solutions as an Associate, discusses the challenges of data management in our second post for our blog series.

In our previous blog, we talked about the need for companies to get a handle on their data management. This is tough, but necessary. As companies develop – as they merge and grow and as more data becomes available to them in multiple forms – data silos occur, making it difficult for a ‘single truth’ of data to emerge. Systems and data are available to all , but often behavior among teams are different, including the ‘context’ in which the data is used. Groups have gathered and enhanced their own data to support their business, making it difficult to reconcile and converge to a single source for business critical data.

This complication is magnified because:

  • New technology brings in large amounts of data – both structured and unstructured
  • Each source has its own glossary of terms, definitions, metadata, and business rules
  • Unstructured data often needs tagging to structured data to assist firms in analytics
  • Structured and unstructured data require metadata to interpret the data and its context

As Dessa Glasser notes, “The problem is not getting the data, the problem is processing, interpreting and understanding the data.”

Companies can also be hindered by the ‘do it yourself’ mentality of their teams, whereby individuals who want systems implemented immediately will often construct a process and data themselves, rather than waiting for IT to deliver it, which either takes time or may not be not available on a timely basis.

 These cross-over efforts undermine a firm’s ability to effectively use the data and often leads to:

  • Data sources being available in multiple forms – both internal and external
  • The costly and manual reconciliation of incorrect data and difficulty aggregating data
  • The inability to generate business insights from the data – more time is spent processing, and reconciling the data, rather than analyzing it

Meanwhile, clients are demanding a holistic view of the services they’re buying into, and management and regulators, when they ask for data, want to know the full relationship with clients across the firm and a holistic view of all aggregated risk positions, which is hard to pull together from numerous teams who work with and may interpret the data differently. Companies must present a cohesive front, regardless of each team’s different procedures or context in which the data is used.

All of the above are prime examples of why the governance and management of data is essential. The end goal is one central, logical, authoritative source for all critical data for a company. It is important to treat data as a business asset and ensure the timely delivery of both well-defined data and metadata to the firm’s applications and business users. This can be done by developing a typical data warehouse to serve up the data, which often can take years to build. However, this can also be facilitated more quickly by leveraging advances in technologies, such as the cloud, data access and management tools, and designing a Data as a Service (DaaS) solution within a firm.

So, how to go about it?

Tune in next month to blog 3 where we’ll discuss.

Dessa Glasser is a Principal with the Financial Risk Group, who assists Virtual Clarity, Ltd. on data solutions as an Associate.

Questions? Comments? Talk to us on Twitter @FRGRisk

Current Expected Credit Loss (CECL) a New Paradigm for Captives, Too

The ramifications of CECL on Financial Institutions has in large part focused on Banks, but as we addressed in a recent paper, “Current Expected Credit Loss: Why the Expectations Are Different,” this new accounting treatment extends to a much larger universe.  An example of this are the captives that finance American’s love affair with cars; their portfolios of leases and loans have become much larger and the implications of CECL more significant.

As with other institutions, data, platforms, and modeling make up the challenges that captives will have to address.  But unlike other types of institutions captives have more concentrated portfolios, which may aid in “pooling” exercises, but may be inadvertently affected by scenario modeling.  A basic tenet for all institutions is the life-of-loan estimate and the use of reasonable and supportable forecasts.  While some institutions may have had “challenger” models in the past that moved in this direction, captives have not tended to utilize this type of approach in the past.

The growth of captives portfolios and the correlation to a number of macro-economic factors (e.g. interest rates, commodity prices, tariffs, etc.) call for data and scenarios that require a different level of modeling and forecasting.  Because FASB does not provide template methodologies or calculations it will be necessary to develop these scenarios with the mindset of the “reasonable and supportable” requirement.  While different approaches will likely be adopted, those that utilize transaction level data have the ability to provide a higher level of accuracy over time, resulting in the goals laid out in the new guidelines.  As might be imagined the ability to leverage experience in the development and deployment of these types of models can’t be overemphasized.

We have found that having the ability to manage the following functional components of the platform are critical to building a flexible platform that can manage the changing needs of the users:

  • Scenario Management
  • Input Data Mapping and Registry
  • Configuration Management
  • Model Management

Experience has taught that there are significant considerations in implementing CECL, but there are also some improvements that can be realized for institutions that develop a well-structured plan. Captives are advised to use this as an opportunity to realize efficiencies, primarily in technology and existing models. Considerations around data, platforms, and the models themselves should leverage available resources to ensure that investments made to address this change provide as much benefit as possible, both now and into the future.

Data Is Big, Did You Know?

Data is big. Big news. Big importance.

How big, you ask? Consider that all the information we have as the human race has been growing since the beginning of time. At the same time, we are enacting more processes every day that add to this growing data, whether on a company or personal level; local or global; text or numerical; in native language or foreign and now digital format, with pictures or video.

Let’s put the sheer volumes of data gathering into some perspective. Back in 2003, it was estimated by Turek, IBM, that between the beginning of time up until 2003, 5 exabytes (5 billion gigabytes) of data was created. By 2011, that same 5 exabytes of data was generated every two days. Forbes in 2015 published “20 Mind-Boggling Facts” indicating that more data was created between 2013 and 2015 than in the entire history of the human race and that “by the year 2020, about 1.7 megabytes of information will be created every second for every human being on the planet”, “growing from 4.4 zettabytes” (in 2015) “to 44 zettabytes, or 44 trillion gigabytes”.

What are the reasons for this exponential growth in data?

A key factor is the evolution of computing. In the space of just one hundred years, we’ve evolved from the very first, basic tabulating systems to the cognitive, sensory systems we see in today’s world. Progress has been fierce and rapid. We would argue there has never been a more significant advancement in the development of humans as that of computing. It’s changed the ways in which we interact with one another, the ways we process information, and, crucially, the ways, and the speed, at which we do business.

The explosion of data occurs across all platforms. We no longer communicate just in binary or text – why would we, when there are so many more stimulating options out there, such as multimedia, visuals, sensors and hand-held devices? The ways in which we generate, and consume, data have grown and grown, whilst at the same time peoples’ attention spans have shrunk to that of a goldfish, leading to the introduction of even more mediums of communication and data generation and the need for tools, such as machine learning and artificial intelligence to process the large amounts of data.

The consequences of Big Data 

Companies, consequently, find themselves having to deal with significant amounts of disparate data available from multiple sources, internally and externally; whether it be from clients, employees, or vendors, from internal operations or growth or caused by mergers, acquisitions etc.

All of this requires significant time spent reconciling and processing data and the use of tools (such as business intelligence, knowledge graphs and machine learning, etc.) to analyse it. How do you make sense of all of it? How do you interpret it in a way that allows you to use it to build a more efficient business model? Who can help you with this? Forbes in 2015 estimated that “less than 0.5% of all data is ever analysed and used”, creating a significant business opportunity.

In this six-blog series, we’re going to talk about the challenges that companies face in managing data and the type of tools available for managing the data. We’re going to tell you why it’s important; and we’re going to explain the benefits of getting a proper handle on your data management.

For this, we’re going to draw on Dessa Glasser, Principal at the Financial Risk Group, working with Virtual Clarity on data strategies, for her knowledge of data management strategies and tools, including the use of Data as a Service (DaaS) – to manage and provision data. Dessa, the former CDO of JPMorgan Chase Asset Management and Deputy Director of the Office of Financial Research (US Treasury), has a wealth of experience in implementing solutions in risk, data and analytics across financial and non-financial firms in both the private and public sector, including enacting operational efficiencies and change management via implementing such tools as DaaS.

Dessa Glasser is a Principal with the Financial Risk Group, who assists Virtual Clarity, Ltd. on data solutions as an Associate. 

Real Time Learning: A Better Approach to Trader Surveillance

An often-heard question in any discussion of Machine Learning (ML) tools is maybe most obvious one: “So, how can we use them?”

The answer depends on the industry, but we think there are especially useful (and interesting) applications for the financial services sector. These consumers have historically been open to the ML concept but haven’t been quick to jump on some potential solutions to common problems.

Let’s look at risk management at the trading desk, for example. If you want to mitigate risk, you need to be able to identify it in advance—say, to insure your traders aren’t conducting out-of-market transactions or placing fictitious orders. The latest issue of the New Machinist Journal by Dr. Jimmie Lenz (available by clicking here) explains how. Trade Desk Surveillance is just one way that Machine Learning tools can help monitor a variety of activities that can cause grief for those tasked with risk management.

Would you like to read more about the possibilities ML can bring to financial services process settings? Download “Real Time Learning: A Better Approach to Trader Surveillance,” along with other issues of the New Machinist Journal, by visiting www.frgrisk.com/resources.

Introducing the New Machinist Journal

Who are the new machinists, and what are their tools?

The machinists of the 21st century are working with Artificial Intelligence (AI) and Machine Learning (ML), turning what has been science fiction into science fact. From learning algorithms that nudge us to buy more stuff to self-driving vehicles that “learn” the highways and byways to deliver us to our destinations safely, AI and ML are attracting considerable attention from a variety of industries.

FRG is currently researching and building machine learning proof-of-concepts to fully understand their practical applications. A new series, the New Machinist Journal, will explore in detail some of these applications in different environments and use cases. It will be published regularly on the FRG website. Volume 1, “What Artificial Intelligence and Machine Learning Solutions Offer,” is an overview of the subject, and is now available for download (click here to read it).

Interested? Visit the website or contact the FRG Research Institute, Research@frgrisk.com

Quantifying the Value of Electricity Storage

ABSTRACT: This research discusses the methodology developed to hedge volatility or identify opportunities resulting from what is normally a discussion constrained to the capital markets.  However, the demand (and the associated volatility) for electricity in the United States has never been more pronounced.  The upcoming paper, “Quantifying the Value of Electricity Storage,” will examine the factors that have led to the growth of volatility, both realized and potential.

There is widespread recognition of the value of energy storage, and new technologies promise to expand this capability for those who understand the opportunities being presented to firms involved in different areas of electricity generation. Objective tools to valuate these options, though, have been limited, as has the insight into when mitigation efforts make economic sense.

In order to answer these questions for electricity generators of all types we have created an economics-based model to address the initial acquisition of storage capacity, as well as the deployment optimization solutions, based on the unique attributes of the population served.

Links to the paper will be posted on FRG’s social media channels.

Forecasting Capital Calls and Distributions

Early in his career, one of us was responsible for cash flow forecasting and liquidity management at a large multiline insurance company. We gathered extensive historical data on daily concentration bank deposits, withdrawals, and balances and developed an elementary but fairly effective model. Because insurance companies receive premium payments from and pay claims to many thousands of individuals and small companies, we found we could base reasonably accurate forecasts on the quarter of the year, month of the quarter, week of the month, and day of the week, taking holidays into account. This rough-and-ready approach enabled the money market traders to minimize overnight balances, make investment decisions early in the morning, and substantially extend the average maturity of their portfolios. It was an object lesson in the value of proactive cash management.

It is not such a trivial matter for investors in private capital funds to forecast the timing and amount of capital calls and distributions. Yet maintaining adequate liquidity to meet obligations as they arise means accepting either a market risk or an opportunity cost that might be avoided. The market risk comes from holding domestic large-cap stocks that will have to be sold quickly, whatever the prevailing price, when a capital commitment is unexpectedly drawn down; the opportunity cost comes from adopting a defensive posture and holding cash or cash equivalents in excess of the amount needed for ongoing operations, especially when short-term interest rates are very low.

FRG is undertaking a financial modeling project aimed at forecasting capital calls and distributions. Our overall objective is to help investors with outstanding commitments escape the unattractive alternatives of holding excess cash or scrambling to liquidate assets to meet contractual obligations whose timing and amount are uncertain. To that end, we seek to assist in quantifying the risks associated with allocation weights and to understand the probability of future commitments so as to keep the total portfolio invested in line with those weights.

In other words, we want to make proactive cash management possible for private fund investors.

As a first step, we have formulated some questions.

  1. How do we model the timing and amount of capital calls and disbursements? Are there exogenous variables with predictive power?
  2. How do the timing of capital calls and disbursements correlate between funds of different vintages and underlying types (e.g., private equity from venture capital to leveraged buyouts, private credit, and real estate, among others)?
  3. Do private funds’ capital calls and distributions correlate with public companies’ capital issuance and dividend payout decisions?
  4. How do we model the growth of invested capital? What best explains the returns achieved before money is returned to LPs?
  5. What triggers distributions? 
  6. How do we allocate money to private funds keeping an eye on total invested capital vs. asset allocation weights?
    1. The timing of capital calls and distributions is probabilistic (from #1). 
    2. Diversification among funds can produce a smooth invested capital profile.  But we need to know how these funds co-move to create distributions around that profile (from #2).
    3. Confounding problem is the growth of invested capital (from #3).  This growth affects total portfolio value and the asset allocation weights.  If total exposure is constrained, what is the probability of breaching those constraints?

We invite front-line investors in limited partnerships and similar vehicles to join the discussion. We would welcome and appreciate your input on the conceptual questions. Please contact Dominic Pazzula at info@frgrisk.com if you have an interest in this topic.

IFRS 9: Evaluating Changes in Credit Risk

Determining whether an unimpaired asset’s credit risk has meaningfully increased since the asset was initially recognized is one of the most consequential issues banks encounter in complying with IFRS 9. Recall the stakes:

  • The expected credit loss for Stage 1 assets is calculated using the 12-month PD
  • The ECL for Stage 2 assets (defined as assets whose credit risk has significantly increased since they were first recognized on the bank’s books) is calculated using the lifetime PD, just as it is for Stage 3 assets (which are in default).

To make the difference more concrete, consider the following:

  • A bank extends an interest-bearing five-year loan of $1 million to Richmond Tool, a hypothetical Virginia-based tool, die, and mold maker serving the defense industry.
  • At origination, the lender estimates the PD for the next 12 months at 1.5%, the PD for the rest of the loan term at 4%, and the loss that would result from default at $750,000.
  • In a subsequent reporting period, the bank updates those figures to 2.5%, 7.3%, and $675,000, respectively.

If the loan were still considered a Stage 1 asset at the later reporting date, the ECL would be $16,875. But if it is deemed a Stage 2 or Stage 3 asset, then the ECL is $66,150, nearly four times as great.

Judging whether the credit risk underlying those PDs has materially increased is obviously important. But it is also difficult. There is a “rebuttable presumption” that an asset’s credit risk has increased materially when contractual payments are more than 30 days past due. In general, however, the bank cannot rely solely upon past-due information if forward-looking information is to be had, either on a loan-specific or a more general basis, without unwarranted trouble or expense.

The bank need not undertake an exhaustive search for information, but it should certainly take into account pertinent intelligence that is routinely gathered in the ordinary course of business.

For instance, Richmond Tool’s financial statements are readily available. Balance sheets are prepared as of a point in time; income and cash flow statements reflect periods that have already ended. Nonetheless, traditional ratio analysis serves to evaluate the company’s prospects as well as its current capital structure and historical operating results. With sufficient data, models can be built to forecast these ratios over the remaining life of the loan. Richmond Tool’s projected financial position and earning power can then be used to predict stage transitions.

Pertinent external information can also be gathered without undue cost or effort. For example, actual and expected changes in the general level of interest rates, mid-Atlantic unemployment, and defense spending are likely to affect Richmond Tool’s business prospects, and, therefore, the credit risk of the outstanding loan. The same holds true for regulatory and technological developments that affect the company’s operating environment or competitive position.

Finally, the combination of qualitative information and non-statistical quantitative information such as actual financial ratios may be enough to reach a conclusion. Often, however, it is appropriate to apply statistical models and internal credit rating processes, or to base the evaluation on both kinds of information. In addition to designing, populating, and testing mathematical models, FRG can help you integrate the statistical and non-statistical approaches into your IFRS 9 platform.

For more information about FRG’s modeling expertise, please click here.

Turning a Blind Eye to the Risky Business of Incentive-based Sales Practices 

Should you be monitoring your sales activities to detect anomalous behaviors?

The use of sales incentives (commissions, bonuses, etc.) to motivate the behavior of salespeople has a long history in the United States.  We all hope to assume the initial structuring of incentive-based pay is not intended to have nefarious or abusive impacts on its customers but, in a number of recent and well-publicized stories of mistreatment of both customers and customer information, we have discovered that these negative consequences do exist.  Likely, the business practice of turning an administrative blind eye to the damage done to consumers as a result of these sales incentive programs has played an even greater role in the scale of abuse that has been uncovered over the last decade.  In the most recent cases of unchecked and large-scale customer abuse, with particular attention focused on the financial services industry, this business paradigm of tying employee benefits (defined as broadly tying employment and/or income potential to sales) were resolved through arbitration and frequently typecast as “a cost of doing business”.

Today, are you putting your business, and all those associated with its success at risk by turning a blind eye to the effects of your business practices, including your employee incentive programs?  There are new consequences being laid on to corporate leaders and board members for all business practices used by the company, and the defense of not knowing the intricacies and results of these practices does not protect you from these risks.

We have developed a methodology to detect both customer sales and individual product behaviors that are indicative of problematic situations that require additional examination.  Our methodology goes beyond the aggregate sales, which are primarily discussed in the literature, to highlight individuals and/or groups that are often obviated when analyzing such data.

A forthcoming  paper, “Sales Practices: Monitoring Sales Activity for Anomalous Behaviors” will explore these issues, and a resolution, in depth. Visit any of our social media channels for the link.

 

 

 

Subscribe to our blog!