Documenting CECL

The CECL Standard requires more than just another update in the calculation of a financial institution’s (FI’s) allowance for credit losses; the new standard also pushes institutions to be more involved in the entire allowance process, especially on the management/executive level. From explanations, justifications and rationales to policies and procedures, the standard requires them all. The FI needs to discuss them, understand them, and document them.

The first point is to discuss all decisions that must be made regarding the CECL process. This includes everything from the definition of default to the justification of which methodology to use for which segment of the data. Although these discussions may be onerous, the CECL standard requires full understanding and completeness of all decisions. Once there is understanding, all decisions must be documented for regulation purposes:

CECL Topic 326-20-50-10: An entity shall provide information that enables a financial statement user to do the following:

  1. Understand management’s method for developing its allowance for credit losses.
  2. Understand the information that management used in developing its current estimate of expected credit losses.
  3. Understand the circumstances that caused changes to the allowance for credit losses, thereby affecting the related credit loss expense (or reversal) reported for the period.

CECL Topic 326-20-50-11: To meet the objectives in paragraph 326-20-50-10, an entity shall disclose all of the following by portfolio segment and major security type:

  1. A description of how expected loss estimates are developed
  2. A description of the entity’s accounting policies and methodology to estimate the allowance for credit losses, as well as discussion of the factors that influenced management’s current estimate of expected credit losses, including:
    • Past events
    • Current conditions
    • Reasonable and supportable forecasts about the future
  3. A discussion of risk characteristics relevant to each portfolio segment
  4. Etc.

Although these may seem like surprising jumps in requirements for CECL, these are simply more defined requirements than under existing ALLL guidance. Note that some of the general requirements under the existing guidance will remain relevant under CECL, such as:

  • “the need for institutions to appropriately support and document their allowance estimates”
  • the “…responsibility for developing, maintaining, and documenting a comprehensive, systematic, and consistently applied process for determining the amounts of the ACL and the provision for credit losses.”
  • the requirement “…that allowances be well documented, with clear explanations of the supporting analyses and rationale.”

As you can see, documentation is an important component of the CECL standard. While the documentation will, at least initially, require more effort to produce, it will also give the FI opportunity to fully understand the inner workings of their CECL process.

Lastly, advice to avoid some headache—take the time to document throughout the entire process of CECL. As my math professor always said, “the due date is not the do date.”

Resources:

  1. FASB Accounting Standards Update, No. 2016-13, Financial Instruments – Credit Losses (Topic 326).
  2. Frequently Asked Questions on the New Accounting Standard on Financial Instruments – Credit Losses. FIL-20-2019. April 3, 2019.

Samantha Zerger, business analytics consultant with FRG, is skilled in technical writing. Since graduating from the North Carolina State University’s Financial Mathematics Master’s program in 2017 and joining FRG, she has taken on leadership roles in developing project documentation as well as improving internal documentation processes.

CECL – The Power of Vintage Analysis

I would argue that a critical step in getting ready for CECL is to review the vintage curves of the segments that have been identified. Not only do the resulting graphs provide useful information but the process itself also requires thought on how to prepare the data.

Consider the following graph of auto loan losses for different vintages of Not-A-Real-Bank bank[1]:

 

While this is a highly-stylized depiction of vintage curves, its intent is to illustrate what information can be gleaned from such a graph. Consider the following:

  1. A clear end to the seasoning period can be determined (period 8)
  2. Outlier vintages can be identified (2015Q4)
  3. Visual confirmation that segmentation captures risk profiles (there aren’t a substantial number of vintages acting odd)

But that’s not all! To get to this graph, some important questions need to be asked about the data. For example:

  1. Should prepayment behavior be captured when deriving the loss rates? If so, what’s the definition of prepayment?
  2. At what time period should the accumulation of losses be stopped (e.g., contractual term)?
  3. Is there enough loss[2] behavior to model on the loan level?
  4. How should accounts that renew be treated (e.g., put in new vintage)?

In conclusion, performing vintage analysis is more than just creating a picture with many different colors. It provides insight into the segments, makes one consider the data, and, if the data is appropriately constructed, positions one for subsequent analysis and/or modeling.

Jonathan Leonardelli, FRM, Director of Business Analytics for the Financial Risk Group, leads the group responsible for model development, data science, documentation, testing, and training. He has over 15 years’ experience in the area of financial risk.

 

[1] Originally I called this bank ACME Bank but when I searched to see if one existed I got this, this, and this…so I changed the name. I then did a search of the new name and promptly fell into a search engine rabbit hole that, after a while, I climbed out with the realization that for any 1 or 2 word combination I come up with, someone else has already done the same and then added bank to the end.

[2] You can also build vintage curves on defaults or prepayment.

 

RELATED:

CECL—Questions to Consider When Selecting Loss Methodologies

CECL—The Caterpillar to Butterfly Evolution of Data for Model Development

CECLData (As Usual) Drives Everything

CECL—Questions to Consider When Selecting Loss Methodologies

Paragraph 326-20-30-3 of the Financial Accounting Standards Board (FASB) standards update[1] states: “The allowance for credit losses may be determined using various methods”. I’m not sure if any statement, other than “We need to talk”, can be as fear inducing. Why is it scary? Because in the world of details and accuracy, this statement is remarkably vague and not prescriptive.

Below are some questions to consider when determining the appropriate loss methodology approaches for a given segment.

How much history do you have?

If a financial institution (FI) has limited history[2] then the options available to them are, well, limited. To build a model one needs sufficient data to capture the behavior (e.g., performance or payment) of accounts. Without enough data the probability of successfully building a model is low. Worse yet, even if one builds a model, the likelihood of it being useful and robust is minimal. As a result, loss methodology approaches that do not need a lot of data should be considered (e.g., discount cashflow or a qualitative factor approach based on industry information).

Have relevant business definitions been created?

The loss component approach (decomposing loss into PD, LGD, and EAD) is considered a leading practice at banks[3]. However, in order to use this approach definitions of default and, arguably, paid-in-full, need to be created for each segment being modeled. (Note: these definitions can be the same or different across segments.) Without these definitions, one does not know when an account has defaulted or paid-off.

Is there a sufficient number of losses or defaults in the data?

Many of the loss methodologies available for consideration (e.g., loss component or vintage loss rates) require enough losses to discern a pattern. As a result, banks that are blessed with infrequent losses can feel cursed when they try to implement one of those approaches. While low losses do not necessarily rule out these approaches, it does make for a more challenging process.

Are loan level attributes available, accurate, and updated appropriately?

This question tackles the granularity of an approach instead of an approach itself. As mentioned in the post CECL – Data (As Usual) Drives Everything, there are three different data granularity levels a model can be built on. Typically, the decision is between loan-level versus segment level. Loan-level models are great for capturing sensitivities to loan characteristics and macroeconomic events provided the loan characteristics are accurate and updated (if needed) on a regular interval.

Jonathan Leonardelli, FRM, Director of Business Analytics for the Financial Risk Group, leads the group responsible for model development, data science, documentation, testing, and training. He has over 15 years’ experience in the area of financial risk.

 

[1]FASB accounting standards update can be found here

[2] There is no consistent rule, at least that I’m aware of, that defines “limited history”. That said, we typically look for clean data reaching back through an economic cycle.

[3] See: Capital Planning at Large Bank Holding Companies: Supervisory Expectations and Range of Current Practice August 2013

RELATED:

CECL—The Caterpillar to Butterfly Evolution of Data for Model Development

CECLData (As Usual) Drives Everything

CECL – Data (As Usual) Drives Everything

To appropriately prepare for CECL a financial institution (FI) must have a hard heart-to-heart with itself about its data. Almost always, simply collecting data in a worksheet, reviewing it for gaps, and then giving it the thumbs up is insufficient.

Data drives all parts of the CECL process. The sections below, by no means exhaustive, provide key areas where your data, simply being by your data, constrains your options.

Segmentation

Paragraph 326-20-30-2 of the Financial Accounting Standards Board (FASB) standards update[1] states: “An entity shall measure expected credit losses of financial assets on a collective (pool) basis when similar risk characteristic(s) exist.” It then points to paragraph 326-20-55-5 which provides examples of risk characteristics, some of which are: risk rating, financial asset type, and geographical location.

Suggestion: prior to reviewing your data consider what risk profiles are in your portfolio. After that, review your data to see if it can adequately capture those risk profiles. As part of that process consider reviewing:

  • Frequency of missing values in important variables
  • Consistency in values of variables
  • Definitional consistency[2]
Methodology Selection

The FASB standard update does not provide guidance as to which methodologies to use[3]. That decision is entirely up to the FI[4]. However, the methodologies that are available to the FI are limited by the data it has. For example, if an FI has limited history then any of the methodologies that are rooted in historical behavior (e.g., vintage analysis or loss component) are likely out of the question.

Suggestion: review the historical data and ask yourself these questions: 1) do I have sufficient data to capture the behavior for a given risk profile?; 2) is my historical data of good quality?; 3) are there gaps in my history?

Granularity of Model

Expected credit loss can be determined on three different levels of granularity: loan, segment (i.e., risk profile), and portfolio. Each granularity level has a set of pros and cons but which level an FI can use depends on the data.

Suggestion: review variables that are account specific (e.g., loan-to-value, credit score, number of accounts with institution) and ask yourself: are the sources of these variables reliable? Do they get refreshed often enough to capture changes in customer or macroeconomic environment behavior?

Hopefully, this post has started you critically thinking about your data. While data review might seem daunting, I cannot stress enough—it’s needed, it’s critical, it’s worth the effort.

 

Jonathan Leonardelli, FRM, Director of Business Analytics for the Financial Risk Group, leads the group responsible for model development, data science, documentation, testing, and training. He has over 15 years’ experience in the area of financial risk.

 

[1] You can find the update here

[2] More on what these mean in a future blog post

[3] Paragraph 326-20-30-3

[4] A future blog post will cover some questions to ask to guide in this decision.

 

RELATED:

CECL—The Caterpillar to Butterfly Evolution of Data for Model Development

Does the Liquidity Risk Premium Still Exist in Private Equity?

FRG has recently been investigating the dynamics of the private capital markets.  Our work has led us to a ground-breaking product designed to help allocators evaluate potential cash flows, risks, and plan future commitments to private capital.  You can learn more here and read about our modeling efforts in our white paper, “Macroeconomic Effects On The Modeling of Private Capital Cash Flows.”

As mentioned in a previous post, we are investigating the effects of available liquidity in the private capital market.  This leads to an obvious question: Does the Liquidity Risk Premium Still Exist in Private Equity?

It is assumed by most in the space that the answer is “Yes.”  Excess returns provided by private funds are attributable to reduced liquidity.  Lock up periods of 10+ years allow managers to find investments that would not be possible otherwise.  This premium is HIGHLY attractive in a world of low rates and cyclically high public equity valuations.  Where else can a pension or endowment find the rates of return required?

If the answer is, “No,” then Houston, we have a problem.  Money continues to flow into PE at a high rate.  A recent article in the FT (quoting data from FRG partner Preqin) show there is nearly $1.5 trillion in dry powder.  Factoring in leverage, there could be, in excess of, $5 trillion in capital waiting to be deployed.  In the case of a “No” answer, return chasing could have gone too far, too fast.

As mentioned, leverage in private capital funds is large and maybe growing larger.  If the liquidity risk premium has been bid away, what investors are left with may very well be just leveraged market risk.  What is assumed to be high alpha/low beta, might, in fact, be low alpha/high beta.  This has massive implications for asset allocation.

We are attempting to get our heads around this problem in order to help our clients understand the risk associated with their portfolios.

 

Dominic Pazzula is a Director with the Financial Risk Group specializing in asset allocation and risk management.  He has more than 15 years of experience evaluating risk at a portfolio level and managing asset allocation funds.  He is responsible for product design of FRG’s asset allocation software offerings and consults with clients helping to apply the latest technologies to solve their risk, reporting, and allocation challenges.

 

 

 

 

 

Private Equity and Debt Liquidity, the “Secondary” Market

A significant consideration in several aspects of Private Equity and Private Debt has been attributed to the liquidity (or lack thereof) of these investments.  The liquidity factor has been cited as a basic investment decision, influencing complex pricing, return of investment and financial risk management.  But as the environment has changed and matured, is liquidity being considered as it should be?

FRG’s ongoing research suggests that some of the changes this asset class are experiencing may be attributable to changes in the liquidity profile of these investments, which in turn may affect asset management decisions.  As modeling techniques continue to evolve in the asset management space, illustrated in our recent paper Macroeconomic Effects On The Modeling of Private Capital Cash Flows, their use as both an asset management tool and a risk management tool become more valuable.

The extreme importance placed on liquidity risk for all types of financial investments, and the financial community in general, to this point in time have been primarily associated with public investments.  However, a burgeoning “secondary” market in Private Equity and Private Debt will change the liquidity consideration of this asset class, a better understanding of which is necessary for investment managers active in this space.  Achieving this understanding will in turn provide private equity and private debt investment managers with another perspective with which to assess management decision aligning a bit more with that traditionally available for public investments. FRG is refining research into the liquidity of Private Capital investments through an appreciation of the dynamics of the environment to provide a better understanding of the behavior of these investments. Watch for more from us on this intriguing subject.

Read more about FRG’s work in Private Capital Forecasting via the VOR platform.

Dr. Jimmie Lenz is a Principal with the Financial Risk Group and teaches Finance at the University of South Carolina.  He has 30 years of experience in financial services, including roles as Chief Risk Officer, Chief Credit Officer, and Head of Predictive Analytics at one of the largest brokerage firms and Wealth Management groups in the U.S.

Change in CECL Approved by the FDIC

The Federal Deposit Insurance Corporation (FDIC) approved a measure that will allow a three-year phase in of the impact of CECL on regulatory capital yesterday (12/18/18). This change will also delay the impact on bank stress tests until 2020.  The change does not affect the rule itself but now allows banks the option to phase in impacts of CECL on regulatory capital over a three-year period. The details of this change can be found in the FDIC memorandum released yesterday.  The memorandum also adjusts how reserves for “bad loans” will be accounted for in regulatory capital.

The Financial Risk Group is recommending that banks utilize this time to better understand the impact, and the opportunities, that result from the mandated changes. “Time to implementation has been a limiting factor for some institutions to explore the identification of additional stakeholder value, but this should no longer be the case,” stated John Bell, FRG’s managing partner. FRG has (and is currently) partnered with clients of all types on a number of assessments and implementations of CECL.  The lessons to date regarding CECL are available in a number of our publications, including: CECL-Considerations, Developments, and Opportunities and Current Expected Credit Loss-Why The Expectations Are Different.

IFRS 17: Killing Two Birds

Time is ticking for the 450 insurers around the world to comply with the International Financial Reporting Standard 17 (IFRS 17) by January 1, 2021 for companies with their financial year starting on January 1.

Insurers are at different stages of preparation, ranging from performing gap analyses, to issuing requirements to software and consulting vendors, to starting the pilot phase with a new IFRS 17 system, with a few already embarking on implementing a full IFRS 17 system.

Unlike the banks, the insurance industry has historically spent less on large IT system revamps. This is in part due to the additional volume, frequency and variety of banking transactions compared to insurance transactions.

IFRS 17 is one of the biggest ‘people, process and technology’ revamp exercises for the insurance industry in a long while. The Big 4 firms have published a multitude of papers and videos on the Internet highlighting the impact of the new reporting standard on insurance contracts that was issued by the IASB in May 2017. In short, it is causing a buzz in the industry.

As efforts are focused on ensuring regulatory compliance to the new standard, insurers must also ask: “What other strategic value can be derived from our heavy investment in time, manpower and money in this whole exercise?”

The answer—analytics to gain deeper business insights.

One key objective of IFRS 17 is to provide information at a level of granularity that helps stakeholders assess the effect of insurance contracts on financial position, financial performance and cash flows, increasing transparency and comparability.

Most IFRS 17 systems in the market today achieves this by bringing the required data into the system, compute, report and integrate to the insurer’s GL system. From a technology perspective, such systems will comprise a data management tool, a data model, a computation engine and a reporting tool. However, most of these systems are not built to provide strategic value beyond pure IFRS 17 compliance.

Apart from the IFRS 17 data, an insurer can use this exercise to put in place an enterprise analytics platform that caters beyond IFRS 17 reporting, to broader and deeper financial analytics, to customer analytics, operational and risk analytics. To leverage on new predictive analytics technologies like machine learning and artificial intelligence, a robust enterprise data platform to house and make available large volumes of data (big data) is crucial.

Artificial Intelligence can empower important processes like claims analyses, asset management, risk calculation, and prevention. For instance, better forecasting of claims experience based on a larger variety and volume of real-time data. The same machine can be used to make informed decisions about investments based on intelligent algorithms, among other use cases.

As the collection of data becomes easier and more cost effective, Artificial Intelligence can drive whole new growths for the insurance industry.

The key is centralizing most of your data onto a robust enterprise platform to allow cross line of business insights and prediction.

As an insurer, if your firm has not embarked on such a platform, selecting a robust system that can cater to IFRS 17 requirements AND beyond will be a case of killing 2 birds with one stone.

FRG can help you and your teams get ready for IFRS 17.  Contact us today for more information.

Tan Cheng See is Director of Business Development and Operations for FRG.

Is Your Business Getting The Full Bang for Its CECL Buck?

Accounting and regulatory changes often require resources and efforts above and beyond “business as usual”, especially those like CECL that are significant departures from previous methods. The efforts needed can be as complex as those for a completely new technology implementation and can take precedence over projects that are designed to improve your core business … and stakeholder value.

But with foresight and proper planning, you can prepare for a change like CECL by leveraging resources in a way that will maximize your efforts to meet these new requirements while also enhancing business value. At Financial Risk Group, we take this approach with each of our clients. The key is to start by asking “how can I use this new requirement to generate revenue and maximize business performance?”

 

The Biggest Bang Theory

In the case of CECL, there are two significant areas that will create the biggest institution-wide impact: analytics and data governance. While the importance of these is hardly new to financial institutions, we are finding that many neglect to leverage their CECL data and analytics efforts to create that additional value. Some basic first steps you can take include the following.

  • Ensure that the data utilized is accurate and that its access and maintenance align to the needs and policies of your business. In the case of CECL these will be employed to create scenarios, model, and forecast … elements that the business can leverage to address sales, finance, and operational challenges.
  • For CECL, analytics and data are leveraged in a much more comprehensive fashion than previous methods of credit assessment provided.  Objectively assess the current state of these areas to understand how the efforts being put toward CECL implementation can be leveraged to enhance your current business environment.
  • Identify existing available resources. While some firms will need to spend significant effort creating new processes and resources to address CECL, others will use this as an opportunity to retire and re-invent current workflows and platforms.

Recognizing the business value of analytics and data may be intuitive, but what is often less intuitive is knowing which resources earmarked for CECL can be leveraged to realize that broader business value. The techniques and approaches we have put forward provide good perspective on the assessment and augmentation of processes and controls, but how can these changes be quantified? Institutions without in-house experienced resources are well advised to consider an external partner. The ability to leverage expertise of staff experienced in the newest approaches and methodologies will allow your internal team to focus on its core responsibilities.

Our experience with this type of work has provided some very specific results that illustrate the short-term and longer-term value realized. The example below shows the magnitude of change and benefits experienced by one of our clients: a mid-sized North American bank. A thorough assessment of its unique environment led to a redesign of processes and risk controls. The significant changes implemented resulted in less complexity, more consistency, and increased automation. Additionally, value was created for business units beyond the risk department. While different environments will yield different results, those illustrated through the methodologies set forth here provide a good example to better judge the outcome of a process and controls assessment.

 

 Legacy EnvironmentAutomated Environment
Reporting OutputNo daily available manual controls for risk reportingDaily in-cycle reporting controls are automated with minimum manual interaction
Process SpeedCredit run 40+ hours
Manually-input variables prone to mistakes
Credit run 4 hours
Cycle time reduced from 3 days to 1 for variable creation
Controls & AuditMultiple audit issues and Regulatory MRAsAudit issues resolved and MRA closed
Model ExecutionSpreadsheet driven90 models automated resulting in 1,000 manual spreadsheets eliminated

 

While one approach will not fit all firms, providing clients with an experienced perspective on more fully utilizing their specific investment in CECL allows them to make decisions for the business that might otherwise never be considered, thereby optimizing the investment in CECL and truly ensuring you receive the full value from your CECL buck.

More information on how you can prepare for—and drive additional value through—your CECL preparation is available on our website and includes:

White Paper – CECL: Why the expectations are different

White Paper – CECL Scenarios: Considerations, Development and Opportunities

Blog – Data Management: The Challenges

Macroeconomic Effects on the Modeling of Private Capital Cash Flows

The demand of private capital investing has investors clamoring for more information about prospective cash flows. Historically, that data has been hard to estimate. Because the investments aren’t traded on a public venue, there are few figures generated beyond the data received by existing investors.

So what’s an investor to do? FRG has developed a Private Capital Model solution that provides more insight and understanding of the probable cashflows, one that includes the macroeconomic variables that have been found to influence cash flows and significantly improve the forecasting probabilities. We have found those variables create a more complete picture than the Takahashi and Alexander model, commonly used within the industry to provide guidance around cash flows.

Three of FRG’s modeling and investment experts – Dr. Jimmie Lenz, Dominic Pazzula and Jonathan Leonardelli – have written a new white paper detailing the methodology used to create the Private Capital Model, and the results the model provides. Download the paper, “Macroeconomic Effects on the Modeling of Private Capital Cash Flows” from the Resources section of the FRG website. Interested in a perspective on an investor’s need and utilization of cash flow information? Download FRG’s first Private Capital Fund Cash Flows paper.

Subscribe to our blog!