Documenting CECL

The CECL Standard requires more than just another update in the calculation of a financial institution’s (FI’s) allowance for credit losses; the new standard also pushes institutions to be more involved in the entire allowance process, especially on the management/executive level. From explanations, justifications and rationales to policies and procedures, the standard requires them all. The FI needs to discuss them, understand them, and document them.

The first point is to discuss all decisions that must be made regarding the CECL process. This includes everything from the definition of default to the justification of which methodology to use for which segment of the data. Although these discussions may be onerous, the CECL standard requires full understanding and completeness of all decisions. Once there is understanding, all decisions must be documented for regulation purposes:

CECL Topic 326-20-50-10: An entity shall provide information that enables a financial statement user to do the following:

  1. Understand management’s method for developing its allowance for credit losses.
  2. Understand the information that management used in developing its current estimate of expected credit losses.
  3. Understand the circumstances that caused changes to the allowance for credit losses, thereby affecting the related credit loss expense (or reversal) reported for the period.

CECL Topic 326-20-50-11: To meet the objectives in paragraph 326-20-50-10, an entity shall disclose all of the following by portfolio segment and major security type:

  1. A description of how expected loss estimates are developed
  2. A description of the entity’s accounting policies and methodology to estimate the allowance for credit losses, as well as discussion of the factors that influenced management’s current estimate of expected credit losses, including:
    • Past events
    • Current conditions
    • Reasonable and supportable forecasts about the future
  3. A discussion of risk characteristics relevant to each portfolio segment
  4. Etc.

Although these may seem like surprising jumps in requirements for CECL, these are simply more defined requirements than under existing ALLL guidance. Note that some of the general requirements under the existing guidance will remain relevant under CECL, such as:

  • “the need for institutions to appropriately support and document their allowance estimates”
  • the “…responsibility for developing, maintaining, and documenting a comprehensive, systematic, and consistently applied process for determining the amounts of the ACL and the provision for credit losses.”
  • the requirement “…that allowances be well documented, with clear explanations of the supporting analyses and rationale.”

As you can see, documentation is an important component of the CECL standard. While the documentation will, at least initially, require more effort to produce, it will also give the FI opportunity to fully understand the inner workings of their CECL process.

Lastly, advice to avoid some headache—take the time to document throughout the entire process of CECL. As my math professor always said, “the due date is not the do date.”

Resources:

  1. FASB Accounting Standards Update, No. 2016-13, Financial Instruments – Credit Losses (Topic 326).
  2. Frequently Asked Questions on the New Accounting Standard on Financial Instruments – Credit Losses. FIL-20-2019. April 3, 2019.

Samantha Zerger, business analytics consultant with FRG, is skilled in technical writing. Since graduating from the North Carolina State University’s Financial Mathematics Master’s program in 2017 and joining FRG, she has taken on leadership roles in developing project documentation as well as improving internal documentation processes.

CECL – The Power of Vintage Analysis

I would argue that a critical step in getting ready for CECL is to review the vintage curves of the segments that have been identified. Not only do the resulting graphs provide useful information but the process itself also requires thought on how to prepare the data.

Consider the following graph of auto loan losses for different vintages of Not-A-Real-Bank bank[1]:

 

While this is a highly-stylized depiction of vintage curves, its intent is to illustrate what information can be gleaned from such a graph. Consider the following:

  1. A clear end to the seasoning period can be determined (period 8)
  2. Outlier vintages can be identified (2015Q4)
  3. Visual confirmation that segmentation captures risk profiles (there aren’t a substantial number of vintages acting odd)

But that’s not all! To get to this graph, some important questions need to be asked about the data. For example:

  1. Should prepayment behavior be captured when deriving the loss rates? If so, what’s the definition of prepayment?
  2. At what time period should the accumulation of losses be stopped (e.g., contractual term)?
  3. Is there enough loss[2] behavior to model on the loan level?
  4. How should accounts that renew be treated (e.g., put in new vintage)?

In conclusion, performing vintage analysis is more than just creating a picture with many different colors. It provides insight into the segments, makes one consider the data, and, if the data is appropriately constructed, positions one for subsequent analysis and/or modeling.

Jonathan Leonardelli, FRM, Director of Business Analytics for the Financial Risk Group, leads the group responsible for model development, data science, documentation, testing, and training. He has over 15 years’ experience in the area of financial risk.

 

[1] Originally I called this bank ACME Bank but when I searched to see if one existed I got this, this, and this…so I changed the name. I then did a search of the new name and promptly fell into a search engine rabbit hole that, after a while, I climbed out with the realization that for any 1 or 2 word combination I come up with, someone else has already done the same and then added bank to the end.

[2] You can also build vintage curves on defaults or prepayment.

 

RELATED:

CECL—Questions to Consider When Selecting Loss Methodologies

CECL—The Caterpillar to Butterfly Evolution of Data for Model Development

CECLData (As Usual) Drives Everything

CECL—Questions to Consider When Selecting Loss Methodologies

Paragraph 326-20-30-3 of the Financial Accounting Standards Board (FASB) standards update[1] states: “The allowance for credit losses may be determined using various methods”. I’m not sure if any statement, other than “We need to talk”, can be as fear inducing. Why is it scary? Because in the world of details and accuracy, this statement is remarkably vague and not prescriptive.

Below are some questions to consider when determining the appropriate loss methodology approaches for a given segment.

How much history do you have?

If a financial institution (FI) has limited history[2] then the options available to them are, well, limited. To build a model one needs sufficient data to capture the behavior (e.g., performance or payment) of accounts. Without enough data the probability of successfully building a model is low. Worse yet, even if one builds a model, the likelihood of it being useful and robust is minimal. As a result, loss methodology approaches that do not need a lot of data should be considered (e.g., discount cashflow or a qualitative factor approach based on industry information).

Have relevant business definitions been created?

The loss component approach (decomposing loss into PD, LGD, and EAD) is considered a leading practice at banks[3]. However, in order to use this approach definitions of default and, arguably, paid-in-full, need to be created for each segment being modeled. (Note: these definitions can be the same or different across segments.) Without these definitions, one does not know when an account has defaulted or paid-off.

Is there a sufficient number of losses or defaults in the data?

Many of the loss methodologies available for consideration (e.g., loss component or vintage loss rates) require enough losses to discern a pattern. As a result, banks that are blessed with infrequent losses can feel cursed when they try to implement one of those approaches. While low losses do not necessarily rule out these approaches, it does make for a more challenging process.

Are loan level attributes available, accurate, and updated appropriately?

This question tackles the granularity of an approach instead of an approach itself. As mentioned in the post CECL – Data (As Usual) Drives Everything, there are three different data granularity levels a model can be built on. Typically, the decision is between loan-level versus segment level. Loan-level models are great for capturing sensitivities to loan characteristics and macroeconomic events provided the loan characteristics are accurate and updated (if needed) on a regular interval.

Jonathan Leonardelli, FRM, Director of Business Analytics for the Financial Risk Group, leads the group responsible for model development, data science, documentation, testing, and training. He has over 15 years’ experience in the area of financial risk.

 

[1]FASB accounting standards update can be found here

[2] There is no consistent rule, at least that I’m aware of, that defines “limited history”. That said, we typically look for clean data reaching back through an economic cycle.

[3] See: Capital Planning at Large Bank Holding Companies: Supervisory Expectations and Range of Current Practice August 2013

RELATED:

CECL—The Caterpillar to Butterfly Evolution of Data for Model Development

CECLData (As Usual) Drives Everything

CECL—The Caterpillar to Butterfly Evolution of Data for Model Development

I don’t know about you, but I find caterpillars to be a bit creepy[1]. On the other hand, I find butterflies to be beautiful[2]. Oddly enough, this aligns to my views on the different stages of data in relation to model development.

As a financial institution (FI) prepares for CECL, it is strongly suggested (by me at least) to know which stage the data falls into. Knowing its stage provides one with guidance on how to proceed.

The Ugly

At FRG we use the term dirty data to describe data that is ugly. Dirty data typically has these following characteristics (the list is not comprehensive):

  • Unexplainable missing values: The key word is unexplainable. Missing values can mean something (e.g., a value has not been captured yet) but often they indicate a problem. See this article for more information.
  • Inconsistent values: For example, a character variable that holds values for state might have Missouri, MO, or MO. as values. A numeric variable for interest rate might have a value as a percent (7.5) and a decimal (0.075)
  • Poor definitional consistency: This occurs when a rule that is used to classify some attribute of an account changes during history. For example, at one point in history a line of credit might be indicated by a nonzero original commitment amount, but at a different point it might be indicated by whether a revolving flag is non-missing.
The Transition

You should not model or perform analysis using dirty data. Therefore, the next step in the process is to transition dirty data into clean data.

Transitioning to clean data, as the name implies, requires scrubbing the information. The main purpose of this step is to address the issues identified in the dirty data. That is, one would want to fix missing values (e.g., imputation), standardized variable values (e.g., all states are identified by a two-character code), and correct inconsistent definitions (e.g., a line indicator is always based on nonzero original commitment amount).

The Beautiful

A final step must be taken before data can be used for modeling. This step takes clean data and converts it to model-ready data.

At FRG we use the term model-ready to describe clean data with the application of relevant business definitions. An example of a relevant business definition would be how an FI defines default[3]. Once the definition has been created the corresponding logic needs to be applied to the clean data in order to create, say, a default indicator variable.

Just like a caterpillar metamorphosing to a butterfly, dirty data needs to morph to model-ready for an FI to enjoy its true beauty. And, only then, can an FI move forward on model development.

 

Jonathan Leonardelli, FRM, Director of Business Analytics for the Financial Risk Group, leads the group responsible for model development, data science, documentation, testing, and training. He has over 15 years’ experience in the area of financial risk.

 

[1] Yikes!

[2] Pretty!

[3] E.g., is it 90+ days past due (DPD) or 90+ DPD or in bankruptcy or in non-accrual or …?

 

RELATED:

CECL—Questions to Consider When Selecting Loss Methodologies

CECLData (As Usual) Drives Everything

CECL – Data (As Usual) Drives Everything

To appropriately prepare for CECL a financial institution (FI) must have a hard heart-to-heart with itself about its data. Almost always, simply collecting data in a worksheet, reviewing it for gaps, and then giving it the thumbs up is insufficient.

Data drives all parts of the CECL process. The sections below, by no means exhaustive, provide key areas where your data, simply being by your data, constrains your options.

Segmentation

Paragraph 326-20-30-2 of the Financial Accounting Standards Board (FASB) standards update[1] states: “An entity shall measure expected credit losses of financial assets on a collective (pool) basis when similar risk characteristic(s) exist.” It then points to paragraph 326-20-55-5 which provides examples of risk characteristics, some of which are: risk rating, financial asset type, and geographical location.

Suggestion: prior to reviewing your data consider what risk profiles are in your portfolio. After that, review your data to see if it can adequately capture those risk profiles. As part of that process consider reviewing:

  • Frequency of missing values in important variables
  • Consistency in values of variables
  • Definitional consistency[2]
Methodology Selection

The FASB standard update does not provide guidance as to which methodologies to use[3]. That decision is entirely up to the FI[4]. However, the methodologies that are available to the FI are limited by the data it has. For example, if an FI has limited history then any of the methodologies that are rooted in historical behavior (e.g., vintage analysis or loss component) are likely out of the question.

Suggestion: review the historical data and ask yourself these questions: 1) do I have sufficient data to capture the behavior for a given risk profile?; 2) is my historical data of good quality?; 3) are there gaps in my history?

Granularity of Model

Expected credit loss can be determined on three different levels of granularity: loan, segment (i.e., risk profile), and portfolio. Each granularity level has a set of pros and cons but which level an FI can use depends on the data.

Suggestion: review variables that are account specific (e.g., loan-to-value, credit score, number of accounts with institution) and ask yourself: are the sources of these variables reliable? Do they get refreshed often enough to capture changes in customer or macroeconomic environment behavior?

Hopefully, this post has started you critically thinking about your data. While data review might seem daunting, I cannot stress enough—it’s needed, it’s critical, it’s worth the effort.

 

Jonathan Leonardelli, FRM, Director of Business Analytics for the Financial Risk Group, leads the group responsible for model development, data science, documentation, testing, and training. He has over 15 years’ experience in the area of financial risk.

 

[1] You can find the update here

[2] More on what these mean in a future blog post

[3] Paragraph 326-20-30-3

[4] A future blog post will cover some questions to ask to guide in this decision.

 

RELATED:

CECL—The Caterpillar to Butterfly Evolution of Data for Model Development

Change in CECL Approved by the FDIC

The Federal Deposit Insurance Corporation (FDIC) approved a measure that will allow a three-year phase in of the impact of CECL on regulatory capital yesterday (12/18/18). This change will also delay the impact on bank stress tests until 2020.  The change does not affect the rule itself but now allows banks the option to phase in impacts of CECL on regulatory capital over a three-year period. The details of this change can be found in the FDIC memorandum released yesterday.  The memorandum also adjusts how reserves for “bad loans” will be accounted for in regulatory capital.

The Financial Risk Group is recommending that banks utilize this time to better understand the impact, and the opportunities, that result from the mandated changes. “Time to implementation has been a limiting factor for some institutions to explore the identification of additional stakeholder value, but this should no longer be the case,” stated John Bell, FRG’s managing partner. FRG has (and is currently) partnered with clients of all types on a number of assessments and implementations of CECL.  The lessons to date regarding CECL are available in a number of our publications, including: CECL-Considerations, Developments, and Opportunities and Current Expected Credit Loss-Why The Expectations Are Different.

Is Your Business Getting The Full Bang for Its CECL Buck?

Accounting and regulatory changes often require resources and efforts above and beyond “business as usual”, especially those like CECL that are significant departures from previous methods. The efforts needed can be as complex as those for a completely new technology implementation and can take precedence over projects that are designed to improve your core business … and stakeholder value.

But with foresight and proper planning, you can prepare for a change like CECL by leveraging resources in a way that will maximize your efforts to meet these new requirements while also enhancing business value. At Financial Risk Group, we take this approach with each of our clients. The key is to start by asking “how can I use this new requirement to generate revenue and maximize business performance?”

 

The Biggest Bang Theory

In the case of CECL, there are two significant areas that will create the biggest institution-wide impact: analytics and data governance. While the importance of these is hardly new to financial institutions, we are finding that many neglect to leverage their CECL data and analytics efforts to create that additional value. Some basic first steps you can take include the following.

  • Ensure that the data utilized is accurate and that its access and maintenance align to the needs and policies of your business. In the case of CECL these will be employed to create scenarios, model, and forecast … elements that the business can leverage to address sales, finance, and operational challenges.
  • For CECL, analytics and data are leveraged in a much more comprehensive fashion than previous methods of credit assessment provided.  Objectively assess the current state of these areas to understand how the efforts being put toward CECL implementation can be leveraged to enhance your current business environment.
  • Identify existing available resources. While some firms will need to spend significant effort creating new processes and resources to address CECL, others will use this as an opportunity to retire and re-invent current workflows and platforms.

Recognizing the business value of analytics and data may be intuitive, but what is often less intuitive is knowing which resources earmarked for CECL can be leveraged to realize that broader business value. The techniques and approaches we have put forward provide good perspective on the assessment and augmentation of processes and controls, but how can these changes be quantified? Institutions without in-house experienced resources are well advised to consider an external partner. The ability to leverage expertise of staff experienced in the newest approaches and methodologies will allow your internal team to focus on its core responsibilities.

Our experience with this type of work has provided some very specific results that illustrate the short-term and longer-term value realized. The example below shows the magnitude of change and benefits experienced by one of our clients: a mid-sized North American bank. A thorough assessment of its unique environment led to a redesign of processes and risk controls. The significant changes implemented resulted in less complexity, more consistency, and increased automation. Additionally, value was created for business units beyond the risk department. While different environments will yield different results, those illustrated through the methodologies set forth here provide a good example to better judge the outcome of a process and controls assessment.

 

 Legacy EnvironmentAutomated Environment
Reporting OutputNo daily available manual controls for risk reportingDaily in-cycle reporting controls are automated with minimum manual interaction
Process SpeedCredit run 40+ hours
Manually-input variables prone to mistakes
Credit run 4 hours
Cycle time reduced from 3 days to 1 for variable creation
Controls & AuditMultiple audit issues and Regulatory MRAsAudit issues resolved and MRA closed
Model ExecutionSpreadsheet driven90 models automated resulting in 1,000 manual spreadsheets eliminated

 

While one approach will not fit all firms, providing clients with an experienced perspective on more fully utilizing their specific investment in CECL allows them to make decisions for the business that might otherwise never be considered, thereby optimizing the investment in CECL and truly ensuring you receive the full value from your CECL buck.

More information on how you can prepare for—and drive additional value through—your CECL preparation is available on our website and includes:

White Paper – CECL: Why the expectations are different

White Paper – CECL Scenarios: Considerations, Development and Opportunities

Blog – Data Management: The Challenges

Subscribe to our blog!