IFRS 17: Killing Two Birds

Time is ticking for the 450 insurers around the world to comply with the International Financial Reporting Standard 17 (IFRS 17) by January 1, 2021 for companies with their financial year starting on January 1.

Insurers are at different stages of preparation, ranging from performing gap analyses, to issuing requirements to software and consulting vendors, to starting the pilot phase with a new IFRS 17 system, with a few already embarking on implementing a full IFRS 17 system.

Unlike the banks, the insurance industry has historically spent less on large IT system revamps. This is in part due to the additional volume, frequency and variety of banking transactions compared to insurance transactions.

IFRS 17 is one of the biggest ‘people, process and technology’ revamp exercises for the insurance industry in a long while. The Big 4 firms have published a multitude of papers and videos on the Internet highlighting the impact of the new reporting standard on insurance contracts that was issued by the IASB in May 2017. In short, it is causing a buzz in the industry.

As efforts are focused on ensuring regulatory compliance to the new standard, insurers must also ask: “What other strategic value can be derived from our heavy investment in time, manpower and money in this whole exercise?”

The answer—analytics to gain deeper business insights.

One key objective of IFRS 17 is to provide information at a level of granularity that helps stakeholders assess the effect of insurance contracts on financial position, financial performance and cash flows, increasing transparency and comparability.

Most IFRS 17 systems in the market today achieves this by bringing the required data into the system, compute, report and integrate to the insurer’s GL system. From a technology perspective, such systems will comprise a data management tool, a data model, a computation engine and a reporting tool. However, most of these systems are not built to provide strategic value beyond pure IFRS 17 compliance.

Apart from the IFRS 17 data, an insurer can use this exercise to put in place an enterprise analytics platform that caters beyond IFRS 17 reporting, to broader and deeper financial analytics, to customer analytics, operational and risk analytics. To leverage on new predictive analytics technologies like machine learning and artificial intelligence, a robust enterprise data platform to house and make available large volumes of data (big data) is crucial.

Artificial Intelligence can empower important processes like claims analyses, asset management, risk calculation, and prevention. For instance, better forecasting of claims experience based on a larger variety and volume of real-time data. The same machine can be used to make informed decisions about investments based on intelligent algorithms, among other use cases.

As the collection of data becomes easier and more cost effective, Artificial Intelligence can drive whole new growths for the insurance industry.

The key is centralizing most of your data onto a robust enterprise platform to allow cross line of business insights and prediction.

As an insurer, if your firm has not embarked on such a platform, selecting a robust system that can cater to IFRS 17 requirements AND beyond will be a case of killing 2 birds with one stone.

FRG can help you and your teams get ready for IFRS 17.  Contact us today for more information.

Tan Cheng See is Director of Business Development and Operations for FRG.

Is Your Business Getting The Full Bang for Its CECL Buck?

Accounting and regulatory changes often require resources and efforts above and beyond “business as usual”, especially those like CECL that are significant departures from previous methods. The efforts needed can be as complex as those for a completely new technology implementation and can take precedence over projects that are designed to improve your core business … and stakeholder value.

But with foresight and proper planning, you can prepare for a change like CECL by leveraging resources in a way that will maximize your efforts to meet these new requirements while also enhancing business value. At Financial Risk Group, we take this approach with each of our clients. The key is to start by asking “how can I use this new requirement to generate revenue and maximize business performance?”

 

The Biggest Bang Theory

In the case of CECL, there are two significant areas that will create the biggest institution-wide impact: analytics and data governance. While the importance of these is hardly new to financial institutions, we are finding that many neglect to leverage their CECL data and analytics efforts to create that additional value. Some basic first steps you can take include the following.

  • Ensure that the data utilized is accurate and that its access and maintenance align to the needs and policies of your business. In the case of CECL these will be employed to create scenarios, model, and forecast … elements that the business can leverage to address sales, finance, and operational challenges.
  • For CECL, analytics and data are leveraged in a much more comprehensive fashion than previous methods of credit assessment provided.  Objectively assess the current state of these areas to understand how the efforts being put toward CECL implementation can be leveraged to enhance your current business environment.
  • Identify existing available resources. While some firms will need to spend significant effort creating new processes and resources to address CECL, others will use this as an opportunity to retire and re-invent current workflows and platforms.

Recognizing the business value of analytics and data may be intuitive, but what is often less intuitive is knowing which resources earmarked for CECL can be leveraged to realize that broader business value. The techniques and approaches we have put forward provide good perspective on the assessment and augmentation of processes and controls, but how can these changes be quantified? Institutions without in-house experienced resources are well advised to consider an external partner. The ability to leverage expertise of staff experienced in the newest approaches and methodologies will allow your internal team to focus on its core responsibilities.

Our experience with this type of work has provided some very specific results that illustrate the short-term and longer-term value realized. The example below shows the magnitude of change and benefits experienced by one of our clients: a mid-sized North American bank. A thorough assessment of its unique environment led to a redesign of processes and risk controls. The significant changes implemented resulted in less complexity, more consistency, and increased automation. Additionally, value was created for business units beyond the risk department. While different environments will yield different results, those illustrated through the methodologies set forth here provide a good example to better judge the outcome of a process and controls assessment.

 

 Legacy EnvironmentAutomated Environment
Reporting OutputNo daily available manual controls for risk reportingDaily in-cycle reporting controls are automated with minimum manual interaction
Process SpeedCredit run 40+ hours
Manually-input variables prone to mistakes
Credit run 4 hours
Cycle time reduced from 3 days to 1 for variable creation
Controls & AuditMultiple audit issues and Regulatory MRAsAudit issues resolved and MRA closed
Model ExecutionSpreadsheet driven90 models automated resulting in 1,000 manual spreadsheets eliminated

 

While one approach will not fit all firms, providing clients with an experienced perspective on more fully utilizing their specific investment in CECL allows them to make decisions for the business that might otherwise never be considered, thereby optimizing the investment in CECL and truly ensuring you receive the full value from your CECL buck.

More information on how you can prepare for—and drive additional value through—your CECL preparation is available on our website and includes:

White Paper – CECL: Why the expectations are different

White Paper – CECL Scenarios: Considerations, Development and Opportunities

Blog – Data Management: The Challenges

Macroeconomic Effects on the Modeling of Private Capital Cash Flows

The demand of private capital investing has investors clamoring for more information about prospective cash flows. Historically, that data has been hard to estimate. Because the investments aren’t traded on a public venue, there are few figures generated beyond the data received by existing investors.

So what’s an investor to do? FRG has developed a Private Capital Model solution that provides more insight and understanding of the probable cashflows, one that includes the macroeconomic variables that have been found to influence cash flows and significantly improve the forecasting probabilities. We have found those variables create a more complete picture than the Takahashi and Alexander model, commonly used within the industry to provide guidance around cash flows.

Three of FRG’s modeling and investment experts – Dr. Jimmie Lenz, Dominic Pazzula and Jonathan Leonardelli – have written a new white paper detailing the methodology used to create the Private Capital Model, and the results the model provides. Download the paper, “Macroeconomic Effects on the Modeling of Private Capital Cash Flows” from the Resources section of the FRG website. Interested in a perspective on an investor’s need and utilization of cash flow information? Download FRG’s first Private Capital Fund Cash Flows paper.

Current Expected Credit Loss (CECL) a New Paradigm for Captives, Too

The ramifications of CECL on Financial Institutions has in large part focused on Banks, but as we addressed in a recent paper, “Current Expected Credit Loss: Why the Expectations Are Different,” this new accounting treatment extends to a much larger universe.  An example of this are the captives that finance American’s love affair with cars; their portfolios of leases and loans have become much larger and the implications of CECL more significant.

As with other institutions, data, platforms, and modeling make up the challenges that captives will have to address.  But unlike other types of institutions captives have more concentrated portfolios, which may aid in “pooling” exercises, but may be inadvertently affected by scenario modeling.  A basic tenet for all institutions is the life-of-loan estimate and the use of reasonable and supportable forecasts.  While some institutions may have had “challenger” models in the past that moved in this direction, captives have not tended to utilize this type of approach in the past.

The growth of captives portfolios and the correlation to a number of macro-economic factors (e.g. interest rates, commodity prices, tariffs, etc.) call for data and scenarios that require a different level of modeling and forecasting.  Because FASB does not provide template methodologies or calculations it will be necessary to develop these scenarios with the mindset of the “reasonable and supportable” requirement.  While different approaches will likely be adopted, those that utilize transaction level data have the ability to provide a higher level of accuracy over time, resulting in the goals laid out in the new guidelines.  As might be imagined the ability to leverage experience in the development and deployment of these types of models can’t be overemphasized.

We have found that having the ability to manage the following functional components of the platform are critical to building a flexible platform that can manage the changing needs of the users:

  • Scenario Management
  • Input Data Mapping and Registry
  • Configuration Management
  • Model Management

Experience has taught that there are significant considerations in implementing CECL, but there are also some improvements that can be realized for institutions that develop a well-structured plan. Captives are advised to use this as an opportunity to realize efficiencies, primarily in technology and existing models. Considerations around data, platforms, and the models themselves should leverage available resources to ensure that investments made to address this change provide as much benefit as possible, both now and into the future.

Real Time Learning: A Better Approach to Trader Surveillance

An often-heard question in any discussion of Machine Learning (ML) tools is maybe most obvious one: “So, how can we use them?”

The answer depends on the industry, but we think there are especially useful (and interesting) applications for the financial services sector. These consumers have historically been open to the ML concept but haven’t been quick to jump on some potential solutions to common problems.

Let’s look at risk management at the trading desk, for example. If you want to mitigate risk, you need to be able to identify it in advance—say, to insure your traders aren’t conducting out-of-market transactions or placing fictitious orders. The latest issue of the New Machinist Journal by Dr. Jimmie Lenz (available by clicking here) explains how. Trade Desk Surveillance is just one way that Machine Learning tools can help monitor a variety of activities that can cause grief for those tasked with risk management.

Would you like to read more about the possibilities ML can bring to financial services process settings? Download “Real Time Learning: A Better Approach to Trader Surveillance,” along with other issues of the New Machinist Journal, by visiting www.frgrisk.com/resources.

IFRS 9: Modeling Challenges

Calculating expected credit losses under IFRS 9 is easy. It requires little more than high school algebra to determine the aggregate present value of future cash flows. But it is not easy to ascertain the key components that are used by the basic equation—regardless whether the approach taken is “advanced”  (i.e., where PD, LGD, and EAD are modeled) or ”simplified” (also called “intermediate”). The forward-looking stance mandated by IFRS 9 makes the inherently difficult process of specifying these variables all the more complex.

For the sake of brevity, let’s consider only the advanced approach for this discussion. There are two immediate impacts on PD model estimation: the point-in-time requirements and the length of the forecast horizon.

PD estimates need to reflect point-in-time (PIT) rather than through-the-cycle (TTC) values. What this means is that PDs are expected to represent the current period’s economic conditions instead of some average through an economic cycle. Bank risk managers will have to decide whether they can adapt a CCAR (or other regulatory) model to this purpose, determine a way to convert a TTC PD to a PIT PD, or build an entirely new model.

The length of the forecast horizon has two repercussions. First, one must consider how many models to build for estimating PDs throughout the forecast. For example, it may be determined that a portfolio warrants one model for year 1, a second model for years 2 to 3, and a third model for years 3+. Second, one should consider how far into the forecast horizon to use models. Given the impacts of model risk, along with onus of maintaining multiple models, perhaps PDs for a horizon greater than seven years would be better estimated by drawing a value from some percentile of an empirical distribution.

Comparatively speaking, bank risk managers may find it somewhat less difficult to estimate LGDs, especially if collateral values are routinely updated and historical recovery rates for comparable assets are readily available in the internal accounting systems. That said, IFRS 9 requires an accounting LGD, so models will need to be developed to accommodate this, or a process will have to be defined to convert an economic LGD into an accounting one.

Projecting EADs is similarly challenging. Loan amortization schedules generally provide a valid starting point, but unfortunately they are only useful for installment loans. How does one treat a revolving exposure? Can one leverage, and tweak, the same rules used for CCAR? In addition, embedded options have to be taken into account. There’s no avoiding it: estimating EADs calls for advanced financial modeling.

As mentioned above, there are differences between the requirements of IFRS 9 and those of other regulatory requirements (e.g., CCAR). As a result, the models that banks use for stress testing or other regulatory functions cannot be used as-is for IFRS 9 reporting. Bank risk managers will have to decide, then, whether their CCAR models can be adapted with relatively minor modifications. In many cases they may conclude that it makes more sense to develop new models. Then all the protocols and practices of sound model design and implementation come into play.

Of course, it is also important to explain the conceptual basis and present the supporting evidence for PD, LGD, and EAD estimates to senior management—and to have the documentation on hand in case independent auditors or regulatory authorities ask to see it.

In short, given PD, LGD, and EAD, it’s a trivial matter to calculate expected credit losses. But preparing to comply with the IFRS 9 standard is serious business. It’s time to marshal your resources.

Risk Premia Portfolio Case Study

See how FRG’s VOR (Visualization of Risk) platform works for a major U.S. foundation: download a case study that explores how we customized VOR application tools to help them with their day-to-day portfolio management activities, as well as their monthly analysis and performance reporting.

The study shows how FRG was able to leverage its econometric expertise, system development capability and logistical strength to empower the foundation’s specialized investment team. Read the study, and learn more about VOR, here.

Subscribe to our blog!