See how FRG’s VOR (Visualization of Risk) platform works for a major U.S. foundation: download a case study that explores how we customized VOR application tools to help them with their day-to-day portfolio management activities, as well as their monthly analysis and performance reporting.
The Federal Reserve and the OCC define model risk as “the potential for adverse consequences from decisions based on incorrect or misused model outputs and reports.” Statistical models are the core of stress testing and credit analysis, but banks are increasingly using them in strategic planning. And the more banks integrate model outputs into their decision making, the greater their exposure to model risk.
Regulators have singled out model risk for supervisory attention; managers who have primary responsibility for their bank’s model development and implementation processes should be no less vigilant. This article summarizes the principles and procedures we follow to mitigate model risk on behalf of our clients.
The first source of model risk is basing decisions on incorrect output. Sound judgment in the design stage and procedural discipline in the development phase are the best defenses against this eventuality. The key steps in designing a model to meet a given business need are determining the approach, settling on the model structure, and articulating the assumptions.
- Selecting the approach means choosing the optimal level of granularity (for example, should the model be built at the loan or segment level).
- Deciding on the structure means identifying the most suitable quantitative techniques (for example, should a decision tree, multinomial logistic, or deep learning model be used).
- Stating the assumptions means describing both those that are related to the model structure (for instance, distribution of error terms) and those pertaining to the methodology (such as default expectations and the persistence of historical relationships over the forecast horizon).
Once the model is defined, the developers can progressively refine the model, critically subjecting it to rounds of robust testing both in and out of sample. They will make further adjustments until the model reliably produces plausible results.
Additionally, independent model validation teams provide a second opinion on the efficacy of the model. Further model refinement might be required. This helps to reduce the risk of confirmation bias on the part of the model developer.
This iterative design, development, and validation process reduces the first kind of risk by improving the likelihood that the final version will give decision makers solid information.
The second kind of model risk, misusing the outputs, can be addressed in the implementation phase. Risk managers learned the hard way in the financial crisis of 2007-2008 that it is vitally important for decision makers to understand—not just intellectually but viscerally—that mathematical modeling is an art and models are subject to limitations. The future may be unlike the past. Understanding the limitations can help reduce the “unknown unknowns” and inhibit the misuse of model outputs.
Being aware of the potential for model risk is the first step. Acting to reduce it is the second. What hedges can you put in place to mitigate the risk?
First, design, develop, and test models in an open environment which welcomes objective opinions and rewards critical thinking. Give yourself enough time to complete multiple cycles of the process to refine the model.
Second, describe each model’s inherent limitations, as well as the underlying assumptions and design choices, in plain language that makes sense to business executives and risk managers who may not be quantitatively or technologically sophisticated.
Finally, consider engaging an independent third party with the expertise to review your model documentation, audit your modeling process, and validate your models.
For information on how FRG can help you defend your firm against model risk, please click here.
 Federal Reserve and OCC, “Supervisory Guidance on Model Risk Management,” Attachment to SR Letter 11-07 (April 4, 2011), page 3. Emphasis added.
 See for example the Federal Reserve’s SR letters 15-8 and 12-17.
Middle office jobs are fascinating. In performance analysis, spotting dubious returns and tracing them back to questionable inputs requires insight that seems intuitive or innate but results in fact from a keen understanding of markets, asset classes, investment strategies, security characteristics, and portfolio dynamics. Risk management additionally calls for imagination in scenario forecasting, math and programming skills in model development, judgment in prioritizing and mitigating identified risks, and managerial ability in monitoring exposures that continually shift with market movements and the firm’s portfolio decisions. Few careers so completely engage such a wide range of talents.
Less rewarding is handling the voluminous information that feeds the performance measurement system and risk management models. Financial data management is challenging for small banks and investment managers, and it becomes more and more difficult as the business grows organically, adding new accounts, entering new markets, and implementing new strategies that often use derivatives. Not to mention the extreme data integration issues that stem from business combinations!
And data management hasn’t any upside: nobody in your chain of command notices when it’s going well, and everyone reacts when it fails.
Nonetheless, reliable data is vital for informative performance evaluation and effective risk management, especially at the enterprise level. It doesn’t matter how hard it is to collect, format, sort, and reconcile the data from custodians and market data services as well as your firm’s own systems (all too often including spreadsheets) in multiple departments. Without timely, accurate, properly classified information on all the firm’s long and short positions across asset classes, markets, portfolios, issuers, and counterparties, you can’t know where you stand. You can’t answer questions. You can’t do your job.
Adding up the direct, explicit costs of managing data internally is a straightforward exercise; the general ledger keeps track of license fees. The indirect, implicit costs are less transparent. For example, they include the portion of IT, accounting, and administrative salaries and benefits attributable to mapping data to the performance measurement system and the risk models, coding multiple interfaces, maintaining the stress testing environment, correcting security identifiers and input errors—all the time-consuming details that go into supporting the middle office. The indirect costs also include ongoing managerial attention and the potential economic impact of mistakes that are inevitable if your company does not have adequate staffing and well-documented, repeatable, auditable processes in place to support smooth performance measurement and risk management operations.
You can’t delegate responsibility for the integrity of the raw input data provided by your firm’s front office, portfolio assistants, traders, and security accountants. But you can outsource the processing of that data to a proven provider of hosting services. And then your analysts can focus on the things they do best—not managing data but evaluating investment results and enterprise risk.
Learn more about FRG’s Hosting Services here.
Stop and think: how much does your firm — and your work group — depend upon electronic spreadsheets to get mission-critical assignments done? How badly could a spreadsheet error damage your company’s reputation? Its financial results? Your own career?
Here’s an example. Advising Tibco Software on its sale to Vista Equity Partners, Goldman Sachs used a spreadsheet that overstated its client’s shares outstanding and, as a result, overvalued the company by $100 million. The Wall Street Journal reported, “It’s not clear who created the spreadsheet. Representatives for Tibco and Goldman declined to comment. Vista couldn’t be reached for comment.” (October 16, 2014.) Nonetheless, it’s safe to assume that the analyst who prepared the spreadsheet was identified, along with his or her manager, and that they both were penalized for the mistake.
Spreadsheets proliferate in financial organizations for good reasons. They offer convenient, flexible, and surprisingly powerful ad hoc solutions to all sorts of analytical problems. But as risk managers we are an impatient lot, and all too often results-oriented people like us turn to spreadsheets even for production applications because we cannot wait for IT resources to become available. We know that the IT department has a hard-and-fast policy of disavowing the business lines’ spreadsheets, but that’s all right, we tell ourselves, because “it’s only temporary.” Then we turn our attention to another problem….
Let’s take it as axiomatic that the firm’s risk management operations should not exacerbate the firm’s exposure to operational risk.
You may already have established some controls to mitigate spreadsheet risk in production applications. For example, key spreadsheets may be encrypted, stored on dedicated, non-networked PCs with password protection, and backed up every night. And it might be said that spreadsheets are self-documenting because the macros and formulas are visible and the functions are vendor-defined. As a practical matter, however, only the analyst who originally developed a spreadsheet fully understands it. When she leaves, and other analysts add enhancements — possibly with new names for existing variables — the spreadsheet becomes much more difficult to troubleshoot.
We recommend taking these steps now:
- Starting in the risk management area, inventory all the spreadsheets in use across the firm’s operations.
- Confirm that every time a spreadsheet enters a workflow it is identified as such. Cross-check the workflow documentation and swim lane diagrams against the spreadsheet inventory and update them where necessary.
- Document every non-trivial spreadsheet, minimally including its purpose, the data sources, and any procedural tips.
- Select the operationally embedded spreadsheets whose failure would be most injurious to departmental objectives and downstream processes, and look for permanent solutions with proper controls.
Whether or not it’s explicitly listed in your performance objectives, you owe it to your firm and yourself to migrate mission-critical spreadsheet applications to a reliable platform with codified controls. Systems development life cycle (SDLC) methodologies impose the discipline that’s needed in all phases of the project, from requirements analysis through deployment and maintenance, to minimize operational risk. This is not a trivial task; transferring the ad hoc functionality you currently have embedded in spreadsheets to a system that is well designed and amply supported takes heart. But the potential consequences of inaction are unacceptable. We strongly encourage you to take the necessary steps before a problem comes to light because a key person leaves the organization, a client spots a costly mistake, or — in the worst case — an operational crisis prevents the firm from meeting its contractual or regulatory obligations. And you lose your job.
The U.S. central bank finalized its rule exempting large and noncomplex banks from the qualitative component of Comprehensive Capital Analysis and Review (CCAR) program. Bank holding companies and U.S. intermediate holding companies of foreign banking organizations that have total consolidated assets between $50 billion and $250 billion and total nonbank assets of less than $75 billion, and that are not identified as global systemically important banks, must conduct stress tests but are not required to undergo the qualitative assessment to which the largest banks are additionally subject. The Fed’s press release also states that the scenarios and instructions for the 2017 CCAR cycle will be released by the end of this week.
A 10th anniversary calls for a special celebration with family and friends. At our annual party on Dec. 16, our consultants — from as nearby as Cary and as far away as Kuala Lumpur, Malaysia — enjoyed a night of casino-style gaming, music and fun at Market Hall in Raleigh. Here’s to the next 10 years!
The Department of Labor’s fiduciary rule became effective in June with an implementation date that is now less than four months away. It is, of course, uncertain that the regulation will stay in place under the new administration. President-elect Trump has named Andrew Puzder as Labor Secretary, and he may wish to buy time to repeal the rule by deferring the implementation date. According to Barron’s, however, “delaying or repealing the rule could easily take a year or more.”[i]
Major financial services institutions have already made key decisions and launched the related systems, training, and client communications projects to meet the current April 10, 2017 deadline. Smaller banks may be less prepared.
We’re not qualified to offer legal or regulatory advice; if your company is affected by the new rule, we urge you to call in a compliance consultant. Nonetheless, as financial and operational risk managers, we’d like to offer a few practical suggestions to help you get started.
All firms that offer investment advice to retirement savers have to meet certain fundamental requirements by the implementation date:
- Comply with the Impartial Conduct Standards set forth in the rule (act in the investor’s best interest, give no misleading information, and charge no more than reasonable compensation).
- Notify investors that your institution and its advisors are acting as fiduciaries, and describe the firm’s conflicts of interest.
- Appoint a person responsible for addressing conflicts of interest and ensuring compliance with the Impartial Conduct Standards.
The first step is to determine whether to continue offering commission-based products and services (Wells Fargo’s plan) or to convert to a fee-only basis (Merrill Lynch’s approach). This is a board-level decision, and the answer will depend upon such factors as the competitive environment and the bank’s ability to manage cultural as well technological and process changes under deadline pressure.
The second step is to appoint the person who will have ongoing responsibility for resolving conflicts of interest and monitoring advisors’ investment recommendations. In addition to technical competence, desirable qualities include strong communications and negotiating skills.
The third step is to assemble the project team that will analyze the business requirements and make any necessary changes to the firm’s document management, accounting, and client reporting systems.
A sound fourth step, in our view, is to confirm that your firm has current know-your-customer and suitability documentation for every client relationship. You can’t defend any investment recommendations without this information. Now is a good time to make sure it’s complete and up-to-date.
The fifth step can be initiated in parallel with the fourth one. It is to develop advisor training materials in two areas: how to explain “fiduciary status” to their clients, and what the Impartial Conduct Standards require of them.
If it remains in force, the fiduciary rule will massively change the financial services industry over the next few years. Certainly, it will squeeze the profitability of retail investment advisory services, and it may also foster the spread of robo-advising. In the short term, however, banks primarily face regulatory and operational risk, and hoping for a timely reprieve is not a prudent compliance strategy. These first five steps mitigate the risk of getting caught short as the implementation date approaches.
Philip Lawton, CFA, CIPM, is a guest blogger for Financial Risk Group.
[i] “Trump’s DOL Pick: Fiduciary Friend or Foe?” Barron’s, November 9, 2016.
The Federal Reserve has announced its plans to test the ability of the country’s largest banks to weather a prolonged period of negative interest rates.
As reported in Bloomberg Business, the Fed wants reassurance from this year’s stress test that these banks can withstand the economic scenarios typical of a severe global recession, which could be accompanied by three-month bill rates falling below zero for three years.
Read more about this year’s stress tests in Bloomberg Business.
The experts at Financial Risk Group are uniquely positioned to help banks test their vulnerabilities. Contact us today for more information.
A new survey conducted by American Banker Research provides insight into the responsibilities of risk managers from banks with assets of less than $100 million to more than $10 billion. They answered questions about the tasks that require the biggest chunk of their time, their top challenges, and more. To read the survey, visit AmericanBanker.com.
The Dodd-Frank law turned five years old in July, and its sponsors say they are pleased with what the financial legislation has accomplished. Former Sen. Christopher Dodd and former Rep. Barney Frank discussed the law, its importance and its shortfalls with The Wall Street Journal. Click below to read the article.