AI in FIs: Foundations of Machine Learning in Financial Risk

This blog series will focus on Financial Institutions as a premier business use case for Machine Learning through the lens of financial risk.

Today, opportunities exist for professionals to delegate time-intensive, dense, and complex tasks to machines. Machine Learning (ML) has the ability to automate Artificial Intelligence (AI) and is becoming much more robust as technological advances ease and lessen resource constraints.

Financial Institutions (FI) are constantly under pressure to keep up with evolving technology and regulatory requirements. Compared to what has been used in the past, modern tools have become more user-friendly and flexible; they are also easily integrated with existing systems. This evolution is enabling advanced tools such as ML to regain relevance across industries, including finance.

So, how does ML work? Imagine someone is learning to throw a football. Over time, the to-be quarterback is trained to understand how to adjust the speed of the ball, the strength of the throw, and the path of trajectory to meet the expected routes of the receivers. In a similar way, machines are trained to perform a specific task, such as clustering, by means of an algorithm. Just as the quarterback is trained by a coach, a machine learns to perform a specific task from a ML algorithm. This expands the possibilities for ways technology can be used to add value to the business.

What does this mean for FIs? The benefit of ML is that value can be added in areas where efficiency and accuracy are most critical.  To accomplish this, the company aligns these four components: data, applications, infrastructure, and business needs.

 

A flow chart showing the relationship between technology and data

 

The level of data maturity of FIs determines their capacity for effectively utilizing both structured and unstructured data. A well-established data governance framework lays the foundation for proper use of data for a company. Once their structured data is effectively governed, sourced, analyzed, and managed, they can then employ more advanced tools such as ML to supplement their internal operations. Unstructured data can also be used, but the company must first harness the tools and computing power capable of handling it.

Many companies are turning to cloud computing for their business-as-usual processes and for deploying machine learning. There are options for hosting cloud computing either on-premises or with public cloud services, but these are a matter of preference. Either method provides scalable computing power, which is essential when using ML algorithms to unlock the potential value that massive amounts of data provides.

Interested in reading more? Subscribe to the FRG blog to keep up with AI in FIs.

Hannah Wiser is an assistant consultant with FRG. After graduating with her Master’s in Quantitative Economics and Econometrics from East Carolina University in 2019, she joined FRG and has worked on projects focusing on technical communication and data governance.

 

Chart shows relevant terms and definitions

 

Data Management – Leveraging Data for a Competitive Advantage

This is the first in a series of blogs that explore how data can be an asset or a risk to organizations in an uncertain economic climate.

Humanity has always valued novelty. Since the advent of the Digital Age, this preference has driven change at an astronomical pace. For example, more data was generated in the last two years than in the entire human history to date, a concept made more staggering by Machine Learning and Artificial Intelligence tools that allow users to access and analyze data as never before. The question now is: how can business leaders and investors best make sense of this information and use it for their competitive advantage?

Traditionally, access to good data has been a limiting factor. Revolutionary business strategies were reserved for those who knew how to obtain, prepare, and analyze it. While top-tier decision making is still data- and insight-driven, today’s data challenges are characterized more by glut than scarcity, both in terms of overall volume of information and the tools available to make sense of it. As of today, only 0.5% of data that is produced is even analyzed.

This overabundance of information and tech tools has ironically led to greater uncertainty for business leaders. Massive data sets and powerful, user-friendly tools often mask underlying issues, resulting in many firms maintaining and processing duplicates of their data, creating silos of critical but unconnected data that must be sorted and reconciled. Analysts still spend between 80% of their time collecting and preparing their data and only 20% analyzing it.

Global interconnectivity is making the world smaller and more competitive. Regulators, who understand the power of data, are increasing controls over it. Now, more than ever, it is critical for firms to take action. To remain competitive, organizations must understand the critical data that drives their business, so they are able to make use of it and alternative data sets for future decision making; otherwise they face obsolescence. These are not just internal concerns. Clients are also requesting more customized services and demanding to understand how firms are using their information. Firms must identify critical data and understand that all data is not, and should not, be treated the same so they can extract the full power of the information and meet client and regulatory requirements.

Let’s picture data as an onion. As the core of the onion supports it outer layers, the ‘core’ or critical enterprise data supports all the functions of a business. When the core is strong, so is the rest of the structure. When the core is contaminated or rotten – that is a problem, for the onion and for your company.

A comparison picture showing an onion with healthy core data vs. an onion with a contaminated core.

Data that is core to a business – information like client IDs, addresses, products and positions, to name a few examples – must be solid and healthy enough to support the outer layers of data use and reporting in the organization. This enterprise data must be defined, clean and unique, or the firm will waste time cleaning and reconciling it, and the client, business and regulatory reports that it supports will be inaccurate.

How do you source, define and store your data to cleanly extract the pieces you need? Look at the onion again. You could take the chainsaw approach to slice the onion, which would give you instant access to everything inside, good and contaminated, and will probably spoil your dish. Likewise, if you use bad data at the core, any calculations you perform on it or reports aggregating the data will not be correct. If you need a clean slice of onion required by a specific recipe (or calculated data required for a particular report), precision and cleanliness of the slice (good core data and unique contextual definition) is key.

Once your core data is unique, supported and available, clients, business and corporate users can combine it with alternative and non-traditional data sets, to extract information, enhance it and add value. As demand for new “recipes” of data (for management, client or regulatory reporting) is ever increasing, firms who do not clean up and leverage their core data effectively will become obsolete. These demands can be anything from data needed for instant access and client reporting across different form factors (i.e. Web, iOS & Android apps), to data visualization and manipulation tools for employees analyzing new and enhanced information to determine trends. Demand also stems from the numerous requirements needed to comply with the complex patchwork of regional financial regulations across the globe. Many different users, many different recipes, all reliant on the health of their core data (onion core).

What is the actionable advice when you read a headline like: “A recent study in the Harvard Business Review found that over 92% of surveyed firms agreed that data analytics for decision making will be more important 2 years from now”? We have some ideas. In this blog series, FRG Data Advisory & Analytics will take you through several use cases to outline what data is foundational or core to business operations and how to achieve the contextual precision demanded from the market and regulators within our current environment of uncertainty, highlighting both how data can be an asset, or a potential risk, if not treated appropriately.

Dessa Glasser, Ph.D., is an FRG Principal Consultant, with 30+ years experience designing and implementing innovative solutions and organizations in data, risk, and analytics. She leads the Data Advisory & Analytics Team for FRG and focuses on data, analytics and regulatory solutions for clients.  

Edward Hanlon is a Senior Consultant and Engagement Manager on FRG’s Data Advisory & Analytics Team. He focuses on development and implementation of data strategy solutions for FRG, leveraging previous experience launching new Digital products and reengineering operational models as a Digital Technology platform owner and program lead in financial services.

 

Economic Impact Analysis for Credit Unions

In a recent webinar I participated in with SAS we discussed Economic Impact Analysis (EIA). While EIA is similar in concept to stress testing, its main goal is to allow credit unions to move quickly to evaluate economic changes to their portfolio—such as those brought about by a crisis like the COVID-19 pandemic.

There are four main components to EIA.

  1. Portfolio data: At a minimum this needs to be segment level with loss performance through time. If needed, this data could be augmented with external data
  2. Scenarios: Multiple economic possibilities are necessary to help assess timing and magnitude of potential, future loss
  3. Models or methodologies: These are required to link scenarios to the portfolio to forecast loss amounts
  4. Visualization of results: This is essential to clearly understand the portfolio loss behavior. While tables are useful, nothing illustrates odd behavior better than a picture (e.g., a box plot or tree map or histogram).

A credit union looking for a practical approach for getting started should consider the following steps:

  • Start with segment level data instead of account level. This should reduce the common complexities that arise when sourcing and cleaning account level data.
  • Develop segment level models or methodologies to capture the impacts of macroeconomic changes.  These can be simple provided they incorporate relationships to macroeconomic elements.
  • Create multiple scenarios. The more the better. Different scenarios will provide different insights in how the portfolio reacts to changing macroeconomic environments.
  • Execute models and explore results. This is where (I believe) the fun begins. Be curious – change portfolio assumptions (e.g., growth or run-off), and scenarios, to see how losses will react.

Now is the time to act, to gain an understanding about the economy’s impact on one’s portfolio. But it is worth mentioning this is also an investment into the future. As mentioned earlier, EIA has its roots in stress testing. By creating an EIA process now, a credit union not only better positions itself to build a robust stress test platform but also has the foundation to tackle CECL.

To view the webinar on demand, please visit NAFCU.

Jonathan Leonardelli, FRM, Director of Business Analytics for the Financial Risk Group, leads the group responsible for model development, data science, and technical communication. He has over 15 years’ experience in the area of financial risk.

Avoiding Bureaucratic Phrasing


Employees developed plans during the course of the project in order to create a standardized process with respect to regulation guidelines.

Did you understand that sentence reading through the first time? The sentence is filled with bureaucratic phrasing which makes the information more complex than necessary.

In the workplace, “bureaucratic” means involving complicated rules and processes that make something unnecessarily slow and difficult. People tend to use this style of phrasing because they believe there is permanence in writing. Say something and it’s gone, but write it down and it’s with us forever.

When people believe their writing is out there for all to see, they want to sound as professional and as knowledgeable as possible. But adding bureaucratic language isn’t the best way to sound like an expert. Many complex phrases read better when they are stripped down into simple words. For example, in the original sentence above, “in order to” can be reduced to “to” and “during the course of” can be simplified to “during”:

Employees developed plans during the project to create a standardized process with respect to regulation guidelines.

Using bureaucratic phrasing can make readers feel inadequate and indirectly exclude them from the conversation. This is why using plain, straightforward language in your writing is recommended instead.

The key is learning how to turn those overly complex phrases into simple words that mean the same thing. Here are some examples:

Bureaucratic PhraseSimple Word / Phrase
Along the lines ofLike
As of this dateYet, still, or now
At all timesAlways
Due to the fact thatBecause
Concerning the matter ofAbout
For the purpose ofFor, to
In spite of the fact thatAlthough

One guideline is to avoid words and phrases that you would not use in everyday speech. You would never say, “May I look over your paper in the near future in order to review it?” Why write it?

The goal of any documentation, whether it be a technical design document or an email, is to state your main point in a simple manner. Your readers should be able to easily find important information, quickly understand what information they should store for later use, and immediately recognize what is being asked of them. Avoiding bureaucratic phrasing can help you accomplish this.

Resources:

  • Hopkins, Graham. Why bureaucratic jargon is just a pompous waste of words. 12 Sept. 2000. The Guardian.
  • Richard Johnson-Sheehan. “Technical Communication Today: Special Edition for Society for Technical Communication Foundation Certification”. Fifth Edition.

Samantha Zerger, business analytics consultant with FRG, is skilled in technical writing. Since graduating from the North Carolina State University’s Financial Mathematics Master’s program in 2017 and joining FRG, she has taken on leadership roles in developing project documentation as well as improving internal documentation processes.

RELATED:

Improving Business Email

 

How COVID-19 Could Affect Private Capital Investors

A new blog by Preqin explores what COVID-19 could mean for private capital investors.

FRG and Preqin, an industry-leading provider of data, analytics and insights for the alternative assets community, partnered to develop a novel cash flow prediction model. The model is guided by FRG’s innovative methodology and powered by Preqin’s fund-level cash flow data.

Analysts used this tool in conjunction with the release of FRG’s Pandemic Economic Scenario to assess the impact of a recession triggered by the novel coronavirus on capital calls, distributions and net cash flows.

In the blog, Preqin’s Jonathon Furer examines an analysis created by FRG.  Jonathon explores the pandemic’s effect focused on 2017-2019 vintage funds, which represent 72% of the $2.63tn in callable dry powder that the private capital industry has raised since 2000. “Assuming the global economy undergoes a significant but brief recession, and then recovers, our model suggests GPs will respond in two stages,” Furer writes.

Read about the projected stages in the full analysis, Why COVID-19 Means Investors Should Expect Lower Capital Calls and Distributions in 2020.

FRG has 20+ years of experience applying stress testing to portfolios for banks and asset allocators. We developed this unique model enabling investors to stress test private capital portfolios for a wide range of macroeconomic shocks. We are ready to help investors looking to better understand portfolio dynamics for capital planning and pacing, or risk control for a black swan event.

Download the Pandemic Economic Scenario or get in contact with Preqin at info@preqin.com for the most accurate private capital cash flow forecasting model.

If FRG can help you better understand the effects of macroeconomic shocks on your private capital portfolios, contact us at info@frgrisk.com.

 

 

 

 

 

Is a Pandemic Scenario Just a Recession Scenario?



Recently, I wrote about how a pandemic might be a useful scenario to have for scenario analysis. As I thought about how I might design such a scenario I considered: should I assume a global recession for the pandemic scenario?

A pandemic, by definition, is an outbreak of a disease that affects people around the globe. Therefore, it is reasonable to think that it would slow the flow of goods and services through the world. Repercussions would be felt everywhere – from businesses reliant on tourism and travel to companies dependent on products manufactured in countries hit the hardest.

For an initial pass, using a recession seems sensible. However, I believe this “shortcut” omits a key trait needed for scenario development: creativity.

The best scenarios, I find, come from brainstorming sessions. These sessions allow challenges to be made to status quo and preconceptions. They also help identify risk and opportunity.

To immediately consider a recession scenario as “the pandemic scenario,” then, might not be advantageous in the long run.

As an exercise, I challenged myself to come up with questions that aren’t immediately answered when assuming a generic recession. Some that I asked were:

  • How do customers use my business? Do they need to be physically present to purchase my goods or use my services?
  • How will my business be impacted if my employees are not able to come into work?
  • What will happen to my business if there is a temporary shortage of a product I need? What will happen if there is a drawn-out shortage?
  • How dependent is my business on goods and services provided by other countries? Do these countries have processes in place to contain or slow down the spread of the disease?
  • Does my business reside in a region of the country that makes it more susceptible to the impact of a pandemic (e.g., ports, major airports, large manufacturing)?
  • How are my products and services used by other countries?
  • How can my company use technology to mitigate the impacts of a pandemic?
  • Is there a difference in the impact to my company if the pandemic is slow moving versus fast moving?

These are just a few of the many questions to consider for this analysis. Ultimately, the choice of whether to use a recession or not rests with the scenario development committee. To make the most informed decision, I would urge the committee to make questions like these a part of the discussion rather than taking the “shortcut” approach.

Jonathan Leonardelli, FRM, Director of Business Analytics for FRG, leads the group responsible for model development, data science, documentation, testing, and training. He has over 15 years’ experience in the area of financial risk.


RELATED:

Do You Have a Pandemic Scenario?

VOR Scenario Builder: Obtaining Business Insight From Scenarios

 

The Financial Risk Group Is Now FRG

We’re making it official: After more than a decade of operating as “The Financial Risk Group,” we’re changing our name to reflect what our clients have called us since the early days. We are excited to formally debut our streamlined “FRG” brand and logo.

Our new look is a natural progression from where we started 14 years ago, when the three founding partners of this company set a lofty goal. We wanted to become the premier risk management consulting company. It seemed ambitious, considering we were operating out of Ron Holanek’s basement at the time, but we knew we had at least two things going for us: a solid business plan and a drive to do whatever it took to deliver success for our clients.

And look at us now. It would take a while to list everything we’ve accomplished over the last decade plus, but here’s a quick run down of some of the items we’ve crossed off the company bucket list since 2006.

  • We’ve grown our numbers from the original three to more than 50 talented risk consultants, analysts, and developers.
  • We moved out of the basement (it would have been a tight fit, considering). We settled in historic downtown Cary in 2008, but quickly spilled out of our main office there and into several satellite locations. In 2018 we bought an older building a few blocks away and renovated it to a gleaming modern office hub for our US headquarters.
  • We opened offices in Toronto, Canada and Kuala Lumpur, Malaysia, to better serve our clients around the world.
  • We opened several new business units, expanding on our original core focus of delivering automated technology solutions. Adding dedicated Data and Risk, Business Analytics, and Platform Hosting teams enlarged our wheelhouse, so that we have experts that can walk our clients through the entire lifecycle of risk management programs. (Shameless plug: you can learn more about a number of them via a series of videos that are sprinkled throughout the website). We now also work with institutional investors on innovative models and product offerings to help streamline processes and drive excess returns.
  • We formalized our NEET (New Employee Excellence Training) apprenticeship program, so we can nurture and enhance the specific blend of skills that risk management professionals need to solve real-world business challenges. The program has struck a chord with our clients, so we built a program to recruit and develop risk management talent for them, as well.

Obviously, we couldn’t have done any of this without continued trust and support from our clients. Our clientele represents a cross section of the world’s largest banking, capital markets, insurance, energy and commodity firms – stretching across continents and across industries – and we recognize that they’re some very smart people. When they talk, we listen, and what they have been saying for a few years now is that the brand we started with in 2006 should evolve with the evolution of the company.

It is natural for people to streamline words into acronyms.  In our industry, there are many, and knowing them is very important to our job.  Our clients, partners, and even our internal teams used FRG from day one, but now is the time to make it official.  By rebranding and fully embracing the FRG name, we hope that it, too, becomes a well-known acronym in the risk management space, one that people equate with integrity and quality of work.

So we’re celebrating 2020 with the new name, a new look, and a new logo. But it’s like they say. The more things change, the more they stay the same. That’s why you can be sure that our core values, our core principle – to fulfill our clients’ needs, while surpassing their expectations – still guide us every day. We are our reputation. We are FRG.

Mike Forno is a Partner and Senior Director of Sales with FRG.

 

Stress Testing Private Equity


FRG, partnered with Preqin, has developed a system for simulating cash flows for private capital investments (PCF).  PCF allows the analyst to change assumptions about future economic scenarios and investigate the changes in the output cash flows.  This post will pick a Venture fund, shock the economy for a mild recession in the following quarters, and view the change in cash flow projections.

FRG develops scenarios for our clients.  Our most often used scenarios are the “Growth” or “Base” scenario, and the “Recession” scenario.  Both scenarios are based on the Federal Reserve’s CCAR scenarios “Base” and “Adverse”, published yearly and used for banking stress tests.

The “Growth” scenario (using the FED “Base” scenario) assumes economic growth more or less in line with recent experience.

The “Recession” scenario (FED “Adverse”) contains a mild recession starting late 2019, bottoming in Q2 2020.  GDP recovers back to its starting value in Q2 2021.  The recovery back to trend line (potential) GDP goes through Q2 2023.

Real GPD Growth chart

 

The economic drawdown is mild, the economy only loses 1.4% from the high.

Start DateTrough DateRecovery DateFull PotentialDepth
Q4 2019Q2 2020Q2 2021Q2 2023-1.4%

Equity market returns are a strong driver of performance in private capital.  The total equity market returns in the scenarios include a 34% drawdown in the index.  The market fully bottoms in Q1 2022, and has recovered to new highs by Q1 2023.

This draw down is shallow compared to previous history and the recovery period shorter:

Begin DateTrough DateRecovery DateDepthTotal LengthTrough Recovery
06/30/200009/30/200212/31/2006-47%271017
12/31/200703/31/200903/31/2013-49%22616
12/31/201903/31/202203/31/2024-34%18108

The .COM and Global Financial Crisis (GFC) recessions took off nearly 50% of the market value.  This recession only draws down 34%.  The time from the peak to the trough is 10 and 6 quarters for the .COM and GCF respectively.  Here we are inline with the .COM crash with a 10-quarter peak to trough period.  This recovery is faster by nearly double than either of the recent large drawdowns at 8 quarters versus 17 and 16.

We start by picking a 2016 vintage venture capital fund.  This fund has called around 89% of its committed capital, has an RVPI of 0.85 and currently sports about an 18% IRR.  For this exercise, we assume a $10,000,000 commitment.

Feeding the two scenarios, this fund, and a few other estimates into the PCF engine, we can see a dramatic shift in expected J-curve.

Under the “Growth” scenario, the fund’s payback date (date where total cash flow is positive) is Q1 2023.  The recession prolongs the payback period, with the expected payback date being Q3 2025, an additional 2.5 years.  Further, the total cash returned to investors is much lower.

This lower cash returned as well as the lengthening of the payback period have a dramatic effect on the fund IRR.

That small recession drops the expected IRR of the fund a full 7% annualized.  The distribution shown in the box and whisker plot above illustrates the dramatic shift in possible outcomes.  Whereas before, there were only a few scenarios where the fund returned a negative IRR, in the recession nearly a quarter of all scenarios produced a negative return.  There are more than a few cases where the fund’s IRR is well below -10% annually!

This type of analysis should provide investors in private capital food for thought.  How well do your return expectations hold up during an economic slowdown?  What does the distribution of expected cash flows and returns tell you about the risk in your portfolio?

At FRG, we specialize in helping people answer these questions.  If you would like to learn more, please visit www.frgrisk.com/vor-pcf  or contact us.

Dominic Pazzula is a Director with FRG, specializing in asset allocation and risk management. He has more than 15 years of experience evaluating risk at a portfolio level and managing asset allocation funds. He is responsible for product design of FRG’s asset allocation software offerings and consults with clients helping to apply the latest technologies to solve their risk, reporting, and allocation challenges.

 

 

CECL Preparation: How Embracing SR 11-7 Guidelines Can Support the CECL Process

The Board of Governors of the Federal Reserve System’s SR 11-7 supervisory guidance (2011) provides an effective model risk management framework for financial institutions (FI’s). SR 11-7 covers everything from the definition of a model to the robust policies/procedures that should exist within a model risk management framework. To reduce model risk, any FI should consider following the guidance throughout internal and regulatory processes as its guidelines are comprehensive and reflect a banking industry standard.

The following items and quotations represent an overview of the SR 11-7 guidelines (Board of Governors of the Federal Reserve System, 2011):

  1. The definition of a model – “the term model refers to a quantitative method, system, or approach that applies statistical, economic, financial, or mathematical theories, techniques, and assumptions to process input data into quantitative estimates.”
  2. A focus on the purpose/use of a model – “even a fundamentally sound model producing accurate outputs consistent with the design objective of the model may exhibit high model risk if it is misapplied or misused.”
  3. The three elements of model risk management:
    • Robust model development, implementation, and use – “the design, theory, logic underlying the model should be well documented and generally supported by published research and sound industry practice.”
    • Sound model validation process – “an effective validation framework should include three core elements: evaluation of conceptual soundness, …, ongoing monitoring, …, and benchmarking, outcomes analysis, …”
    • Governance – “a strong governance framework provides explicit support and structure to risk management functions through policies defining relevant risk management activities, procedures that implement those policies, allocation of resources, and mechanisms for evaluating whether policies and procedures are being carried out as specified.”

The majority of what the SR 11-7 guidelines discuss applies to some of the new aspects from the accounting standard CECL (FASB, 2016). Any FI under CECL regulation must provide explanations, justifications, and rationales for the entirety of the CECL process including (but not limited to) model development, validation, and governance. The SR 11-7 guidelines will help FI’s develop effective CECL processes in order to limit model risk.

Some considerations from the SR 11-7 guidelines in regards to the components of CECL include (but are not limited to):

  • Determining appropriateness of data and models for CECL purposes. Existing processes may need to be modified due to some differing CECL requirements (e.g., life of loan loss estimation).
  • Completing comprehensive documentation and testing of model development processes. Existing documentation may need to be updated to comply with CECL (e.g., new models or implementation processes).
  • Accounting for model uncertainty and inaccuracy through the understanding of potential limitations/assumptions. Existing model documentation may need to be re-evaluated to determine if new limitations/assumptions exist under CECL.
  • Ensuring validation independence from model development. Existing validation groups may need to be further separated from model development (e.g., external validators).
  • Developing a strong governance framework specifically for CECL purposes. Existing policies/procedures may need to be modified to ensure CECL processes are being covered.

The SR 11-7 guidelines can provide FI’s with the information they need to start their CECL process. Although not mandated, following these guidelines overall is important in reducing model risk and in establishing standards that all teams within and across FI’s can follow and can regard as a true industry standard.

Resources:

  1. Board of Governors of the Federal Reserve System. “SR 11-7 Guidance on Model Risk Management”. April 4, 2011.
  2. Daniel Brown and Dr. Craig Peters. “New Impairment Model: Governance Considerations”. Moody’s Analytics Risk Perspectives. The Convergence of Risk, Finance, and Accounting: CECL. Volume VIII. November 2016.
  3. Financial Accounting Standards Board (FASB). Financial Instruments – Credit Losses (Topic 326). No. 2016-13. June 2016.

Samantha Zerger, business analytics consultant with FRG, is skilled in technical writing. Since graduating from the North Carolina State University’s Financial Mathematics Master’s program in 2017 and joining FRG, she has taken on leadership roles in developing project documentation as well as improving internal documentation processes.

 

Improve Your Problem-Solving Skills

This is the fifth post in an occasional series about the importance of technical communication in the workplace.

 “Work organisations are not only using and applying knowledge produced in the university but they are also producing, transforming, and managing knowledge by themselves to create innovations (Tynjälä, Slotte, Nieminen, Lonka, & Olkinuora, 2006)”.

Problem-solving skills are rooted in the fact that you must learn how to think, not what to think. Most classes in high schools and colleges teach you what to think (e.g., history dates, mathematical equations, grammar rules), but you must learn further problem-solving skills in order to help you learn how to think.

In the technical workplace, you are expected to be able to be given a problem and come up with a solution; a solution that possibly has never been thought of before. Employers are looking for people that have the right skills in order to do that very thing. Because of this, most interview processes will inevitably include at least one problem-solving question.

  • “How have you handled a problem in your past? What was the result?”
  • “How would you settle the concerns of a client?”
  • “How would you handle a tight deadline on a project?”

The way you answer the problem-solving question usually gives the interviewer a good sense of your problem-solving skills. Unfortunately, for the interviewee, problem solving is grouped into a BROAD skill set made up of:

  • Active listening: in order to identify that there is a problem
  • Research: in order to identify the cause of the problem
  • Analysis: in order to fully understand the problem
  • Creativity: in order to come up with a solution, either based on your current knowledge (intuitively) or using creative thinking skills (systematically)
  • Decision making: in order to make a decision on how to solve the problem
  • Communication: in order to communicate the issue or your solution to others
  • Teamwork: in order to work with others to solve the problem
  • Dependability: in order to solve the problem in a timely manner

So how do you, as the interviewee, convey that you have good problem-solving skills? First, acknowledge the skill set needed to solve the problem relating to each step in the problem-solving process:

Step in Problem SolvingSkill Set Needed
1. Identifying the problemActive listening, research
2. Understanding and structuring the problemAnalysis
3. Searching for possible solutions or coming up with your own solutionCreativity, communication
4. Making a decisionDecision making
5. Implementing a solutionTeamwork, dependability, communication
6. Monitoring the problem and seeking feedbackActive listening, dependability, communication

Then, note how you are either planning to or are improving your problem-solving skills. This may include gaining more technical knowledge in your field, putting yourself in new situations where you may need to problem solve, observing others who are known for their good problem-solving skills, or simply practicing problems on your own. Problem solving involves a diverse skill set and is key to surviving in a technical workplace.

Resources:

  1. Problem-Solving Skills: Definitions and Examples. Indeed Career Guide.
  2. Tynjälä, Päivi & Slotte, Virpi & Nieminen, Juha & Lonka, Kirsti & Olkinuora, Erkki. (2006). From university to working life: Graduates’ workplace skills in practice.

 

Samantha Zerger, business analytics consultant with the Financial Risk Group, is skilled in technical writing. Since graduating from the North Carolina State University’s Financial Mathematics Master’s program in 2017 and joining FRG, she has taken on leadership roles in developing project documentation as well as improving internal documentation processes.

 

 

Subscribe to our blog!