Introduction
- Credit plays a crucial role in both private and public sectors, facilitating economic activities and funding sources. Over the years, credit has evolved significantly, impacting its volume, provision channels, types, and regulatory framework.
- Credit risk, the likelihood of borrowers failing to meet debt obligations, is a major concern across industries. While commonly associated with financial institutions, it also affects non- financial sectors, where firms extend and receive credit.
- Statistics from the Bank of International Settlements indicate substantial credit expansion in the Eurozone and the USA, with private debt exceeding 150% of GDP in both regions. Notably, credit composition varies, with the Eurozone seeing more credit to non-financial corporations and the USA having higher household credit. The banking sector remains a significant contributor to credit provision, though new financial instruments and platforms like credit derivatives and peer-to-peer lending introduce additional complexities and risks.
- Managing credit risk has become multifaceted, involving regulatory, methodological, and technical challenges, impacting various organizations exposed to credit risk.
- Two major developments have significantly impacted credit risk management in recent years. Firstly, the regulatory framework, particularly concerning credit institutions like banks, underwent significant changes. The Basel Committee on Banking Supervision introduced the Basel I Accord in 1988, establishing basic capital requirements based on asset risk. This was succeeded by the more comprehensive Basel II Accord in 2004, covering credit, operational, and market risks, and emphasizing supervisory guidelines and market discipline. Basel III, currently under development, aims to enforce stricter capital, liquidity, and leverage requirements. These regulations have heavily shaped credit risk management practices in the financial sector.
- Secondly, there has been a widespread adoption of analytical methods for credit risk modeling and management. Early methods relied on empirical evaluation systems like CAMEL, which combined various factors but lacked objectivity. The introduction of advanced analytical approaches dates back to the 1960s and 1970s with bankruptcy prediction models and the 1980s with credit scoring models like the FICO score. Tighter regulatory frameworks incentivized the adoption of analytical models, which were seen as more sophisticated and reliable. This trend was further fueled by advancements in mathematical finance, data science, and operations research. Analytical credit risk modeling applies to both individual loans and portfolios, aiding in assessing creditworthiness, estimating expected losses, and guiding portfolio management decisions.
- Despite these advancements, credit risk models faced criticism during the global credit crisis of 2007-2008 for their role in facilitating excessive credit expansion and failing to accurately estimate risk exposure. However, they have undeniably played a crucial role in facilitating access to credit by reducing risk premiums for borrowers.
The Camel System
- When bank credit analysts have all the required information, including multi-year financial data, they typically use the CAMEL system or its variations to assess credit risk. Though initially designed by U.S. bank supervisors for examination, it’s now widely embraced by rating agencies, counterparty analysts, and even equity analysts for valuing bank stocks.
- CAMEL is a simplified acronym that doesn’t encompass all necessary aspects or allocate appropriate weights to them. It represents the five critical factors in bank financial health:
- C for Capital – This reflects the bank’s ability to absorb losses and remain solvent. The bank’s capital adequacy is determined by both the amount and quality of available capital. Tier 1 capital, such as common equity, holds more significance than tier 2 capital, such as subordinated debt. Factors such as adhering to interest and dividend regulations, growth strategies, economic conditions, and investment focus are also analyzed.
- A for Asset Quality – The quality of a bank’s assets, such as loans and investments, plays a pivotal role. High-quality assets reduce the risk of defaults and losses.
- M for Management – The bank’s leadership and management practices are fundamental to ensure prudent risk-taking, strategic planning, and efficient operations.
- E for Earnings – A bank’s profitability and consistent earnings are indicative of its financial strength, including its ability to generate revenue and cover expenses. Analysts evaluate a bank’s capacity to generate growth-fueling returns and attain necessary capital levels. Key considerations include the bank’s growth, stability, net interest margin, and net worth.
- L for Liquidity – Liquidity signifies the bank’s ability to meet its short-term obligations promptly. Sufficient liquidity ensures the bank can manage unexpected financial challenges without resorting to distress measures. The liquidity level is significantly influenced by the ratio of cash held by banks and central bank balances to total assets. Additionally, the analysis includes evaluating a bank’s reliance on short-term and potentially unstable financial resources.
- Except for “management”, most elements can be analyzed using ratios. However, quantifying “liquidity” is challenging.
- Since CAMEL’s inception, banks have engaged into many transactions that no longer fit its categories. Some transactions are off the balance sheet or driven by asset/liability management, making “asset quality” too narrow. While often termed a model, CAMEL is essentially a checklist of critical bank attributes for financial evaluation. It can lay the foundation for systematic approaches to credit assessment.
- In the United States, the CAMEL system operates as a scoring model for regulators. Examiners assign scores from “1” (best) to “5” (worst) for each acronym letter. Composite scores combine attribute scores, and scores of 3 or higher signal concerns, prompting regulatory review.
Elements Of Credit Risk Modeling
Capital Adequacy Ratio
Credit Risk Assessment Approaches
- Various methods exist for evaluating the probability of default within credit risk modeling. These approaches differ primarily in the data necessary for their implementation, as well as their scope and applicability. Three main frameworks are used:
- Judgmental approaches, Empirical models, and financial models.
- Judgmental Approaches
- Judgmental approaches in credit risk assessment, often dubbed qualitative approaches or expert systems, are the oldest and least analytically complex methods. They rely solely on the expert judgment of credit analysts to evaluate both qualitative and quantitative characteristics of the borrower.
- The “5C analysis” method, a prominent judgmental evaluation scheme, assesses five key dimensions of a borrower’s creditworthiness:
- Character: Reflects the borrower’s personality.
- Capacity: Evaluates the borrower’s ability to repay the loan based on income, expenses, and other financial obligations.
- Capital: Considers the borrower’s own capital at risk.
- Collateral: Examines the guarantees provided for the obligation’s payment.
- Conditions: Describes the general business environment and loan characteristics, such as the interest rate.
- These dimensions undergo analysis using a wide range of qualitative and quantitative factors. For corporate loans, this may include data from financial statements, business plans, industry information, and economic outlooks. Small loans are typically reviewed by a single credit analyst or a small team, while larger loans require more extensive examination by expert analysts and credit committees.
- While judgmental systems offer a structured framework, they rely solely on expertise rather than theory or empirical data, leading to notable limitations. It’s challenging to validate results, update evaluation processes, or analyze scenarios affecting creditworthiness without clear protocols. Attempts in the 1980s and early 1990s to enhance judgmental systems with analytical capabilities focused on emerging expert systems (ESs) technology. Despite limitations, judgmental approaches persist, especially in areas lacking historical data for advanced models, like project finance or specialized sectors such as shipping. They offer deep insights into complex cases where expert credit analysts interpret unstructured information effectively, providing valuable perspectives from both business and financial standpoints.
- Data-Driven Empirical Models
- Data-driven approaches rely on models constructed using historical data on loans, including accepted, rejected, paid-as-agreed, and defaulted cases. Applicable to both corporate and consumer loans when historical data are available, these approaches utilize data from various sources such as internal bank databases, credit rating agencies, and other data providers. The data primarily include borrower characteristics, loan status (defaulted or non-defaulted), and external risk factors. Instead of relying on expertise to aggregate risk factors, empirical models use analytical techniques to identify patterns from the data, establishing explicit relationships between default likelihood and input variables or risk factors.
- Empirical approaches have roots dating back to the late 1960s with the development of statistical models for predicting bankruptcy. Since then, methodological advances in data analytics and improvements in data collection have propelled rapid evolution in this field. A wide range of statistical, machine learning, and operations research techniques are now available for data analytics, enabling the identification of complex non-linear credit default relationships efficiently, even with large-scale datasets. Additionally, advancements in academic research, regulatory frameworks, and credit risk management practices have led to the identification of new relevant risk factors beyond traditional financial data. Factors such as corporate governance issues, information from social networks, news analytics, and real-time financial market data enhance the explanatory and predictive power of empirical models in various contexts.
- Advantages of the empirical approach in credit risk assessment include:
- Transparency and consistency: Empirical models offer greater transparency and consistency across different scenarios, ensuring impartial and standardized evaluation of credit risk.
- Objectivity: Empirical models minimize subjectivity and bias so that conclusions can be supported by measurable data.
- Validation of structure and predictive power: Both the model’s structure and predictive ability can be empirically validated.
- Analytical exploration: They make it possible to conduct scenario analyses and stress tests, as well as to examine hypotheses analytically.
- Real-time data updates: Market data can be continuously updated to adapt to changing conditions in real-time.
- Versatility and scalability: They can effectively handle large datasets and diverse credit portfolios (including both consumer and corporate loans), making them adaptable across different institutions. Specialized models can be developed for different entities, sectors, and types of loans.
- Despite these advantages, the empirical method also comes with significant weaknesses:
- Reliance on historical data: Historical data might not adequately forecast future outcomes, especially during periods of heightened uncertainty and volatility.
- Inadequacy and flaws in data: The data utilized in constructing empirical credit risk models are often incomplete and imperfect. .
- Static and retrospective characteristics: Data utilized in empirical models, like financial statements, are commonly perceived as retrospective and static, offering a snapshot of the present condition.
- Limited real-time updates: While market data can be updated in real-time, not all inputs, like financial statement data, may be accessible as frequently. This infrequent updating could lead to slower risk assessment compared to a judgmental approach.
- Financial Models
- In contrast to data-driven empirical approaches, financial models primarily rely on theoretical principles. Instead of focusing on descriptive and predictive analysis, financial models take a normative approach based on fundamental economic and financial principles. These models aim to explain the mechanism of the default process and are often referred to as market models because they utilize data from financial markets, particularly focusing on corporate debt..
- Two primary types of financial models for credit risk assessment:
- Structural models:
- Assume default as an internal process linked to a firm’s structural characteristics.
- Factors such as asset and debt values are central to this model.
- Reduced form models:
- View default as a random event driven by external factors.
- Often use Poisson jump processes.
- Rely extensively on market data from bonds and credit derivatives.
The Merton Model
- Option pricing theory, pioneered by Black and Scholes (1973), has found extensive application in assessing default-risky debt and equity, notably through the Merton (1974) model. In this framework, a leveraged firm with a single debt issue, no dividend payments, and perfect financial markets is considered. The debt has no coupons and matures at 𝑇. Under these idealized conditions, debt holders and equity holders are the sole claimants against the firm, the value of the firm’s assets is equal to the sum of the value of debt and the value of equity. At date 𝑇, if the firm cannot pay the principal amount 𝐹, it is bankrupt, equity has no value, and the firm belongs to the debt holders. If the firm can pay the principal at 𝑇, any surplus belongs to equity holders. For example, if a firm owes $350 million at maturity 𝑇, and total value of its assets at 𝑇 is only $280 million, equity holders receive nothing, but if the firm’s assets’ value at 𝑇 is $380 million, it can pay off the full principal of $350 million and equity holders get $30 million.
- Let \(A_T\) be the value of the firm’s assets and \(E_T\) be the value of equity at date T. The equity holders receive \(A_T -F\) if \(A_T -F\)> 0 or else they receive zero. This structure resembles the payoff of a call option on the value of the firm’s assets. At date T:
\[
E_T = \max(A_T – F, 0)
\]
- To price equity and debt using the Black-Scholes formula, the following assumptions are made:
- The value of the firm’s assets follows a log-normal distribution with a constant volatility \(sigma_A\).
- The interest rate 𝑟 is constant.
- Trading occurs continuously.
- Financial markets are perfect.
- The current value of equity can be derived as:
\(E = A \times N(d_1) – Fe^{-rt} \times N(d_2)\)
where:
\(d_1 = \frac{ \ln \left( \frac{A}{F} \right) + \left( r + \frac{\sigma_A^2}{2} \right) t }{ \sigma_A \sqrt{t} }\) and
\(d_2 = d_1 – \sigma_A \sqrt{t} = \frac{ \ln \left( \frac{A}{F} \right) + \left( r – \frac{\sigma_A^2}{2} \right) t }{ \sigma_A \sqrt{t} }\)
N is cumulative normal distribution function,
F is the face value of the debt (equal to the market value of equity and net debt), A is the current value of the firm’s assets,
r is the risk-free rate of return
𝑡 is the remaining time to maturity of the debt,
\(sigma_A\) is the instantaneous volatility (standard deviation) of the firm’s assets.
- The current value of equity can be derived as:
\(E = A \times N(d_1) – Fe^{-rt} \times N(d_2)\)
where:
\(d_1 = \frac{ \ln \left( \frac{A}{F} \right) + \left( r + \frac{\sigma_A^2}{2} \right) t }{ \sigma_A \sqrt{t} }\) and
\(d_2 = d_1 – \sigma_A \sqrt{t} = \frac{ \ln \left( \frac{A}{F} \right) + \left( r – \frac{\sigma_A^2}{2} \right) t }{ \sigma_A \sqrt{t} }\)
N is cumulative normal distribution function,
F is the face value of the debt (equal to the market value of equity and net debt), A is the current value of the firm’s assets,
r is the risk-free rate of return
𝑡 is the remaining time to maturity of the debt,
\(sigma_A\) is the instantaneous volatility (standard deviation) of the firm’s assets.
- The calculation of the equity assumes equity as a function of both assets and time. It is given by
\(\sigma_E = \frac{ \sigma_A \times A_t \times N(d_1) }{ E_t }\)
- Additionally, N\(-d_2\) represents the risk-neutral probability of default, which signifies the likelihood that shareholders will opt not to utilize the option to repay the company’s debt. This probability is determined under the assumption of asset growth at the risk-free rate. To obtain the actual probability of default (𝑃𝐷), the expected return on assets (𝜇) should replace the risk- free rate (𝑟), and it is given by:
\(PD_{real} = N \left( \frac{ \ln \left( \frac{A}{F} \right) + \left( \mu – \frac{\sigma_A^2}{2} \right) t }{ \sigma_A \sqrt{t} } \right)\)
- When it comes to credit risk assessments, a key idea is Distance to Default (𝐷𝐷). Distance to Default (𝐷𝐷) gauges a company’s proximity to defaulting on its debt by measuring the number of standard deviations that the company’s assets exceeds the face value of debt. A higher 𝐷𝐷 signifies a reduced likelihood of default, indicating stronger financial stability. It is given by
\(DD = \frac{\ln \left( \frac{A}{F} \right) + \left( \mu – \frac{\sigma_A^2}{2} \right) t}{\sigma_A \sqrt{t}}\)
Limitations of the Merton Model
- Assumption Simplification: The model relies on simplifying assumptions such as constant volatility and risk-free interest rates, potentially oversimplifying actual market conditions and impacting its accuracy.
- Limited Applicability: The model’s applicability is limited to publicly traded and financially liquid companies. Its application to unlisted firms presents challenges due to:
- Lack of Observable Prices: Unlisted companies lack observable prices, making it difficult to apply the model accurately. Proxies or comparables may yield unreliable results due to the model’s sensitivity to key parameters.
- Feasibility Issues with Comparables: Medium-sized enterprises, in particular, may pose challenges for using comparable prices, rendering this method infeasible.
- Reliance on Historical Data: The model’s effectiveness hinges on historical market data, which may become less predictive of future trends due to evolving market conditions and financial regulations.
- Recalibration Costs: Frequent recalibration of the model is necessary, incurring substantial costs.
- Stability Compared to Ratings: Merton’s approaches exhibit higher instability compared to credit rating agencies’ ratings due to continuous market fluctuations. Long-term institutional investors may be hesitant to adopt this approach, preferring less frequent changes in asset allocation.
- Credit Risk Aggregation Challenges: The model faces difficulties in aggregating and comparing credit risks across different business lines or financial institutions due to its focus on market- based risk factors.
The Credit Metrics Model
The Credit Risk + Model
- The CreditRisk + model was developed by Credit Suisse Financial Products in 1997.
- Unlike the KMV model, CreditRisk+ does not rely on a company’s capital structure or credit rating to assess the probability of default.
- Also, in contrast to the CreditMetrics model, which aims to evaluate debt securities and potential losses, CreditRisk+ simplifies credit events to only two categories: bankruptcy and bad credit conditions. Additionally, CreditRisk+ treats default frequency as a continuous random variable, with potential losses occurring solely in the event of bankruptcy.
- In the CreditRisk+ model, default probabilities are assumed to be low and independent of other credit events. This model utilizes the Poisson distribution, where the average default rate equals its variance. Unlike CreditMetrics , CreditRisk+ does not compute credit ratings but solely focuses on default events. This simplicity requires minimal data input, making it user- friendly.
Moody’s KMV Model
- KMV employs the “Expected Default Frequency” (EDF ) derived from the Merton Model extension to calculate default probabilities for each obligor. The EDF model generates an empirical distribution of default frequencies based on historical data, rather than the normal distribution approach used in the Merton Model. This offers a more robust mapping of Distance to Default (𝐷𝐷) to a probability of default scale. Additionally, it defines the default point as the sum of short-term debt and half of long-term debt, providing a closer approximation to a firm’s actual loan obligations.
- Advantages of KMV include:
- Current equity value determines probabilities of default (PD), ensuring immediate reflection of any firm value changes.
- PD changes continuously rather than waiting for rating adjustments, offering real-time risk assessment.
- Equity value increase reduces default probability, unlike CreditMetrics where firm value fluctuations may not affect default probability due to static ratings.
- KMV adopts a CAPM-inspired approach for expected firm value growth and simplifies correlation structure using a factor model, allowing for an analytical solution for loss distribution and eliminating the need for simulation to compute Credit VaR.
- Many credit risk assessment methodologies, like scoring and rating models, are validated through statistical performance measures. However, their outcomes for daily decision-making are evaluated in financial terms. Understanding these financial measures is crucial for assessing loan performance and making acceptance/rejection decisions based on risk-adjusted return.
- The primary financial performance measure used is risk-adjusted return on capital (𝑅𝐴𝑅𝑂𝐶). It has gained significant attention in the financial industry because efficient risk management is vital for institutional profitability. 𝑅𝐴𝑅𝑂𝐶 evaluates the performance of financial institutions, product lines, or loans by comparing financial outcomes (income, profits) to the economic capital required to achieve them. This general description changes based on the specifications for certain components.
- In a basic credit risk modeling scenario, the financial outcomes of a loan are its revenues (excluding expenses and expected losses), while economic capital represents the capital required to cover risk exposure in case of default. 𝑅𝐴𝑅𝑂𝐶, in this context, is defined as
\(RAROC = \frac{\text{Loan Revenues}}{\text{Capital at Risk}}\)
- When loan performance is evaluated using 𝑅𝐴𝑅𝑂𝐶, a straightforward rule of thumb is followed: a loan is considered profitable if its 𝑅𝐴𝑅𝑂𝐶 is higher than the bank’s cost of capital.
- The following elements are included in the 𝑅𝐴𝑅𝑂𝐶 equation’s numerator:
- Loan amount (𝐿)
- The spread between the loan rate and the bank’s cost of capital (𝑠).
- Fees linked to the loan (𝑓).
- Expected loan losses (𝑙).
- Operating costs (𝑐).
- Taxation (x).
- If these components are all expressed as a percentage of the loan amount, then the anticipated revenues from a loan can be defined as:
\(Loan \, revenues = (s + f \,-\, l\, -\, c)(1 \,-\, x) \times L\)
- One method for computing the 𝑅𝐴𝑅𝑂𝐶 equation’s denominator (capital at risk) is to look at how much the loan’s value changed due to changes in interest rates over a given time period, usually a year. The duration of the loan is a measure of how sensitive it is to changes in interest rates. To be more precise, the duration approximation for the change in loan value (Δ𝐿) owing to an interest rate change (Δ𝑖) for a loan valued at 𝐿, duration of 𝐷, and interest rate of 𝑖 is:
\(\Delta L = -LD \left( \frac{\Delta i}{1 + i} \right)\)
- As an illustration, let’s consider MF Bank’s loan valued at $2,000,000 with the following parameters.
- The spread between the loan rate and the bank’s cost of capital (𝑠)=0.4%.
- Commission linked to the loan (𝑓)=0.15%.
- Expected loan losses (𝑙)=0.
- Operating costs (𝑐)=0.3%.
- Taxation (𝑥)=20%.
\(Loan \, revenues = (s + f \,-\, l\, -\, c)(1 \,-\, x) \times L\)
\(= (0.004 + 0.0015 \,- \,0 \,-\, 0.003)(1 \,-\, 0.2) \times 2,000,000\)
\(= 4,000\)
Moreover, assume that the loan’s duration is 4 years and the current interest rate is 5.6%. With an expected increase in interest rates of 0.5%, the capital at risk can be estimated using the change in loan value as an approximation:
\(\Delta L = – LD \left( \frac{\Delta i}{1 + i} \right) = $2,000,000 \times 4 \times \frac{0.005}{1.056} = $37,878.79\)
Hence \(RAROC = \frac{Loan \, Revenues}{Capital \, at \, Risk} = \frac{4000}{37878.79} = 0.1056 \, or \, 10.56\%\)
So, this loan remains profitable for the bank as long as its own interest rate is below 10.56%.
- The above method is commonly known as a market-based approach. An alternative method for assessing the capital at risk relies on historical data rather than market-based indicators like anticipated interest rate fluctuations. This method incorporates the unexpected loan loss into the denominator of the RAROC equation, formulated as follows:
Unexpected loan loss = \(\alpha \times LGD \times EAD\)
- Here, \(\alpha\) stands as a risk factor symbolizing the unexpected default rate, which can be determined based on the historical default rates distribution for loans resembling the one in focus. For example, assuming a normal distribution of default rates, a value of \(\alpha = 2.576\sigma\) can be set at a 99.5% confidence level (based on the z-score). However, loan loss distributions typically exhibit skewness, indicating non-normality. Hence, the standard deviation coefficient in this context is frequently set at a higher level to accommodate for this.