This is an open access article distributed under the Creative Commons Attribution License, which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited.
1. Introduction
With the economic globalization and the continuous innovation of financial products, the financial industry is also facing huge risks while booming, and the complexity, linkage, and uncertainty of financial markets are increasing. After the outbreak of the U.S. subprime mortgage crisis in 2008, it brought a heavy blow to the global financial market, and people’s demand for risk management became more urgent. Financial risk management is an indispensable part of the investment, decision-making, and supervision, which mainly includes four stages: risk identification, risk measurement, risk decision, and risk control. Among them, risk measurement refers to measuring the size and possibility of loss caused by various risks by the quantitative method, which is also the core content of financial risk management. However, the complexity and time-varying of the market factors bring some difficulties to financial risk measurement. Most risk measurement methods involve complex mathematical formulas and models that are less operable. What the investment decision maker needs is a technical method that is easy to grasp and accurately reflect the size of the risk [1–4].
Since the 1990s, the value at risk (VaR) has been introduced into financial risk management and has become a widely used risk measurement and management tool for the regulatory authorities and financial institutions. The earliest method of measuring VaR value is the historical simulation method. Then, the Monte Carlo simulation method and variance-covariance method are gradually used. In 1994, the VaR method was used by J.P. Morgan. The Basel Committee even explicitly stipulated that the VaR should be used as the standard model for determining the bank regulatory capital. Nowadays, the VaR method has been widely used in the field of market risk and credit risk management, however, some scholars still believe that the risk management based on the VaR method cannot avoid the outbreak of financial crisis [5–9].
2. Literature Review
Since the VaR method was proposed, it has attracted the attention of many scholars. They have conducted a lot of research on the theory of VaR and put up with many means to improve the accuracy of the VaR models. Christofersen and Errunza [10] found that using the normal distribution to calculate VaR would have a large error. Hence, the GARCH models and their derivative models are used to estimate VaR, which can better describe the distribution characteristics of financial time series and also provide new ideas for calculating VaR. Angelidis et al. [11] studied the performance of the GARCH type models in the daily value at the risk of a diversified portfolio using a large number of distribution assumptions and different sample sizes. The results showed that different distributions would affect the VaR value, however, there was no correlation between model selection and sample size. So and Yu [12] applied seven different GARCH models to the financial data and evaluated the VaR value of each model under different significance levels. To confirm whether the estimation of VaR is affected by the conditional distribution of return, Makiel [13] collected a large amount of data to estimate various types of ARIMA-GARCH (1, 1) models and compared the differences among the models according to the Kupiec test results. Girardi and Ergün [1] used the multidimensional GARCH model to estimate the CoVaR value. Chen et al. [2] researched the variable smooth transition function of the nonlinear GARCH model. Kuhe [3] used the cointegration vector generalized autoregressive conditional heteroscedasticity (VAR-GARCH) model to investigate the dynamic relationship between the crude price and stock market price fluctuation in Nigerians. The results show that there is a reliable relationship between the crude price and stock market price, however, the conditional volatility of the logarithmic rate of the return of the stock market price is stable and predictable, while the conditional volatility of the logarithmic rate of the return of the crude price is unstable and unpredictable.
Manganelli and Engle [14] proposed a new method to calculate the VaR value. They built the conditional value at risk (CVaR) model by autoregressive modeling. CVaR is the conditional mean of the loss exceeding VaR, which reflects the average level of excess loss and the potential loss of the financial assets. Alexander and Baptista [15] studied the change of the portfolio selection behavior of the agents after the CVaR constraints were included in the agent constraint set. They found that compared with the VaR constraint, the CVaR constraint can more effectively inhibit the agent’s mild risk aversion, however, it cannot inhibit or even indulge the agent’s high-risk aversion. Kaut et al. [16] used the CVaR model to analyze the robustness of portfolio management. They found that the CVaR model is sensitive to model parameter setting errors. Abad et al. [17] reviewed the existing literature on the value at risk (VaR). From a practical point of view, the method based on the extreme value theory and filtering history simulation is the best method to predict VaR.
Wang et al. [18] focused on the fact that electricity price fluctuation brings more uncertainty and greater risk. They proposed a risk assessment method for the power market based on the EMD algorithm. The impact of different macro policies implemented by the government on the power market risk is also different, such as a feed-in tariff or tax-rebate regulation [19]. The risk assessment of the economic system and financial market can also be analyzed from the perspective of system stability. Ma and wu [20] established a triopoly price game model based on the derivation of the mathematical models to analyze the pricing of the enterprises in relevant markets. Wang et al. [21] proposed the idea of combining the EMD algorithm with the Copula theory, and hence, they constructed a new EMD-Copula model and applied it to the estimation of value at risk in the actual power market. He et al. [22] proposed a new VaR model based on empirical mode decomposition (EMD) to estimate the downside risk of the power market. Mensi et al. [23] used the wavelet square coherence method and wavelet analysis to build the VaR model to study the portfolio risk and interaction between the BRICS and South Asian border stock markets and each major developed stock market. Berger and Gençay [24] used the wavelet analysis to divide the volatility of the financial conditions into three parts: short-term, medium-term, and long-term, to study VaR. The results showed that the short-term and medium-term information components cover the relevant information necessary to estimate the sufficient daily VaR.
In general, foreign research on the VaR risk management methods is earlier, and the development process of the VaR method is traceable [25–28]. Although domestic research started late, the progress of research on the VaR methods is very fast, and statistics and mathematics are used to optimize the models [29, 30]. At present, the domestic and foreign scholars all agree that the main difficulty of research is how to improve the accuracy and effectiveness of the model. They focus a lot of energy on the establishment of models but neglect the processing and design of the research data [31, 32]. For example, many scholars chose a small sample size and did not consider the update of the estimated sample data [33, 34]. In this paper, 2494 SSE index data are divided into estimated sample and forecast sample. The estimated sample is updated by the R software, and the empirical analysis of the five VaR models based on the variance-covariance method is carried out. Finally, it examines the effectiveness of the models by comparing the VaR values and the actual loss values.
The rest of this article is organized as follows: section 3 describes the VaR method, section 4 describes some models used in this study, section 5 contains the empirical analysis, and section 6 presents the main conclusions of this paper.
3. VaR
VaR refers to the maximum possible loss of the value of a certain financial asset or portfolio of assets within a certain period under certain confidence. It is expected to illustrate the potential loss under some extreme conditions. VaR is the quantile of the loss variable with a high probability. Hence, there is just a small chance that the loss can exceed this amount. The statistical nature of VaR also allows the back-testing of this measure so that the practitioners can study the accuracy of the underlying loss model. The specific meaning of VaR can be expressed by,
In equation (1), “P” refers to the probability that the actual loss of the asset value is greater than the maximum possible loss, and “L” refers to the loss value of a financial asset in a specific future holding period, whereas “c” refers to the confidence level. Equation (1) demonstrates that in a certain holding period in the future, the probability that the loss does not exceed the VaR value is c.
It can be seen from the definition of VaR that the holding period and confidence level are two important parameters. In addition, the distribution characteristics of the asset prices are also one of the elements of the VaR model. According to different probability distribution models, the VaR value of the financial assets is somewhat different. The basic idea of the VaR method is to use the historical information of the financial assets to infer the future situation, however, what is inferred is a probability distribution. The core of the VaR value calculation is to use the historical data to estimate the probability distribution of the value or the return of financial assets, and the key to inferring this probability distribution is to estimate the volatility of the financial assets.
3.1. VaR in General Distribution
If the distribution of the financial asset returns is not specifically set, the initial investment amount is assumed to be
At a given confidence level of
VaR measures the maximum possible loss at a given confidence level and is generally expressed as a positive number. VaR is divided into absolute VaR and relative VaR. Absolute VaR does not consider the expected value of the financial assets, and it is expressed by the difference between the initial asset value and the current value. The expression is as follows:
Relative VaR refers to the maximum possible loss relative to the expected value of the financial assets. Assuming that the expected rate of return is
If the holding period is short and the average yield is low, the VaR value calculated according to the above two definitions is very close. In other cases, especially when measuring the unforeseen loss of the credit risk over a longer period of time, the relative VaR can better reflect the risk that comes from the deviations from the average based on fully considering the time value of the funds.
It can be seen from the VaR calculation formula that the key to estimating the VaR value is to determine the minimum value of the financial asset
The value of
3.2. VaR under Normal Distribution
When the returns of the financial assets follow a normal distribution, the standard deviation
The probability of financial asset loss exceeding the VaR value can be expressed as follows:
Note:
According to equation (7), the minimum critical rate of return can be expressed as
When the observations of the return rate of the financial assets meet the three assumptions of an efficient market, an independent and identical distribution, and a random walk, the volatility of the annual return rate can be derived from the volatility of the daily return rate. The VaR of different time lengths can be converted into each other, which is called “time aggregation” in quantitative economics. Assuming that the average annual return rate and the variance of the financial asset are, respectively, expressed as
Equation (12) shows that under the condition that the return rate series are uncorrelated when the observation period becomes longer, the volatility increases by the square root of the time. If only the relative VaR is considered, the
4. Methods of Calculating VaR
This paper mainly introduces the historical simulation method, Monte Carlo simulation method, and variance-covariance method for calculating the VaR value. The first two methods adopt the complete valuation method and use the statistical method to simulate the distribution of the future return of the financial assets without assuming the distribution of the return of the financial assets. The variance-covariance method adopts the local valuation method and mainly uses the sample data to estimate certain parameters to calculate the VaR value.
4.1. Historical Simulation
The historical simulation method refers to a nonparametric method that calculates the VaR value using the historical data to simulate the future return of the financial assets at a certain confidence level, assuming that the future fluctuations of the financial assets are the same as the historical fluctuations. The historical simulation method does not need to make assumptions about the distribution of the return rate, nor does it need to estimate the parameters of the model. It can be calculated based on the historical data and the definition of VaR, and hence, it is very simple and intuitive, and its simulation results depend largely on historical data selection. However, this method assumes that history will repeat itself, but the capital market, in reality, is very volatile. If the historical data contains fewer extreme values, it will not be able to simulate well the extreme events in reality.
4.2. Monte Carlo Simulation
Compared with the variance-covariance method and the historical simulation method, the biggest difference of the Monte Carlo simulation method is that this method can simulate the model by establishing a random process model and using a computer to generate a large number of random numbers from outside the sample. This method is also similar to the historical simulation method. They both do not need to make assumptions about the distribution of returns, and the basic principle is to use statistical methods to simulate the distribution of the future returns of the financial assets. This model can simulate various possible situations of the future changes in the financial asset prices and is not constrained by the capacity and quality of historical data. Therefore, its simulation results are often closer to the true value and have higher accuracy, however, the method is complicated to operate and requires a large number of calculations. Besides, the cost is high, and it takes a long time. The choice of random model has a great influence on the simulation results, and hence, there is a certain model risk.
4.3. Variance-Covariance Method
Based on the traditional delta-normal method and gamma-normal method, some simpler and easier-to-operate variance-covariance methods have evolved, e.g., the RiskMetrics model proposed by J. P. Morgan and the VaR method based on the GARCH model.
4.3.1. Delta-Normal Model and Gamma-Normal Model
The core idea of the delta-normal model is to use the approximate relationship between the financial asset portfolio value function and the market risk factor under the assumption of normal distribution and infer the probability distribution of the financial asset portfolio value function from the statistical distribution of risk factors. According to Taylor’s first-order expansion, we approximate the value of VaR to simplify the calculation of VaR. The establishment process of the gamma-normal model is similar to that of the delta-normal model. The difference is that the gamma-normal model describes the nonlinear change characteristics of the financial assets, and it approximates the VaR value according to Taylor’s second-order expansion.
4.3.2. RiskMetrics Model
The RiskMetrics model was proposed by J. P. Morgan in 1994. The core of this model is to assume that the logarithmic returns of the financial assets follow a conditional normal distribution. It uses the exponentially weighted moving average (EWMA) model to estimate the standard deviation sequence of returns. The biggest feature of this model is that when estimating the volatility of the return rate, it is assumed that the weight of the volatility of each period decreases exponentially with the increase of the retrospective time, that is to say, the closer to the current period, the greater the weight. The assumption can better reflect the recent change trend of volatility, and the formula for measuring the rate of return volatility is shown as equation (14).
In equation (14),
When there are m historical data, equation (16) can be obtained by m iterations.
As
The weight of the data during period i is
4.3.3. Variance-Covariance Method Based on GARCH Model
Engle proposed the autoregressive conditional heteroscedasticity model (also known as the ARCH model). This model can better describe the heteroscedasticity in a financial time series, however, it belongs to a nonlinear function. Its parameter estimation cannot be obtained by ordinary methods. In addition, for some financial time series with a long memory, a high-order ARCH model is required. When the lag order is too high, the difficulty of estimating the model parameters will increase, and it will also lead to problems, such as multicollinearity. However, the GARCH model proposed by Bollerslev on the basis of the ARCH model can effectively overcome the shortcomings of the ARCH model. It defines
Of these, equation (18) is the mean yield equation, and
After fitting the GARCH (1, 1) model based on the data in the estimated sample, the prediction model can be used to estimate the volatility of the return rate of the next holding period as in equation (21).
Note:
4.4. Back-Testing
Back-testing means that after predicting the VaR value, the VaR value is compared with the actual loss value to test the validity of the VaR model. The back-testing of VaR is the coverage degree of the model results to the actual loss, and one feasible method is the failure rate test introduced by Kupiec. The definition of “failure rate” is the probability that the actual maximum loss value exceeds the VaR value. By statistical “failure rate,” the effectiveness of the VaR models can be quickly and accurately tested. Assuming that the size of the samples is T and the number of failure days is N, the failure rate is
Test by constructing the likelihood statistic LR (the source and derivation of formula (23) are placed in the appendix at the end of the text).
The ideal number of expected failures is represented as
Table 1
The comparison table of back-testing acceptance confidence interval.
Confidence level (%) | |
95 | |
99 |
5. Empirical Analysis
5.1. Data Acquisition and Processing
The Shanghai Composite Index was officially released on July 15, 1991. The calculated sample stocks in the Shanghai Composite Index include all stocks listed on the Shanghai Stock Exchange. It can not only reflect the changes in the price of the listed stocks on the Shanghai Stock Exchange but also reflect the economic conditions of various industries. It is of great significance to measure and manage risks. This article collects the daily closing prices of the Shanghai Composite Index from January 4, 2010, to April 8, 2020, a total of 2494 sample data. The entire sample is divided into two parts: an estimated sample, the 1456 closing prices data from January 4, 2010, to December 31, 2015, and a prediction sample, the 1038 closing prices data from January 4, 2016, to April 8, 2020.
The stock holding period studied in this paper is one day. The data on that day is represented by
The geometric yield of the stock
In the research, to simplify the analysis, it is often assumed that the dividend
5.2. Forecasting of VaR
According to the VaR calculation formula under the assumption of normal distribution, the key to estimating the VaR value is to predict the standard deviation of the rate of return. The prediction of the return rate fluctuation is very complicated. To forecast the VaR of a certain holding period simply and quickly, the average value and variance of the historical data can be used to approximate the mean and variance of return rate in the forecast period by simple statistics of the existing historical data. There are some unscientific and irrational processes in the estimation process, however, it still has a certain reference function for financial asset risk management. We select the Shanghai Composite Index data to conduct an empirical analysis. According to the variance and mean of the initially estimated sample, the VaR value on January 4, 2016, can be estimated. Then, the estimated sample is rolled back one day, in turn, to forecast the daily VaR of the entire forecast sample. The final forecasting result is shown in Figure 1. The red curve represents the absolute VaR value at 99% confidence level, and the blue curve represents the absolute VaR value under 95% confidence level. The black line represents the actual loss, which is the difference between the closing price of the previous day and the current day. The red line in the figure is above the blue line, indicating that the VaR value under high confidence is larger. The two VaR curves have a certain deviation from the actual loss value, and they are both above the actual loss value, which shows that the simple prediction method of the VaR value has a certain effect on avoiding risk loss, however, the prediction results have a large error and are not scientific.
[figure omitted; refer to PDF]
(3) ARCH Effect Test. It can be judged that the residuals have an autocorrelation through the graphical characteristics of the residuals. To further verify this characteristic, this paper uses the R software to test the ARCH effect of the residual series. The basic principle of the test is if the residual sequence has an ARCH effect, the residual squared sequence generally will have autocorrelation. Therefore, the test of the ARCH effect can be transformed into the autocorrelation test of the residual squared sequence. There are two test methods, namely the Q test and the LM test.
(a) Q test.
The Q test was proposed by Box and Pierce in 1970, and in 1978, Ljung and Box improved the Q statistic. The null hypothesis of Q test is as follows: the residual square sequence is a pure random sequence, and there is no ARCH effect. If
The improved Q statistic expression is as follows:
(b) LM test.
The LM test is the abbreviation of Lagrange multiplier test. The basic principle of the LM test is to construct an autoregressive auxiliary model of the residual squared sequence and test whether the residual squared sequence has the ARCH effect by checking whether the regression coefficients in the autoregressive model are all 0. The auxiliary model expression is as follows:
where
The statistical expression of the LM test is as follows:
T is the sample size of the regression model, and
It can be seen from Table 4 that the results of the Q test and the LM test reject the null hypothesis, indicating that the residual sequence has an ARCH effect and meets the research conditions of the GARCH model.
Table 4
ARCH effect test results.
Testing method | Statistics | |
Q test | 854.351 | 0.000 |
LM inspection | 243.382 | 0.000 |
5.4.3. Fitting Results of GRACH (1, 1) Model
The fitting results of the GARCH (1, 1) model under the assumptions of normal distribution, t distribution, and GED distribution are as follows:
Table 5
GARCH (1, 1) fitting results.
Coefficient | Normal distribution | t distribution | GED distribution |
9.152E − 06 | 2.523E − 04 | 2.675E − 04 | |
(0.977) | (0.387) | (0.252) | |
0.047 | 0.046 | 0.046 | |
(0.006) | (0.000) | (0.000) | |
0.094 | 0.945 | 0.943 | |
(0.000) | (0.000) | (0.000) | |
LM statistics | 11.945 | 12.519 | 12.363 |
(0.451) | (0.405) | (0.417) | |
Q (20) | 17.132 | 17.987 | 17.786 |
(0.644) | (0.588) | (0.601) | |
AIC | −5.819 | −5.881 | −5.891 |
Note: Table 5 shows the value of each coefficient or statistic. The value in brackets represents the corresponding p value, and
The fitting results of the GARCH (1, 1) model show that under any distribution, the coefficients of the ARCH term (
It can be seen from Table 5 that the p values of the LM statistic and the Q (20) statistic are greater than 0.05, and the null hypothesis of the ARCH test should be accepted. In other words, the residuals of the GARCH (1,1) model under the three hypothetical distributions have no autocorrelation, and the ARCH effect has disappeared.
5.4.4. Forecasting and Calculate VaR Value
By the initially estimated sample, a specific GARCH model can be estimated, and the volatility on January 4, 2016, can be predicted and estimated to obtain the VaR value of the day. Rolling the estimated sample toward one day and keeping its sample size unchanged, it can continue to estimate another GARCH model. Then, estimate the volatility on January 5, 2016, and calculate the VaR value of the day, and so on. The estimated samples are rolled in sequence for one day, and bring the predicted volatility of return and the corresponding quantiles of each distribution (as shown in Table 6) into the VaR calculation formula to estimate the VaR value of each trading day during the period from January 4, 2016, to April 8, 2020. The series of operations are completed by the R software.
Table 6
Quantile table.
Significance level | Normal distribution | t distribution (df = 6) | GED distribution |
1.645 | 1.943 | 2.417 | |
2.332 | 3.143 | 4.682 |
The R software is used to plot the absolute VaR value and the actual loss value of the Shanghai Composite Index from January 4, 2016, to April 8, 2020. Figures 6–8, respectively, show the daily absolute VaR value of the Shanghai Composite Index under the assumptions of normal distribution, t distribution, and GED distribution under the 95% significance level. The black curve represents the true daily loss value, expressed as the difference between the closing price of the previous day and that of the day. For convenience, we denote the GARCH (1, 1) models under the three distributions as GARCH-N, GARCH-t, and GARCH-GED, respectively. From the comparison of the absolute VaR value and the actual loss value, it can be seen that the actual loss value fluctuates around 0, and the actual loss value is negative, indicating the profit of the day. The VaR curve is almost above the actual loss value, and there are only a few VaR values falling on the upper edge of the actual loss curve. Figure 9 shows that the absolute VaR values predicted under the three distribution assumptions overlap, however, it can be seen that the VaR curve predicted by the GARCH-GED model is the highest, followed by the GARCH-t model, and the VaR curve predicted by the GARCH-N model is the lowest.
[figure omitted; refer to PDF][figure omitted; refer to PDF][figure omitted; refer to PDF][figure omitted; refer to PDF]5.5. Back-Testing Results
According to the comparison chart of the VaR forecast value and the actual loss value of the Shanghai Composite Index, the VaR value is mostly above the actual loss value, indicating that the estimated VaR value is effective, and the risk can be controlled to a certain extent. In addition, this paper refers to the situation where the actual loss value exceeds the VaR value as a prediction failure. The number of VaR value prediction failures and the failure rate in 1038 prediction samples are counted by the R software. The return test results of each VaR model are shown in Table 7.
Table 7
Back-testing results.
Model | 95% significance level | 99% significance level | ||||||
Relative VaR | Absolute VaR | Relative VaR | Absolute VaR | |||||
Failure frequency | Failure rate (%) | Failure frequency | Failure rate (%) | Failure frequency | Failure rate (%) | Failure frequency | Failure rate (%) | |
Simple VaR | 34 | 3.28 | 34 | 3.28 | 16 | 1.54 | 16 | 1.54 |
RiskMetrix | 55 | 5.3 | 55 | 5.3 | 28 | 2.7 | 28 | 2.7 |
GARCH-N | 47 | 4.53 | 47 | 4.53 | 22 | 2.12 | 22 | 2.12 |
GARCH-t | 29 | 2.79 | 29 | 2.79 | 12 | 1.16 | 12 | 1.16 |
GARCH-GED | 22 | 2.12 | 22 | 2.12 | 3 | 0.29 | 3 | 0.29 |
The results in Table 7 show that there is no difference between the return test results of the relative VaR and absolute VaR. With 95% confidence, only the RiskMetrics model cannot pass the back-testing. Under 99% confidence, compared with the ideal failure rate, only the GARCH-GED model can pass the strict back-testing. However, when the number of samples is 1000, according to the confidence interval given by Kupiec (Table 1), the failure times of the GARCH-t model are within the confidence interval, and it is considered to be able to pass back-testing. In addition, for the prediction results based on the GARCH (1, 1) model, the VaR forecasting failure rate under the GED distribution assumption is the lowest, followed by the t distribution, and the failure rate under the normal distribution assumption is higher.
Comparing the prediction results of the RiskMetrics model and the GARCH (1, 1) model in Table 7, the failure rate of the VaR prediction results of the former is higher than that of the latter. Both models consider the dynamic and time-varying of the logarithmic return when calculating the conditional variance. However, when calculating VaR in the RiskMetrics model, the weight of the index weighted moving the average model uses a constant
6. Conclusion
To further study the effectiveness of the VaR model, this paper conducts an empirical analysis of the Shanghai Composite Index data based on the variance-covariance method. We selected the daily log return rate of the Shanghai Composite Index from January 4, 2010, to April 8, 2020, as the sample and used the data of the first 1456 days as the estimated sample and the data of the other 1038 days as the forecast sample to update the estimated sample data by rolling backward one day, in turn. Then, the daily VaR value of the forecast sample is predicted, and some conclusions are drawn.
The VaR model has a certain risk prevention function. From the research in this paper, it can be seen that the VaR curves obtained by the five models are generally above the actual loss value. The greater the deviation between the VaR curve and the actual loss curve, the lower the failure rate of the model and the stronger the effectiveness. Compared with other models, the VaR curve estimated based on the RiskMetrics model has a lower deviation from the actual loss curve, and hence, its back-testing failure rate is significantly higher than the other models.
Different distribution assumptions will affect the validity of the VaR prediction results based on the GARCH (1, 1) model. When estimating the VaR value based on the GARCH (1,1) model, it is found that the daily logarithmic return rate of the Shanghai Composite Index does not obey the normal distribution and has the characteristics of sharp peaks and thick tails. The residuals of the mean equation have an obvious ARCH effect. After introducing the GARCH (1, 1) model under the assumption of t distribution and GED distribution, the new model has better fitting results and can predict the volatility of the return more accurately. From the back-testing results, it can be seen that the GARCH-GED model has the lowest failure rate. The second is the GARCH-t model, and the GARCH-N model has a relatively high failure rate.
The VaR model is easier to pass the back-testing at the 95% confidence level. Comparing the VaR values under two different confidence levels, it can be seen that the higher the confidence level, the greater the corresponding VaR values. The VaR curve at the 99% confidence level is higher than the VaR curve at the 95% confidence level. At the 95% confidence level, all the five models, except the RiskMetrics model, can pass the back-testing. However, at the 99% confidence level, comparing the ideal failure rate, only the GARCH-GED model can pass the back-testing. According to the confidence given by the Kupiec interval, the GARCH-t model can also pass the back-testing.
Acknowledgments
The paper was supported by “the Fundamental Research Funds for the Central Universities”, South-Central University for Nationalities (CSY18013).
Appendix
The likelihood ratio test is a test method that uses the ratio of two opposite likelihood functions, i.e., likelihood ratio, to test whether a hypothesis is valid.
Definition 1.
Generalized likelihood Ratio
Let
Let
Then, we call it the generalized likelihood ratio of the hypothesis testing problem. The failure ratio test proposed by Kupiec (1995) can measure whether the number of failure values in the VaR estimation sequence is consistent with the given confidence level c. Under the original assumption, the model is “correct,” and the number of failure values N follows a binomial distribution. Therefore, the known variables for the failure rate test are the sample size T, number of failure values n, and confidence level c.
The original hypothesis of the failure rate test is
As the number of failure values N follows binomial distribution, its probability mass function is,
Then, the generalized likelihood ratio is,
Let
Under the null hypothesis, the LR statistic obeys the
[1] A. G. Girardi, T. Ergün, "Systemic risk measurement: multivariate GARCH estimation of CoVaR," Journal of Banking & Finance, vol. 37 no. 8,DOI: 10.1016/j.jbankfin.2013.02.027, 2013.
[2] C. W. S. Chen, M. M. C. Weng, T. Watanabe, "Bayesian forecasting of value-at-risk based on variant smooth transition heteroskedastic models," Statistics and Its Interface, vol. 10 no. 3,DOI: 10.4310/sii.2017.v10.n3.a9, 2017.
[3] D. A. Kuhe, "The dynamic relationship between crude oil prices and stock market price volatility in Nigeria: a cointegrated VAR-GARCH model," Current Journal of Applied Science and Technology, vol. 38 no. 3,DOI: 10.9734/cjast/2019/v38i330363, 2019.
[4] C. Wang, H. Wan, W. Zhang, "Financial market risk measurement model VAR," Journal of Systems Engineering, vol. 15 no. 1, pp. 67-75, 2000.
[5] S. K. Peng, L. Y. Gu, "VAR model and its influence on bank capital requirements," Statistics and decision, vol. no. 1, pp. 109-110, 2007.
[6] Z. Xiao, Y. Su, "Application of VAR model in financial risk management," Productivity research, vol. 24, pp. 44-46, 2008.
[7] S. Liang, "Monitoring and measurement of systemic financial risk in China: an empirical study based on egrach VAR model," Discussion on modern economy, vol. 11, pp. 33-40, 2017.
[8] X. Su, S. Xie, Y. Zhou, "Some progress and application of modeling theory and method of financial risk measurement," Operations research and management, vol. 27 no. 1, pp. 185-199, 2018.
[9] H. Du, "Application of VAR model in securities risk management," Securities Market Guide, vol. 8, pp. 57-61, 2000.
[10] P. Christofersen, V. Errunza, "Towards a global financial architecture: capital mobility and risk management issues," Emerging Markets Review, vol. 1 no. 1, 2000.
[11] T. Angelidis, A. Benos, S. Degiannakis, "The use of GARCH models in VaR estimation," Statistical Methodology, vol. 1 no. 1-2, pp. 105-128, DOI: 10.1016/j.stamet.2004.08.004, 2004.
[12] M. K. P. So, P. L. H. Yu, "Empirical analysis of GARCH models in value at risk estimation," Journal of International Financial Markets, Institutions and Money, vol. 16 no. 2, pp. 180-197, 2005.
[13] K. Makiel, "ARIMA-GARCH models in estimating market risk using Value at Risk for the WIG20 index," E-Finanse, vol. 8 no. 2, 2012.
[14] S. Manganelli, R. F. Engle, "Value at risk models in finance," NBER Working paper, vol. 75, pp. 64-78, 2001.
[15] G. J. Alexander, A. M. Baptista, "CVaR as a measure of risk: implications for portfolio selection," Working Paper UCLA and University of Minnesota and University of Arizona, vol. 7, pp. 78-102, 2003.
[16] M. Kaut, H. Vladimirou, S. W. Wallace, S. A. Zenios, "Stability analysis of portfolio management with conditional value-at-risk," Quantitative Finance, vol. 7 no. 4, pp. 397-409, DOI: 10.1080/14697680701483222, 2007.
[17] P. Abad, S. Benito, C. López, "A comprehensive review of value at risk methodologies," The Spanish Review of Financial Economics, vol. 12 no. 1, pp. 15-32, DOI: 10.1016/j.srfe.2013.06.001, 2014.
[18] H. Wang, K. He, Y. Zou, "EMD based value at risk estimate algorithm for electricity markets," Proceedings of the Seventh International Joint Conference on Computational Sciences and Optimization, pp. 445-449, .
[19] T. Xu, J. Ma, "Feed-in tariff or tax-rebate regulation? Dynamic decision model for the solar photovoltaic supply chain," Applied Mathematical Modelling, vol. 89 no. 1, pp. 1106-1123, DOI: 10.1016/j.apm.2020.08.007, 2021.
[20] J. Ma, K. Wu, "Complex system and influence of delayed decision on the stability of a triopoly price game model," Nonlinear Dynamics, vol. 73 no. 3, pp. 1741-1751, DOI: 10.1007/s11071-013-0900-1, 2013.
[21] X. Wang, J. Cai, K. He, "EMD copula based value at risk estimates for electricity markets," Procedia Computer Science, vol. 55, pp. 1318-1324, DOI: 10.1016/j.procs.2015.07.115, 2015.
[22] K. J. He, H. Q. Wang, J. Z. Du, Y. C. Zou, "Forecasting electricity market risk using empirical mode decomposition (EMD)—based multiscale methodology," Energies, vol. 9 no. 931,DOI: 10.3390/en9110931, 2016.
[23] W. Mensi, S. J. H. Shahzad, S. Hammoudeh, R. Zeitun, M. U. Rehman, "Diversification potential of Asian Frontier, BRIC emerging and major developed stock markets: a wavelet-based value at risk approach," Emerging Markets Review, vol. 32, pp. 130-147, DOI: 10.1016/j.ememar.2017.06.002, 2017.
[24] T. Berger, R. Gençay, "Improving daily value-at-risk forecasts: the relevance of short-run volatility for regulatory quality assessment," Journal of Economic Dynamics and Control, vol. 92, pp. 30-46, DOI: 10.1016/j.jedc.2018.03.016, 2018.
[25] C.-M. Kuan, J.-H. Yeh, Y.-C. Hsu, "Assessing value at risk with CARE, the conditional autoregressive expectile models," Journal of Econometrics, vol. 150 no. 2, 2008.
[26] C. Jiang, X. Ding, Q. Xu, Y. Tong, "A TVM-Copula-MIDAS-GARCH model with applications to VaR-based portfolio selection," The North American Journal of Economics and Finance, vol. 51, 2020.
[27] G. Tian, "Research review on financial risk measurement principles and methods," Financial Theory and Teaching, vol. 5, pp. 25-29, 2014.
[28] Y. Li, D. Luo, H. Wang, "Financial risk measurement model based on VaR method and its application," Journal of Shenyang University of Technology (Social Science Edition), vol. 2 no. 4, pp. 335-339, 2009.
[29] J. Liu, Q. Wang, X. Liu, C. Chang, "Financial risk measurement based on time-varying copula GARCH model," Journal of Xihua University (Natural Science Edition), vol. 33 no. 3, pp. 81-84, 2014.
[30] N. Li, "Research on volatility risk of China’s stock index futures price based on VaR and stress test method," Journal of Xi’an University of Arts and Sciences (Social Science Edition), vol. 17 no. 5, pp. 92-94, 2014.
[31] Z. He, W. Lan, "Measurement, evaluation and prevention of financial risk," Financial economy, vol. 10, pp. 74-77, 2014.
[32] X. Gong, S. Peng, S. Yang, Y. Sun, X. Hang, "Research on prudent management of financial risk based on uncertainty distribution," Economic Research, vol. 54 no. 7, pp. 64-77, 2019.
[33] H. Zhang, "Comparative study on risk measurement and performance evaluation of traditional finance and internet finance-VaR method based on EGARCH-GED model," Market weekly, vol. 10, pp. 136-138, 2019.
[34] W. Lin, M. Chen, L. Zhou, X. Meng, "Analysis of risk forecasting ability of Chinese stock market based on VaR-GARCH model family," Statistics and decision, vol. 35 no. 21, pp. 151-155, 2019.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer
Copyright © 2022 Yuling Wang et al. This is an open access article distributed under the Creative Commons Attribution License (the “License”), which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited. Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License. https://creativecommons.org/licenses/by/4.0/
Abstract
With increasing extremal risk, VaR has been becoming a popular methodology because it is easy to interpret and calculate. For comparing the performance of extant VaR models, this paper makes an empirical analysis of five VaR models: simple VaR, VaR based on RiskMetrics, VaR based on different distributions of GARCH-N, GARCH-GED, and GARCH-t. We exploit the daily closing prices of the Shanghai Composite Index from January 4, 2010, to April 8, 2020, and divide the entire sample into two periods for empirical analysis. The rolling window is used to update the daily estimation of risk. Based on the failure rates under different significance levels, we test whether a specific VaR model passes the back-testing. The results indicate that all models, except the RiskMetrics model, pass the test at a 5% level. According to the ideal failure rate, only the GARCH-GED model can pass the test at a 1% level. For the Kupiec confidence interval, the GARCH-t model can also pass the back-testing at all aforementioned levels. Particularly, we find that the GARCH-GED model has the lowest forecasting failure rate in the class of GARCH models.
You have requested "on-the-fly" machine translation of selected content from our databases. This functionality is provided solely for your convenience and is in no way intended to replace human translation. Show full disclaimer
Neither ProQuest nor its licensors make any representations or warranties with respect to the translations. The translations are automatically generated "AS IS" and "AS AVAILABLE" and are not retained in our systems. PROQUEST AND ITS LICENSORS SPECIFICALLY DISCLAIM ANY AND ALL EXPRESS OR IMPLIED WARRANTIES, INCLUDING WITHOUT LIMITATION, ANY WARRANTIES FOR AVAILABILITY, ACCURACY, TIMELINESS, COMPLETENESS, NON-INFRINGMENT, MERCHANTABILITY OR FITNESS FOR A PARTICULAR PURPOSE. Your use of the translations is subject to all use restrictions contained in your Electronic Products License Agreement and by using the translation functionality you agree to forgo any and all claims against ProQuest or its licensors for your use of the translation functionality and any output derived there from. Hide full disclaimer