RePEc: Research Papers in Economics

Research Papers in Economics is a collaborative effort of hundreds of volunteers in many countries to enhance the dissemination of research in economics.

repec.org

NCER Working Paper Series

All  2019  2018  2017  2016  2015  2014  2013  2012  2011  2010  
  • #120
    Download full text
    JEL-Codes:
    C22, C51, C52, C53, C58
    Keywords:
    Volatility forecasting; Realized variance; HAR model; HARQ model; Robust regression; Box-Cox transformation; Forecast comparisons; QLIKE loss; Model confidence set

    A Practical Guide to Harnessing the HAR Volatility Model

    A Clements and D Preve

    The standard heterogeneous autoregressive (HAR) model is perhaps the most popular benchmark model for forecasting return volatility. It is often estimated using raw realized variance (RV) and ordinary least squares (OLS). However, given the stylized facts of RV and wellknown properties of OLS, this combination should be far from ideal. One goal of this paper is to investigate how the predictive accuracy of the HAR model depends on the choice of estimator, transformation, and forecasting scheme made by the market practitioner. Another goal is to examine the effect of replacing its high-frequency data based volatility proxy (RV) with a proxy based on free and publicly available low-frequency data (logarithmic range). In an out-of-sample study, covering three major stock market indices over 16 years, it is found that simple remedies systematically outperform not only standard HAR but also state of the art HARQ forecasts, and that HAR models using logarithmic range can often produce forecasts of similar quality to those based on RV.

  • #119
    Download full text
    JEL-Codes:
    C22, G00
    Keywords:
    Multivariate volatility, combination forecasts, forecast evaluation, model confidence set

    Combining Multivariate Volatility Forecasts using Weighted Losses

    A Clements and M Doolan

    The ability to improve out-of-sample forecasting performance by combining forecasts is well established in the literature. This paper advances this literature in the area of multivariate volatility forecasts by developing two combination weighting schemes that are capable of placing varying emphasis on losses within the combination estimation period. A comprehensive empirical analysis of the out-of-sample forecast performance across varying dimensions, loss functions, sub-samples and forecast horizons show that new approaches significantly outperform their counterparts in terms of statistical accuracy. Within the financial applications considered, significant benefits from combination forecasts relative to the individual candidate models are observed. Although the more sophisticated combination approaches consistently rank higher relative to the equally weighted approach, their performance is statistically indistinguishable given the relatively low power of these loss functions. Finally, within the applications, further analysis highlights how combination forecasts dramatically reduce the variability in the parameter of interest, namely the portfolio weight or beta.

  • #118
    Download full text
    JEL-Codes:
    C22; G00
    Keywords:
    News, media, linguistic analysis, volatility, crude oil

    Media attention and crude oil volatility: Is there any 'new' news in the newspaper?

    D Aromi and A Clements

    In recent years there has been a growing interest in the analysis of large volumes of unscheduled news flow. Such news flow has often been used as an exogenous variable for explaining asset returns and or volatility. This paper examines the dynamic relationship between news flow and asset price dynamics from a different perspective. A novel index of media attention is proposed, and in the context of the crude oil market the linkages between media attention and returns and volatility are examined. It is found that media attention reacts strongly to shocks to volatility whereas there is little impact in the opposite direction. As such media attention seems to inherit the persistence in volatility but offers only a little more in terms of information relevant to future volatility. Therefore media attention does not offer a great deal of new news useful for explaining volatility.

  • #117
    Download full text
    Keywords:
    Air quality, Particulate matter, Dynamic multiple equations

    A Dynamic Multiple Equation Approach for Forecasting PM2.5 Pollution in Santiago, Chile

    Stella Moisan, Rodrigo Herrera and Adam Clements

    A methodology based on a system of dynamic multiple linear equations is proposed that incorporates hourly, daily and annual seasonal characteristics to predict hourly pm2.5 pollution concentrations for 11 meteorological stations in Santiago, Chile. It is demonstrated that the proposed model has the potential to match or even surpass the accuracy of other linear and nonlinear forecasting models in terms of fit and predictive ability. In addition, the model is successful in predicting various categories of high concentration events, up to 76% of mid-range and 100% of extreme-range events as an average across all stations. This forecasting model is considered a useful tool for government authorities to anticipate critical episodes of air quality so as to avoid the detrimental impacts economic and health impacts of extreme pollution levels.

  • #116
    Download full text
    JEL-Codes:
    C43; D72; L82
    Keywords:
    media bias, governmental capture, index

    Does the 4th Estate Deliver? Towards a More Direct Measure of Political Media Bias

    Ralf Dewenter, Uwe Dulleck and Tobias Thomas

    This contribution introduces a new direct measure of political media bias by analyzing articles and newscasts with respect to the tonality on political parties and politicians. On this basis we develop an index sorting the media in the political left to right spectrum. We apply the index to opinion‐leading media in Germany, analysing 7,203,351 reports on political parties and politicians in 35 media outlets from 1988 to 2012. With this approach, in contrast to other indexes, we are able to achieve a more direct and reliable measure of media bias. In addition, we apply the index to study whether the media fulfil their role as the fourth estate, i.e. provide another level of control for government, or whether there is evidence of government capture.

  • #115
    Download full text
    JEL-Codes:
    C53; F47; G15
    Keywords:
    Extreme risk, Co-movements, Multivariate Hawkes-POT, Point process, Value at Risk

    Modelling Extreme Risks in Commodities and Commodity Currencies

    Fernanda Fuentes, Rodrigo Herrera and Adam Clements

    This paper analyzes extreme co-movements between the Australian and Canadian commodity currencies, and the gold and oil markets respectively, within a multivariate extension of the Hawkes-POT model. The intensity of extreme events in the Australian dollar are influenced by extreme events in gold, while the size of extreme events in the Canadian dollar are driven by extreme events in crude oil. Models with both self-excitation and cross-excitation produce the most accurate predictions of extreme risk in these markets. The results of this paper will provide participants in the commodity and currency markets a deeper understanding of the risks they face.

  • #114
    Download full text
    Keywords:
    DSGE models, shocks

    An Unintended Consequence of Using "Errors in Variables Shocks" in DSGE Models?

    Adrian Pagan

    This note shows that the common practice of adding on measurement errors or "errors in variables" when estimating DSGE models can imply that there is a lack of co-integration between model and data variables and also between data variables themselves. An analysis is provided of what the nature of the measurement error would be if it was desired to ensure co-integration. It is very unlikely that it would be the white noise shocks that are commonly used.

  • #113
    Download full text
    JEL-Codes:
    C12; C15; C32, E47
    Keywords:
    Time-varying Granger causality, subsample Wald tests, Money-Income

    Causal Change Detection in Possibly Integrated Systems: Revisiting the Money-Income Relationship

    Shuping Shi, Stan Hurn and Peter C B Phillips

    This paper re-examines changes in the causal link between money and income in the United States for over the past half century (1959 - 2014). Three methods for the data-driven discovery of change points in causal relationships are proposed, all of which can be implemented without prior detrending of the data. These methods are a forward recursive algorithm, a recursive rolling algorithm and the rolling window algorithm all of which utilize subsample tests of Granger causality within a lag-augmented vector autoregressive framework. The limit distributions for these subsample Wald tests are provided. The results from a suite of simulation experiments suggest that the rolling window algorithm provides the most reliable results, followed by the recursive rolling method. The forward expanding window procedure is shown to have worst performance. All three approaches find evidence of money-income causality during the Volcker period in the 1980s. The rolling and recursive rolling algorithms detect two additional causality episodes: the turbulent period of late 1960s and the starting period of the subprime mortgage crisis in 2007.

  • #112
    Download full text
    JEL-Codes:
    C30;C36;E13
    Keywords:
    Impulse Responses to DSGE, SVAR

    Investigating the Relationship Between DSGE and SVAR Models

    Adrian Pagan and Tim Robinson

    DSGE models often contain variables for which data is not observed when estimating. Although DSGE models generally imply that there is a finite order SVAR in all the variables this may no longer be true for SVARs just in observable variables, and so there is a VAR-truncation problem. The paper examines this issue. It looks at five different studies using DSGE models that appear in the literature. Generally it emerges that the truncation issue is probably not that important, except possibly in small open economy models with external debt. Even when there is no truncation problem in VARs which control the dynamics) the structural impulse responses from both models may be different due to differing initial responses. It is shown that DSGE models incorporate some strong restrictions on the nature of SVAR models and these would need to employed for the two approaches to give the same initial estimates.

  • #111
    Download full text
    JEL-Codes:
    C22; G11; G17
    Keywords:
    Volatility, multivariate GARCH, equicorrelation, portfolio allocation

    Volatility Dependent Dynamic Equicorrelation

    Adam Clements, Ayesha Scott and Annastiina Silvennoinen

    This paper explores the link between equicorrelation and market volatility. The standard equicorrelation model is extended to condition the correlation process on volatility, based on the Volatility Dependent Dynamic Conditional Correlation class of model. Analysis of this relationship is presented in two empirical examples, with both a national and international context studied. The various correlation forecasting methods are compared using a portfolio allocation problem, specifically the global minimum variance portfolio and Model Confidence Set. Relative economic value is also considered. In the case of U.S. equities, overall the equicorrelation models perform well and the inclusion of volatility in the equicorrelations performs well against the standard equicorrelated model. For large portfolios a simple specification such as constant conditional correlation seems sufficient, particularly during periods of market calm. Internationally, the equicorrelated models perform poorly against the dynamic conditional corelation-based models. Reasoning is provided that the information pooling advantage equicorrelation has over dynamic conditional correlation models is eroded when forecasting correlations between indices, rather than equities. In both applications, there appears to be no statistically significant difference between the standard equicorrelation model and the Volatility Dependent class although in general a volatility dependent structure leads to lower portfolio variances.

  • #110
    Download full text
    JEL-Codes:
    C22; G00
    Keywords:
    Networks, news, volatility, sentiment

    News and network structures in equity market volatility

    Adam Clements and Yin Liao

    An understanding of the linkages between assets is important for understanding the stability of markets. Network analysis provides a natural framework within which to examine such linkages. This paper examines the impact of firm specific news arrivals on the interconnections at an individual firm and overall portfolio level. While a great deal of research has focused on the impact of news on the volatility of a single asset, much less attention has been paid to the role of news in explaining the links between assets. It is found that the both the volume of news and its associated sentiment are important drivers the connectedness between individual stocks and the overall market structure. Firms that experience negative news arrivals during periods of market stress become more centrally important in the market structure.

  • #109
    Download full text
    Keywords:
    Smooth transition conditional correlation; Structural breaks; Return comovement;
    (Published)

    Crude Oil and Agricultural Futures: An Analysis of Correlation Dynamics

    Annastiina Silvennoinen and Susan Thorp

    Correlations between oil and agricultural commodities have varied over previous decades, impacted by renewable fuels policy and turbulent economic conditions. We estimate smooth transition conditional correlation models for 12 agricultural commodities and WTI crude oil. While a structural change in correlations occurred concurrently with the introduction of biofuel policy, oil and food price levels are also key influences. High correlation between biofuel feedstocks and oil is more likely to occur when food and oil price levels are high. Correlation with oil returns is strong for biofuel feedstocks, unlike with other agricultural futures, suggesting limited contagion from energy to food markets.

  • #108
    Download full text
    JEL-Codes:
    C32; C52
    Keywords:
    autoregressive conditional heteroskedasticity, modelling volatility, testing parameter constancy, time-varying GARCH

    Testing constancy of unconditional variance in volatility models by misspecification and specification tests

    Annastiina Silvennoinen and Timo Terasvirta

    The topic of this paper is testing the hypothesis of constant unconditional variance in GARCH models against the alternative that the unconditional variance changes deterministically over time. Tests of this hypothesis have previously been performed as misspecification tests after fitting a GARCH model to the original series. It is found by simulation that the positive size distortion present in these tests is a function of the kurtosis of the GARCH process. Adjusting the size by numerical methods is considered. The possibility of testing the constancy of the unconditional variance before fitting a GARCH model to the data is discussed. The power of the ensuing test is vastly superior to that of the misspecification test and the size distortion minimal. The test has reasonable power already in very short time series. It would thus serve as a test of constant variance in conditional mean models. An application to exchange rate returns is included.

  • #107
    Download full text
    JEL-Codes:
    C12;C15;C32;G17
    Keywords:
    Causality, Forward recursion, Hypothesis testing, Inflation, Output, Recursvie rolling test, Rolling Window, Yield curve

    Change Detection and the Casual Impact of the Yield Curve

    Stan Hurn, Peter C B Phillips and Shuping Shi

    Causal relationships in econometrics are typically based on the concept of predictability and are established in terms of tests for Granger causality. These causal relationships are susceptible to change, especially during times of financial turbulence, making the real-time detection of instability an important practical issue. This paper develops a test for detecting changes in causal relationships based on a recursive rolling window, which is analogous to the procedure used in recent work on financial bubble detection. The limiting distribution of the test takes a simple form under the null hypothesis and is easy to implement in conditions of homoskedasticity, conditional heteroskedasticity and unconditional heteroskedasticity. Simulation experiments compare the efficacy of the proposed test with two other commonly used tests, the forward recursive and the rolling window tests. The results indicate that both the rolling and the recursive rolling approaches offer good finite sample performance in situations where there are one or two changes in the causal relationship over the sample period. The testing strategies are illustrated in an empirical application that explores the causal impact of the slope of the yield curve on output and inflation in the U.S. over the period 1985-2013.

  • #106
    Download full text
    JEL-Codes:
    C22; G00
    Keywords:
    Volatility; Order flow; News; Dynamic conditional score; forecasting

    Public news flow in intraday component models for trading activity and volatility

    Adam Clements, Joanne Fuller and Vasilios Papalexiou

    Understanding the determinants of, and forecasting asset return volatility are crucial issues in many financial applications. Many earlier studies have considered the impact of trading activity and news arrivals on volatility. This paper develops a range of intraday component models for volatility and order flow that include the impact of news arrivals. Estimates of the conditional mean of order flow, taking into account news flow are included in models ofvolatility providing a superior in-sample fit. At a 1-minute frequency, it is found that first generating forecasts of order flow which are then included in forecasts of volatility leads to superior day-ahead forecasts of volatility. While including overnight news arrivals directly into models for volatility improves in-sample fit, this approach produces inferior forecasts.

  • #105
    Download full text
    Keywords:
    VAR

    A New Method for Working With Sign Restrictions in SVARs

    S Ouliaris and A R Pagan

    Structural VARs are used to compute impulse responses to shocks. One problem that has arisen involves the information needed to perform this task i.e. how are the shocks to separated into those representing technology, monetary effects etc. Increasingly the signs of impulse responses are used for this task. However it is often desirable to impose some parametric assumption as well e.g. that monetary shocks have no long-run impact on output. Existing methods for combining sign and parametric restrictions are not well developed. In this paper we provide a relatively simple way to allow for these combinations and show how it works in a number of different contexts.

  • #104
    Download full text
    JEL-Codes:
    C14; C53
    Keywords:
    Implied volatility, Hawkes process, Peaks over threshold, Point process, Extreme events

    Point process models for extreme returns: Harnessing implied volatility

    R Herrera and Adam Clements

    Forecasting the risk of extreme losses is an important issue in the management of financial risk. There has been a great deal of research examining how option implied volatilities (IV) can be used to forecasts asset return volatility. However, the impact of IV in the context of predicting extreme risk has received relatively little attention. The role of IV is considered within a range of models beginning with the traditional GARCH based approach. Furthermore, a number of novel point process models for forecasting extreme risk are proposed in this paper. Univariate models where IV is included as an exogenous variable are considered along with a novel bivariate approach where movements in IV are treated as another point process. It is found that in the context of forecasting Value-at-Risk, the bivariate models produce the most accurate forecasts across a wide range of scenarios.

  • #103
    Download full text
    JEL-Codes:
    C32; Q41; Q47
    Keywords:
    Short-term load forecasting, seasonality, intra-day correlation, recursive equation system

    Forecasting day-ahead electricity load using a multiple equation time series approach

    Adam Clements, Stan Hurn and Zili Li

    The quality of short-term electricity load forecasting is crucial to the operation and trading activities of market participants in an electricity market. In this paper, it is shown that a multiple equation time-series model, which is estimated by repeated application of ordinary least squares, has the potential to match or even outperform more complex nonlinear and nonparametric forecasting models. The key ingredient of the success of this simple model is the e ective use of lagged information by allowing for interaction between seasonal patterns and intra-day dependencies. Although the model is built using data for the Queensland region of Australia, the methods are completely generic and applicable to any load forecasting problem. The model's forecasting ability is assessed by means of the mean absolute percentage error (MAPE). For day-ahead forecast, the MAPE returned by the model over a period of 11 years is an impressive 1.36%. The forecast accuracy of the model is compared with a number of benchmarks including three popular alternatives and one industrial standard reported by the Australia energy market operator (AEMO). The performance of the model developed in this paper is superior to all benchmarks and outperforms the AEMO forecasts by about a third in terms of the MAPE criterion.

  • #102
    Download full text
    JEL-Codes:
    C22; G10; G13; G14
    Keywords:
    Information flow; Volatility; Oil futures; Gold futures; Trading activity.

    The impact of information flow and trading activity on gold and oil futures volatility

    Adam Clements and Neda Todorova

    There is a long history of research into the impact of trading activity and information on financial market volatility. Based on 10 years of unique data on news items relating to gold and crude oil broadcast over the Reuters network, this study has two objectives. It investigates the impact of shocks in trading activity and traders positions which are unrelated to information flows on realized volatility. Additionally, the extent to which the volume of the information flow as well as the sentiment inherent in the news affects volatility is also examined. Both the sentiment and rate of news flow are found to influence volatility, with unexpected positive shocks to the rate of news arrival, and negative shocks to the sentiment of news flow exhibiting the largest impacts. While volatility is also related to measures of trading activity, their influence decreases after news is accounted for indicating that a non-negligible component of trading is in response to public news flow. After controlling for the level of trading activity and news flow, the net positions of the various types of traders play no role, implying that no single group of traders lead to these markets being more volatile.

  • #101
    Download full text
    JEL-Codes:
    C22; G00
    Keywords:
    Realized volatility; diffusion; jumps; point process; Hawkes process; forecasting

    The role in index jumps and cojumps in forecasting stock index volatility: Evidence from the Dow Jones index

    Adam Clements and Yin Liao

    Modeling and forecasting realized volatility is of paramount importance. Previous studies have examined the role of both the continuous and jump components of volatility in forecasting. This paper considers how to use index level jumps and cojumps across index constituents for forecasting index level volatility. In combination with the magnitude of past index jumps, the intensity of both index jumps and cojumps are examined. Estimated jump intensity from a point process model is used within a forecasting regression framework. Even in the presence of the diffusive part of total volatility, and past jump size, intensity of both index and cojumps are found to significantly improve forecast accuracy. An important contribution is that information relating to the behaviour of underlying constituent stocks is useful for forecasting index level behaviour. Improvements in forecast performance are particularly apparent on the days when jumps or cojumps occur, or when markets are turbulent.

  • #100
    Download full text
    JEL-Codes:
    C23; C51; L94; Q41
    Keywords:
    Smooth transition, binary choice model, logit model, electricity spot prices, peak loading pricing, price spikes

    A Smooth Transition Logit Model of the Effects of Deregulation in the Electricity Market

    A S Hurn, Annastiina Silvennoinen and Timo Terasvirta

    The paper proposes and develops a smooth transition logit (STL) model that is designed to detect and model situations in which there is structural change in the behaviour underlying the latent index from which the binary dependent variable is constructed. The maximum likelihood estimators of the parameters of the model are derived along with their asymptotic properties and a Lagrange Multiplier test of the null hypothesis of linearity in the underlying latent index. The development of the STL model is motivated by the desire to assess the impact of deregulation in the Queensland electricity market by addressing the question of whether or not increased competition has resulted in changes in the behaviour of the spot price of electricity, specifically with respect to the well documented phenomenon of periodic abnormally high prices or price spikes. In testing this conjecture the STL model allows the timing of any change to be endogenously determined and also market participants' behavior to change gradually over time. The main results reported in the paper provide clear evidence in support of the structural change in nature and duration of price spikes in Queensland. The endogenous dating of the structural change by the STL model agrees with the institutional detail surrounding the process of deregulation and indicates that the full effect of the policy change took about a year to occur. Notwithstanding the fact that the STL model was specifically developed to tackle a problem couched in an Australian institutional framework this research will be of general interest and applicability. In particular, it is applicable to any situation in which the impact and dating of policy changes is required and where the outcome of the policy is naturally measurable as a binary variable.

  • #99
    Download full text
    JEL-Codes:
    C22; G11; G17
    Keywords:
    Volatility, multivariate GARCH, portfolio allocation

    On the Benefits of Equicorrelation for Portfolio Allocation

    Adam Clements, Ayesha Scott and Annastiina Silvennoinen

    The importance of modelling correlation has long been recognised in the field of portfolio management with large dimensional multivariate problems are increasingly becoming the focus of research. This paper provides a straightforward and commonsense approach toward investigating a number of models used to generate forecasts of the correlation matrix for large dimensional problems. We find evidence in favour of assuming equicorrelation across various portfolio sizes, particularly during times of crisis. During periods of market calm however, the suitability of the constant conditional correlation model cannot be discounted especially for large portfolios. A portfolio allocation problem is used to compare forecasting methods. The global minimum variance portfolio and Model Confidence Set are used to compare methods, whilst portfolio weight stability and relative economic value are also considered.

  • #98
    Download full text
    JEL-Codes:
    C22
    Keywords:
    Credit risk, Merton model, Stochastic volatility, Particle Filtter; Default probability, CDS

    Structural Credit Risk Model with Stochastic Volatility: A Particle-filter Approach

    Di Bu and Yin Liao

    This paper extends Merton's structural credit risk model to account for the fact that the firm's asset volatility follows a stochastic process. With the presence of stochastic volatility, the transformed-data maximum likelihood estimation (MLE) method of Duan (1994, 2000) can no longer be applied to estimate the model. We devise a particle filtering algorithm to solve this problem. This algorithm is based on the general non-linear and non-Gaussian filtering with sequential parameter learning, and a simulation study is conducted to ascertain its finite sample performance. Meanwhile, we implement this model on the real data of companies in Dow Jones industrial average and find that incorporating stochastic volatility into the structural model can largely improve the model performance.

  • #97
    Download full text
    JEL-Codes:
    C32; C36; C51
    Keywords:
    Mixed models, transitory shocks, mixed shocks, long-run restrictions, sign restrictions, instrumental variables

    Econometric Issues when Modelling with a Mixture of I(1) and I(0) Variables

    Lance A Fisher, Syeon-seung Huh and Adrian Pagan

    This paper considers structural models when both I(1) and I(0) variables are present. It is necessary to extend the traditional classification of shocks as permanent and transitory, and we do this by introducing a mixed shock. The extra shocks coming from introducing I(0) variables into a system are then classified as either mixed or transitory. Conditions are derived upon the nature of the SVAR in the event that these extra shocks are transitory. We then analyse what happens when there are mixed shocks, finding that it changes a number of ideas that have become established from the cointegration literature. The ideas are illustrated using a well-known SVAR where there are mixed shocks. This SVAR is re-formulated so that the extra shocks coming from the introduction of I(0) variables do not affect relative prices in the long-run and it is found that this has major implications for whether there is a price puzzle. It is also shown how to handle long-run parametric restrictions when some shocks are identified using sign restrictions.

  • #96
    Download full text
    Keywords:
    pattern recognition, pattern matching, pattern formation

    Patterns and Their Uses

    Adrian Pagan

    Three major themes have emerged in the literature on patterns. These involve pattern recognition, pattern matching (do a set of observations match a particular pattern?) and pattern formation ( how does a pattern emerge?). The talk takes up each of these themes, presenting some economic examples of where a pattern has been of interest, how it has been measured (section 2), some issues in checking whether a given pattern holds (section 3), what theories might account for a particular pattern (section 4), and the predictability of patterns ( section5). Most attention is paid to judging macroeconomic models based on their ability to generate macroeconomic and financial patterns, and some simple tests are suggested to do this. Because sentiment and the origins of patterns are so inextricably linked in macroeconomics and .finance we will spend some time looking at the literature which deals with the interaction of series representing sentiment with those representing macroeconomic and financial outcomes.

  • #95
    Download full text
    Keywords:
    history of macroeconometric system modelling

    Macro-Econometric System Modelling @75

    Tony Hall, Jan Jacobs and Adrian Pagan

    We summarize the history of macroeconometric system modelling as having produced four generations of models. Over time the principles underlying the model designs have been extended to incorporate eight major features. Because models often evolve in response to external events we are led to ask what has happened to models used in the policy process since the financial crisis on 2008/9. We find that models have become smaller but that there is still no standard way of capturing the effects of such a crisis.

  • #94
    Download full text
    Keywords:
    Phillips Curve, structural change

    Issues in Estimating New Keynesian Phillips Curves in the Presence of Unknown Structural Change

    Mariano Kulish and Adrian Pagan

    Many papers which have estimated models with forward looking expectations have reported that the magnitude of the coefficients of the expectations term is very large when compared with the effects coming from past dynamics. This has sometimes been regarded as implausible and led to the feeling that the expectations coefficient is biased upwards. A relatively general argument that has been advanced is that the bias could be due to structural changes in the means of the variables entering the structural equation. An alternative explanation is that the bias comes from weak instruments. In this paper we investigate the issue of upward bias in the estimated coefficients of the expectations variable based on a model where we can see what causes the breaks and how to control for them. We conclude that weak instruments are the most likely cause of any bias and note that structural change can affect the quality of instruments. We also look at some empirical work in Castle et al. (2011) on the NK Phillips curve in the Euro Area and U.S, assessing whether the smaller coefficient on expectations that Castle et al. (2011) highlight is due to structural change. Our conclusion is that it is not. Instead it comes from their addition of variables to the NKPC. After allowing for the fact that there are weak instruments in the estimated re-specified model it would seem that the forward coefficient estimate is actually quite high rather than low.

  • #93
    Download full text
    JEL-Codes:
    C22; G00
    Keywords:
    Realized volatility, diffusion, jumps, point process, Hawkes process, forecasting

    Modeling and forecasting realized volatility: getting the most out of the jump component

    Adam E Clements and Yin Liao

    Modeling and forecasting realized volatility is of paramount importance. Recent econometric developments allow total volatility to be decomposed into its' constituent continuous and jump components. While previous studies have examined the role of both components in forecasting, little analysis has been undertaken into how best to harness the jump component. This paper considers how to get the most out of the jump component for the purposes of forecasting total volatility. In combination with the magnitude of past jumps, the intensity of jump occurrence is examined. Estimated jump intensity from a point process model is used within a forecasting regression framework. Even in the presence of the diffusive part of total volatility, and past jump size, intensity is found to significantly improve forecast accuracy. The improvement is particularly apparent on the days when jumps occur or when markets are turbulent. Overall, the best way to harness the jump component for volatility forecasting is to make use of both the magnitude and probability of jump occurrences.

  • #92
    Download full text
    JEL-Codes:
    L83;D63;C63
    Keywords:
    Competitive balance, Idealized standard deviation, Ratio of standard, deviations, Season length, Sports economics, Simulation

    Competitive Balance Measures in Sports Leagues: The Effects of Variation in Season Length

    P Dorian Owen and Nicholas King

    Appropriate measurement of competitive balance is a cornerstone of the economic analysis of professional sports leagues. We examine the distributional properties of the ratio of standard deviations (RSD) of points percentages, the most widely used measure of competitive balance in the sports economics literature, in comparison with other standard-deviation-based measures. Simulation methods are used to evaluate the effects of changes in season length on the distributions of competitive balance measures for different distributions of the strengths of teams in a league. The popular RSD measure performs as expected only in cases of perfect balance; if there is imbalance in team strengths, its distribution is very sensitive to changes in season length. This has important implications for comparisons of competitive balance for different sports leagues with different numbers of teams and/or games played.

  • #91
    Download full text
    JEL-Codes:
    C22;G00
    Keywords:
    Realized volatility, correlation, jumps, co-jumps, point process

    The dynamics of co-jumps, volatility and correlation

    Adam Clements and Yin Liao

    Understanding the dynamics of volatility and correlation is a crucially important issue. The literature has developed rapidly in recent years with more sophisticated estimates of volatility, and its associated jump and diffusion components. Previous work has found that jumps at an index level are not related to future volatility. Here we examine the links between co-jumps within a group of large stocks, the volatility of, and correlation between their returns. It is found that the occurrence of common, or co-jumps between the stocks are unrelated to the level of volatility or correlation. On the other hand, both volatility and correlation are lower subsequent to a co-jump. This indicates that co-jumps are a transient event but in contrast to earlier research have a greater impact that jumps at an index level.

  • #90
    Download full text
    JEL-Codes:
    Fourier transform, Fourier series, characteristic function, option price

    On the Efficacy of Fourier Series Approximations for Pricing European and Digital Options

    A S Hurn, Kenenth A Lindsay and Andrew McClelland

    This paper investigates several competing procedures for computing the price of European and digital options in which the underlying model has a characteristic function that is known in at least semi-closed form. The algorithms for pricing the options investigated here are the half-range Fourier cosine series, the half-range Fourier sine series and the full-range Fourier series. The performance of the algorithms is assessed in simulation experiments which price options in a Black-Scholes world where an analytical solution is available and for a simple affine model of stochastic volatility in which there is no closed-form solution. The results suggest that the half-range sine series approximation is the least effective of the three proposed algorithms. It is rather more difficult to distinguish between the performance of the half-range cosine series and the full-range Fourier series. There are however two clear differences. First, when the interval over which the density is approximated is relatively large, the full-range Fourier series is at least as good as the half-range Fourier cosine series, and outperforms the latter in pricing out-of-the-money call options, in particular with maturities of three months or less. Second, the computational time required by the half-range Fourier cosine series is uniformly longer than that required by the full-range Fourier series for an interval of fixed length. Taken together, these two conclusions make a strong case for the merit of pricing options using a full-range range Fourier series as opposed to a half-range Fourier cosine series.

  • #89
    Download full text
    JEL-Codes:
    C33; E31; R19
    Keywords:
    Relative price convergence; Structural break; Panel unit root test; Half-life; Time

    City Relative Price Dynamics in Australia: Are Structural Breaks Important?

    Hiranya K Nath and Jayanta Sarkar

    This paper examines the dynamic behaviour of relative prices across seven Australian cities by applying panel unit root test procedures with structural breaks to quarterly CPI data for 1972Q1-2011Q4. We find overwhelming evidence of convergence in city relative prices. Three common structural breaks are endogenously determined at 1985, 1995, and 2007. Further, correcting for two potential biases, namely Nickell bias and time aggregation bias, we obtain half-life estimates of 2.3-3.8 quarters that are much shorter than those reported by previous research. Thus, we conclude that both structural breaks and bias corrections are important to obtain shorter half-life estimates.

  • #88
    Download full text
    JEL-Codes:
    C22; G00
    Keywords:
    Implied volatility, VIX, hedging, semi-parametric, forecasting

    Forecasting increases in the VIX: A time-varying long volatility hedge for equities

    Adam Clements and Joanne Fuller

    Since the introduction of volatility derivatives, there has been growing interest in option implied volatility (IV). Many studies have examined informational content, and or forecast accuracy of IV, however there is relatively less work on directly modeling and forecasting IV. This paper uses a semi-parametric forecasting approaching to implement a time varying long volatility hedge to combine with a long equity position. It is found that such a equity-volatility combination improves the risk-return characteristics of a simple long equity position which is particularly successful during periods of market turmoil.

  • #87
    Download full text
    JEL-Codes:
    C22;C52
    Keywords:
    stochastic volatility, parameter estimation, maximum likelihood, particle filter

    Estimating the Parameters of Stochastic Volatility Models using Option Price Data

    Stan Hurn, Ken Lindsay and Andrew McClelland

    This paper describes a maximum likelihood method for estimating the parameters of Heston's model of stochastic volatility using data on an underlying market index and the prices of options written on that index. Parameters of the physical measure (associated with the index) and the parameters of the risk-neutral measure (associated with the options) are identified including the equity and volatility risk premia. The estimation is implemented using a particle filter. The computational load of this estimation method, which previously has been prohibitive, is managed by the effective use of parallel computing using Graphical Processing Units. A byproduct of this focus on easing the computational burden is the development of a simplification of the closed-form approximation used to price European options in Heston's model. The efficacy of the filter is demonstrated under simulation and an empirical investigation of the fit of the model to the S&P 500 Index is undertaken. All the parameters of the model are reliably estimated and, in contrast to previous work, the volatility premium is well estimated and found to be significant.

  • #86
    Download full text
    JEL-Codes:
    C21; L13
    Keywords:
    Retail Gasoline Pricing, Vertical Restraints, Shop-a-Docket Discount Scheme, Spatial Econometrics, Australia

    A Spatial Econometric Analysis of the Effect of Vertical Restraints and Branding on Retail Gasoline Pricing

    Stephen Hogg, Stan Hurn, Stuart McDonald and Alicia Rambaldi

    This paper builds an econometric model of retail gas competition to explain the pricing decisions of retail outlets in terms of vertical management structures, input costs and the characteristics of the local market they operate within. The model is estimated using price data from retail outlets from the South-Eastern Queensland region in Australia, but the generic nature of the model means that the results will be of general interest. The results indicate that when the cost of crude oil and demographic variations across different localities are accounted for, branding (i.e. whether the retail outlet is affiliated with one of the major brand distributers - Shell, Caltex, Mobil or BP) has a statistically significant positive effect on prices at nearby retail outlets. Conversely, the presence of an independent (non-branded) retailer within a locality has the effect of lowering retail prices. Furthermore, the results of this research show that service stations participating in discount coupon schemes with the two major retail supermarket chains have the effect of largely off-setting the price increase derived from branding affiliation. While, branding effects are not fully cancelled out, the overall effect is that prices are still higher than if branding did not occur.

  • #85
    Download full text
    JEL-Codes:
    C22; G00
    Keywords:
    Multivariate volatility, portfolio allocation, forecast evaluation, model selection, model confidence set

    Selecting forecasting models for portfolio allocation

    Adam E Clements, Mark Doolan, Stan Hurn and Ralf Becker

    Techniques for evaluating and selecting multivariate volatility forecasts are not yet as well understood as their univariate counterparts. This paper considers the ability of different loss functions to discriminate between a competing set of forecasting models which are subsequently applied in a portfolio allocation context. It is found that a likelihood based loss function outperforms it competitors including those based on the given portfolio application. This result indicates that the particular application of forecasts is not necessarily the most effective approach under which to select models.

  • #84
    Download full text
    JEL-Codes:
    I1;J2;O1,O2
    Keywords:
    Child labour, Health, Human capital, Income inequality, Multiple equilibria

    Why does child labour persist with declining poverty?

    Jayanta Sarkar and Dipanwita Sarkar

    Uneven success of poverty-based approaches calls for a re-think of the causes behind persistent child labour in many developing societies. We develop a theoretical model to highlight the role of income inequality as a channel of persistence. The interplay between income inequality and investments in human capital gives rise to a non-convergent dynamic path of income distribution characterised by clustering of steady state relative incomes around local poles. The child labour trap thus generated is shown to preserve itself despite rising per capita income. In this context, we demonstrate that redistributive policies, such as public provision of education can alleviate the trap, while a ceteris paribus ban on child labour is likely to aggravate it.

  • #83
    Download full text
    JEL-Codes:
    D03;D81;C93
    Keywords:
    Decision under risk, large losses, natural experiment

    Variation in Risk Seeking Behavior in a Natural Experiment on Large Losses Induced by a Natural Disaster

    Lionel Page, David Savage and Benno Torgler

    This study explores people's risk attitudes after having suffered large real-world losses following a natural disaster. Using the margins of the 2011 Australian floods (Brisbane) as a natural experimental setting, we find that homeowners who were victims of the floods and face large losses in property values are 50% more likely to opt for a risky gamble - a scratch card giving a small chance of a large gain ($500,000) - than for a sure amount of comparable value ($10). This finding is consistent with prospect theory predictions of the adoption of a risk-seeking attitude after a loss.

  • #82
    Download full text
    JEL-Codes:
    C14; C53.
    Keywords:
    Electricity Prices, Prices Spikes, Semi-parametric, Multivariate Kernel

    Semi-parametric forecasting of Spikes in Electricity Prices

    Adam E Clements, Joanne Fuller and Stan Hurn

    The occurrence of extreme movements in the spot price of electricity represent a significant source of risk to retailers. Electricity markets are often structured so as to allow retailers to purchase at an unregulated spot price but then sell to consumers at a heavily regulated price. As such, the ability to forecast price spikes is an important aspect of effective risk management. A range of approaches have been considered with respect to modelling electricity prices, including predicting the trajectory of spot prices, as well as more recently, focusing of the prediction of spikes specifically. These models however, have relied on time series approaches which typically use restrictive decay schemes placing greater weight on more recent observations. This paper develops an alternative, semi-parametric method for forecasting that does not rely on this convention. In this approach, a forecast is a weighted average of historical price data, with the greatest weight given to periods that exhibit similar market conditions to the time at which the forecast is being formed. Weighting is determined by comparing short-term trends in electricity price spike occurrences across time, including other relevant factors such as load, by means of a multivariate kernel scheme. It is found that the semi-parametric method produces forecasts that are more accurate than the previously identified best approach for a short forecast horizon.

  • #81
    Download full text
    JEL-Codes:
    C91, L15, D82, D40
    Keywords:
    Credence Goods, Experts, Pricing,

    The Good, the Bad and the Naive: Do fair prices signal good types or do they induce good behaviour?

    Uwe Dulleck, David Johnston, Rudolf Kerschbamer and Matthias Sutter

    Evidence on behavior of experts in credence goods markets raises an important causality issue: Do "fair prices" induce "good behavior", or do "good experts" post "fair prices"? To answer this question we propose and test a model with three seller types: "the good" choose fair prices and behave consumer-friendly; "the bad" mimic the good types' price-setting, but cheat on quality; and "the naive" fall victim to a projection bias that all sellers behave like the bad types. OLS, sample selection and fixed effects regressions support the model's predictions and show that causality goes from good experts to fair prices.

  • #80
    Download full text
    JEL-Codes:
    C22;G11;G17
    Keywords:
    Volatility, multivariate GARCH, portfolio allocation

    Forecasting multivariate volatility in larger dimensions: some practical issues

    Adam E Clements, Ayesha Scott and Annastiina Silvennoinen

    The importance of covariance modelling has long been recognised in the field of portfolio management and large dimensional multivariate problems are increasingly becoming the focus of research. This paper provides a straightforward and commonsense approach toward investigating whether simpler moving average based correlation forecasting methods have equal predictive accuracy as their more complex multivariate GARCH counterparts for large dimensional problems. We find simpler forecasting techniques do provide equal (and often superior) predictive accuracy in a minimum variance sense. A portfolio allocation problem is used to compare forecasting methods. The global minimum variance portfolio and Model Confidence Set (Hansen, Lunde, and Nason (2003)) are used to compare methods, whilst portfolio weight stability and computational time are also considered.

  • #79
    Download full text
    JEL-Codes:
    D82;H50;H61
    Keywords:
    Electoral control, Fiscal restraints, Credence goods

    Expert Politicians, Electoral Control, and Fiscal Restraints

    Uwe Dulleck and Berthold U Wigger

    Fiscal restraints have been argued to force today's governments to internalize the externalities that result from extensive borrowing on future electorates and governments as well as on other countries by causing fiscal instability. In this article we provide an alternative argument for fiscal restraints which is based on an agency perspective on government. A budget maximizing politician is better informed than the electorate about the necessary spending to ensure the states ability to provide services for the economy. In this respect, the politician is an expert in the meaning of the credence good literature. The electorate, being able to observe the budget but not the necessary level of spending, will reelect a government if its budget does not exceed a critical level. A fiscal restraint limits the maximum spending a government will choose if the reelection level is not sufficient to ensure the state's ability to provide services to the economy. We determine when such a fiscal restraint improves voter welfare and discuss the role of the opposition in situations where very high levels of spending are required.

  • #78

    μ-σ Games

    Uwe Dulleck and Andreas Loffler

    Risk aversion in game theory is usually modelled using expected utility, which has been critized early on leading to an extensive literature on generalized expected utility. In this paper we are first to apply μ-σ theory to the analysis of (static) games.
    μ-σ theory is widely accepted in the finance literature, using it allows us to study the effect on uncertainty endogenous to the game, i.e. mixed equilibria. In particular, we look at the case of linear μ-σ utility functions and determine the best response strategy. In the case of 2x2- and NxM-games we are able to characterize all mixed equilibria.

  • #77
    Download full text
    JEL-Codes:
    E24;E52;F32;F41
    Keywords:
    Open economy macroeconomics, monetary policy, unemployment

    Monetary Policy and Unemployment in Open Economies

    Philipp Engler

    After an expansionary monetary policy shock employment increases and unemployment falls. In standard New Keynesian models the fall in aggregate unemployment does not affect employed workers at all. However, Luchinger, Meier and Stutzer (2010) found that the risk of unemployment negatively affects utility of employed workers: An increases in aggregate unemployment decreases workers' subjective well-being, which can be explained by an increased risk of becoming unemployed. I take account of this effect in an otherwise standard New Keynesian open economy model with unemployment as in Gali (2010) and find two important results with respect to expansionary monetary policy shocks: First, the usual wealth effect in New Keynesian models of a declining labor force, which is at odds with the data as high-lighted by Christiano, Trabandt and Walentin (2010), is shut down. Second, the welfare effects of such shocks improve considerably, modifying the standard results of the open economy literature that set off with Obstfeld and Rogoff's (1995) redux model.

  • #76
    Download full text
    JEL-Codes:
    C22;G11; G17
    Keywords:
    Volatility, volatility timing, utility, portfolio allocation, realized volatility

    Volatility timing and portfolio selection: How best to forecast volatility

    Adam E Clements and Annastiina Silvennoinen

    Within the context of volatility timing and portfolio selection this paper considers how best to estimate a volatility model. Two issues are dealt with, namely the frequency of data used to construct volatility estimates, and the loss function used to estimate the parameters of a volatility model. We find support for the use of intraday data for estimating volatility which is consistent with earlier research. We also find that the choice of loss function is important and show that a simple mean squared error loss, overall provides the best forecasts of volatility upon which to form optimal portfolios.

  • #75
    Download full text
    JEL-Codes:
    C22; E32; E37
    Keywords:
    Business and Financial Cycles; Binary Time Series; BBQ Algorithm

    Econometric Analysis and Prediction of Recurrent Events

    Adrian Pagan and Don Harding

    Economic events such as expansions and recessions in economic activity, bull and bear markets in stock prices and financial crises have long attracted substantial interest. In recent times there has been a focus upon predicting the events and constructing Early Warning Systems of them. Econometric analysis of such recurrent events is however in its infancy. One can represent the events as a set of binary indicators. However they are different to the binary random variables studied in micro-econometrics, being constructed from some (possibly) continuous data. The lecture discusses what difference this makes to their econometric analysis. It sets out a framework which deals with how the binary variables are constructed, what an appropriate estimation procedure would be, and the implications for the prediction of them. An example based on Turkish business cycles is used throughout the lecture.

  • #74
    Download full text
    JEL-Codes:
    C91; D81
    Keywords:
    risk preferences, laboratory experiment, elicitation methods, subject heterogeneity

    Within-subject Intra- and Inter-method consistency of two experimental risk attitude elicitation

    Uwe Dulleck, Jacob Fell and Jonas Fooken

    We compare the consistency of choices in two methods to used elicit risk preferences on an aggregate as well as on an individual level. We asked subjects to choose twice from a list of nine decision between two lotteries, as introduced by Holt and Laury (2002, 2005) alternating with nine decisions using the budget approach introduced by Andreoni and Harbaugh (2009). We find that while on an aggregate (subject pool) level the results are (roughly) consistent, on an individual (within-subject) level, behavior is far from consistent. Within each method as well as across methods we observe low correlations. This again questions the reliability of experimental risk elicitation measures and the ability to use results from such methods to control for the risk aversion of subjects when explaining effects in other experimental games.

  • #73
    Download full text
    JEL-Codes:
    L14, D82, D44, R50
    Keywords:
    Credence Goods, Design-Build, Competitive Bidding, Sequential Search, Infrastructure Projects

    Contracting for Infrastructure Projects as Credence Goods

    Uwe Dulleck and Jianpei Li

    Large infrastructure projects are a major responsibility of government, who usually lacks expertise to fully specify the demanded projects. Contractors, typically experts on such projects, advise of the needed design in their bids. Producing the right design is nevertheless costly.
    We model the contracting for such infrastructure projects taking into account this credence goods feature and examine the performance of commonly used contracting methods. We show that when building costs are public information, multistage competitive bidding involving shortlisting of two contractors and contingent compensation of both contractors on design efforts outperforms sequential search and the traditional Design-and-Build approach. While the latter leads to minimum design effort, sequential search suffers from a commitment problem. If building costs are the private information of the contractors and are revealed to them after design cost is sunk, competitive bidding may involve sampling more than two contractors. The commitment problem under sequential search may be overcome by the procurer's incentive to search for low building cost if the design cost is sufficiently low. If this is the case, sequential search may outperform competitive bidding.

  • #72
    Download full text
    JEL-Codes:
    C32; C53; G17
    Keywords:
    Equicorrelation, Implied Correlation, Multivariate GARCH, DCC

    Forecasting Equicorrelation

    Adam E Clements, Christopher A Coleman-Fenn and Daniel R Smith

    We study the out-of-sample forecasting performance of several time-series models of equicorrelation, which is the average pairwise correlation between a number of assets. Building on the existing Dynamic Conditional Correlation and Linear Dynamic Equicorrelation models, we propose a model that uses proxies for equicorrelation based on high-frequency intraday data, and the level of equicorrelation implied by options prices. Using state-of-the-art statistical evaluation technology, we find that the use of both realized and implied equicorrelations outperform models that use daily data alone. However, the out-of-sample forecasting benefits of implied equicorrelation disappear when used in conjunction with the realized measures.

  • #71
    Download full text
    JEL-Codes:
    C12; C52; C87; E24; E32
    Keywords:
    unemployement, non-linearity, dynamic modelling, aggregate demand, real wages

    Asymmetric unemployment rate dynamics in Australia

    Gunnar Bardsen, Stan Hurn and Zoe McHugh

    The unemployment rate in Australia is modelled as an asymmetric and nonlinear function of aggregate demand, productivity, real interest rates, the replacement ratio and the real exchange rate. If changes in unemployment are big, the management of of demand, real interest rates and the replacement ratio will be good instruments to start bringing it down. The model is developed by exploiting recent developments in automated model-selection procedures.

  • #70
    Download full text
    JEL-Codes:
    C14; C52
    Keywords:
    Electricity Prices, Price Spikes, Autoregressive Conditional Duration, Autoregressive

    Forecasting Spikes in Electricity Prices

    Tim Christensen, Stan Hurn and Ken Lindsay

    In many electricity markets, retailers purchase electricity at an unregulated spot price and sell to consumers at a heavily regulated price. Consequently the occurrence of extreme movements in the spot price represents a major source of risk to retailers and the accurate forecasting of these extreme events or price spikes is an important aspect of effective risk management. Traditional approaches to modeling electricity prices are aimed primarily at predicting the trajectory of spot prices. By contrast, this paper focuses exclusively on the prediction of spikes in electricity prices. The time series of price spikes is treated as a realization of a discrete-time point process and a nonlinear variant of the autoregressive conditional hazard (ACH) model is used to model this process. The model is estimated using half-hourly data from the Australian electricity market for the sample period 1 March 2001 to 30 June 2007. The estimated model is then used to provide one-step-ahead forecasts of the probability of an extreme event for every half hour for the forecast period, 1 July 2007 to 30 September 2007, chosen to correspond to the duration of a typical forward contract. The forecasting performance of the model is then evaluated against a benchmark that is consistent with the assumptions of commonly-used electricity pricing models.

  • #69
    Download full text
    Keywords:
    Global Financial Crisis, Great Recession,

    Can We Predict Recessions?

    Don Harding and Adrian Pagan

    The fact that the Global Financial Crisis, and the Great Recession it ushered in, was largely unforeseen, has led to the common opinion that macroeconomic models and analysis is deficient in some way. Of course it has probably always been true that businessmen, journalists and politicians have agreed on the proposition that economists can't forecast recessions. Yet we see an enormous published literature that presents results which suggest it is possible to do so, either with some new model or some new estimation method e.g. Kaufman (2010), Galvao (2006), Dueker (2005), Wright (2006) and Moneta (2005). Moreover, there seem to be no shortage of papers still emerging that make claims along these lines. So a question that naturally arises is how one is to reconcile the existence of an expanding literature on predicting recessions with the scepticism noted above?

  • #68
    Download full text
    JEL-Codes:
    G32;G35
    Keywords:
    Volatility; Trend; Turnover
    (forthcoming)

    Comparing Different Explanations of the Volatility Trend

    Amir Rubin and Daniel Smith

    We analyze the puzzling behavior of the volatility of individual stock returns over the past few decades. The literature has provided many different explanations to the trend in volatility and this paper tests the viability of the different explanations. Virtually all current theoretical arguments that are provided for the trend in the average level of volatility over time lend themselves to explanations about the difference in volatility levels between firms in the cross-section. We therefore focus separately on the crosssectional and time-series explanatory power of the different proxies. We fail to find a proxy that is able to explain both dimensions well. In particular, we find that Cao et al. (2008) market-to-book ratio tracks average volatility levels well, but has no crosssectional explanatory power. On the other hand, the low-price proxy suggested by Brandt et al. (2010) has much cross-sectional explanatory power, but has virtually no time-series explanatory power. We also find that the different proxies do not explain the trend in volatility in the period prior to 1995 (R-squared of virtually zero), but explain rather well the trend in volatility at the turn of the Millennium (1995-2005).

  • #67
    Download full text
    JEL-Codes:
    C12;C14;C52,G11
    Keywords:
    Value-at-Risk, Backtesting, Quantile Regression
    (forthcoming)

    Evaluating Value-at-Risk Models via Quantile Regression

    Wagner Piazza Gaglianone, Luiz Renato Lima, Oliver Linton and Daniel Smith

    This paper is concerned with evaluating Value-at-Risk estimates. It is well known that using only binary variables, such as whether or not there was an exception, sacrifices too much information. However, most of the specification tests (also called backtests) available in the literature, such as Christofferson (1998) and Engle and Mangenelli (2004) are based on such variables. In this paper we propose a new backtest that does not rely solely on binary variables. It is shown that the new backtest provides a sufficient condtion to assess the finite sample performance of a quantile model whereas the existing ones do not. The proposed methodolgy allows us to identify periods of an increased risk exposure based on a quantile regression model (Koenker and Xiao, 2002). Our theoretical findings are corroborated through a Monte Carlo simulation and an empirical exercise with daily S&P500 time series.

  • #66
    Download full text
    JEL-Codes:
    C14, C32, C53, C58
    Keywords:
    Nonparametric, variance-covariance matrix, volatility forecasting, multivariate

    A Kernel Technique for Forecasting the Variance-Covariance Matrix

    Ralf Becker, Adam Clements and Robert O'Neill

    The forecasting of variance-covariance matrices is an important issue. In recent years an increasing body of literature has focused on multivariate models to forecast this quantity. This paper develops a nonparametric technique for generating multivariate volatility forecasts from a weighted average of historical volatility and a broader set of macroeconomic variables. As opposed to traditional techniques where the weights solely decay as a function of time, this approach employs a kernel weighting scheme where historical periods exhibiting the most similar conditions to the time at which the forecast if formed attract the greatest weight. It is found that the proposed method leads to superior forecasts, with macroeconomic information playing an important role.

  • #65
    Download full text
    JEL-Codes:
    C22;C52
    Keywords:
    stochastic differential equations, parameter estimation, quasi-maximum likelihood, moments

    A quasi-maximum likelihood method for estimating the parameters of multivariate diffusions

    Stan Hurn, Andrew McClelland and Kenneth Lindsay

    This paper develops a quasi-maximum likelihood (QML) procedure for estimating the parameters of multi-dimensional stochastic differential equations. The transitional density is taken to be a time-varying multivariate Gaussian where the first two moments of the distribution are approximately the true moments of the unknown transitional density. For affine drift and diffusion functions, the moments are shown to be exactly those of the true transitional density and for nonlinear drift and diffusion functions the approximation is extremely good. The estimation procedure is easily generalizable to models with latent factors, such as the stochastic volatility class of model. The QML method is as effective as alternative methods when proxy variables are used for unobserved states. A conditioning estimation procedure is also developed that allows parameter estimation in the absence of proxies.

  • #64
    Download full text
    JEL-Codes:
    G10;G12
    Keywords:
    Realized volatility, bi-power variation, limit order book, market microstructure, order imbalance

    Volatility and the role of order book structure

    Ralf Becker and Adam Clements

    There is much literature that deals with modeling and forecasting asset return volatility. However, much of this research does not attempt to explain variations in the level of volatility. Movements in volatility are often linked to trading volume or frequency, as a reflection of underlying information flow. This paper considers whether the state of an open limit order book influences volatility. It is found that market depth and order imbalance do influence volatility, even in the presence of the traditional volume related variables.

  • #63
    Download full text
    Keywords:
    Business cycles, binary models, predicting recessions

    Can Turkish Recessions be Predicted?

    Adrian Pagan

    In response to the widespread criticism that macro-economists failed to predict the global recession coming from the GFC, we look at whether recessions in Turkey can be predicted. Because the growth in Turkish GDP is quite persistent one might expect this is possible. But it is the sign of GDP growth that needs to be forecast if we are to predict a recession, and this is made more difficult by the persistence in GDP growth. We build a small SVAR model of the Turkish economy that is motivated by New Keynesian models of the open economy, and find that using the variables entering it increases predictive success, although it is still the case that the predictive record is not good. Non-linear models for Turkish growth are then found to add little to predictive ability. Fundamentally, recession prediction requires one to forecast future shocks to the economy, and thus one needs some indicators of these. The paper explores a range of indicators for the Turkish economy, but none are particularly advantageous. Developing a bigger range of these indicators should be a priority for future Turkish macro-economic research.

  • #62
    Download full text
    Keywords:
    Rugby league; Rugby Union; favouritism

    Evidence of referees' national favouritism in rugby

    Lionel Page and Katie Page

    The present article reports evidence of national favouritism from professional referees in two major sports: Rugby League and Rugby Union. National favouritism can appear when a referee is in charge of a match where one team (and only one) is from his country. For fear of the risk of such favouritism, such situations are avoided in most major sports. In this study we study two specific competitions who depart from this national neutrality" rule: the European Super League in Rugby League (and its second tier competition) and the Super 14 in Rugby Union. In both cases we find strong evidence that referees favour teams from their own nationality, in a way which has a large influence on match results.
    For these two major competitions, the Super League and the Super 14, we compare how a team performs in situations where the referee both shares their nationality and in situations where the referee comes from a different nationality. We also analyse referees' decisions within matches (such as penalty and try decisions) in a Rugby League competition, the Championship (second tier below the Super League). In both Rugby League and Rugby Union we find strong evidence of national favouritism.

  • #61
    Download full text
    JEL-Codes:
    C23; L83
    Keywords:
    playoff uncertainty, match uncertainty, sports league attendance, Australian National Rugby League, fixed effects estimation

    Playoff Uncertainty, Match Uncertainty and Attendance at Australian National Rugby League Matches

    Nicholas King, P Dorian Owen and Rick Audas

    This paper develops a new simulation-based measure of playoff uncertainty and investigates its contribution to modelling match attendance compared to other variants of playoff uncertainty in the existing literature. A model of match attendance that incorporates match uncertainty, playoff uncertainty, past home-team performance and other relevant control variables is fitted to Australian National Rugby League data for seasons 2004-2008 using fixed effects estimation. The results suggest that playoff uncertainty and home-team success are more important determinants of match attendance than match uncertainty. Alternative measures of playoff uncertainty based on points behind the leader, although more ad hoc, also appear able to capture the effects of playoff uncertainty.

  • #60
    Download full text
    JEL-Codes:
    c22;G00
    Keywords:
    Cholesky, Midas, volatility forecasts

    A Cholesky-MIDAS model for predicting stock portfolio volatility

    Ralf Becker, Adam Clements and Robert O'Neill

    This paper presents a simple forecasting technique for variance covariance matrices. It relies significantly on the contribution of Chiriac and Voev (2010) who propose to forecast elements of the Cholesky decomposition which recombine to form a positive definite forecast for the variance covariance matrix. The method proposed here combines this methodology with advances made in the MIDAS literature to produce a forecasting methodology that is flexible, scales easily with the size of the portfolio and produces superior forecasts in simulation experiments and an empirical application.

  • #59
    Download full text
    JEL-Codes:
    D63;L83
    Keywords:
    sports economics, competitive balance, relative standard deviation,idealized standard deviation, draws/ties

    Measuring Parity in Sports Leagues with Draws: Further Comments

    P Dorian Owen

    This paper re-examines the calculation of the relative standard deviation (RSD) measure of competitive balance in leagues in which draws are possible outcomes. Some key conclusions emerging from the exchange between Cain and Haddock (2006) and Fort (2007) are reversed. There is no difference, for any given points assignment scheme, between the RSD for absolute points compared to percentages of points. However, variations in the points assignment that change the ratio of points for a win compared to a draw do result in different RSD values, although the numerical differences are minor.

  • #58
    Download full text
    JEL-Codes:
    C22;C53;E32;E37
    Keywords:
    Generalized dynamic categorical model, Business cycle; binary variable, Markov process, probit model, yield curve

    Applying shape and phase restrictions in generalized dynamic categorical models of the business cycle

    Don Harding

    To match the NBER business cycle features it is necessary to employ Generalised dynamic categorical (GDC) models that impose certain phase restrictions and permit multiple indexes. Theory suggests additional shape restrictions in the form of monotonicity and boundedness of certain transition probabilities. Maximum likelihood and constraint weighted bootstrap estimators are developed to impose these restrictions. In the application these estimators generate improved estimates of how the probability of recession varies with the yield spread.

  • #57
    Download full text
    JEL-Codes:
    E32;C51;C32
    Keywords:
    Structural Vector Autoregressions, New Keynesian Model, Sign Restrictions

    Sign Restrictions in Structural Vector Autoregressions: A Critical Review

    Renee Fry and Adrian Pagan

    The paper provides a review of the estimation of structural VARs with sign restrictions. It is shown how sign restrictions solve the parametric identification problem present in structural systems but leave the model identification problem unresolved. A market and a macro model are used to illustrate these points. Suggestions have been made on how to find a unique model. These are reviewed, along with some of the difficulties that can arise in how one is to use the impulse responses found with sign restrictions.

  • #56
    Download full text
    JEL-Codes:
    C1; C32; G14
    Keywords:
    US Treasury markets, high frequency data, cojump test

    Cojumping: Evidence from the US Treasury Bond and Futures Markets

    Mardi Dungey and Lyudmyla Hvozdyk

    The basis between spot and future prices will be affected by jump behavior in each asset price, challenging intraday hedging strategies. Using a formal cojumping test this paper considers the cojumping behavior of spot and futures prices in high frequency US Treasury data. Cojumping occurs most frequently at shorter maturities and higher sampling frequencies. We find that the presence of an anticipated macroeconomic news announcement, and particularly non-farm payrolls, increases the probability of observing cojumps. However, a negative surprise in non-farm payrolls, also increases the probability of the cojumping tests being unable to determine whether jumps in spots and futures occur contemporaneously, or alternatively that one market follows the other. On these occasions the market does not clearly signal its short term pricing behavior.

  • #55
    Download full text
    JEL-Codes:
    C93
    Keywords:
    Tournament, first-mover advantage, psychological pressure, field experiment, soccer, penalty shootouts

    Psychological pressure in competitive environments: Evidence from a randomized natural experiment: Comment

    Martin G. Kocher, Marc V. Lenz and Matthias Sutter

    Apesteguia and Palacios-Huerta (forthcoming) report for a sample of 129 shootouts from various seasons in ten different competitions that teams kicking first in soccer penalty shootouts win significantly more often than teams kicking second. Collecting data for the entire history of six major soccer competitions we cannot replicate their result. Teams kicking first win only 53.4% of 262 shootouts in our data, which is not significantly different from random. Our findings have two implications: (1) Apesteguia and Palacios-Huerta's results are not generally robust. (2) Using specific subsamples without a coherent criterion for data selection might lead to non-representative findings.

  • #54
    Download full text
    JEL-Codes:
    C22; G11;G17
    Keywords:
    Volatility, utility, portfolio allocation, realized volatility, MIDAS

    Portfolio allocation: Getting the most out of realised volatility

    Adam Clements and Annastiina Silvennoinen

    Recent advances in the measurement of volatility have utilized high frequency intraday data to produce what are generally known as realised volatility estimates. It has been shown that forecasts generated from such estimates are of positive economic value in the context of portfolio allocation. This paper considers the link between the value of such forecasts and the loss function under which models of realised volatility are estimated. It is found that employing a utility based estimation criteria is preferred over likelihood estimation, however a simple mean squared error criteria performs in a similar manner. These findings have obvious implications for the manner in which volatility models based on realised volatility are estimated when one wishes to inform the portfolio allocation decision.

  • #53
    Download full text
    JEL-Codes:
    C51; E31; E52
    Keywords:
    Monetary Policy, Bank Credit, VAR, Brazil, Chile

    The Credit Channel and Monetary Transmission in Brazil and Chile: A Structured VAR Approach

    Luis Catão and Adrian Pagan

    We use an expectation-augmented SVAR representation of an open economy New Keynesian model to study monetary transmission in Brazil and Chile. The underlying structural model incorporates key structural features of Emerging Market economies, notably the role of a bank-credit channel. We find that interest rate changes have swifter effects on output and inflation in both countries compared to advanced economies and that exchange rate dynamics plays an important role in monetary transmission, as currency movements are highly responsive to changes in in policy-controlled interest rates. We also find the typical size of credit shocks to have large effects on output and inflation in the two economies, being stronger in Chile where bank penetration is higher.

  • #52
    Download full text
    JEL-Codes:
    C22; C53;Q49
    Keywords:
    Stock returns, Technical analysis, Momentum trading rules, Bootstrapping.

    Testing the Profitability of Technical Analysis as a Portfolio Selection Strategy

    Vlad Pavlov and Stan Hurn

    One of the main diffculties in evaluating the profits obtained using technical analysis is that trading rules are often specifed rather vaguely by practitioners and depend upon the judicious choice of rule parameters. In this paper, popular moving-average (or cross-over) rules are applied to a cross-section of Australian stocks and the signals from the rules are used to form portfolios. The performance of the trading rules across the full range of possible parameter values is evaluated by means of an aggregate test that does not depend on the parameters of the rules. The results indicate that for a wide range of parameters moving-average rules generate contrarian profits (profits from the moving-average rules are negative). In bootstrap simulations the returns statistics are significant indicating that the moving-average rules pick up some form of systematic variation in returns that does not correlate with the standard risk factors.

  • #51
    Download full text
    JEL-Codes:
    J24; M51
    Keywords:
    Productivity, leadership

    Substitution Between Managers and Subordinates: Evidence from British Football

    Sue Bridgewater, Lawrence M. Kahn and Amanda H. Goodall

    We use data on British football managers and teams over the 1994-2007 period to study substitution and complementarity between leaders and subordinates. We find for the Premier League (the highest level of competition) that, other things being equal, managers who themselves played at a higher level raise the productivity of less-skilled teams by more than that of highly skilled teams. This is consistent with the hypothesis that one function of a top manager is to communicate to subordinates the skills needed to succeed, since less skilled players have more to learn. We also find that managers with more accumulated professional managing experience raise the productivity of talented players by more than that of less-talented players. This is consistent with the hypothesis that a further function of successful managers in high-performance workplaces is to manage the egos of elite workers. Such a function is likely more important the more accomplished the workers are -- as indicated, in our data, by teams with greater payrolls.

  • #50
    Download full text
    JEL-Codes:
    E12;E13;C51;C52
    Keywords:
    DSGE models;Phillips Curve;Macroeconometric Models;Bayesian Estimation

    Structural Macro-Econometric Modelling in a Policy Environment

    Martin Fukac and Adrian Pagan

    The paper looks at the development of macroeconometric models over the past sixty years. In particular those that have been used for analysing policy options. We argue that there have been four generations of these. Each generation has evolved new features that have been partly drawn from the developing academic literature and partly from the perceived weaknesses in the previous generation. Overall the evolution has been governed by a desire to answer a set of basic questions and sometimes by what can be achieved using new computational methods. Our account of each generation considers their design, the way in which parameters were quantified and how they were evaluated.

  • #49
    Download full text
    JEL-Codes:
    C14; C52.
    Keywords:
    Transitory components, common features, reduced rank, cointegration.

    Detecting Common Dynamics in Transitory Components

    Tim M Christensen, Stan Hurn and Adrian Pagan

    This paper considers VAR/VECM models for variables exhibiting cointegration and common features in the transitory components. While the presence of cointegration reduces the rank of the long-run multiplier matrix, other types of common features lead to rank reduction in the short-run dynamics. These common transitory components arise when linear combination of the first differenced variables in a cointegrated VAR are white noise. This paper offers a reinterpretation of the traditional approach to testing for common feature dynamics, namely checking for a singular covariance matrix for the transitory components. Instead, the matrix of short-run coefficients becomes the focus of the testing procedure thus allowing a wide range of tests for reduced rank in parameter matrices to be potentially relevant tests of common transitory components. The performance of the different methods is illustrated in a Monte Carlo analysis which is then used to reexamine an existing empirical study. Finally, this approach is applied to analyze whether one would observe common dynamics in standard DSGE models.

  • #48
    Download full text
    Keywords:
    sports betting, inter-market arbitrage

    Inter-market Arbitrage in Sports Betting

    Egon Franck, Erwin Verbeek and Stephan Nuesch

    Unlike the existing literature on sports betting, which concentrates on arbitrage within a single market, this paper examines inter-market arbitrage by searching for arbitrage opportunities through combining bets at the bookmaker and the exchange market. Using the posted odds of eight different bookmakers and the corresponding odds traded at a well-known bet exchange for 5,478 football matches played in the top-five European leagues during three seasons, we find (only) ten intra-market arbitrage opportunities. However, we find 1,450 cases in which a combined bet at the bookmaker as well as at the exchange yields a guaranteed positive return. Further analyses reveal that inter-market arbitrage emerges from different levels of informational efficiency between the two markets.

  • #47
    Download full text
    JEL-Codes:
    L83; D62
    Keywords:
    Sport participation, relational goods, crime, Kenneth Boulding

    Relational Good at Work! Crime and Sport Participation in Italy. Evidence from Panel Data Regional Analysis over the Period 1997-2003.

    Raul Caruso

    What is the broad impact of sport participation and sport activities in a society? The first aim of this paper is tackling this crucial point by studying whether or not there is a relationship between sport participation and crime. A panel dataset have been constructed for the twenty Italian regions over the period 1997-2003. The impact of spot participation on different type of crimes has been studied. Results show that: (i) there is a robust negative association between sport participation and property crime; (ii) There is a robust negative association between sport participation and juvenile crime; (iii) There is a positive association between sport participation and violent crime, but it is only weakly significant.

  • #46
    Download full text
    JEL-Codes:
    D81; L83
    Keywords:
    social pressure, nationality, decision-making, referee home bias, football
    (Accepted)

    The Influence of Social Pressure and Nationality on Individual Decisions: Evidence from the Behaviour of Referees

    Peter Dawson and Stephen Dobson

    This study considers the influences on agents’ decisions in an international context. Using data from five seasons of European cup football matches it is found that referees favour home teams when awarding yellow and red cards. Previous research on referee decisions in national leagues has identified social pressure as a key reason for favouritism. While social pressure is also found to be an important influence in this study, the international setting shows that nationality is another important influence on the decision-making of referees. In considering principal-agent relationships account needs to be taken not only of how agents (referees) decide under social pressure but also of how national identity shapes agents’ decision making.

  • #45
    Download full text
    JEL-Codes:
    C12;C22;G00
    Keywords:
    Implied volatility, volatility forecasts, volatility models, volatility risk premium, model confidence sets

    Forecast performance of implied volatility and the impact of the volatility risk premium

    Ralf Becker, Adam Clements and Christopher Coleman-Fenn

    Forecasting volatility has received a great deal of research attention, with the relative performance of econometric models based on time-series data and option implied volatility forecasts often being considered. While many studies find that implied volatility is the preferred approach, a number of issues remain unresolved. Implied volatilities are risk-neutral forecasts of spot volatility, whereas time-series models are estimated on risk-adjusted or real world data of the underlying. Recently, an intuitive method has been proposed to adjust these risk-neutral forecasts into their risk-adjusted equivalents, possibly improving on their forecast accuracy. By utilising recent econometric advances, this paper considers whether these risk-adjusted forecasts are statistically superior to the unadjusted forecasts, as well as a wide range of model based forecasts. It is found that an unadjusted risk-neutral implied volatility is an inferior forecast. However, after adjusting for the risk premia it is of equal predictive accuracy relative to a number of model based forecasts.

  • #44
    Download full text
    JEL-Codes:
    C10;C22;G11;G17
    Keywords:
    Volatility, utility, portfolio allocation, realized volatility, MIDAS

    On the economic benefit of utility based estimation of a volatility model

    Adam Clements and Annastiina Silvennoinen

    Forecasts of asset return volatility are necessary for many financial applications, including portfolio allocation. Traditionally, the parameters of econometric models used to generate volatility forecasts are estimated in a statistical setting and subsequently used in an economic setting such as portfolio allocation. Differences in the criteria under which the model is estimated and applied may inhibit reduce the overall economic benefit of a model in the context of portfolio allocation. This paper investigates the economic benefit of direct utility based estimation of the parameters of a volatility model and allows for practical issues such as transactions costs to be incorporated within the estimation scheme. In doing so, we compare the benefits stemming from various estimators of historical volatility in the context of portfolio allocation. It is found that maximal utility based estimation, taking into account transactions costs, of a simple volatility model is preferred on the basis of greater realized utility. Estimation of models using historical daily returns is preferred over historical realized volatility.

  • #43
    Download full text
    JEL-Codes:
    C22; G00
    Keywords:
    Volatility, forecasts, forecast evaluation, model confidence set, nonparametric

    A nonparametric approach to forecasting realized volatility

    Adam Clements and Ralf Becker

    A well developed literature exists in relation to modeling and forecasting asset return volatility. Much of this relate to the development of time series models of volatility. This paper proposes an alternative method for forecasting volatility that does not involve such a model. Under this approach a forecast is a weighted average of historical volatility. The greatest weight is given to periods that exhibit the most similar market conditions to the time at which the forecast is being formed. Weighting occurs by comparing short-term trends in volatility across time (as a measure of market conditions) by the application of a multivariate kernel scheme. It is found that at a 1 day forecast horizon, the proposed method produces forecasts that are significantly more accurate than competing approaches.

  • #42

    The Economics of Credence Goods: On the Role of Liability, Verifiability, Reputation and Competition

    Uwe Dulleck, Rudolf Kerschbamer and Matthias Sutter

    Credence goods markets are characterized by asymmetric information between sellers and consumers that may give rise to inefficiencies, such as under- and overtreatment or market break-down. We study in a large experiment with 936 participants the determinants for efficiency in credence goods markets. While theory predicts that either liability or verifiability yields efficiency, we find that liability has a crucial, but verifiability only a minor effect. Allowing sellers to build up reputation has little influence, as predicted. Seller competition drives down prices and yields maximal trade, but does not lead to higher efficiency as long as liability is violated.

  • #41
    Download full text
    JEL-Codes:
    C22; G00
    Keywords:
    Multivariate volatility, forecasts, forecast evaluation, Model confidence set

    Evaluating multivariate volatility forecasts

    Adam Clements, Mark Doolan, Stan Hurn and Ralf Becker

    The performance of techniques for evaluating univariate volatility forecasts are well understood. In the multivariate setting however, the efficacy of the evaluation techniques is not developed. Multivariate forecasts are often evaluated within an economic application such as portfolio optimisation context. This paper aims to evaluate the efficacy of such techniques, along with traditional statistical based methods. It is found that utility based methods perform poorly in terms of identifying optimal forecasts whereas statistical methods are more effective.

  • #40
    Download full text
    JEL-Codes:
    J71; L83.
    Keywords:
    discrimination, race, gender, basketball
    (forthcoming)

    The Economics of Discrimination: Evidence from Basketball

    Lawrence M. Kahn

    This Chapter reviews evidence on discrimination in basketball, primarily examining studies on race but with some discussion of gender as well. I focus on discrimination in pay, hiring, and retention against black NBA players and coaches and pay disparities by gender among college coaches. There was much evidence for each of these forms of discrimination against black NBA players in the 1980s. However, there appears to be less evidence of racial compensation, hiring and retention discrimination against black players in the 1990s and early 2000s than the 1980s. This apparent decline is consistent with research on customer discrimination in the NBA: in the 1980s, there was abundant evidence of fan preference for white players; however, since the 1980s, these preferences seem much weaker. There appears to be little evidence of pay, hiring or retention discrimination against black NBA coaches, and while male college basketball coaches outearn females, this gap is accounted for by differences in revenues and coaches' work histories. There is some dispute over whether these revenue differences are themselves the result of employer discrimination.

  • #39
    Download full text
    JEL-Codes:
    C22; C53; E32; E37
    Keywords:
    Business cycle; binary variable, Markov process, Probit model, yield curve
    (Accepted)

    An Econometric Analysis of Some Models for Constructed Binary Time Series

    Don Harding and Adrian Pagan

    Macroeconometric and financial researchers often use secondary or constructed binary random variables that differ in terms of their statistical properties from the primary random variables used in micro-econometric studies. One important difference between primary and secondary binary variables is that, while the former are, in many instances, independently distributed (i.d.), the latter are rarely i.d. We show how popular rules for constructing the binary states interact with the stochastic processes for of the variables they are constructed from, so that the binary states need to be treated as Markov processes. Consequently, one needs to recognize this when performing analyses with the binary variables, and it is not valid to adopt a model like static Probit which fails to recognize such dependence. Moreover, these binary variables are often censored, in that they are constructed in such a way as to result in sequences of them possessing the same sign. Such censoring imposes restrictions upon the DGP of the binary states and it creates difficulties if one tries to utilize a dynamic Probit model with them. Given this we describe methods for modeling with these variables that both respects their Markov process nature and which explicitly deals with any censoring constraints. An application is provided that investigates the relation between the business cycle and the yield spread.

  • #38
    Download full text
    JEL-Codes:
    C61; E52; E58.
    Keywords:
    Discretion, timeless perspective, policy evaluation.

    Timeless Perspective Policymaking: When is Discretion Superior?

    Richard Dennis

    In this paper I show that discretionary policymaking can be superior to timeless perspective policymaking and identify model features that make this outcome more likely. Developing a measure of conditional loss that treats the auxiliary state variables that characterize the timeless perspective equilibrium appropriately, I use a New Keynesian DSGE model to show that discretion can dominate timeless perspective policymaking when the Phillips curve is relatively flat, due, perhaps, to firm-specific capital (or labor) and/or Kimball (1995) aggregation in combination with nominal price rigidity. These results suggest that studies applying the timeless perspective might also usefully compare its performance to discretion, paying careful attention to how policy performance is evaluated.

  • #37
    Download full text
    JEL-Codes:
    C35; D63; D91; P2
    Keywords:
    Expectations; Happiness; Consumption and Savings; China; Political Economy

    Are optimistic expectations keeping the Chinese happy?

    Paul Frijters, Amy Y.C. Liu and Xin Meng

    In this paper we study the effect of optimistic income expectations on life satisfaction amongst the Chinese population. Using a large scale household survey conducted in 2002 we find that the level of optimism about the future is particularly strong in the countryside and amongst rural-to-urban migrants. The importance of these expectations for life satisfaction is particularly pronounced in the urban areas, though also highly significant for the rural area. If expectations were to reverse from positive to negative, we calculate that this would have doubled the proportion of unhappy people and reduced proportion of very happy people by 48%. We perform several robustness checks to see if the results are driven by variations in precautionary savings or reverse causality.

  • #36
    Download full text
    JEL-Codes:
    D000; D600; 8222; 9210; L830
    Keywords:
    Inequality aversion, relative income, positional concerns, envy, social comparison, performance, interdependent preferences

    Inequality Aversion and Performance in and on the Field

    Benno Torgler, Markus Schaffner, Bruno S. Frey, Sascha L. Schmidt and Uwe Dulleck

    The experimental literature and studies using survey data have established that people care a great deal about their relative economic position and not solely, as standard economic theory assumes, about their absolute economic position. Individuals are concerned about social comparisons. However, behavioral evidence in the field is rare. This paper provides an empirical analysis, testing the model of inequality aversion using two unique panel data sets for basketball and soccer players. We find support that the concept of inequality aversion helps to understand how the relative income situation affects performance in a real competitive environment with real tasks and real incentives.

  • #35
    Download full text
    JEL-Codes:
    C13; C25; C32.
    Keywords:
    Integer-valued autoregression, Poisson distribution, Bernoulli distribution, latent factors, maximum likelihood estimation

    Discrete time-series models when counts are unobservable

    T M Christensen, A. S. Hurn and K A Lindsay

    Count data in economics have traditionally been modeled by means of integer-valued autoregressive models. Consequently, the estimation of the parameters of these models and their asymptotic properties have been well documented in the literature. The models comprise a description of the survival of counts generally in terms of a binomial thinning process and an independent arrivals process usually specified in terms of a Poisson distribution. This paper extends the existing class of models to encompass situations in which counts are latent and all that is observed is the presence or absence of counts. This is a potentially important modification as many interesting economic phenomena may have a natural interpretation as a series of 'events' that are driven by an underlying count process which is unobserved. Arrivals of the latent counts are modeled either in terms of the Poisson distribution, where multiple counts may arrive in the sampling interval, or in terms of the Bernoulli distribution, where only one new arrival is allowed in the same sampling interval. The models with latent counts are then applied in two practical illustrations, namely, modeling volatility in financial markets as a function of unobservable 'news' and abnormal price spikes in electricity markets being driven by latent 'stress'.

  • #34
    Download full text
    JEL-Codes:
    C14, C52.
    Keywords:
    Weather Derivatives, Temperature Models, Cooling Degree Days, Maximum Likelihood Estimation, Distribution for Correlated Variables

    Developing analytical distributions for temperature indices for the purposes of pricing temperature-based weather derivatives

    Adam Clements, A S Hurn and K A Lindsay

    Temperature-based weather derivatives are written on an index which is normally defined to be a nonlinear function of average daily temperatures. Recent empirical work has demonstrated the usefulness of simple time-series models of temperature for estimating the payoffs to these instruments. This paper develops analytical distributions of temperature indices on which temperature derivatives are written. If deviations of daily temperature from its expected value is modelled as an Ornstein-Uhlenbeck process with time-varying variance, then the distributions of the temperature index on which the derivative is written is the sum of truncated, correlated Gaussian deviates. The key result of this paper is to provide an analytical approximation to the distribution of this sum, thus allowing the accurate computation of payoffs without the need for any simulation. A data set comprising average daily temperature spanning over a hundred years for four Australian cities is used to demonstrate the efficacy of this approach for estimating the payoffs to temperature derivatives. It is demonstrated that expected payoffs computed directly from historical records is a particulary poor approach to the problem when there are trends in underlying average daily temperature. It is shown that the proposed analytical approach is superior to historical pricing.

  • #33
    Download full text
    JEL-Codes:
    C14;C52.
    Keywords:
    Temperature, Weather Derivatives, Cooling Degree Days, Time-series Models.

    Estimating the Payoffs of Temperature-based Weather Derivatives

    Adam Clements, A S Hurn and K A Lindsay

    Temperature-based weather derivatives are written on an index which is normally defined to be a nonlinear function of average daily temperatures. Recent empirical work has demonstrated the usefulness of simple time-series models of temperature for estimating the payoffs to these instruments. This paper argues that a more direct and parsimonious approach is to model the time-series behaviour of the index itself, provided a sufficiently rich supply of historical data is available. A data set comprising average daily temperature spanning over a hundred years for four Australian cities is assembled. The data is then used to compare the actual payoffs of temperature-based European call options with the expected payoffs computed from historical temperature records and two time-series approaches. It is concluded that expected payoffs computed directly from historical records perform poorly by comparison with the expected payoffs generated by means of competing time-series models. It is also found that modeling the relevant temperature index directly is superior to modeling average daily temperatures.

  • #32
    Download full text
    JEL-Codes:
    C13, C63
    Keywords:
    gradient algorithms, unconstrained optimisation, generalised method of moments.

    The Devil is in the Detail: Hints for Practical Optimisation

    T M Christensen, A S Hurn and K A Lindsay

    Finding the minimum of an objective function, such as a least squares or negative log-likelihood function, with respect to the unknown model parameters is a problem often encountered in econometrics. Consequently, students of econometrics and applied econometricians are usually well-grounded in the broad differences between the numerical procedures employed to solve these problems. Often, however, relatively little time is given to understanding the practical subtleties of implementing these schemes when faced with illbehaved problems. This paper addresses some of the details involved in practical optimisation, such as dealing with constraints on the parameters, specifying starting values, termination criteria and analytical gradients, and illustrates some of the general ideas with several instructive examples.

  • #31
    Download full text
    JEL-Codes:
    L81, D83.
    Keywords:
    e-commerce, price comparison, decision theory, heuristics, seller reputation

    Buying Online: Sequential Decision Making by Shopbot Visitors

    Uwe Dulleck, Franz Hackl, Bernhard Weiss and Rudolf Winter-Ebmer

    In this article we propose a two stage procedure to model demand decisions by customers who are balancing several dimensions of a product. We then test our procedure by analyzing the behavior of buyers from an Austrian price comparison site. Although in such a market a consumer will typically search for the cheapest price for a given product, reliability and service of the supplier are other important characteristics of a retailer. In our data, consumers follow such a two stage procedure: they select a shortlist of suppliers by using the price variable only; finally, they trade off reliability and price among these shortlisted suppliers.

  • #30
    Download full text
    JEL-Codes:
    E52; E62; C61.
    Keywords:
    Model uncertainty, robustness, uncertainty aversion, time-consistency.

    Model Uncertainty and Monetary Policy

    Richard Dennis

    Model uncertainty has the potential to change importantly how monetary policy should be conducted, making it an issue that central banks cannot ignore. In this paper, I use a standard new Keynesian business cycle model to analyze the behavior of a central bank that conducts policy with discretion while fearing that its model is misspecified. My main results are as follows. First, policy performance can be improved if the discretionary central bank implements a robust policy. This important result is obtained because the central bank's desire for robustness directs it to assertively stabilize inflation, thereby mitigating the stabilization bias associated with discretionary policymaking. In effect, a fear of model uncertainty can act similarly to a commitment mechanism. Second, exploiting the connection between robust control and uncertainty aversion, I show that the central bank's fear of model misspecification leads it to forecast future outcomes under the belief that inflation (in particular) will be persistent and have large unconditional variance, raising the probability of extreme outcomes. Private agents, however, anticipating the policy response, make decisions under the belief that inflation will be more closely stabilized, that is, more tightly distributed, than under rational expectations. Third, as a technical contribution, I show how to solve an important class of linear-quadratic robust Markov-perfect Stackelberg problems.

  • #29
    Download full text
    JEL-Codes:
    C11, C52, E31, E52.
    Keywords:
    Price adjustment, inflation indexation, Bayesian estimation.

    The Frequency of Price Adjustment and New Keynesian Business Cycle Dynamics

    Richard Dennis

    The Calvo pricing model that lies at the heart of many New Keynesian business cycle models has been roundly criticized for being inconsistent both with time series data on inflation and with micro-data on the frequency of price changes. In this paper I develop a new pricing model whose structure can be interpreted in terms of menu costs and information gathering/processing costs, that usefully addresses both criticisms. The resulting Phillips curve encompasses the partial-indexation model, the full-indexation model, and the Calvo model, and can speak to micro-data in ways that these models cannot. Taking the Phillips curve to the data, I find that the share of firms that change prices each quarter is about 60 percent and, reflecting the importance of information gathering/processing costs, that most firms that change prices use indexation. Exploiting an isomorphism result, I show that these values are consistent with estimates implied by the partial-indexation model.

  • #28
    Download full text
    JEL-Codes:
    Z1; C23; C25; I31
    Keywords:
    Morbidity, Mortality, Lifestyle, Alcohol, Smoking, Exercise, Income

    Robustness in Health Research: Do differences in health measures, techniques, and time frame matter?

    Paul Frijters and Aydogan Ulker

    Survey-based health research is in a boom phase following an increased amount of health spending in OECD countries and the interest in ageing. A general characteristic of survey-based health research is its diversity. Different studies are based on different health questions in different datasets; they use different statistical techniques; they differ in whether they approach health from an ordinal or cardinal perspective; and they differ in whether they measure short-term or long-term effects. The question in this paper is simple: do these differences matter for the findings? We investigate the effects of life-style choices (drinking, smoking, exercise) and income on six measures of health in the US Health and Retirement Study (HRS) between 1992 and 2002: (1) self-assessed general health status, (2) problems with undertaking daily tasks and chores, (3) mental health indicators, (4) BMI, (5) the presence of serious long-term health conditions, and (6) mortality. We compare ordinal models with cardinal models; we compare models with fixed effects to models without fixed-effects; and we compare short-term effects to long-term effects. We find considerable variation in the impact of different determinants on our chosen health outcome measures; we find that it matters whether ordinality or cardinality is assumed; we find substantial differences between estimates that account for fixed effects versus those that do not; and we find that short-run and long-run effects differ greatly. All this implies that health is an even more complicated notion than hitherto thought, defying generalizations from one measure to the others or one methodology to another.

  • #27
    Download full text
    JEL-Codes:
    J22; J13; C31
    Keywords:
    Child Development, Maternal Labor Force Participation, Handedness

    Early Child Development and Maternal Labor Force Participation: Using Handedness as an Instrument

    Paul Frijters, David W. Johnston, Manisha Shah and Michael A. Shields

    We estimate the effect of early child development on maternal labor force participation using data from teacher assessments. Mothers might react to having a poorly developing child by dropping out of the formal labor force in order to spend more time with their child, or they could potentially increase their labor supply to be able to provide the funds for better education and health resources. Which action dominates is therefore the empirical question we seek to answer in this paper. Importantly, we control for the potential endogeneity of child development by using an instrumental variables approach, uniquely exploiting exogenous variation in child development associated with child handedness. We find that having a poorly developing young child reduces the probability that a mother will participate in the labor market by about 25 percentage points.

  • #26
    Download full text
    JEL-Codes:
    C23; C25; I31.
    Keywords:
    Happiness methodology, unobservables, latent variable

    The mystery of the U-shaped relationship between happiness and age.

    Paul Frijters and Tony Beatton

    In this paper we address the puzzle of the relation between age and happiness. Whilst the majority of psychologists have concluded there is not much of a relationship at all, the economic literature has unearthed a possible U-shape relationship. In this paper we replicate the U-shape for the German SocioEconomic Panel (GSOEP), and we investigate several possible explanations for it.

  • #25
    Download full text
    JEL-Codes:
    C14, C52
    Keywords:
    Electricity Prices, Extreme Events, Poisson Regressions, Poisson Autoregressive Model

    It never rains but it pours: Modelling the persistence of spikes in electricity prices

    T M Christensen, A S Hurn and K A Lindsay

    During periods of market stress, electricity prices can rise dramatically. This paper treats these abnormal episodes or price spikes as count events and attempts to build a model of the spiking process. In contrast to the existing literature, which either ignores temporal dependence in the spiking process or attempts to model the dependence solely in terms of deterministic variables (like seasonal and day of the week effects), this paper argues that persistence in the spiking process is an important factor in building an effective model. A Poisson autoregressive framework is proposed in which price spikes occur as a result of the latent arrival and survival of system stresses. This formulation captures the salient features of the process adequately, and yields forecasts of price spikes that are superior to those obtained from näıve models which do not account for persistence in the spiking process.

  • #24
    Download full text
    JEL-Codes:
    C12, C22, G00, G14
    Keywords:
    Implied volatility, VIX, volatility forecasts, informational efficiency, jumps

    The Jump component of S&P 500 volatility and the VIX index

    Ralf Becker, Adam Clements and Andrew McClelland

    Much research has investigated the differences between option implied volatilities and econometric model-based forecasts in terms of forecast accuracy and relative informational content. Implied volatility is a market determined forecast, in contrast to model-based forecasts that employ some degree of smoothing to generate forecasts. Therefore, implied volatility has the potential to reflect information that a model-based forecast could not. Specifically, this paper considers two issues relating to the informational content of the S&P 500 VIX implied volatility index. First, whether it subsumes information on how historical jump activity contributed to the price volatility, followed by whether the VIX reflects any incremental information relative to model based forecasts pertaining to future jumps. It is found that the VIX index both subsumes information relating to past jump contributions to volatility and reflects incremental information pertaining to future jump activity, relative to modelbased forecasts. This is an issue that has not been examined previously in the literature and expands our understanding of how option markets form their volatility forecasts.

  • #23
    Download full text
    JEL-Codes:
    G11, G12
    Keywords:
    Stock returns, Momentum portfolios, Size effect

    Momentum in Australian Stock Returns: An Update

    A. S. Hurn and V.Pavlov

    It has been documented that a momentum investment strategy based on buying past well performing stocks while selling past losing stocks, is a profitable one in the Australian context particularly in the 1990s. The aim of this short paper is to investigate whether or not this feature of Australian stock returns is still evident. The paper confirms the presence of a medium-term momentum effect, but also provides some interesting new evidence on the importance of the size effect on momentum.

  • #22
    Download full text
    JEL-Codes:
    F37, C51
    Keywords:
    Contagion, Structural GARCH

    Unobservable Shocks as Carriers of Contagion: A Dynamic Analysis Using Identified Structural GARCH

    Mardi Dungey, George Milunovich and Susan Thorp

    Markets in financial crisis may experience heightened sensitivity to news from abroad and they may also spread turbulence into foreign markets, creating contagion. We use a structural GARCH model to separate and measure these two parts of crisis transmission. Unobservable structural shocks are named and linked to source markets using variance decompositions, allowing clearer interpretation of impulse response functions. Applying this method to data from the Asian crisis, we find signifcant contagion from Hong Kong to nearby markets but little heightened sensitivity. Impulse response functions for an equally-weighted equity portfolio show the increasing dominance of Korean and Hong Kong shocks during the crisis, whereas Indonesia's infuence shrinks.

  • #21
    (forthcoming)

    Extending an SVAR Model of the Australian Economy

    Mardi Dungey and Adrian Pagan

    Dungey and Pagan (2000) present an SVAR model of the Australian economy which models macro-economic outcomes as transitory deviations from a deterministic trend. In this paper we extend that model in two directions. Firstly, we relate it to an emerging literature on DSGE modelling of small open economies. Secondly, we allow for both transitory and permanent components in the series and show how this modification has an impact upon the design of macroeconomic models.

  • #20
    Download full text
    JEL-Codes:
    A110, D100, I310
    Keywords:
    happiness, subjective well-being, perceptions, superstars, economists

    Mirror, Mirror on the Wall, who is the Happiest of Them All?

    Benno Torgler, Nemanja Antic and Uwe Dulleck

    This paper turns Snow-White's magic mirror onto recent economics Nobel Prize winners, top economists and happiness researchers, and through the eyes of the "man in the street" seeks to determine who the happiest academic is. The study not only provides a clear answer to this question but also unveils who is the ladies' man and who is the sweetheart of the aged. It also explores the extent to which information matters and whether individuals' self-reported happiness affects their perceptions about the happiness of these superstars in economics.

  • #19
    Download full text
    JEL-Codes:
    Z130; I300; D310
    Keywords:
    Relative income, positional concerns, social capital, social norms, happiness.

    Social Capital And Relative Income Concerns: Evidence From 26 Countries

    Justina AV Fischer and Benno Torgler

    Research evidence on the impact of relative income position on individuals' attitudes and behaviour is sorely lacking. Therefore, using the International Social Survey Programme 1998 data from 26 countries this paper investigates the impact of relative income on 14 measurements of social capital. We find support for a considerable deleterious positional concern effect of persons below the reference income. This effect is more sizeable by far than the beneficial impact of a relative income advantage. Most of the results indicate that such an effect is non-linear. Lastly, changing the reference group (regional versus national) produces no significant differences in the results.

  • #18
    Download full text
    JEL-Codes:
    C12; C22; G00
    Keywords:
    Volatility, macroeconomic data, forecast, spline, GARCH.

    Forecasting stock market volatility conditional on macroeconomic conditions.

    Ralf Becker and Adam Clements

    This paper presents a GARCH type volatility model with a time-varying unconditional volatility which is a function of macroeconomic information. It is an extension of the SPLINE GARCH model proposed by Engle and Rangel (2005). The advantage of the model proposed in this paper is that the macroeconomic information available (and/or forecasts)is used in the parameter estimation process. Based on an application of this model to S&P500 share index returns, it is demonstrated that forecasts of macroeconomic variables can be easily incorporated into volatility forecasts for share index returns. It transpires that the model proposed here can lead to significantly improved volatility forecasts compared to traditional GARCH type volatility models.

  • #17
    Download full text
    JEL-Codes:
    C12; C22; G00
    Keywords:
    Implied volatility, volatility forecasts, volatility models, realized volatility, combination forecasts.

    Are combination forecasts of S&P 500 volatility statistically superior?

    Ralf Becker and Adam Clements

    Forecasting volatility has received a great deal of research attention. Many articles have considered the relative performance of econometric model based and option implied volatility forecasts. While many studies have found that implied volatility is the preferred approach, a number of issues remain unresolved. One issue being the relative merit of combination forecasts. By utilising recent econometric advances, this paper considers whether combination forecasts of S&P 500 volatility are statistically superior to a wide range of model based forecasts and implied volatility. It is found that combination forecasts are the dominant approach, indicating that the VIX cannot simply be viewed as a combination of various model based forecasts.

  • #16
    Download full text
    JEL-Codes:
    F43; O15; O40
    Keywords:
    Capital Goods Imports, Human Capital, Developing Countries, Technology Diffusion

    Imported Equipment, Human Capital and Economic Growth in Developing Countries

    Uwe Dulleck and Neil Foster

    De Long and Summers (1991) began a literature examining the impact of equipment investment on growth. In this paper we examine such a relationship for developing countries by considering imports of equipment from advanced countries as our measure of equipment investment for a sample of 55 developing countries. We examine whether the level of human capital in a country affects its ability to benefit from such investment. We find a complex interrelationship between imported equipment and human capital. Generally, the relationship between imported equipment and growth is lowest, and often negative, for countries with low levels of human capital, highest for countries within an intermediate range and somewhat in between for countries with the highest level of human capital.

  • #15
    Download full text
    JEL-Codes:
    C12; C22; G00; G14
    Keywords:
    Implied volatility, VIX, volatility forecasts, informational efficiency

    Does implied volatility reflect a wider information set than econometric forecasts?

    Ralf Becker, Adam Clements and James Curchin

    Much research has addressed the relative performance of option implied volatilities and econometric model based forecasts in terms of forecasting asset return volatility. The general theme to come from this body of work is that implied volatility is a superior forecast. Some authors attribute this to the fact that option markets use a wider information set when forming their forecasts of volatility. This article considers this issue and determines whether S&P 500 implied volatility reflects a set of economic information beyond its impact on the prevailing level of volatility. It is found, that while the implied volatility subsumes this information, as do model based forecasts, this is only due to its impact on the current or prevailing level of volatility. Therefore, it appears as though implied volatility does not reflect a wider information set than model based forecasts, implying that implied volatility forecasts simply reflect volatility persistence in much the same way of as do econometric models.

  • #14

    Some Issues in Using Sign Restrictions for Identifying Structural VARs

    Renee Fry and Adrian Pagan

    The paper looks at estimation of structural VARs with sign restrictions. Since sign restrictions do not generate a unique model it is necessary to find some way of summarizing the information they yield. Existing methods present impulse responses from different models and it is argued that they should come from a common model. If this is not done the implied shocks implicit in the impulse responses will not be orthogonal. A method is described that tries to resolve this difficulty. It works with a common model whose impulse responses are as close as possible to the median values of the impulse responses (taken over the range of models satisfying the sign restrictions). Using a simple demand and supply model it is shown that there is no reason to think that sign restrictions will generate better quantitative estimates of the effects of shocks than existing methods such as assuming a system is recursive.

  • #13

    Weak Instruments: A Guide to the Literature

    Adrian Pagan

    Weak instruments have become an issue in many contexts in which econometric methods have been used. Some progress has been made into how one diagnoses the problem and how one makes an allowance for it. The present paper gives a partial survey of this literature, focussing upon some of the major contributions and trying to provide a relatively simple exposition of the proposed solutions.

  • #12
    Download full text
    JEL-Codes:
    H20; C90

    Effects of Tax Morale on Tax Compliance: Experimental and Survey Evidence

    Ronald G. Cummings, Jorge Martinez-Vazquez, Michael McKee and Benno Torgler

    There is considerable evidence that enforcement efforts can increase tax compliance. However, there must be other forces at work because observed compliance levels cannot be fully explained by the level of enforcement actions typical of most tax authorities. Further, there are observed differences, not related to enforcement effort, in the levels of compliance across countries and cultures. To fully understand differences in compliance behavior across cultures one needs to understand differences in tax administration and citizen attitudes toward governments. The working hypothesis is that cross-cultural differences in behavior have foundations in these institutions. Tax compliance is a complex behavioral issue and its investigation requires the use of a variety of methods and data sources. Results from laboratory experiments conducted in different countries demonstrate that observed differences in tax compliance levels can be explained by differences in the fairness of tax administration, in the perceived fiscal exchange, and in the overall attitude towards the respective governments. These experimental results are shown to be robust by replicating them for the same countries using survey response measures of tax compliance.

  • #11
    Download full text
    Keywords:
    Relative income, positional concerns, envy, performance, social integration

    The Power of Positional Concerns: A Panel Analysis

    Benno Torgler, Sascha L. Schmidt and Bruno S. Frey

    Many studies have established that people care a great deal about their relative economic position and not solely, as standard economic theory assumes, about their absolute economic position. However, behavioral evidence is rare. This paper provides an empirical analysis on how individuals' relative income position affects their performance. Using a unique data set for 1040 soccer players over a period of eight seasons, our analysis suggests that if a player's salary is below the average and this difference increases, his performance worsens and the productivity decreasing effects of positional concerns are stronger. Moreover, the larger the income differences within a team, the stronger positional concern effects are observable. We also find that the more the players are integrated in a particular social environment (their team), the more evident a relative income effect is. Finally, we find that positional effects are stronger among high performing teams.

  • #10
    Download full text
    JEL-Codes:
    C22; C53; Q49
    Keywords:
    electricity prices, regime switching, time-varying probabilities, beta

    Modelling Spikes in Electricity Prices

    Ralf Becker, Stan Hurn and Vlad Pavlov

    During periods of market stress, electricity prices can rise dramatically. Electricity retailers cannot pass these extreme prices on to customers because of retail price regulation. Improved prediction of these price spikes, therefore, is important for risk management. This paper builds a time-varying-probability Markov-switching model of Queensland electricity prices, aimed particularly at forecasting price spikes. Variables capturing demand and weather patterns are used to drive the transition probabilities. Unlike traditional Markov-switching models, that assume normality of the prices in each state, the model presented here uses a generalized beta distribution to allow for the skewness in the distribution of electricity prices during high-price episodes.

  • #9
    Download full text
    JEL-Codes:
    C22; C52
    Keywords:
    stochastic di®erential equations, maximum likelihood, ¯nite di®erence, ¯nite element, cumulative

    Teaching an Old Dog New Tricks: Improved Estimation of the Parameters of Stochastic Differential Equations by Numerical Solution of the Fokker-Planck Equation

    A. Hurn, J. Jeisman and K. Lindsay

    Many stochastic differential equations (SDEs) do not have readily available closed-form expressions for their transitional probability density functions (PDFs). As a result, a large number of competing estimation approaches have been proposed in order to obtain maximum-likelihood estimates of their parameters. Arguably the most straightforward of these is one in which the required estimates of the transitional PDF are obtained by numerical solution of the Fokker-Planck(or forward-Kolmogorov) partial differential equation. Despite the fact that this method produces accurate estimates and is completely generic, it has not proved popular in the applied literature. Perhaps this is attributable to the fact that this approach requires repeated solution of a parabolic partial differential equation to obtain the transitional PDF and is therefore computationally quite expensive. In this paper, three avenues for improving the reliability and speed of this estimation method are introduced and explored in the context of estimating the parameters of the popular Cox-Ingersoll-Ross and Ornstein-Uhlenbeck models. The recommended algorithm that emerges from this investigation is seen to offer substantial gains in reliability and computational time.

  • #8
    Download full text
    Keywords:
    nonlinearity in mean, heteroskedasticity, wild bootstrap, empirical size and power

    Testing for nonlinearity in mean in the presence of heteroskedasticity. Working paper #8

    Stan Hurn and Ralf Becker

    This paper considers an important practical problem in testing time-series data for nonlinearity in mean. Most popular tests reject the null hypothesis of linearity too frequently if the the data are heteroskedastic. Two approaches to redressing this size distortion are considered, both of which have been proposed previously in the literature although not in relation to this particular problem. These are the heteroskedasticity-robust-auxiliary-regression approach and the wild bootstrap. Simulation results indicate that both approaches are effective in reducing the size distortion and that the wild bootstrap others better performance in smaller samples. Two practical examples are then used to illustrate the procedures and demonstrate the potential pitfalls encountered when using non-robust tests.

  • #7
    Download full text
    Keywords:
    Permanent shocks, structural identi?cation, error correction models, IS-LM models
    (published)

    Econometric Analysis of Structural Systems with Permanent and Transitory Shocks. Working paper #7

    Adrian Pagan and Hashem Pesaran

    This paper considers the implications of the permanent/transitory decomposition of shocks for identification of structural models in the general case where the model might contain more than one permanent structural shock. It provides a simple and intuitive generalization of the influential work of Blanchard and Quah (1989), and shows that structural equations for which there are known permanent shocks must have no error correction terms present in them, thereby freeing up the latter to be used as instruments in estimating their parameters. The proposed approach is illustrated by a re-examination of the identification scheme used in a monetary model by Wickens and Motta (2001), and in a well known paper by Gali (1992) which deals with the construction of an IS-LM model with supply-side effects. We show that the latter imposes more short-run restrictions than are needed because of a failure to fully utilize the cointegration information.

  • #6
    (published)

    Limited Information Estimation and Evaluation of DSGE Models. Working paper #6

    Martin Fukac and Adrian Pagan

    We advance the proposal that DSGE models should not just be estimated and evaluated with reference to full information methods. These make strong assumptions and therefore there is uncertainty about their impact upon results. Some limited information analysis which can be used in a complementary way seems important. Because it is sometimes difficult to implement limited information methods when there are unobservable non-stationary variables in the system we present a simple method of overcoming this that involves normalizing the non-stationary variables with their permanent components and then estimating the estimating the resulting Euler equations. We illustrate the interaction between full and limited information methods in the context of a well-known open economy model of Lubik and Schorfheide. The transformation was effective in revealing possible mis-specifications in the equations of LS's system and the limited information analysis highlighted the role of priors in having a major influence upon the estimates.

  • #5

    Income and Happiness: Evidence, Explanations and Economic Implications. Working paper #5

    Andrew E. Clark, Paul Frijters and Michael A. Shields

  • #4

    Inventories, Fluctuations and Business Cycles. Working paper #4

    Louis J. Maccini and Adrian Pagan

    The paper looks at the role of inventories in U.S. business cycles and fluctuations. It concentrates upon the goods producing sector and constructs a model that features both input and output inventories. A range of shocks are present in the model, including sales, technology and inventory cost shocks. It is found that the presence of inventories does not change the average business cycle characteristics in the U.S. very much. The model is also used to examine whether new techniques for inventory control might have been an important contributing factor to the decline in the volatility of US GDP growth. It is found that these would have had little impact upon the level of volatility.

  • #3
    Download full text
    Keywords:
    non-linear filtering, stochastic volatility, state-space models, asymmetries, latent factors, two factor volatility models

    Estimating Stochastic Volatility Models Using a Discrete Non-linear Filter. Working paper #3

    Adam Clements, Stan Hurn and Scott White

    Many approaches have been proposed for estimating stochastic volatility (SV) models, a number of which are filtering methods. While non-linear filtering methods are superior to linear approaches, non-linear filtering methods have not gained a wide acceptance in the econometrics literature due to their computational cost. This paper proposes a discretised non-linear filtering (DNF) algorithm for the estimation of latent variable models. It is shown that the DNF approach leads to significant computational gains relative to other procedures in the context of SV estimation without any associated loss in accuracy. It is also shown how a number of extensions to standard SV models can be accommodated within the DNF algorithm.

  • #2
    Download full text
    Keywords:
    stochastic differential equations, parameter estimation, maximum likelihood, simulation, moments

    Seeing the Wood for the Trees: A Critical Evaluation of Methods to Estimate the Parameters of Stochastic Differential Equations. Working paper #2

    Stan Hurn, J.Jeisman and K.A. Lindsay

    Maximum-likelihood estimates of the parameters of stochastic differential equations are consistent and asymptotically efficient, but unfortunately difficult to obtain if a closed form expression for the transitional probability density function of the process is not available. As a result, a large number of competing estimation procedures have been proposed. This paper provides a critical evaluation of the various estimation techniques. Special attention is given to the ease of implementation and comparative performance of the procedures when estimating the parameters of the Cox-Ingersoll-Ross and Ornstein-Uhlenbeck equations respectively.

  • #1
    Download full text
    Keywords:
    Business cycle; binary variable, Markov chain, probit model, yield curve

    The Econometric Analysis of Constructed Binary Time Series. Working paper #1

    Adrian pagan and Don Harding

    Macroeconometric and financial researchers often use secondary or constructed binary random variables that differ in terms of their statistical properties from the primary random variables used in microeconometric studies. One important difference between primary and secondary binary variables is that while the former are, in many instances, independently distributed (i.d.) the later are rarely i.d. We show how popular rules for constructing binary states determine the degree and nature of the dependence in those states. When using constructed binary variables as regressands a common mistake is to ignore the dependence by using a probit model. We present an alternative non-parametric method that allows for dependence and apply that method to the issue of using the yield spread to predict recessions.