RePEc: Research Papers in Economics

Research Papers in Economics is a collaborative effort of hundreds of volunteers in many countries to enhance the dissemination of research in economics.

repec.org

NCER Working Paper Series

2011

All  2019  2018  2017  2016  2015  2014  2013  2012  2011  2010  
  • #77
    Download full text
    JEL-Codes:
    E24;E52;F32;F41
    Keywords:
    Open economy macroeconomics, monetary policy, unemployment

    Monetary Policy and Unemployment in Open Economies

    Philipp Engler

    After an expansionary monetary policy shock employment increases and unemployment falls. In standard New Keynesian models the fall in aggregate unemployment does not affect employed workers at all. However, Luchinger, Meier and Stutzer (2010) found that the risk of unemployment negatively affects utility of employed workers: An increases in aggregate unemployment decreases workers' subjective well-being, which can be explained by an increased risk of becoming unemployed. I take account of this effect in an otherwise standard New Keynesian open economy model with unemployment as in Gali (2010) and find two important results with respect to expansionary monetary policy shocks: First, the usual wealth effect in New Keynesian models of a declining labor force, which is at odds with the data as high-lighted by Christiano, Trabandt and Walentin (2010), is shut down. Second, the welfare effects of such shocks improve considerably, modifying the standard results of the open economy literature that set off with Obstfeld and Rogoff's (1995) redux model.

  • #76
    Download full text
    JEL-Codes:
    C22;G11; G17
    Keywords:
    Volatility, volatility timing, utility, portfolio allocation, realized volatility

    Volatility timing and portfolio selection: How best to forecast volatility

    Adam E Clements and Annastiina Silvennoinen

    Within the context of volatility timing and portfolio selection this paper considers how best to estimate a volatility model. Two issues are dealt with, namely the frequency of data used to construct volatility estimates, and the loss function used to estimate the parameters of a volatility model. We find support for the use of intraday data for estimating volatility which is consistent with earlier research. We also find that the choice of loss function is important and show that a simple mean squared error loss, overall provides the best forecasts of volatility upon which to form optimal portfolios.

  • #75
    Download full text
    JEL-Codes:
    C22; E32; E37
    Keywords:
    Business and Financial Cycles; Binary Time Series; BBQ Algorithm

    Econometric Analysis and Prediction of Recurrent Events

    Adrian Pagan and Don Harding

    Economic events such as expansions and recessions in economic activity, bull and bear markets in stock prices and financial crises have long attracted substantial interest. In recent times there has been a focus upon predicting the events and constructing Early Warning Systems of them. Econometric analysis of such recurrent events is however in its infancy. One can represent the events as a set of binary indicators. However they are different to the binary random variables studied in micro-econometrics, being constructed from some (possibly) continuous data. The lecture discusses what difference this makes to their econometric analysis. It sets out a framework which deals with how the binary variables are constructed, what an appropriate estimation procedure would be, and the implications for the prediction of them. An example based on Turkish business cycles is used throughout the lecture.

  • #74
    Download full text
    JEL-Codes:
    C91; D81
    Keywords:
    risk preferences, laboratory experiment, elicitation methods, subject heterogeneity

    Within-subject Intra- and Inter-method consistency of two experimental risk attitude elicitation

    Uwe Dulleck, Jacob Fell and Jonas Fooken

    We compare the consistency of choices in two methods to used elicit risk preferences on an aggregate as well as on an individual level. We asked subjects to choose twice from a list of nine decision between two lotteries, as introduced by Holt and Laury (2002, 2005) alternating with nine decisions using the budget approach introduced by Andreoni and Harbaugh (2009). We find that while on an aggregate (subject pool) level the results are (roughly) consistent, on an individual (within-subject) level, behavior is far from consistent. Within each method as well as across methods we observe low correlations. This again questions the reliability of experimental risk elicitation measures and the ability to use results from such methods to control for the risk aversion of subjects when explaining effects in other experimental games.

  • #73
    Download full text
    JEL-Codes:
    L14, D82, D44, R50
    Keywords:
    Credence Goods, Design-Build, Competitive Bidding, Sequential Search, Infrastructure Projects

    Contracting for Infrastructure Projects as Credence Goods

    Uwe Dulleck and Jianpei Li

    Large infrastructure projects are a major responsibility of government, who usually lacks expertise to fully specify the demanded projects. Contractors, typically experts on such projects, advise of the needed design in their bids. Producing the right design is nevertheless costly.
    We model the contracting for such infrastructure projects taking into account this credence goods feature and examine the performance of commonly used contracting methods. We show that when building costs are public information, multistage competitive bidding involving shortlisting of two contractors and contingent compensation of both contractors on design efforts outperforms sequential search and the traditional Design-and-Build approach. While the latter leads to minimum design effort, sequential search suffers from a commitment problem. If building costs are the private information of the contractors and are revealed to them after design cost is sunk, competitive bidding may involve sampling more than two contractors. The commitment problem under sequential search may be overcome by the procurer's incentive to search for low building cost if the design cost is sufficiently low. If this is the case, sequential search may outperform competitive bidding.

  • #72
    Download full text
    JEL-Codes:
    C32; C53; G17
    Keywords:
    Equicorrelation, Implied Correlation, Multivariate GARCH, DCC

    Forecasting Equicorrelation

    Adam E Clements, Christopher A Coleman-Fenn and Daniel R Smith

    We study the out-of-sample forecasting performance of several time-series models of equicorrelation, which is the average pairwise correlation between a number of assets. Building on the existing Dynamic Conditional Correlation and Linear Dynamic Equicorrelation models, we propose a model that uses proxies for equicorrelation based on high-frequency intraday data, and the level of equicorrelation implied by options prices. Using state-of-the-art statistical evaluation technology, we find that the use of both realized and implied equicorrelations outperform models that use daily data alone. However, the out-of-sample forecasting benefits of implied equicorrelation disappear when used in conjunction with the realized measures.

  • #71
    Download full text
    JEL-Codes:
    C12; C52; C87; E24; E32
    Keywords:
    unemployement, non-linearity, dynamic modelling, aggregate demand, real wages

    Asymmetric unemployment rate dynamics in Australia

    Gunnar Bardsen, Stan Hurn and Zoe McHugh

    The unemployment rate in Australia is modelled as an asymmetric and nonlinear function of aggregate demand, productivity, real interest rates, the replacement ratio and the real exchange rate. If changes in unemployment are big, the management of of demand, real interest rates and the replacement ratio will be good instruments to start bringing it down. The model is developed by exploiting recent developments in automated model-selection procedures.

  • #70
    Download full text
    JEL-Codes:
    C14; C52
    Keywords:
    Electricity Prices, Price Spikes, Autoregressive Conditional Duration, Autoregressive

    Forecasting Spikes in Electricity Prices

    Tim Christensen, Stan Hurn and Ken Lindsay

    In many electricity markets, retailers purchase electricity at an unregulated spot price and sell to consumers at a heavily regulated price. Consequently the occurrence of extreme movements in the spot price represents a major source of risk to retailers and the accurate forecasting of these extreme events or price spikes is an important aspect of effective risk management. Traditional approaches to modeling electricity prices are aimed primarily at predicting the trajectory of spot prices. By contrast, this paper focuses exclusively on the prediction of spikes in electricity prices. The time series of price spikes is treated as a realization of a discrete-time point process and a nonlinear variant of the autoregressive conditional hazard (ACH) model is used to model this process. The model is estimated using half-hourly data from the Australian electricity market for the sample period 1 March 2001 to 30 June 2007. The estimated model is then used to provide one-step-ahead forecasts of the probability of an extreme event for every half hour for the forecast period, 1 July 2007 to 30 September 2007, chosen to correspond to the duration of a typical forward contract. The forecasting performance of the model is then evaluated against a benchmark that is consistent with the assumptions of commonly-used electricity pricing models.