Economics - Research Publications

Permanent URI for this collection

Search Results

Now showing 1 - 10 of 59
  • Item
    Thumbnail Image
    How to proxy the unmodellable: Analysing granular insurance claims in the presence of unobservable or complex drivers
    Avanzi, B ; Taylor, G ; wong, B ; Xian, A (Institute of Actuaries, Australia, 2018)
    The estimation of claim and premium liabilities is a key component of an actuary's role and plays a vital part of any insurance company’s operations. In practice, such calculations are complicated by the stochastic nature of the claims process as well as the impracticality of capturing all relevant and material drivers of the observed claims data. In the past, computational limitations have promoted the prevalence of simplified (but possibly sub-optimal) aggregate methodologies. However, in light of modern advances in processing power, it is viable to increase the granularity at which we analyse insurance data sets so that potentially useful information is not discarded. By utilising more granular and detailed data (that is usually readily available to insurers), model predictions may become more accurate and precise. Unfortunately, detailed analysis of large insurance data sets in this manner poses some unique challenges. Firstly, there is no standard framework to which practitioners can refer and it can be challenging to tractably integrate all modelled components into one comprehensive model. Secondly, analysis at greater granularity or level of detail requires more intense levels of scrutiny as complex trends and drivers that were previously masked by aggregation and discretisation assumptions may emerge. This is particularly an issue with claim drivers that are either unobservable to the modeller or very difficult/expensive to model. Finally, computation times are a material concern when processing such large volumes of data as model outputs need to be obtained in reasonable time-frames. Our proposed methodology overcomes the above problems by using a Markov-modulated non-homogeneous Poisson process framework. This extends the standard Poisson model by allowing for over-dispersion to be captured in an interpretable, structural manner. The approach implements a flexible exposure measure to explicitly allow for known/modelled claim drivers while the hidden component of the Hidden Markov model captures the impact of unobservable or practicably non-modellable information. Computational developments are made to drastically reduce calibration times. Theoretical findings are illustrated and validated in an empirical case study using Australian general insurance data in order to highlight the benefits of the proposed approach.
  • Item
    Thumbnail Image
    Charitable Giving in the Laboratory: Advantages of the Piecewise Linear Public Goods Game
    Menietti, M ; Recalde, M ; Vesterlund, L ; Scharf, K ; Tonin, M (The MIT Press, 2018)
    The vast majority of US households make significant charitable contributions. When examining the effectiveness of the mechanisms fundraisers use to solicit such funds, it is often essential that researchers elicit or control the donor’s return from giving. While much can be gained from examining data on actual donations, insights on giving increasingly result from laboratory studies. An advantage of the laboratory is that it permits control of the donor’s return from giving and thus facilitates the identification of donor motives as well as their responses to different fundraising or solicitation strategies (see Vesterlund 2016 for a review).
  • Item
    Thumbnail Image
    Confidence Intervals for Ratios: Econometric Examples with Stata
    Lye, JN ; Hirschberg, JG (Elsevier BV, 2018)
    Ratios of parameter estimates are often used in econometric applications. However, the test of these ratios when estimated can cause difficulties since the ratio of asymptotically normally distributed random variables have a Cauchy distribution for which there are no finite moments. This paper presents a method for the estimation of confidence intervals based on the Fieller approach that has been shown to be preferable to the usual Delta method. Using example applications in both Stata and R, we demonstrate that a few extra steps in the examination of the estimate of the ratio may provide a confidence interval with superior coverage.
  • Item
    Thumbnail Image
    Grading Journals in Economics: The ABCs of the ABDC
    Hirschberg, JG ; Lye, JN ( 2018-01-01)
  • Item
    Thumbnail Image
    Opportunities and Challenges for CGE Models in Analysing Taxation
    Freebairn, J (WILEY, 2018-03)
    Taxation analysis seeks to describe the effects of current taxes, make forecasts and assess proposed reform options. In each case, the effects on market outcomes, distribution of the tax burden and distortions to decisions and economic efficiency are estimated. When second‐round effects are important, including for most taxes on business and where exemptions from comprehensive tax bases are significant, general equilibrium models are required. A computer general equilibrium model (CGE) with detailed and disaggregated industry, product and factor markets has great potential to quantify the general equilibrium effects of taxation. Challenges and areas for development of available CGE models for taxation analysis include the following: disaggregation of households to assess distribution effects and allow for different elasticities; modelling the effects of the hybrid tax treatment of different household saving and investment options; disaggregation of some business decisions to capture the effects of departures from comprehensive tax bases and of decision‐makers facing different tax systems; and modelling and conveying the implications of imperfect knowledge of key assumptions and parameters.
  • Item
    Thumbnail Image
    Time series copulas for heteroskedastic data
    Loaiza-Maya, R ; Smith, MS ; Maneesoonthorn, W (WILEY, 2018-04-01)
    Summary We propose parametric copulas that capture serial dependence in stationary heteroskedastic time series. We suggest copulas for first‐order Markov series, and then extend them to higher orders and multivariate series. We derive the copula of a volatility proxy, based on which we propose new measures of volatility dependence, including co‐movement and spillover in multivariate series. In general, these depend upon the marginal distributions of the series. Using exchange rate returns, we show that the resulting copula models can capture their marginal distributions more accurately than univariate and multivariate generalized autoregressive conditional heteroskedasticity models, and produce more accurate value‐at‐risk forecasts.
  • Item
    Thumbnail Image
    A REVIEW OF THE RECENT LITERATURE ON THE INSTITUTIONAL ECONOMICS ANALYSIS OF THE LONG-RUN PERFORMANCE OF NATIONS
    Lloyd, P ; Lee, C (WILEY, 2018-02)
    Abstract This paper reviews the recent (post‐2000) literature that assesses the importance of institutions as a factor determining cross‐country differences in growth rates or in the contemporary level of “prosperity.” It first sketches how institutional economics has evolved. It then examines critically the methods of analysis employed in the recent literature. The paper finds that this literature has made a major contribution to the analysis of the causes of economic growth but the relative importance of institutions as a determinant of long‐run growth and prosperity is still a wide open question.
  • Item
    Thumbnail Image
    Environmental Water Efficiency: Maximizing Benefits and Minimizing Costs of Environmental Water Use and Management
    Horne, AC ; O'Donnell, EL ; Loch, AJ ; Adamson, DC ; Hart, B ; Freebairn, J (WILEY, 2018)
    Environmental water management is a relatively new discipline, with concepts, management practice and institutional mechanisms that are still emerging. The efficient and effective use of environmental water to maximize environmental benefits, or environmental water use efficiency, is one such emerging concept. Currently, much of the focus is on allocative efficiency, where the objective is to achieve a better balance between consumptive and environmental water uses in a cost‐effective way. However, this may not provide the most efficient and effective way to manage environmental water in the long term, where managers are seeking productive (or operational) efficiency. Here, the objective is to maximize environmental outcomes relative to the cost of managing the available resource. This paper explores the concept of water use efficiency in the context of environmental water.
  • Item
    Thumbnail Image
    Surprised by the Hot Hand Fallacy? A Truth in the Law of Small Numbers
    Miller, JB ; Sanjurjo, A (Econometric Society, 2018-11-01)
    We prove that a subtle but substantial bias exists in a common measure of the conditional dependence of present outcomes on streaks of past outcomes in sequential data. The magnitude of this streak selection bias generally decreases as the sequence gets longer, but increases in streak length, and remains substantial for a range of sequence lengths often used in empirical work. We observe that the canonical study in the influential hot hand fallacy literature, along with replications, are vulnerable to the bias. Upon correcting for the bias, we find that the longstanding conclusions of the canonical study are reversed.
  • Item
    Thumbnail Image
    Policy Forum: Cryptocurrencies Introduction
    Castelnuovo, E (WILEY, 2018-12)