Economics - Research Publications

Permanent URI for this collection

Search Results

Now showing 1 - 10 of 763
  • Item
    Thumbnail Image
    Replication: Belief elicitation with quadratic and binarized scoring rules
    Erkal, N ; Gangadharan, L ; Koh, BH (Elsevier, 2020-12-01)
    Researchers increasingly elicit beliefs to understand the underlying motivations of decision makers. Two commonly used methods are the quadratic scoring rule (QSR) and the binarized scoring rule (BSR). Hossain and Okui (2013) use a within-subject design to evaluate the performance of these two methods in an environment where subjects report probabilistic beliefs over binary outcomes with objective probabilities. In a near replication of their study, we show that their results continue to hold with a between-subject design. This is an important validation of the BSR given that researchers typically implement only one method to elicit beliefs. In favor of the BSR, reported beliefs are less accurate under the QSR than the BSR. Consistent with theoretical predictions, risk-averse subjects distort their reported beliefs under the QSR.
  • Item
    Thumbnail Image
    Intra-industry spill-over effect of default: Evidence from the Chinese bond market
    Hu, X ; Luo, H ; Xu, Z ; Li, J (WILEY, 2021-09)
    Abstract We investigate the intra‐industry spill‐over effect of defaults in the Chinese bond market by using a sample of public corporate debt securities for the period 2014–2018. We find that both industry portfolios and individual firms witness a strong contagion effect, which further spreads to the primary bond market, triggering a surge in the debt financing cost for default industries. Moreover, this contagion effect is stronger for low‐competition industries and regulated industries, as well as when a default happens to state‐owned enterprises. Better information access and higher bond liquidity alleviate the contagion effect, lending support to the information updates and liquidity dry‐up hypotheses.
  • Item
    Thumbnail Image
    How to proxy the unmodellable: Analysing granular insurance claims in the presence of unobservable or complex drivers
    Avanzi, B ; Taylor, G ; wong, B ; Xian, A (Institute of Actuaries, Australia, 2018)
    The estimation of claim and premium liabilities is a key component of an actuary's role and plays a vital part of any insurance company’s operations. In practice, such calculations are complicated by the stochastic nature of the claims process as well as the impracticality of capturing all relevant and material drivers of the observed claims data. In the past, computational limitations have promoted the prevalence of simplified (but possibly sub-optimal) aggregate methodologies. However, in light of modern advances in processing power, it is viable to increase the granularity at which we analyse insurance data sets so that potentially useful information is not discarded. By utilising more granular and detailed data (that is usually readily available to insurers), model predictions may become more accurate and precise. Unfortunately, detailed analysis of large insurance data sets in this manner poses some unique challenges. Firstly, there is no standard framework to which practitioners can refer and it can be challenging to tractably integrate all modelled components into one comprehensive model. Secondly, analysis at greater granularity or level of detail requires more intense levels of scrutiny as complex trends and drivers that were previously masked by aggregation and discretisation assumptions may emerge. This is particularly an issue with claim drivers that are either unobservable to the modeller or very difficult/expensive to model. Finally, computation times are a material concern when processing such large volumes of data as model outputs need to be obtained in reasonable time-frames. Our proposed methodology overcomes the above problems by using a Markov-modulated non-homogeneous Poisson process framework. This extends the standard Poisson model by allowing for over-dispersion to be captured in an interpretable, structural manner. The approach implements a flexible exposure measure to explicitly allow for known/modelled claim drivers while the hidden component of the Hidden Markov model captures the impact of unobservable or practicably non-modellable information. Computational developments are made to drastically reduce calibration times. Theoretical findings are illustrated and validated in an empirical case study using Australian general insurance data in order to highlight the benefits of the proposed approach.
  • Item
    Thumbnail Image
    On the Impact, Detection and Treatment of Outliers in Robust Loss Reserving
    Avanzi, B ; Taylor, G ; Wong, B ; Lavendar, M (Actuaries Institute, 2016)
    The sensitivity of loss reserving techniques to outliers in the data or deviations from model assumptions is a well known challenge. For instance, it has been shown that the popular chain-ladder reserving approach is at significant risk to such aberrant observations in that reserve estimates can be significantly shifted in the presence of even one outlier. In this paper we firstly investigate the sensitivity of reserves and mean squared errors of prediction under Mack's Model. This is done through the derivation of impact functions which are calculated by taking the first derivative of the relevant statistic of interest with respect to an observation. We also provide and discuss the impact functions for quantiles when total reserves are assumed to be lognormally distributed. Additionally, comparisons are made between the impact functions for individual accident year reserves under Mack's Model and the Bornhuetter-Ferguson methodology. It is shown that the impact of incremental claims on these statistics of interest varies widely throughout a loss triangle and is heavily dependent on other cells in the triangle. We then put forward two alternative robust bivariate chain-ladder techniques (Verdonck and VanWouwe, 2011) based on Adjusted-Outlyingness (Hubert and Van der Veeken, 2008) and bagdistance (Hubert et al., 2016). These techniques provide a measure of outlyingness that is unique to each individual observation rather than largely relying on graphical representations as is done under the existing bagplot methodology. Furthermore the Adjusted Outlyingness approach explicitly incorporates a robust measure of skewness into the analysis whereas the bagplot captures the shape of the data only through a measure of rank. Results are illustrated on two sets of real bivariate data from general insurers.
  • Item
    No Preview Available
    When Walras meets Vickrey
    Delacrétaz, D ; Loertscher, S ; Mezzetti, C (The Econometric Society, 2022-11)
    We consider general asset market environments in which agents with quasilinear payoffs are endowed with objects and have demands for other agents' objects. We show that if all agents have a maximum demand of one object and are endowed with at most one object, the VCG transfer of each agent is equal to the largest net Walrasian price of this agent. Consequently, the VCG deficit is equal to the sum of the largest net Walrasian prices over all agents. Generally, whenever Walrasian prices exist, the sum of the largest net Walrasian prices is a nonnegative lower bound for the deficit, implying that no dominant‐strategy mechanism runs a budget surplus while respecting agents' ex post individual rationality constraints.
  • Item
    No Preview Available
    Double Markups, Information, and Vertical Mergers
    Loertscher, S ; Marx, LM (SAGE Publications, 2022-09-01)
    In vertical contracting models with complete information and linear prices, double markups that arise between independent firms provide an efficiency rationale for vertical mergers since these eliminate double markups (EDM). However, the double markups vanish even without vertical integration if the firms are allowed to use two-part tariffs. Hence, the efficiency rationale for vertical mergers in models of complete information requires restrictions on the contracts that firms can use. In a sense, with complete information, two-part tariffs are simply too powerful. If instead one allows incomplete information and removes the restriction on contract forms, then vertical mergers continue to have an effect that is analogous to EDM, but they also have the potential to affect the overall efficiency of the market to the detriment of society. Consequently, the social surplus effects of vertical integration depend on the underlying market structure, and vertical mergers are, in and of themselves, neither good nor bad. We illustrate through an example that with incomplete information, the private benefits from vertical integration tend to be excessive; that is, vertical mergers remain profitable even when they are socially harmful.
  • Item
    No Preview Available
    Monopoly Pricing, Optimal Randomization, and Resale
    Loertscher, S ; Muir, EV (UNIV CHICAGO PRESS, 2022-03-01)
  • Item
    No Preview Available
    Leadership selection: Can changing the default break the glass ceiling?
    Erkal, N ; Gangadharan, L ; Xiao, E (ELSEVIER SCIENCE INC, 2022-04)
  • Item
    No Preview Available
    Income and saving responses to tax incentives for private retirement savings
    Chan, MK ; Morris, T ; Polidano, C ; Vu, H (ELSEVIER SCIENCE SA, 2022-02)
  • Item
    No Preview Available