Finance - Theses

Permanent URI for this collection

Search Results

Now showing 1 - 7 of 7
  • Item
    Thumbnail Image
    User information satisfaction: the effect of a strategy - MIS scope alignment and the role of information system management
    Roberts, Elizabeth S. ( 1998)
    This thesis develops a theoretical model to improve our understanding of how the management information systems/information technology (IS/IT) resources influences user information satisfaction. Understanding this relationship is important not only because of the significant proportion of the total resources now devoted to IS/IT but also because organisations are increasingly recognising that the effective management of these resources is critical to achieving and maintaining competitive advantage. The study focuses on the possible causes of user information satisfaction. The core proposition developed is that the type of information provided by management information systems will be related to, or matched with, an organisational unit's strategy. It is then argued that, when this match exists, user information satisfaction is higher than it would otherwise be. Further, it is argued that the capacity of the information system function to manage the IS/IT resources throughout the organisation is a possible cause of this match. This organisational role of IS/IT resource management also affects the level of user satisfaction both directly, and indirectly through its effect on the alignment of strategy and type of information. The research model was developed by drawing together separate but related strands of the literature in the information systems, management and accounting disciplines and from the results of previous studies. To that extent, the model is exploratory, but each of the individual propositions identified in the model, and subsequently tested, are theoretically based. To test the model and the research hypotheses, a survey was undertaken in manufacturing organisations. Data were collected from 192 production managers and 117 information system managers in 153 different manufacturing firms. The research model was evaluated both at the firm level of analysis and at the departmental level. Bivariate correlations, path analyses and structural equation modelling techniques were used to test the hypotheses and to establish the goodness-of-fit of the research model. The results from the statistical analysis strongly support the model. Users were found to be more satisfied with the information provided to them when the IS/IT resources were well managed and when organisational unit strategy was aligned with the scope of the information. Moreover, alignments between strategy and information scope were more likely in firms where the IS/IT resources were well-managed. These results are both encouraging and exciting as they support the theoretical propositions developed here. The study, therefore, has the potential not only to contribute to the research literature but also has some important practical implications for the effective management of IS/IT resources. It contributes to the literature by developing a theoretical framework that integrates a number of different strands of previous research in the related areas of accounting, management and information systems. It also adds to the literature in each of these three disciplines. It contributes to organisational understanding of the role of IS/IT resource management, of user satisfaction, and of the alignment between organisational unit strategy and management information system design. In so doing, it provides insights for more effective management of IS/IT resources and suggests reasons why some users are more satisfied with the information available in their organisations than others. While there are limitations of this research, it has the potential to make important contributions to theory and practice, and it provides several opportunities for future research.
  • Item
    Thumbnail Image
    Loan contracting and the credit cycle
    Jericevic, Sandra Lynne ( 2002-04)
    The performance of financial institutions is significantly influenced by the actions of loan officers. The process by which lending decisions are made is therefore of critical interest to management, shareholders, and regulators alike. Indeed, the drain on bank capital that has often accompanied credit quality problems in the past has encouraged the search for new approaches towards the management of lending and related activities. This thesis seeks to examine whether existing governance and incentive techniques found in banks are sufficiently comprehensive in guiding loan decision-making. In the context of lending to the corporate sector, the study investigates the endogenous and exogenous influences surrounding the lending role, and assesses the implications for how loan officers are monitored, evaluated, and motivated to act in a financial institution’s best interests. By first developing an expanded model that conceptualizes the loan offer function, and then grounding this framework within a business cycle context, the study demonstrates the potential for governance and reward systems, that are constant through time, to have variable outcomes/effects. Support for this hypothesis is provided based on publicly available financial market information and other material gathered from private sources. A proposal is then advanced for the development of a management information system that identifies changes in credit standards being applied, thereby enabling banks to benchmark and influence loan officer performance in the context of cyclically changing attitudes to risk and the effects on negotiating power.
  • Item
    Thumbnail Image
    An empirical study of corporate bond pricing with unobserved capital structure dynamics
    Maclachlan, Iain Campbell ( 2007-05)
    This work empirically examines six structural models of the term structure of credit risk spreads: Merton (1974), Longstaff & Schwartz (1995) (with and without stochastic interest rates), Leland & Toft (1996), Collin-Dufresne & Goldstein (2001), and a constant elasticity of variance model. The conventional approach to testing structural models has involved the use of observable data to proxy the latent capital structure process, which may introduce additional specification error. This study extends Jones, Mason & Rosenfeld (1983) and Eom, Helwege & Huang (2004) by using implicit estimation of key model parameters resulting in an improved level of model fit. Unlike prior studies, the models are fitted from the observed dynamic term structure of firm-specific credit spreads, thereby providing a pure test of model specification. The models are implemented by adapting the method of Duffee (1999) to structural credit models, thereby treating the capital structure process is truly latent, and simultaneously enforcing cross-sectional and time-series model constraints. Quasi-maximum likelihood parameter estimates of the capital structure process are obtained via the extended Kalman filter applied to actual market trade prices on 32 firms and 200 bonds for the period 1994 to 2000. We find that including an allowance for time-variation in the market liquidity premium improves model specification. A simple extension of the Merton (1974) model is found to have the greatest prediction accuracy, although all models performed with similar prediction errors. At between 28.8 to 34.4 percent, the root mean squared error of the credit spread prediction is comparable with reduced-form models. Unlike Eom, Helwege & Huang (2004) we do not find a wide dispersion in model prediction errors, as evidenced by an across model average mean absolute percentage error of 22 percent. However, in support of prior studies we find an overall tendency for slight underprediction, with the mean percentage prediction error of between -6.2 and -8.7 percent. Underprediction is greatest with short remaining bond tenor and low rating. Credit spread prediction errors across all models are non-normal, and fatter tailed than expected, with autocorrelation evident in their time series. More complex models did not outperform the extended Merton (1974) model; in particular stochastic interest-rate and early default accompanied by an exogenous write-down rate appear to add little to model accuracy. However, the inclusion of solvency ratio mean-reversion in the Collin-Dufresne & Goldstein (2001) model results in the most realistic latent solvency dynamics as measured by its implied levels of asset volatility, default boundary level, and mean-reversion rate. The extended Merton (1974) is found to imply asset volatility levels that are too high on average when compared to observed firm equity volatility. We find that the extended Merton (1974) and the Collin-Dufresne & Goldstein (2001) models account for approximately 43 percent of the credit spread on average. For BB rated trades, the explained proportion rises to 55 to 60 percent. For investment grade trades, our results suggest that the amount of the credit spread that is default related is approximately double the previous estimate of Huang & Huang (2003). Finally, we find evidence that the prediction errors are related to market-wide factors exogenous to the models. The percentage prediction errors are positively related to the VIX and change in GDP, and negatively related to the Refcorp-Treasury spread.
  • Item
    Thumbnail Image
    The corporate treasury function: risk management and performance measurement
    Sweeney, Mary Elizabeth Blundy ( 1997-05)
    The Australian financial system has changed dramatically in recent years, creating both threats and opportunities for value adding activities. Many large corporations have set up a separate treasury division or department to handle their financing requirements. This thesis derives the rationale for a separate treasury function from theory of the firm. A framework has been developed by drawing upon both the old theory of the firm (transaction cost economics) and the new theory of the firm (agency theory) to determine the appropriate governance structure to manage financial arrangements. Formal analysis of corporate treasury functions and performance measurement research has not kept pace with the growth of treasury activities. Appropriate benchmarks provide management with information to manage financial risk and to more accurately assess treasury performance. A benchmark is required for core treasury tasks, including debt portfolio management. Optimal treasury benchmarks are difficult to determine, due to the complexity involved in measuring financial exposures for firms which derive income from physical, rather than financial assets. The inter-relationships between financial risks, including maturity, interest rate and currency risk, further compounds the problem. Decomposition of financial risk into these respective elements allows identification of the firm specific factors that influence financial exposure. Appropriate benchmarks for managing repricing, refunding and foreign exchange risk depend upon the trade-off between transaction costs, agency costs and information signalling costs. Theory suggests growth options in real assets within the firm's investment opportunity set provide opportunities for natural hedges that offset financial risk. However, empirical analysis of share price sensitivity to interest rates and an analysis of debt maturity structure indicates growth options and agency factors are less important than firm specific characteristics when setting up benchmark portfolios to manage financial risk. Treasuries are often classified as either active or passive managers, but a continuum of strategies is possible when managing financial risk, rather than points at either end of a spectrum. Tolerance levels around the benchmark constrain activity within a relevant range - the more active the treasury, the broader the range. Constraints allow the degree of activity to be fine-tuned. The decision to actively manage risk depends upon whether value can be added in risk-adjusted terms. This is a function not only of whether opportunities exist, but also whether value can be added consistently, compared with a passive approach. The majority of practising treasurers describe themselves as 'active hedgers'. Subject to caveats outlined in the thesis, field experiments conducted over a three year period indicate the ability of corporate treasurers to add value to the firm through outperforming a passive benchmark portfolio of debt is limited. Respondents to an international survey on corporate treasury control and performance standards cited difficulty in setting benchmarks, particularly risk-adjusted benchmarks, as the major reason for not measuring treasury performance. Empirical determinants of benchmark structures for repricing risk, refunding risk and currency risk have been identified. A better understanding of the factors that determine financial risk will assist management when they are designing or refining benchmarks to manage financial risk and measure treasury performance.
  • Item
    Thumbnail Image
    An architecture for computer-based accounting information systems
    SEDDON, PETER ( 1991)
    The question addressed in this thesis is whether cost-effective, computer-based accounting systems can be used to generate "better" accounting information than existing transaction processing accounting systems. The first half of the thesis is devoted to gathering and summarizing information about how computer-based accounting systems work today, and what might constitute "better" accounting information. Data about present-day computer-based accounting systems was collected by mail questionnaires and personal reviews of widely-used packaged accounting software. Information about what constitutes "better" accounting information was collected, first, by reviewing the normative accounting literature, second, by reviewing the empirical literature on the inflation accounting experiments in the US (SFAS 33) and UK (SSAP 16), and third, by reviewing the academic literature on computer-based accounting, particularly the work of Ijiri, McCarthy, and Weber. To reconcile the apparent conflict between the empirical evidence that (a) inflation accounting is essential in times of very high inflation and (b) the empirical studies had found very little evidence of additional information content in SFAS 33 and SSAP 16 reports, it is suggested that the benefits of inflation accounting must only become apparent when general price-level changes exceed, say, 15% - 20% p.a.. Thus the ideas of the normative theorists are not rejected, and it is decided that a computer-based general ledger system that (a) is inflation-tolerant, (b) draws its data from the firm's transaction processing system database, and (c) can provide accounting reports based on different sets of accounting rules (called Multiview Accounting), would be likely to meet the objectives for the thesis. The second half of the thesis focuses on the design of such a system. To build inflation-tolerance into the profit measurement system, it is proposed that the constant-value journal entries and constant-value ledger account balances of conventional ledger systems should be replaced by formulae like those in spreadsheets. It is shown that a coherent system of double-entry bookkeeping, called Formula Accounting, can be developed, where ledger account balances may be functions of any variable that is likely to change in value over time, e.g., time itself, stock market prices, and price-index series. For automatic generation of Formula Accounting (FA) journal entries it is proposed that either the firm's many special-purpose transaction processing systems should be modified, or that a combination of (a) a specially-defined accounting data model (called the REE model), and (b) a computer program that encodes the rules used by accountants when they prepare journal entries (called an Interpreter), should be developed. To demonstrate the feasibility of all these proposals, a prototype REE-Interpreter-FA system was developed in roughly 4,000 lines of Prolog. Multiview Accounting is illustrated by using the prototype system to generate Historical Cost and Current Cost/Constant Purchasing Power interpretations of representative Exchange Events for both a trading firm and a manufacturing firm.
  • Item
    Thumbnail Image
    Takeover waves: behaviour, motives and consequences
    Kendig, Caralee ( 1997)
    The objective of this thesis is to assist in the development of a comprehensive theory of takeover activity. It is proposed that the micro based theories of takeovers developed by Roll (1986) and Jensen (1986) can be integrated with macro takeover theories and aspects of behavioural finance into an explanation of both firm specific acquisitions and the takeover wave phenomenon. Specifically, the central hypothesis is that takeover waves are driven by overreactions and agency conflicts which emerge during capital market booms. Four research questions concerning the behaviour, origins, composition, and consequences of takeover activity are examined in the context of this hypothesis. Empirical evidence indicates that Australian takeovers can be characterised by a wave process. A new version of a nonlinear model is developed and applied to a new, consistent series of takeover data for the period 1955 to 1995. The takeover series is shown to follow a two state Markov switching regime model, where the underlying processes are Poisson-distributed with first order autoregressive properties. A filter developed by Hamilton (1989) is applied to identify 1 minor wave from 1979 to 1980, and 3 major waves, between 1959 and 1961, 1969 and 1973, and 1988 and 1990. An examination of the origins of takeover waves reveals that they are positively related to increases in share prices and occur during periods of high business confidence. They do not appear to be caused by external shocks in specific industries or by changes in the regulatory environment. Further, the takeover wave phenomenon is evident in both conglomerate and related takeover series, and the incidence of multiple acquirers and competing bids does not change significantly during wave states. Finally, the consequences of takeover waves are explored. Between 1955 and 1995, returns to bidders are insignificant at 0.2%, and the target premium is statistically significant at 40.3%. Bidder abnormal returns are relatively low in takeover waves. However, they are not significantly different from returns in normal activity. In contrast, the takeover premium exhibits a significant positive relationship with takeover waves. In general, these results provide considerable support for the managerial hypothesis of takeover waves advanced in this thesis. This has implications for a variety of market participants. It suggests that the market for corporate control is not entirely efficient. Measures to align shareholder and manager interests may be appropriate during takeover waves. More importantly, it provides insight into the reconciliation of micro and macro theories of takeover motives. This enables an explanation of takeovers that can be both generalised to an aggregate and reduced to an individual level.
  • Item
    Thumbnail Image
    Manufacturing strategy and performance measurement system design
    Lillis, Anne M. ( 1998)
    This study explores the link between strategy and the use and design of manufacturing subunit performance measures by profit centre managers. More specifically, it examines the relationship between manufacturing competitive strategy and 1. relative reliance on financial and non-financial performance measures for manufacturing management control 2. the way cost benchmarks used in financial performance measures are constituted to integrate non-financial dimensions of performance. These links between manufacturing strategy, reliance on performance measures and constitution of financial benchmarks are examined also for implications on performance measurement system effectiveness. Data were collected using a semi-structured interview in conjunction with a structured questionnaire administered to 36 profit centre managers and 12 manufacturing managers in 36 manufacturing firms in Victoria, Australia. (Open document for complete abstract)