Faculty of Education - Research Publications

Permanent URI for this collection

Search Results

Now showing 1 - 2 of 2
  • Item
    Thumbnail Image
    PISA Data: Raising concerns with its use in policy settings
    Gillis, S ; Polesel, J ; Wu, M (Springer, 2016-03)
    This article considers the role played by policy makers, government organisations, and research institutes (sometimes labelled ‘‘think tanks’’) in the analysis, use and reporting of PISA data for the purposes of policy advice and advocacy. It draws on the ideas of Rizvi and Lingard (Globalizing Education Policy, 2010), Bogdandy and Goldmann (Governance by Indicators/ Global Power through Quantification and Rankings, 2012) and others to explore the ways in which such ‘‘agents of change’’ can interpret, manipulate and disseminate the results of data arising from large scale assessment survey programs such as PISA to influence and determine political and/or educational research agendas. This article illustrates this issue by highlighting the uncertainty surrounding the PISA data that have been used by a number of prominent, high profile agents of change to defend policy directions and advice. The final section of this paper highlights the need for policy makers and their advisors to become better informed of the technical limitations of using international achievement data if such data are to be used to inform policy development and educational reforms.
  • Item
    Thumbnail Image
    Standards-referenced assessment for vocational education and training in schools
    Griffin, P ; Gillis, S ; Calvitto, L (AUSTRALIAN COUNCIL EDUCATIONAL RES LIMITED, 2007-04)
    This study examined a model of assessment that could be applied nationally for Year Twelve Vocational Education and Training (VET) subjects and which could yield both a differentiating score and recognition of competence. More than fifty colleges across all states and territories of Australia field-tested the approach over one school year. Results showed that the model allowed for a standards-referenced model to be used: that the approach was compatible with the diverse range of senior secondary assessment systems in use throughout Australia and that there were considerable cost benefits to be had in adopting the logic of item response modelling for the development of rubrics for scoring performances on units of competence from National Training Packages. A change in the logic of competency assessment was proposed, in that the performance indicators were not rated using a dichotomy but with a series of quality ordered criteria to indicate how well students performed specified tasks in the workplace or its simulation. The study validated the method of assessment development, demonstrated the method's consistency, and showed how the method could address the issue of consistency across states. The study also proposed a set of principles for a joint assessment of both quality and competence.