Faculty of Education - Research Publications

Permanent URI for this collection

Search Results

Now showing 1 - 10 of 19
  • Item
    No Preview Available
    A Report for the NSW Department of Education on Vocational Education and Training Delivered to Secondary Students
    Polesel, J ; Gillis, S ; Leahy, M ; Guthrie, H ; Klatt, M ; Suryani, A ; Firth, J ( 2019-11-01)
    This report presents the findings of an external review and analysis of relevant recent practices, research and data on the delivery of Vocational Education and Training (VET) to secondary students. The review and analysis were commissioned by the New South Wales (NSW) Department of Education and were conducted by the Centre for Vocational and Educational Policy at the University of Melbourne to identify best possible practices and make recommendations for future practice.
  • Item
    Thumbnail Image
    Evaluation of Chances for Children 2015
    KLATT, M ; Gillis, S ; Polesel, J ; North, S (Chances for Children, 2015)
    This study was undertaken to evaluate the Chances for Children program, established in 2000, which supports children and young people in the Mallee region of Victoria and south west New South Wales. It aims to remove financial barriers for those who, without financial support, would not be able to achieve their full potential (in tertiary education, secondary school, or sporting and musical activities). It also provides assistance for those with learning difficulties.
  • Item
    Thumbnail Image
    VET in Schools: strengthening delivery and assessment outcomes
    Gillis, S ; BATEMAN, A ; Dyson, C (TVET Australia, Commonwealth of Australia, 2011)
    The National Quality Council (NQC) funded a consortium led by Victoria University to design and implement 11 interactive sessions across each state and territory. The aim of the sessions was to communicate outcomes of the NQC’s current work on assessment to schools involved in the delivery of VET qualifications recognised under the AQF. This included schools that were either school-based RTOs and/or those schools who had partnership and/or auspicing arrangements with other RTOs. The relevant findings of the NQC funded projects to be communicated as part of the dissemination strategy included the reports and support materials available from the NQC website in relation to assessment, validation and moderation. The NQC completed these projects in 2009 and 2010 and was interested in ensuring that the key outcomes of the projects were shared with school teachers/assessors delivering nationally recognized VET qualifications.
  • Item
    No Preview Available
    Mapping adult literacy performance — support document
    Gillis, S ; Dulhunty, M ; Wu, M ; Calvitto, L ; Pancini, G (Victoria University, 2013)
    In 2010, the National Centre for Vocational Education Research (NCVER) conducted a preliminary study to determine the feasibility of mapping the performance levels of the international Adult Literacy and Life Skill Survey (ALLS) to those of the Australian Core Skills Framework (ACSF) using a Delphi technique (Circelli, Curtis, Perkins, 20111 ). In that study, a small number of adult literacy and numeracy experts used their professional judgement to qualitatively align a sample of ALLS items to the ACSF levels. At the completion of the study, there was general consensus among the participants that: • the mapping process was feasible for the: o Reading domain of the ACSF to the ALLS prose and document literacy domains; as well as the o Numeracy domains of the two frameworks. • a larger-scale research study should be undertaken to empirically align the two frameworks onto a single scale for each of the two domains (i.e., Reading and Numeracy). The National Centre for Vocational Education Research (NCVER) commissioned Victoria University (Shelley Gillis) in conjunction with Educational Measurement Solutions (Margaret Wu and Mar k Dulhunty) to undertake the larger-scale research study.
  • Item
    No Preview Available
    Career Development: Defining and Measuring Quality.
    Rice, S ; Gillis, S ; Leahy, M ; Polesel, J (Melbourne Graduate School of Education, The University of Melbourne, 2015)
    In August 2014, the NSW Legislative Assembly Public Accounts Committee (PAC) recommended that: by June 2015, the Department of Education and Communities conduct an evaluation of the quality and appropriateness of career advice provided in schools. To undertake such an evaluation, it is first necessary to determine what is meant by ‘quality and appropriateness of career advice provided in schools’. Given that there are no universal definitions or measures of quality career advice, the University of Melbourne has been engaged by the NSW DEC to: • Undertake an extensive literature review to identify best practice indicators of quality career advice within school contexts. • Determine to what extent existing data available from within the three data collection programs within the DEC (i.e., the Expectations and Destinations of NSW Senior Secondary Students Survey, the Student Pathways Survey and the annual Online School to Work Program reporting) can be re-analysed to provide evidence of ‘quality career advice’ in accordance with the best practice indicators identified. • Recommend further strategies and actions the Department should consider to adequately respond to the Public Accounts Committee’s recommendation.
  • Item
    Thumbnail Image
    Overview of the Empirical Validation of the Strengthened Australian Qualifications Framework
    Gillis, S ; DULHUNTY, M ; Wu, M ; CALVITTO, L ; BATEMAN, A (Australian Qualifications Framework Council, Australian Government, 2010)
    The aim of this study was to undertake an empirical analysis of the revised design of the strengthened Australian Qualifications Framework. In particular, four elements of the revised framework were to be examined: the levels structure, with 10 levels expressed as learning outcomes (referred to as 'levels criteria'); revised descriptors for each of the existing 14 qualification types (and two kinds [the Master's and Doctoral Degree qualifications types had two kinds: other and research]) expressed as learning outcomes (referred to as 'qualification type descriptors'); the relationship between the qualification types and the levels structure; [and] an estimate of the notional duration of student learning for each qualification type. The major aims of the empirical validation were to: estimate the complexity of the criteria for each of the levels, and for each set, compare the estimates with the proposed 10-level structure; estimate the complexity of each qualification type descriptor for each of the 14 qualification types; identify any potentially redundant and non-discriminating levels criteria and/or qualification type descriptors; determine where each qualification type is typically positioned within the proposed 10-level structure; [and] investigate the adequacy of the suggested duration for each qualification type.
  • Item
    Thumbnail Image
    Empirical validation of the Strengthened Australian Qualifications Framework using Item Response Theory
    Gillis, S ; Wu, M ; DULHUNTY, M ; CALVITTO, L ; BATEMAN, A ( 2010)
    This study set out to empirically examine the revised architectural design of the Strengthened Australian Qualifications Framework. There were four elements of the strengthened framework that were to be examined: 1. A levels structure with ten levels expressed as learning outcomes (referred to as Levels Criteria) 2. Revised descriptors for each of the existing 14 Qualification Types (and two kinds) expressed as learning outcomes (referred to as Qualification Type Descriptors). 3. The interaction between the Qualification Types and the Levels Structure. 4. A measurement of the notional duration of student learning for each Qualification Type. The study was designed to examine the measurement properties of three of the four elements listed above (i.e., 1, 2 & 3). It was also designed to examine the appropriateness of the assigned notional duration of student learning for each Qualification Type (i.e., 4).
  • Item
    Thumbnail Image
    Industry e-validation of assessment exemplars: Independent Review Report
    Gillis, S ; Clayton, B ; BATEMAN, A (Flexible Learning Advisory Group, Australian Government, 2013)
    This report first summarises the common features of the six e-validation programs piloted in terms of the unit/qualification focus, industry areas, partnership arrangements and assessment methods validated. It then attempts to summarise the common steps and activities undertaken by each RTO to prepare and conduct the e-validation. Next, the resourcing implications for the industry partners are discussed in terms of professional development needs, technology requirements, financial and workload implications. Finally, the benefits of the program and its implications for policy and practice are discussed.
  • Item
    Thumbnail Image
    Validation and Moderation in Diverse Settings: Final Research Report to the National Quality Council
    Gillis, S ; BATEMAN, A ; DYSON, C (National Quality Council, 2010)
    The study was undertaken to address concerns that had been raised by some stakeholder groups in relation to the perceived quality and consistency of assessments being undertaken by Registered Training Organisations (RTOs) in Australia's VET Sector. That is, there were concerns that assessment standards in the VET sector were often not comparable. Ensuring the comparability of standards had become particularly pertinent in the VET sector, as assessments could be made across a range of contexts (e.g., vocational education, educational and industrial contexts) by a diverse range of assessors using highly contextualised performance based tasks that required professional judgement by assessors.
  • Item
    Thumbnail Image
    Guide for Developing Assessment Tools
    Gillis, S ; BATEMAN, A ; Clayton, B (TVET Australia, 2009)
    This Guide is a practical resource material for assessors and assessor trainers seeking technical guidance on how to develop and/or review assessment tools. The Guide is not intended to be mandatory, exhaustive or definitive but instead it is intended to be aspirational and educative in nature. There are three sections to this Guide. Section 1 explains what an assessment tool is, including its essential components. Section 2 identifies a number of ideal characteristics of an assessment tool and provides four examples of how each of these characteristics can be built into the design for four methods of assessment: observation, interview, portfolio and product-based assessments. Section 3 provides an overview of three quality assurance processes (i.e. panelling, piloting and trialling) that could be undertaken prior to implementing a new assessment tool.