Benchmarking

Benchmarking is a means of comparing the ×îÐÂÌÇÐÄVlog's performance or standards, or both, with those of its peers.Ìý

Benchmarking aims to:

  • monitor relative performance
  • identify gaps
  • seek fresh approaches to bring about improvements
  • set goals
  • establish priorities for change and resource allocation, and
  • follow through with change processes based on empirical evidence

It can be about broad ×îÐÂÌÇÐÄVlog-wide issues or specific matters affecting only one area; it can be:

  • strategic (addressing priority issues), or
  • cyclical (addressing a number of areas on a regular basis), or
  • ad hoc (taking advantage of an opportunity)

Benchmarking projects can be as simple as a desktop survey of relevant websites, or may involve a formal request for information and/or an agreement with another institution.

Whatever its scope or subject matter, benchmarking is an important element of the ×îÐÂÌÇÐÄVlog's quality assurance cycle.

Benchmarking attempts to answer the following questions:

  • How do the standards we have set ourselves compare to our peers?
  • How does our performance measure against the outcomes of national and comparator institutions?
  • How can we adapt good practice examples from other institutions to our own organisation?
  • Why benchmark?

    Benchmarking allows the ×îÐÂÌÇÐÄVlog to:

    • identify and monitor standards and performance in order to improve ×îÐÂÌÇÐÄVlog outcomes, processes and practices
    • discover new ideas for achieving the ×îÐÂÌÇÐÄVlog's 'core objectives' as outlined in itsÌýStrategic Plan
    • provide an evidence-based framework for change and improvement
    • inform planning and goal setting
    • improve decision-making through referencing comparative data
    • bring an external focus to internal activities

The ×îÐÂÌÇÐÄVlog encourages benchmarking with comparator institutions within ×îÐÂÌÇÐÄVlog and overseas as a method of improving performance and assuring standards.

While there is no prescribed methodology for conducting benchmarking exercises, the ×îÐÂÌÇÐÄVlog expects staff to comply with the following benchmarking principles and code of conduct.

  • Core principles

    Benchmarking projects undertaken by Faculties and Divisions of the ×îÐÂÌÇÐÄVlog will:

    • support the ×îÐÂÌÇÐÄVlog's mission, values and strategic priorities
    • be characterised by a commitment to: learning from best practice; implementation of potential improvements arising from the findings of benchmarking projects; and sharing of good practices once projects are completed
    • be balanced in terms of the value received compared to costs involved in undertaking the projects
    • have the approval of the relevant manager or unit head
  • Code of conduct

    The following should be taken into account when undertaking benchmarking projects where a request for information is involved:

    Confidentiality: All benchmarking exchanges should be treated as confidential and publication and external communication of findings should not proceed without the permission of all partners involved in the project.

    Use: Benchmarking information should not be used for other than the express purpose for which it was obtained, prior consent should be obtained from all participating partner.

    Exchange: The type or level of information exchanged should be comparable between the benchmarking partners.

    Agreement:ÌýIf a benchmarking agreement is entered into, issues about confidentiality, use and the type and level of information to be exchanged should be included in the agreement.

  • Authorities and responsibilities

    • Approvals and management responsibilities should be assigned in accordance with the normal ×îÐÂÌÇÐÄVlog organisation structure and reporting relationships
    • Routine management practices must apply to any significant project undertaken
  • Project initiation

    • The benchmarking reports repositoryÌýmust be checked prior to commencing a project to ascertain whether similar projects have or are already being undertaken.
    • Benchmarking projects involving a formal request for information from another institution must be approved by the relevant manager or unit head. This can be a Head of School, a General Manager/Director, an Executive Dean, a Deputy Vice-Chancellor or Vice-President, or, in the case of any large-scale ×îÐÂÌÇÐÄVlog-wide benchmarking, may be the Vice-Chancellor.
    • If the scope of the project affects more than one area, then consultation and agreement between the areas impacted are essential prior to the project’s commencement.
    • Special care must be undertaken when projects require that the ×îÐÂÌÇÐÄVlog's corporate data be shared with other institutions: in this case the relevant data custodian must be contacted and it will be his/her responsibility to ensure that appropriate approvals for the data transfer are obtained from senior management.
  • Project management

    • The responsibility for running the project will be with the unit manager under whose authority the project was approved or his/her delegate.
    • Contact with partner organisations will normally be through the unit manager taking responsibility for the project, unless delegated by him/her. When institutional support is needed for a project to proceed with particular partners, then the responsible manager must contact the relevant Deputy Vice-Chancellor or Vice-President.
    • Written agreements with other institutions and organisations with which projects are undertaken must be entered into in line with theÌýcontract management frameworkÌýand signed in accordance with the formal delegations of the ×îÐÂÌÇÐÄVlog.
    • The ×îÐÂÌÇÐÄVlog expects that benchmarking projects will be funded by the area that initiates, manages and accepts responsibility for the project. If central funds are required such as might be the case for large projects, then a submission must be made through the planning and budgeting process. If a case for special funding is appropriate and the timeframe does not permit submissions through the planning and budgeting process, then the unit manager must approach the relevant Deputy Vice-Chancellor or Vice-President in the first instance.
    • When confidentiality considerations allow, benchmarking project reports should be lodged with the Learning and Quality Support Unit as soon as possible after the completion of the project so that they can be recorded in theÌýbenchmarking reports repository
    • Each year, a summary report on the benchmarking projects undertaken by the Faculties must be submitted to the Deputy Vice-Chancellor and Vice-President (Academic) as part of the Faculty Performance Report.
  • Integration with QA systems

    • Benchmarking projects are most successful when they are integrated with other initiatives and processes designed to improve outcomes within the ×îÐÂÌÇÐÄVlog.
    • The ×îÐÂÌÇÐÄVlog expects that projects, findings and implementation plans will be embedded into Divisional and Faculty operational and business plans.
  • Framework oversight

    • Overall authority for the maintenance of the Benchmarking Framework rests with the Deputy Vice-Chancellor and Vice-President (Academic).
    • However management accountability for the framework is with the Pro-Vice Chancellor (Student Learning).
    • Responsibility for all operational tasks related to the framework rests with Learning and Quality Support.

The ×îÐÂÌÇÐÄVlog encourages benchmarking not only with comparator institutions within ×îÐÂÌÇÐÄVlog but also with institutions overseas.

  • Selection criteria for international benchmarking partners

    The current criteria for selection of the ×îÐÂÌÇÐÄVlog's international benchmarking partners are as follows:

    • Universities with which the ×îÐÂÌÇÐÄVlog of Adelaide has a Memorandum of Understanding (MOU) or other agreement, preferably including a reference to benchmarking
    • Universities which have English as a primary language
    • Universities which are research intensive
    • Universities which are comprehensive, preferably including medicine
    • Universities which are of comparable size to the ×îÐÂÌÇÐÄVlog of Adelaide
  • International benchmarking partners

    In July 2008, the ×îÐÂÌÇÐÄVlog of Adelaide and the ×îÐÂÌÇÐÄVlog of Canterbury (New Zealand) entered into a Memorandum of Understanding to facilitate collaboration between the two universities. This included agreement to engage in an ongoing benchmarking relationship and an initial series of benchmarking projects were agreed.

    The ×îÐÂÌÇÐÄVlog sees the relationship that has been developed with the ×îÐÂÌÇÐÄVlog of Canterbury as a pilot of an international benchmarking approach for the ×îÐÂÌÇÐÄVlog, with the emphasis being a qualitative one. A series of projects have been initiated around:

    • student experience
    • performance reporting
    • sustainability, and
    • Faculty of Sciences assessment practices

    These projects are seen as pilots, providing a template for application to collaborations with other disciplines and other universities and are all ongoing, with continuing communications between relevant staff.

    During 2011, the Quality Enhancement Committee agreed to a broad approach which includes active initiation of informal benchmarking opportunities and reporting best practice, and the development of specific benchmarking guidelines to facilitate an increase in faculty and school benchmarking initiatives.

    It was also agreed that a second benchmarking partnership be initiated with one of the Universities identified by the Quality Enhancement Committee for this purpose.

International benchmarking measures

The tables below show the indicators, measures and associated data definitions for research, learning and teaching, and financial performance.

  • Research

    Indicator Measure Data definition
    Research Refereed journal articles per Academic Staff FTE per broad discipline The number of refereed journal articles per Academic Staff full-time equivalent, broken down by the following fields (these fields are used in the Shanghai Jiao Tong ×îÐÂÌÇÐÄVlog (SJTU) rankings):
    • Natural Sciences and Mathematics
    • Engineering/Technology and Computer Sciences
    • Life and Agricultural Sciences
    • Clinical Medicine and Pharmacy
    • Social Sciences
    Research Activity Ratio of research revenue to total Income from research activities as a % of total activities (excluding controlled entities).
    Research Training Activity Ratio of the number of higher degree by research students to total Number of higher degree by research (PhD and Masters) enrolments as a % of total higher education level student enrolment numbers.
  • Learning and teaching

    Indicator Measure Data definition
    Graduate Satisfaction %Ìýof graduates satisfied with their course experience Percentage of graduates satisfied with their course experience overall (equivalent for each partner of the Graduate Careers ×îÐÂÌÇÐÄVlog Course Experience Questionnaire).
    Employer Satisfaction %Ìýof graduates employed Percentage of graduates employed of those available for work in the year after graduation (equivalent for each partner of the Graduate Careers ×îÐÂÌÇÐÄVlog Graduate Destination Survey).
    Learning Performance Undergraduate retention rates The number of undergraduate students retained as a % of the total from the previous year less the students that have completed.
  • Financial performance

    Indicator Measure Data definition
    Research and Teaching Support % of expenditure on the library collection Expenditure on the library collection (excluding salaries and infrastructure) as a percentage of total expenditure, excluding controlled entities.
    Salary Expenditure % of expenditure on all salaries Expenditure on all salaries as a percentage of total expenditure, excluding controlled entities.
    Government Support % of revenue from Government sources Income from all Government sources as a percentage of total expenditure, excluding controlled entities.
    Financial Health Operating Margin Operating result as a percentage of total operating revenue, excluding controlled entities.
    Financial Health Current ratio Ratio of the value of current assets to current liabilities, excluding controlled entities.
    Knowledge Transfer % of revenue from Intellectual Property (IP) The value of revenue generated from IP (including licences and patents), including controlled entities.

Although the ×îÐÂÌÇÐÄVlog does not prescribe any particular approach to benchmarking, faculties and divisions must be aware of the management arrangements that apply as described inÌýmanaging benchmarking.

The benchmarking process suggested by the ×îÐÂÌÇÐÄVlog follows the ×îÐÂÌÇÐÄVlog's Quality AssuranceÌýPlanning and Budgeting, Implementation, ReviewÌý²¹²Ô»åÌýImprovementÌý(PIRI) system.

  • Types of benchmarking

    The ×îÐÂÌÇÐÄVlog employs a number of different types of benchmarking to support its goal of continuous improvement:

    Strategic
    Ìý
    Used to improve overall performance by examining the long-term strategies and general approaches of institutions that have succeeded in areas of strategic priority for the ×îÐÂÌÇÐÄVlog.
    Performance Used to compare and monitor the performance of the ×îÐÂÌÇÐÄVlog with its peers using a range of metrics including financial, research, and learning and teaching performance indicators.
    Functional Used to compare and improve functional areas in the organisation such as Human Resources or Finance.
    Process Used when the focus is on improving specific critical processes and operations.

    Benchmarking partners are sought from best practice organisations that perform similar work or deliver similar service.
    Internal Involves comparing practices and processes with other units in the ×îÐÂÌÇÐÄVlog.

    The advantage of internal benchmarking is that access to sensitive data and information is easier; standardised data is often readily available; and usually less time and resources are needed.

    There may be relatively few barriers to implementation as practices may be relatively easy to transfer across the same organisation.
    International Involves strategic, performance, functional and process benchmarking with comparator institutions overseas.Ìý

    International benchmarking widens the ×îÐÂÌÇÐÄVlog's focus and helps to ensure international competitiveness.
    Quantitative Looks at quantifiable outputs of an operation.

    The benchmarks are hard measures. Measurement is critical to help the ×îÐÂÌÇÐÄVlog monitor its current performance relative to that of best practice institutions.
    Qualitative Looks at the systems and processes that deliver the results.

    The qualitative benchmarks are generally attributes of best practices in a functional area and these benchmarks could be simply a checklist of essential attributes constituting best practice.
  • Selecting benchmarking partners

    Selecting appropriate benchmarking partners is essential for successful strategic and performance benchmarking.

    The ×îÐÂÌÇÐÄVlog usually undertakes this type of benchmarking with other Go8 universities.

    A benchmarking partner should:

    • have a compatible mission, values and objectives
    • be of comparable size
    • be a research intensive university
    • have a similar discipline mix
    • have superior performance in the areas to be benchmarked, and
    • have a commitment to quality management and a 'willingness to share'

    In addition to the above, international benchmarking partners should:

    • have a Memorandum of Understanding (MOU) with the ×îÐÂÌÇÐÄVlog or other agreement, which includes reference to benchmarking, and
    • have English as the primary language

    For functional and process benchmarking, choose partners with best practice in the areas or processes to be compared.

    Membership of benchmarking consortia is another avenue for comparing performance and outcomes with other institutions.

  • Benchmarking resources

    The area that initiates and manages the project will be responsible for providing the resources for the project. The following resources are normally needed for a benchmarking project:

    • Staff timeÌý- an investment of staff time is required in collecting, analysing and reporting benchmarking data
    • Logistical costsÌý-some benchmarking projects may require site visits or participation in benchmarking partners’ forums
    • Implementation costsÌý- the implementation of recommendations arising in the final stage of the benchmarking project may incur costs
  • Communicating findings

    The value of benchmarking is considerably enhanced if the findings are shared with similar or related units within the ×îÐÂÌÇÐÄVlog.

    The ×îÐÂÌÇÐÄVlog encourages the production of benchmarking reports that can be included in theÌýbenchmarking reports repositoryÌýfor the benefit of other sections of the ×îÐÂÌÇÐÄVlog. This includes externally produced consortia reports.

    Benchmarking reports will vary depending on the size and complexity of the exercise.

    Reports may include:

    • a gap analysis
    • a discussion of best practice examples
    • recommendations for the adaptation of initiatives to the ×îÐÂÌÇÐÄVlog of Adelaide context
    • a cost/benefit analysis

    Progress towards implementing improvements based on benchmarking projects, and their effect on outcomes, should also be shared and areas are encouraged to submit and discuss progress with similar units, and with ×îÐÂÌÇÐÄVlog committees and bodies, such as:

    Ideally, implementation plans should be prepared to operationalise recommendations arising from benchmarking reports. Progress against these plans should be regularly monitored.

    Benchmarking reports should be lodged with Learning and Quality Support by the areas undertaking benchmarking projects.

  • Benchmarking checklist

    The checklist covers the key activities of benchmarking:

    • Project selectionÌý- identify what is to be benchmarked
    • Form an internal benchmarking team
    • Select the benchmarking partnersÌý- consider the necessary protocols required such as confidentiality arrangement, agreements, code of practice, etc.
    • Finalise benchmarks -Ìýmeasures and indicators
    • Collect data
    • Analyse dataÌý- determine performance gaps, reasons for gaps, cost/adaptation benefit analysis
    • Communicate findingsÌý- gain acceptance from management and area staff
    • Set functional targetsÌý- implement specific improvement actions
    • Prepare a monitor progress planÌý- include responsibilities and deadlines
  • Benchmarking data

    Group of Eight

    Source:Ìý

    Type of data/statistics:

    • Student Load
    • Load Funding
    • Finance
    • Staff
    • Course Experience Questionnaire
    • Learning and Teaching Performance Fund
    • Research (Income, Publication, HDR Student Load, HDR Award Completions

    Department of Education, Skills and Employment

    Source:Ìý

    Type of data/statistics:

    • Finance
    • Research Expenditure
    • Staff
    • Students
    • Time Series - Students

    Allocation of Government funds

    Source:

    Type of data/statistics:Ìý

    • Outcome by discipline group
    • Funding allocations

    Council of ×îÐÂÌÇÐÄVlogn ×îÐÂÌÇÐÄVlog Librarians (CAUL)

    Type of data/statistics:

    • Library Organisation
    • Library Staff
    • Library Services
    • Library Expenditures
    • Information Resources
  • Benchmarking references

    Online Resources and ×îÐÂÌÇÐÄVlog of Adelaide Library Collection:

    • Alstete, JW 1996,ÌýBenchmarking in higher education: adapting best practices to improve quality, ASHE-ERIC Higher Education Report, No 1995.5, Washington, USA.
    • - viewed 28 April 2020
    • - viewed 28 April 2020
    • - viewed 28 April 2020
    • Bender, BE & Schuh JH c2002, ‘Using benchmarking to inform practice in higher education’,ÌýNew Directions for Higher Education, No 118, Jossey-Bass, San Francisco.
    • - viewed 28 April 2020
    • - viewed 28 April 2020
    • Dudden, RF c2007,ÌýUsing benchmarking, needs assessment, quality improvement, outcome measurement, and library standards: a how-to-do-it manual with CD-ROM, Neal-Schuman Publishers, New York, NY.
    • Evans, A c1994,ÌýBenchmarking:Ìý taking your organisation towards best practice, Business Library, Melbourne.
    • - viewed 28 April 2018
    • Jackson, N & Lund, HS 2000,ÌýBenchmarking for Higher Education, Society for Research into Higher Education & Open ×îÐÂÌÇÐÄVlog Press, Buckingham.
    • Lidbury, C 1997,ÌýBenchmarking, Evaluation and Strategic Management in the Public Sector, OECD Working Papers No 5:67, Organisation for Economic Co-operation and Development, Paris, France.
    • Matters, M & Evans, A 1996,ÌýThe Nuts and Bolts of Benchmarking, Benchmarking Link-Up ×îÐÂÌÇÐÄVlog.
    • McKinnon, KR 2000,ÌýBenchmarking: a manual for ×îÐÂÌÇÐÄVlogn universities, Department of Education, Training and Youth Affairs, Higher Education Division, Canberra.
    • Meade, PH 1997,ÌýA Guide to Benchmarking, ×îÐÂÌÇÐÄVlog of Otago, Dunedin.
    • NIES 1995,ÌýBenchmarking self help manual: your organisation’s guide to achieving best practicesÌý(2nd Ed), ×îÐÂÌÇÐÄVlogn Government Publishing Service, Canberra.
    • Raa, TT 2009,ÌýThe Economics of Benchmarking: measuring performance for competitive advantage, Palgrave Macmillan, New York.
    • Schofield, A 1998,ÌýBenchmarking in higher education: an international review, Commonwealth Higher Education Management Service, UNESCO, London.
    • Secolsky, C & Denison, DB 2012,ÌýHandbook on measurement, assessment, and evaluation in higher education, Routledge, New York.
    • Stella, A & Woodhouse, D 2007,ÌýBenchmarking in ×îÐÂÌÇÐÄVlogn Higher Education: a thematic analysis of AUQA Audit Reports, ×îÐÂÌÇÐÄVlogn Universities Quality Agency.
    • Zairi, M 1996,ÌýBenchmarking for Best Practice: continuous learning through sustainable innovation, Butterworth Heinemann, Oxford, Boston.