Guidance note: Monitoring and analysis of student performance
Documents
Providers should note that Guidance Notes are intended to provide guidance only. They are not definitive or binding documents. Nor are they prescriptive. The definitive instruments for regulatory purposes remain the TEQSA Act and the Higher Education Standards Framework as amended from time to time.
Context
While many higher education providers monitor and analyse student performance data in some form, TEQSA has identified that, in many instances, student performance data could be enhanced and used more effectively by providers to identify problems and risks early. This includes, for example, identifying problems with English language admissions settings, agents, course delivery, or academic integrity risks (among many others).
The Higher Education Standards Framework (HES Framework) requires providers to monitor and analyse student performance data. TEQSA considers this as a key component of a provider’s self-assurance processes.
This guidance aims to assist providers in undertaking monitoring and analysis of student data in line with the requirements of the Standards. TEQSA recognises that every provider is different and compliance with the HES Framework can be demonstrated in different ways according to the context of the provider. Each provider should determine the most effective way to implement this type of monitoring and analysis in its particular circumstances.
What does ‘student performance’ encompass?
All providers want their students to perform well and achieve the expected learning outcomes. Providers must be able to identify students that are at risk of not performing well (this could be a predicted risk or an observed risk). This enables providers to intervene early, to support students and mitigate against these risks occurring in the future.
Typical indicators of student performance include:
- attrition rates
- progress rates
- completion rates
- grade distributions
- student satisfaction
- graduate success.
Analysis of student performance: How it relates to quality
Understanding student performance through monitoring and analysis is critical to successful higher education. The insights and benefits are considerable, including enabling:
- early identification of problems, allowing remedial action to be taken to avoid a lasting adverse impact
- identification (and support) of students at educational risk, especially previously unpredicted or unmitigated risks
- testing/validation of a provider’s capability in predicting risks for particular groups of students
- evaluation of the effectiveness of a provider’s management of predicted risks
- identification and correction of underlying causes of poor achievement
- development of an evidence-based diagnostic understanding of risks and causal factors to improve performance and prevent future under-performance
- demonstration that a provider meets, and can continue to meet, the requirements of the HES Framework in relation to student achievements.
Together, these benefits can enhance outcomes for students, with consequent enhancement of the reputation of both the provider and Australian higher education overall.
What do ‘monitoring and analysis’ encompass?
For the purposes of this Note, ‘monitoring’ encompasses the regular collection of data on student performance as required by the HES Framework. The data will encompass data sets that are routinely collected by the provider, data from national data bases such as the Higher Education Information Management System (HEIMS) and data collected by the provider for particular purposes (such as monitoring breaches of academic integrity, agent performance or students at risk).
‘Analysis’ encompasses the provider’s approach to understanding the underlying patterns and causes of any identified lapses or deteriorations in student performance (i.e. whether they are apparently temporary or part of longer-term trends), as a foundation for corrective and preventive actions. Such analysis ideally includes:
- predetermined elements such as routine analysis of data from pre-identified groups of students, e.g. international students, annual intakes to a course of study or students studying within a particular field of education, and
- ‘data-driven’ analysis of performance data aimed at detecting areas of risk that are not necessarily pre-determined or anticipated, for example, detecting a group of underperforming international students who have been handled by a particular agent or a cluster of students with academic integrity breaches with a similar profile – country of origin, agent used, basis of English language admission etc.
TEQSA acknowledges that the scope and depth of monitoring and analysis that can be undertaken by a provider will be determined in part by the scale of the provider and the types of methods that are applicable to that scale. For example, a large provider may be able to obtain large data samples that are amenable to sophisticated statistical analyses and/or data-driven business intelligence systems, while a small provider may need to place greater emphasis on a detailed understanding of individual circumstances in relatively small groups of students. Despite such variations in scale and approach to analysis, all providers must analyse and understand the performance of their students to address risks, inform continual improvement and continue to meet the requirements of the HES Framework.
What are identified student cohorts?
For the purposes of this Note, ‘identified student cohorts’1 are groups of students whose members are identified as sharing some particular characteristics that may have a bearing on their success in a proposed course of study, such as a particular educational background, for example. In general, ‘cohorts’ are typically identified prospectively and monitored routinely, although retrospective identification of commonalities may also occur as a result of data-driven analyses (see ‘Other Identified Students’ below).
Identified cohorts typically fall into one or more of the following classes:
- pre-determined cohorts that are traditionally identified in the Australian education system, such as students in individual fields of education, courses or units of study, international students, mature-age students, socially disadvantaged students, Indigenous students, annual intakes and students at different locations or participating in different modes of study (ideally the performance of such groups is monitored over time, i.e. cohort analysis in the formal sense)
- cohorts that providers should deliberately and predictively identify in the course of admission, such as students in diversity groups (e.g. Indigenous students), students who may have some potential educational disadvantage and students who may be at risk and are expected to require additional support after admission
- students who have been offered substantial credit for prior learning (e.g. a third or more of the course of study) through a standing arrangement or other mechanism
- other routinely predetermined groups of significance to particular providers, such as students with particular post-graduate requirements, e.g. initial teacher education.
The matters raised in this Note generally apply to the individuals within a cohort as well as to the cohort as a whole.
Other identified students
In addition to prospective identification of cohorts, a provider’s monitoring and retrospective analysis of its overall student performance data (so called ‘data-driven’ analysis) may reveal other groups/individuals who demonstrate poor performance that was not necessarily anticipated, such as:
- groups of students (or individuals) that are identified by either an episode, or a continuing history, of low academic achievements, including poor performance in early assessments, failure in other assessments, slow completions or attrition
- previously unrecognised groups or individuals that are identified in diagnostic analyses of performance (e.g. where a provider’s assumptions about educational preparedness of a particular group are not realised)
- low performing students that can be associated with particular market niches (e.g. a new international market, admission by particular mode of instruction, admission by type of English language proficiency evidence, admission through particular agents or particular pathway providers, both onshore and offshore)
- students who are demonstrating particular difficulties that are affecting their education (e.g. breaches of academic integrity, or the emergence of particular learning difficulties).
Such retrospective data-driven analysis, as with prospective cohort analysis, gives providers important information to help to identify problems that have occurred and their cause. Most importantly, the analyses should lead to the identification and correction of underlying root causes, so that the problems do not continue to occur in the future.
Relevant Standards in the HES Framework
The HES Framework addresses, or has a direct bearing on, student performance in several ways at a number of levels. This begins with a fundamental requirement of admission that students who are admitted to a provider will have no known limitations that would impede their progression or completion (1.1.1)2, as well as ensuring students are informed about their prospective experience and obligations (1.1.2, 7.1.1 – 7.1.5, 7.2.1 – 7.2.4) and that any credit offered for prior learning does not disadvantage them (1.2.2a).
Section 1.3 (Orientation and Progression 1.3.1 – 1.3.6) sets out the obligations on providers to assess the needs of cohorts, to provide early assessment of student progress and targeted support, if required, and to monitor trends in student performance to enable review and improvement. In particular, this section of the standards notes that students should have equivalent opportunities for progression, irrespective of their background, entry pathway or mode or place of study (1.3.6).
Section 2.2 (Diversity and Equity 2.2.1 – 2.2.3) deals with accommodation of diverse groups such as Indigenous students and disadvantaged groups, and imposes specific requirements for monitoring the performance of identified sub-groups of students and using the findings to improve admissions policies, teaching, learning and support for those groups.
The HES Framework sets requirements at the institutional level for monitoring and review of academic performance, including a requirement to obtain student feedback (5.3.5) and to use student performance data and feedback to inform both admission practices and the provider’s other academic approaches (5.3.7).
The Standards also specify corporate and academic governance responsibilities for overarching oversight of the range of activities already mentioned above. These include ensuring that there is corporate oversight of academic governance (6.2.1f), that the corporate governing body has identified risks to the provider’s education operations (6.2.1e) and that academic oversight of monitoring, review and improvement of academic activities is effective (6.3.2g). The HES Framework also requires effective monitoring and reporting to the corporate governing body on the quality of teaching and research (6.3.2h), together with relevant delegation of authority (for monitoring and reporting on student performance in this case) and that the implementation and effectiveness of those delegations are monitored and reviewed (6.1.3b). A provider is also expected to set and monitor institutional benchmarks for academic quality and outcomes (6.3.1b); for which monitoring and analysis of student performance is a fundamental requirement.
A provider is required to maintain accurate and up-to-date records, including data on enrolments, progression and completion (7.3.3a) and any lapses in compliance with the HES Framework (7.3.3d).
Intent of the Standards
The general intent of the Standards is for providers to develop a detailed understanding of the performance of their students and to create an evidence base for improvements to all aspects of the provider’s academic activities, both at the local level (e.g. delivery of a course of study at faculty/departmental level) and for the provider as a whole, through improved oversight and policy refinement that leads to enhanced student outcomes.
This understanding is expected to be nuanced according to identified (or identifiable) cohort data, where relevant, and is intended to extend to:
- the effectiveness of a provider’s predictions and assumptions that underlie admissions policies and practices, and
- the causes of poor performance of admitted students, both in transition to their course of study and throughout their studies (whether because of, or irrespective of, deficiencies in admission practices).
The Standards intend that a provider will develop a sound quantitative understanding of student achievements, which will inform both established practices and improvement strategies. Such an understanding is intended to be evidence-based, to be able to demonstrate correlations and associations, and to identify underlying causal relationships that will inform improvements.
The necessary analyses are intended to be nuanced by examination of the needs and performance of identified groups of students, while at the same time demonstrating that all students have equal opportunity for successful progress irrespective of background. A provider’s data-driven analyses may identify a previously unrecognised focus of potential disadvantage.
While the Standards seek to proscribe admission of students with known impediments to success, this does not preclude admission of students who may face additional but manageable risks, e.g. students who are expected to need additional academic support. This requires sound judgement by a provider, including predictive analyses, the assumptions of which are expected to be tested through the provider’s subsequent analyses of student performance, as required by the HES Framework. It also necessitates that they make that additional support available.
Risks to quality
The principal risks to quality stemming from an insufficient understanding of student performance relate to poor student outcomes (with potential reputational risks to the provider and to Australian higher education). The causes of poor outcomes generally fall into three broad classes:
- personal factors
- admission of students who are inadequately prepared to undertake their course of study
- deficiencies in the learning environment such as inadequate teaching or insufficient access to, or uptake of, student support services.
TEQSA has identified a number of shortcomings among providers in relation to understanding student performance, including:
- paying insufficient attention to, or ignoring, available data to detect particular risks (e.g. not responding to obvious data that demonstrate poorer performance by a particular group of students, which was not predicted by the provider)
- failure to establish an evidence base to fully understand and validate a provider’s policies and approaches (e.g. admission practices, detection of students at risk, provision of targeted learning support, related institutional policies)
- not undertaking sufficient in-depth analyses of cohort performance to identify underlying causes of poor performance
- failing to track cohorts over time (e.g. systematically tracking and monitoring student performance data based on identified risks such English language proficiency or on the basis of recruitment and admission)
- undertaking analyses of performance, but not acting on the findings to bring about improvements, particularly through institutional academic governance and quality assurance processes
- deficiencies in academic and corporate governance e.g. governing bodies not seeking sufficient information to understand risks to student performance, to be satisfied about educational risk management and to oversee corrective and preventive actions that are, or should be, implemented
- unclear or insufficiently accountable delegations of authority for performance analyses and tracking, and/or failure to monitor the effectiveness of such delegations in detecting and addressing issues of concern.
Particular issues of concern include inadequate analyses and tracking to understand and address:
- insufficient English proficiency that is traceable to different types of admission (e.g. alternative language testing vs standardised testing such as IELTS, criteria not based on testing such as language of previous instruction, exemptions from the normal criteria, the effectiveness or otherwise of different agents, on-shore and off-shore cohorts)
- whether or not a provider’s admission policies or other policies are effective in achieving their intended policy outcomes (e.g. whether the additional support provided to cohorts with known risk on admission are indeed effective and whether the particular approach to admission is tenable in the light of performance data)
- whether or not all students have equivalent chances of success irrespective of their background, mode of entry and mode of participation
- whether or not groups of students were sufficiently informed, or not misinformed, about the requirements of their chose course of study
- whether or not particular types of students are prone to particular concerns e.g. breaches of academic integrity.
What TEQSA will look for?
This part of the guidance note covers the full extent of the Standards, and corresponding evidence that TEQSA may require, in relation to the analysis and understanding of student performance.
For new applicants seeking initial registration and course accreditation, TEQSA will require evidence to be provided in relation to all relevant Standards.
For existing providers, the scope of Standards to be assessed and the evidence required may vary. This is consistent with the regulatory principles in the TEQSA Act, under which TEQSA has discretion to vary the scope of its assessments and the related evidence required. In exercising this discretion, TEQSA will be guided by the provider’s regulatory history, its risk profile and its track record in delivering high quality higher education.
TEQSA’s case managers will discuss with providers the scope of assessments and evidence required well ahead of the due date for submitting an application.
The evidence required for particular types of application is available from the Application Guides on the TEQSA website.
Providers are required to comply with the Standards at all times, not just at the time of application, and TEQSA may seek evidence of compliance at other times if a risk of non-compliance is identified.
TEQSA expects a provider to be able to demonstrate an effective system to track and analyse the performance of identified student cohorts and that this provides an evidence base sufficient to diagnose, address and prevent issues with particular cohorts. The scope of such a system must encompass the relevant sections of the HES Framework (see above) and involve all relevant levels of the organisation, as required by the HES Framework.
An example would be a scenario in which international students in Information Technology had higher and increasing rates of attrition compared to domestic students, and compared to international students in Business. TEQSA would expect management to inquire into the possible causes of this, under the oversight of the governing bodies, and initiate improvements, which might take the form of changes to the relevant admissions criteria, agent management, delivery or assessment. Any improvements would then be reported back to the governing bodies to complete the improvement loop at governance level.
Overall, TEQSA expects that a provider is able to demonstrate that there is an established framework of regular review and response to quantitative analysis3 and resultant evidence to show that:
- the provider knows that admitted students have no known impediments to their prospective progress (1.1.1)
- admitted students have sufficient academic preparation and proficiency in English to participate in their chosen course of study (1.1.1)
- student cohorts have been identified meaningfully and rationally (evidence based) in the context of the provider’s mission and that the needs and risks for those cohorts are understood and anticipated (1.3.2a)
- identified student cohorts have equivalent opportunity for success, irrespective of their educational background, entry pathway and mode or place of study (1.3.6)
- there is both an evidence-based rationale and a framework of delegated authority for adopting or varying admission requirements for any cohort (6.1.3b), including the admission of students who have some identified educational disadvantage that is believed (with evidence) to be manageable with additional support
- additional targeted support is provided where needed and there is evidence that it is effective (1.3)
- granting of credit for prior learning does not disadvantage any cohort (1.3.2c)
- student progression is monitored during transition and throughout their course of study (1.3.5) and the resulting data are used to guide provision of additional support where needed (1.3.2c) and to inform institutional review and improvement (5.3.7), including improving the effectiveness of policies and procedures that are intended to enhance student achievement
- data on student progression are considered and acted on at the institutional level (6.1.3b, 6.2.1e, 6.2.1f, 6.3.2e – h)
- the provider’s data on student progress is accurate and up-to-date (as is reasonably practicable) (7.3.3.a)
- known difficulties with student progress do not reflect deficiencies in the provider’s representation of its offerings (whether directly or via agents) (7.1.1 – 7.1.5) or the information that is provided to students (7.2.1 – 7.1.4)
- the effectiveness of delegated authority for understanding and reporting on student progress is monitored at institutional level (6.1.3b, 6.3.1a-d)
- agents and third-party arrangements operate in the interests of all students involved with those parties (5.4.1 – 5
Notes
- The term ‘cohort’ is used more broadly here than in the specific technical sense some readers may be familiar with in formal statistical cohort analyses – tracking of performance over time).
- Encompassing but not limited to proficiency in English, educational preparedness, appropriate recognition of prior learning. The intent of this standard is to ensure that providers actively consider the preparedness of particular types of students and predict any likely challenges a group may face, with a view to providing targeted support where warranted.
- Except in rare circumstances where it is impractical to do so (such as an immature provider) or when qualitative evidence may be more appropriate.
TEQSA welcomes the diversity of educational delivery across the sector and acknowledges that its Guidance Notes may not encompass all of the circumstances seen in the sector. TEQSA also recognises that the requirements of the HESF can be met in different ways according to the circumstances of the provider. Provided the requirements of the HESF are met, TEQSA will not prescribe how they are met. If in doubt, please consult your TEQSA case manager.
Version # |
Date |
Key changes |
---|---|---|
1.0 |
6 January 2020 |
Made available as beta version for consultation. |