Using data to inform subject redesign

'Good teaching is open to change; it involves constantly trying to find out what the effects of instruction are on learning, and modifying that instruction in light of the evidence collected.'
Paul Ramsden, Learning and Teaching in Higher Education, 2nd ed, 2003, p. 98

Review the available data

The first step in any process of curriculum redesign is to seek clarity on what it is you are trying to achieve or remedy within the subject and the course overall. In other words, we are trying to nail-down the otherwise moving targets of subject-level outcomes.  Therefore, the first step in subject redesign is to gather and analyse all of the available data in an effort to identify areas of relative strength and weakness. At the end of this process you should be able to identify a few key areas for targeted attention and, ideally, produce measures or goals that will allow you to determine the overall effectiveness of the subject redesign.

Faculty and university level data

La Trobe University participates in a range of national and institutional surveys that can be very helpful in providing feedback on our student's learning experience and achievement. For a full overview of the range of survey results that are available to staff, please see the recent document prepared by PIPU titled, Student Feedback: Study of Key Findings and Responses.

*PIPU has provided faculty-level data for most of the surveys on this list (use hyperlinks). In most cases, PIPU should be able to provide data from previous years as needed.

  • La Trobe University, Access and Achievement Research Unit (AARU) Reports – AARU has produced a number of reports relating to student attrition, and related issues, that will also be of use to faculties. For more information on completed and in-progress research projects, please see the AARU website.
  • Colmar Brunton Report, La Trobe: Improving Undergraduate Retention Rates (2013) – " The overarching aim of the research is to identify the factors contributing to student attrition that are within the University's control, with a view to inform the University on problem areas/elements to find solutions to reduce attrition." Provides a detailed, faculty-level, report on students that have left the University in 2013. This report is "commercial-in-confidence" and not to be shared with any external parties or people within La Trobe for whom it is not directly relevant. For a copy please contact Bret Stephenson, LTLT (B.Stephenson@latrobe.edu.au) .
  • The Course Experience Survey (CEQ) collects graduates' perceptions of their higher education experience at the course level. PIPU has released national and state rankings, at the level of Field of Education (FOE), based on CEQ results. Some CEQ results are made public, by subject area, on the My University website.
  • The Australasian Survey of Student Engagement: "Data from the Australasian Survey of Student Engagement (AUSSE) provides information on the time and effort students devote to educationally purposeful activities and on students' perceptions of the quality of other aspects of their university experience." The AUSSE survey is administered to 1st and 3rd year students.
    • AUSSE Engagement Scales: Academic Challenge, Active Learning, Student Staff Interactions, Enriching Educational Experiences, Supportive LearningEnvironment, Work Integrated Learning
    • AUSSE Outcome Scales: Higher Order Thinking, General Learning Outcomes, Career Readiness, Overall Avg. Grade, Departure Intention, Overall Satisfaction
  • The University Experience Survey has largely replaced the AUSSE in many institutions. The UES is now one of the primary governmental measures of institutional performance – teaching quality, skills development, learner engagement, student support, learning resources – and results are made public via the My University website. The UES was first administered at La Trobe in 2013.
  • Australian Graduate Survey Summary Report; Graduate Destination Survey (by faculty)

Subject level data

  • Student Feedback on Subject – Overall quality, value of learning, student interest in subject material, quality of feedback, assessment tasks, etc.
  • Student Feedback on Teaching – The overall design of a subject is, of course, important, but so too is the actual teaching delivery of a subject. Student feedback on teaching can often inform the process of subject redesign and should be part of the review process wherever possible.
  • Subject Retention Rates – There are several ways in which measures of retention/attrition can inform the subject redesign process:
    • 1) Per cent of students retained/lost before census can indicate an issue where students are leaving a subject at an unexpected rate. This may indicate scheduling conflicts, lack of interest, worries about level of difficulty, etc.
    • 2) Per cent of students that take a particular subject and remain (re)enrolled at census in the following semester. For this measure to have any meaning it would have to be compared to overall course retention averages, other cohort measures, or multi-year comparisons. At LTU we have already seen that students who complete first-year subjects with a strong element of active or social learning often have a higher rate of retention than students exposed solely, or primarily to traditional lecture-tutorial formats. This is, of course, an imperfect measure and requires a high level of contextualisation if it is to be meaningful.
    • 3) It can also be useful to clearly distinguish between institutional and course retention. It may be that a course is experiencing high rates of attrition, but many of these students may yet be retained by the institution overall. In these cases, our interest is in maximising institutional retention wherever course retention may be relatively low (i.e. "pathway" or "feeder" courses). This is particularly important in our generalist degrees.
  • Success Rates – Per cent of students who attain successful completion within a subject. It is important to look at overall grade distribution, grade distribution by cohort/course, and distributions by campus. Disaggregating the data in this way can often reveal important insights into the varieties of student (their preparedness, motivation, etc.) and student experience within individual subjects. Demographic data is available from PIPU and will be made more widely available with the release of the Subject/Course Dashboard.
  • Student Demographic Data – ATAR bands (multi-year comparison), course, course preference (1st, 2nd or 3rd preference), First in Family status, SES, campus. Demographic data can shed some light on your student's background characteristics, including their abilities, preparedness and priorities. Demographic data is available from PIPU and will be made more widely available with the release of the Subject/Course Dashboard.
  • Number of Late Enrolments – Late enrolees are at particularly high-risk and may be skewing results (success rates). This may indicate a need for targeted support strategies. Your faculty student administration office may be able to help with the collection of this data.
  • Assessment Completion/Success Rates – Per cent of students completing and passing each assessment task (early assessment, essays, labs, exams, etc). It is particularly helpful to map completion rates over several years for each assessment task. This allows you to then set completion goals for each of your subject's assessment tasks. Low completion rates for formal exams may indicate the need for a new examination strategy. This process may also indicate that the assessment strategy requires overall renewal. Finally, by tracking completion and success rates (grade distributions) for individual assessments, we have a quick and easy way of measuring the effectiveness of redesign efforts as the subject is being delivered. Moreover, this is one of the few measures that carries little time lag.
  • Number of Plagiarism and Academic Integrity Cases – If a subject or particular assessment task is witnessing an unusual number of academic integrity cases, this may indicate a problem area within the subject, or with student preparedness, understanding, or even anxiety. It is well worth keeping track of academic integrity breaches within the subject for this reason.
  • Student Focus Groups – In some cases it will be useful to run focus groups with students that have recently completed the subject. Focus group discussions will help to flesh out the student experience as captured in SFS feedback and further elucidate the student's background characteristics (preparedness, motivation, etc.).
  • Peer Observation of Teaching – LTLT is currently developing a model to support the process of peer observation of teaching. The model will be applicable across multiple delivery modes, including traditional lecture/tutorial, blended and fully online. For more information please contact Rhonda Hallett (LTLT) r.hallett@latrobe.edu.au

Action Checklist

  1. Completed thorough review of available data by faculty/course/subject
  2. Identified key areas for improvement in SFS and SFT evaluations
  3. Identified target cohorts within the subject where necessary
  4. Gathered all necessary data on retention, success, and late enrolments.
  5. Mapped particular assessment completion/success rates
  6. Agreed on key intended outcomes for redesign process, including establishing of specific numerical targets