Logo

How data from digital learning tools can refine teaching

Digital learning tools enable educators to quickly collect and analyse student performance data in order to refine their teaching, as Paul Moss explains

Paul Moss's avatar
26 Sep 2022
copy
0
bookmark plus
  • Top of page
  • Main text
  • More on this topic
A team examining data

Created in partnership with

Created in partnership with

The University of Adelaide

You may also like

Secure and transparent use of student data
Guidance on responsible collection and use of student data to enhance online teaching

Used effectively, digital tools can help educators collect and analyse student performance data to help them refine their teaching practices. One example is the Learning Mastery Gradebook view in the Canvas learning management system (LMS), which helps collect data on progress made towards achieving individual course learning outcomes and presents the information a highly visual way.

The average student attainment for each learning outcome is displayed in real time at the top, providing a clear snapshot of the overall cohort’s success or failure in each outcome. The progress of individual students can also be seen.

To populate the Learning Mastery Gradebook view, outcomes are added to assessment rubrics and marked in the Canvas grading tool, SpeedGrader, where you can view and grade student assignment submissions in one place using a simple point scale or rubric. Pedagogically, the strength of using rubrics in this way is in the explicit promotion of constructive alignment, in which teaching and assessment are aligned with desired learning outcomes.

Constructive alignment

Course co-ordinators are encouraged to thoughtfully map course learning outcomes to assessment and then design rubrics that serve three purposes:

  • to guide students towards achieving success in the task
  • to help those marking their work maintain consistency
  • to act as a checklist to evaluate if the criteria being assessed sufficiently match the content taught.

This is where the design process can be powerful: if there is an obvious disconnect in alignment among outcomes, assessment and content, then adjustments can be made.

If outcomes are measured in more than one assessment, Canvas has a feature where the outcome scores can be aggregated. The designer can choose the weighting of an outcome each time it is measured, meaning they can give a particular assessment a stronger bearing on the overall attainment measure.

Compared with overall course grades at the end of a semester, individual learning outcome data provide a more detailed understanding of how each outcome affects or contributes to the whole. Using this approach, the educator can reflect on why certain outcomes were or were not successfully attained, the impact that any weakness may have had on overall performance, and what could be done to improve outcomes in future versions of the course.

More precise feedback: a case study

Ivan Obaydin, senior lecturer in the University of Adelaide Business School, has taken the use of outcomes in Learning Mastery Gradebook to a more granular level. Rather than looking at course learning outcomes aggregated as a single figure based on all relevant assessments, Ivan breaks them up, collecting data on outcomes tied to each individual assessment.

For example, the mid-semester exam tests learning outcomes one and six, labelled MSE-LO1 and MSE-LO6 respectively. The same learning outcomes are tested again in the final exam, labelled FE-CLO1 and FE-CLO6. Visually, he can quickly see when students have struggled with a certain learning outcome and when they have improved or perhaps regressed.

This data can then be aggregated, so Ivan can see how students fared in each of the outcomes over the semester and identify where refinement might be needed for the next roll-out of the course.

Taking an example from Ivan’s course, he noticed that students had performed worse in two learning outcomes compared with the others, so he investigated why. He was particularly interested in why performance dropped so much between the mid-semester and final exams.

As a result of his evaluation, Ivan made adjustments to the delivery and teaching of the course for this semester. He placed greater emphasis on explanations of specific content in his lectures and began using the LMS to support more active learning and exercises to deepen students’ understanding of concepts.

Ivan encouraged tutors to focus more on these two outcomes in tutorial activities enabling students to practise their application. Ivan believes the changes have already produced positive results, with students showing increased engagement and willingness to answer questions.

Herein lies the power of this approach. This form of action research is a proactive way of using data to inform design. The changes Ivan made to this semester’s course will shape the next round of student performance data, and this in turn will allow him to analyse if the proposed adjustments have made a difference.

Tailoring learning to the needs of the cohort

As course coordinator, Ivan can see the potential of this data-informed approach to support tutors. The Learning Mastery Gradebook data can be manipulated so they highlight individual “sections” in the gradebook. These sections usually represent tutorial groups. This means he can see if the learning outcome weakness is across all students or just particular groups.

Isolating the data enables Ivan to open conversations with tutors about how their students progressed in each outcome. Such data could inform a tutor that more time should be spent on certain outcomes in the lead-up to the final exam. Deeper analysis could provide evidence of certain patterns (for instance, outcomes that rely heavily on previous knowledge), so future cohorts could be grouped on this basis. Or tutors may increase their emphasis on prior knowledge in retrieval activities, for example.

Using data in this way, university teachers can continually tailor curricula and course delivery to suit the needs of their cohort.

Paul Moss is a learning design and capability manager at the University of Adelaide.

If you found this interesting and want advice and insight from academics and university staff delivered direct to your inbox each week, sign up for the THE Campus newsletter.

Loading...

You may also like

sticky sign up

Register for free

and unlock a host of features on the THE site