The SRES can be best described as an online database to capture and manage student data.31 its value comes from its specialised tools that help deliver per- sonalised feedback to students and automate repetitive tasks for the teacher (e.g., emailing students). We have outlined a few case studies below of where SRES has been utilised to deliver personalised feedback to students based on the outcome of an assessment in a large (∼2000 students) student cohort.
24.3.1 Case Study 1: Rubric-to-Feedback
SRES is used extensively in our laboratory environment primarily to capture the grading of non-digital submissions or practical assessments that are diffi- cult or impossible to mark via lmS options. These include assessments such as grading laboratory samples, marking competency or technique-based assessments (practical exams) or even oral presentations. This is due to the flexibility of SRES to accommodate basic computing logic including com- plex calculations to set correct parameters for each submission, personalised feedback generated based on each rubric criterion grade, and conditional (what-if) statements. Teaching assistants (Tas) are provided with a mark- ing rubric which is captured on a mobile-friendly web interface generated in SRES. Whilst constructing the rubric, SRES allows you to assign specific feedback and grade to each criterion (see Figure 24.1).
Tas do not see the grades or feedback associated with each criterion and have only descriptor tiles to press when assessing work. once a Ta has assessed a student’s work, the grades are immediately tallied by the system and presented to the student in an embedded webpage in their lmS. in addition, the feedback statements that were tied to each criterion selected by the Ta, are concatenated to generate a detailed feedback account. generally, this feedback highlights where and why the student lost their marks and presents rectifying actions or directions to further resources to improve in future opportunities. Where students scored full marks, congratulatory remarks were also included as pos- itive reinforcement. The total score and feedback are generated immediately after the assessor submits their grading. in the past, this grade and feedback
Downloaded from http://books.rsc.org/books/edited-volume/chapter-pdf/1746516/bk9781839165238-00301.pdf by RMIT University user on 06 February 2024
311 Digital Formative Assessments for Learning
Figure 24.1 a marking criterion for the assessment of thin-layer chromatography (TlC) performed by students in the chemistry laboratory. “Text to dis- play” shows the criterion descriptor presented to the assessor. “Value to save” is the respective grade (out of 2) associated with the criterion.
“longer description” captures the feedback statement presented to the student if that criterion is selected by the assessor.
Downloaded from http://books.rsc.org/books/edited-volume/chapter-pdf/1746516/bk9781839165238-00301.pdf by RMIT University user on 06 February 2024
Chapter 24 312
was released to the students via an email from the lab coordinator, however it generated an unexpected amount of work when many students replied to the email wishing to express their gratitude for the time spent providing such a detailed account of their samples. in future lab classes, the results were issued via a web portal within the students’ lmS (Figure 24.2). although less personal than a “tailored” email, anecdotal and unpublished research data from stu- dents emphasised their gratitude for the detailed feedback. Through SRES, personalised detailed feedback that would have taken hours to write, for a cohort of over 250 students, took less than an hour by only writing criterion- specific feedback statements as part of the marking rubric creation.
24.3.2 Case Study 2: Peer Assessment
SRES was used to facilitate the peer assessments of student oral presenta- tions as a strategy to engage students and foster critical thinking by involv- ing them in evaluating the contents of peer presentations. in addition, we allowed 30% of a student’s oral mark to be determined by the average mark given through peer assessment. Students used their mobile devices to access the marking rubric presented to them in a web app interface from SRES (Figure 24.3). as students presented, their peers (and the academic) were grading the presentation live.
upon submitting their gradings, SRES averaged the peers’ scores, tallied it with the academic determined grade, and posted it immediately on the presenter’s lmS. in addition, what-if statements were applied to the average scoring of each criterion to construct appropriate feedback statements that were stitched together to provide an immediate overall feedback account of the presentation. presenters had a final score and feedback available to them before they could sit down from their presentation. This instant feedback allowed students to immediately reflect on the experience and start consid- ering their approach for future opportunities. in addition to summative and
Figure 24.2 Example of a SRES lab results page projected into a student’s lmS. The grade of 6.2 was tallied and the “Sample Feedback” was constructed automatically and immediately after the assessor had graded this stu- dent’s sample.
Downloaded from http://books.rsc.org/books/edited-volume/chapter-pdf/1746516/bk9781839165238-00301.pdf by RMIT University user on 06 February 2024
313 Digital Formative Assessments for Learning
student peer feedback, students also experience self-feedback by having the opportunity to reflect on and critically evaluate the reliability of their data, quality of their (laboratory) work and performance in verbal communication against their peers.
unlike traditional assessments, where more of the educator’s time is required post-assessment with grading and supplying feedback, SRES shifts all the labour towards the planning phase of an assessment. The feedback is gen- erated alongside the marking criteria which has an additional benefit of mak- ing the educators reflect upon their assessment structure and how they would advise students on improving different elements in the assessment. however, it is in the long run that efficiency is truly appreciated by the educator as the marking and feedback platform generated in SRES can be cloned and re-used from year to year. This allows for an equally content-rich feedback experience for the students with little time commitment by the educator.