Do you really care?

I’m going to ask you a question. I want you to answer the question quickly and without thinking about it. Please be honest, it’s only for yourself. Here goes:

Can you teach someone without caring about them?

Did you have a quick answer? Now you’ve had a moment to reflect on your answer, does it surprise you? I know the rational, nice, answer is that “of course we have to care”, but the instinctive (honest?) answer is often “no”.

Your initial thoughts might have gone to your students or classmates that you might not know or actually like. Out of a class of 10, 30, 200, you may not know everyone’s names, their stories, who they are…

I have been asking quite a few people this same question, and although a few might sit on the fence, the vast majority of first answers were “no”. In fact, there were very few “yes” answers at all. 

So what does that mean about education? We’re all uncaring sociopaths, or is it more nuanced?

Normally, after asking that question, I follow up with a conversation and hear similar post-hoc rationalisations such as “what about YouTube videos? They are teaching you things.”

On the surface, this is a great argument, but one I want to counter by suggesting that YouTube how-to videos, and the like, are instructing, not teaching. You may think that is a minor semantic difference and I am being overly penickerty. And maybe I am.

However, I would argue that the fundamental difference between teaching and instruction is assessment. If assessment does not occur, this is instruction/training, and if assessment does occur, this is teaching. 

There is some nuance in what we call assessment, granted. However, if there is a genuine requirement to take information or instruction and to process it, e.g. extrapolating it to different scenarios, then that is assessment. You could call it ‘authentic assessment’ if the wind is blowing in that direction and that phrase is still in vogue. Parroting facts via poorly executed multiple choice questions (the longest answer wins, or it is ‘all the above’) is pseudo-assessment and doesn’t count.

Now, I want to be very clear here: the industries of professional development (i.e. training) and education both utilise teaching and instruction/training. The best corporate training isn’t training at all, it is actually teaching. In the same way that the worst teaching is actually training as the assessments and what’s taught don’t align or have value. There is a place for both, and both will likely exist concurrently without any negative value judgement. But there is a fundamental difference: assessment.

So, coming back to the original question, my hypothesis is that assessment is an act of care about your students and their development. To instruct or train, without the opportunity to test if that knowledge or skill has been developed – i.e. usable in other contexts – is not teaching, and it isn’t an act of caring. You are passing across information to your students without any interest in how it might affect their lives. 

Assessment is the activity we use to show we care about not only what our students have learned, but also what change in their actions and behaviours that learning entails.

It’s worth noting that an act of care may not be personalised; but, if you are teaching them, there is an intention that they develop even if you don’t like the person. It’s also important to consider time and space within this hypothesis. I also suggest that displacement of time or location does not matter. This means that it is possible for asynchronous, online learning to be teaching. What matters is the intention of the educator.

If I had asked the initial question differently: do you teach to make a difference to your students’ lives? Then I think the answers would have been overwhelmingly positively. If my thesis is correct, then you have the mechanism to know how and why you do care. It might help you think about making your assessments more meaningful.

NB: There are no references in here because I wanted to extract thoughts from my brain and get them down on the page without the risk of it being actively viewed through someone else’s lens. I would very much like your thoughts and critique of this thesis. I may return to provide more explanation or rigour in the future, but I also may not.

Digital Assessment Maturity Model v0.2

Release notes

We released v0.1 of the Digital Assessment Maturity Model in March 2023, and spent the rest of the month talking to people from the sector and refining our initial work. This blog post lays out the version 0.2 of the model and outlines how it may be used.

Current assessment landscape

Assessment processes are complex [Citation needed]. Different universities do things differently and use differing terminology, however, the general process is somewhat similar. Our first step in developing a maturity model was to attempt to define the assessment process, shown below:

A flow diagram taking you through the five stages of assessment: design, creation and resourcing, completion and submission, marking and feedback, and quality assurance and enhancement. It is a highly branched process with many dotted lines.

Note, there will be considerable institutional nuance, but the process chart aims to capture most of the activities that take place to ensure assessments run effectively.

Our second step was then to identify and define areas of activity within the assessment process. We have called these:

  • Assessment design
  • Assessment creation and resourcing
  • Assessment completion and submission
  • Marking and feedback
  • Quality assurance and enhancement

These were chosen because they often require the work of different groups of people within the institution. For example, timetabling and estates teams may only focus on assessment creation and resourcing activities in relation to scheduling on-campus digital exams. It also allows us to potentially tag relevant resources in the future.

We are aware that this is an incredibly simplified process map compared to sector practice, and that each of the five activity areas is inextricably linked to each other – even if that is not immediately apparent from the process map.

Growing digital assessment

Digital assessment has increased in UK Higher Education over the last five years, driven by underlying strategy and greatly accelerated due to the COVID-19 pandemic. Universities and Colleges may have found themselves un- or under-prepared for a mass move to online exams, for example, and innovation practice happened at pace.

In these circumstances, enhancements to digital assessment may not have occurred evenly across all aspects of the assessment process. You may have concentrated efforts on creation and resourcing, or completion and submission, as these are the most visible to students, and considered digitising the quality assurance and enhancement element less so.

It is worth noting that while even enhancements across the five activity areas is not entirely necessary, as you progress through the digitisation of assessments, the links between stages can cause bottlenecks in your overall processes where they are at different points in their maturity journey.

Creating the maturity model

We started from the point that accessibility is something that is baked-in to all aspects of the assessment process, and that digital assessment meets all the legal and moral requirements currently in place. While this assumption may be fair, where this is not the case, accessibility should be approached before any other changes as per this maturity model.

We have split the maturity model into four stages, with each stage denoting a significant shift in what you may observe. The stages are:

  • Approaching and understanding,
  • Experimenting and exploring,
  • Operating and embedding,
  • Transforming and evolving.

The progression from one stage to the next is cumulative; each stage builds from the one before. Within each stage, we have attempted to identify a non-exhaustive list of activities from across the wide spectrum of assessment practice as examples of what you might see at that stage.

The model is not intended to be a pathway you progress along, but rather a self-assessment checkpoint and to give you suggestions for the future of digital assessment in your institution. For many institutions, moving along the Digital Assessment Maturity Model may not be appropriate or favourable. Indeed, achieving ‘transforming and evolving’ may not be compatible with your internal practices or strategies.

Please don’t use this as a checklist to determine your stage; use it as a holistic tool to give a potential direction of travel. There will be some relation between your institutional assessment strategy, however, you should ensure an alignment between your strategy and practices over this model. The most value may come from comparing the maturity stages across all five areas of activity of the assessment process, especially if there are wide discrepancies, and to help update and renew institutional strategies.

Remember that Jisc has a range of experts and experience in supporting the development of digital assessment, purchasing frameworks and services, so please reach out to your Relationship Manager for more information.

Considerations and constraints

We have purposely avoided assessment methods and methodologies, as they are often subjective and pedagogically-driven (don’t worry, we are looking at that elsewhere in Jisc). We have tried to focus on the digital assessment process and the underlying technologies as enablers.

Introducing the model v0.2

A table showing examples of Digital Assessment Maturity. The column headings are approaching and understanding, experimenting and exploring, operating and embedding, and transforming and evolving. The row headings are assessment design, assessment creation and resourcing, assessment completion and submission, marking and feedback, and quality assurance and enhancement. Within the table are specific examples.

We have structured the Digital Assessment Maturity Model so you can scan across each area of activity of the assessment process and get a quick understanding of the differences between each stage of maturity. Each element has a summary title that gives a very basic overview, followed by some examples of what you might see.

In addition, we have created an abridged version.

A table showing examples of Digital Assessment Maturity. The column headings are approaching and understanding, experimenting and exploring, operating and embedding, and transforming and evolving. The row headings are assessment design, assessment creation and resourcing, assessment completion and submission, marking and feedback, and quality assurance and enhancement. Within the table are brief summaries of what you might see.

Key changes

Based on sector feedback, we made a number of notable changes to the model.

The most obvious change from v0.1 is the move away from a graph to a more tabular representation. While it is anticipated that institutions will move through the model from left-to-right, the graph gave the appearance that ‘rightmost was better’. Some institutions will not find that to be the case.

The second biggest change was reduction of the model from five to four stages through the amalgamation of ‘operational’ and ‘embedding’. We also took the opportunity to tweak some of terminology and provide more context for what each stage means. Although this does take us away from existing Jisc maturity model nomenclature, we felt it made the model more relevant and usable.

The third biggest change is the restructuring of information to provide a better coverage of activities and to layer content detail. While we previously had the model and blog posts for two levels of detail, we now have summary headline, what you might see, and the blog posts. This should ensure the model is useful as an in-depth guide and as an ‘at-a-glance’ tool. This may necessitate the rewriting of the blog posts, but that was outside the scope of the current work.

Further changes were made to :

  • ‘Quality management’ was changed to ‘quality assurance and enhancement’ to better match sector terminology
  • The assessment process was added, and colour-coded against the model for ease of reference
  • Additional emphasis that the examples in each stage are indicative of what you might see, rather than a checklist of things to be achieved.
  • Dropping ‘phases’ in favour of ‘areas of activity’ when describing the assessment process.
  • With a view to adding interactivity in the future, we created two versions of the model, one with ‘what you might see’ examples, and one without.

Next steps

Working alongside colleagues from Jisc and the sector, we will be further refining the model, particularly the terminology and examples at each stage. We especially hope to engage with colleagues in quality enhancement, timetabling and estates to assist with these refinements.

As the blog posts are largely redundant now that a lot more information has been moved into the model, we intend to rewrite the accompanying blog posts. We may repurpose them to provide additional context or next-steps.

If you are interested in being part of this review, please contact us. If you have any comments or questions, please add them below.

Read more

Next generation digital learning environment – my thoughts

As I mentioned previously, the Jisc ‘Next Gen Digital Learning Environment’ strand is a perfect opportunity to rethink not only how technologies support learning, but how we build an educational system to reflect how we learn.
Continue reading “Next generation digital learning environment – my thoughts”

11 handy tips for student blogging

I received these tweets the other day:

I didn’t think I could answer it in less than 140-characters, so maybe a blog post might help (well, duh!)

So here is my go at a handy list to help students new to blogging.
Continue reading “11 handy tips for student blogging”

Fitbit, learning, and why my wife is right

A couple of weeks ago, I helped host the Jisc Change Agents Network meetup, where we welcomed over 170 delegates. It was a very busy and intense day – from setting up in the morning to making sure the entire venue was returned to its correct configuration for the following day’s teaching.

stairwell-690870_1280When I got home and slumped on the sofa, feet aching and ever-so-slightly dehydrated, I opened up my Fitbit app to bask in the glory of my step-based achievement. To my horror, my Flex battery had run out halfway through the day. The potential 25,000+ steps was ripped from me, leaving just a paltry 8,000. All that effort, and what did I have to show for it?
Continue reading “Fitbit, learning, and why my wife is right”