Fight for your right to be accessible: the imperative of upskilling students for a more inclusive future 

As the landscape of tertiary education (UK FE and HE sectors) continues to evolve, the integration of accessibility and accessible practices has become a focal point for educational institutions. In part, this was driven by the introduction of the Public Sector Bodies Accessibility Regulations in 2018 and by the social shift to inclusive practice over the last decade.  

While the immediate benefits of accessibility in education are often discussed in terms of inclusivity and equal opportunities for students with disabilities, there is another significant advantage that is often overlooked: the upskilling of students in accessibility practices and the wider effect this has on society, long-term.  

In the summer of 2023, I worked with South Gloucestershire and Stroud College to investigate embedding inclusive practice, and student accessibility skills was highlighted as one potential approach. This blog post aims to suggest how incorporating accessibility into tertiary education not only fosters an inclusive learning environment but also equips students with valuable skills and graduate attributes that they can apply in the wider world.

The Current State of Accessibility in Tertiary Education 

Despite legislative frameworks and guidelines advocating for accessibility, such as the Equality Act 2010 in the United Kingdom, there remains a gap in the full implementation of accessible practices within tertiary education. While many institutions have made strides in providing physical accommodations and assistive technologies, there is still room for improvement in embedding accessibility into the curriculum and teaching methodologies. 

Upskilling Students in Accessibility Practices 

Technical Skills 

Incorporating accessibility into the curriculum means that students will naturally acquire technical skills related to making digital content more accessible. This includes understanding how to create accessible websites, documents, and multimedia content. These skills are not only beneficial for students pursuing careers in technology and design but are increasingly becoming essential competencies across various sectors. 

Ethical and Social Responsibility 

Teaching accessibility as a part of tertiary education also instils a sense of ethical and social responsibility in students. Understanding the importance of inclusivity and equal access encourages empathy and a commitment to social justice, qualities that are highly valued in today’s globalised world. 

The 2018 legislation focuses only on public sector organisations, however, by embedding the skills and behaviours of inclusiveness in the education of students, those practices will propagate into private and third sector organisations, leading to a fairer, more accessible future for all. 

Problem-Solving and Innovation 

Accessibility challenges often require creative solutions. By engaging with these challenges as part of their education, students develop strong problem-solving skills. They learn to approach issues from multiple perspectives and to innovate, skills that are transferable to any professional setting. 

Employability 

The employability agenda has become a major discourse in Higher Education especially. The roles and salaries of recent graduates are used to score the institutions as producing ‘high quality graduates’. These scores hold great weight in university league tables, and therefore, have become a major focus of institutions. By developing students’ accessibility skills and behaviours, this can be a key selling point to potential employers, especially if they are public-facing and outwardly support inclusion. 

A shared journey 

In many institutions, accessible practices are only just beginning to be developed. By involving students in skill and behaviour development from the outset, it helps foster a shared and better experience and understanding. Approximately, 17% of Higher Education students have some sort of accessibility need (comparable with the general population), so given the greater number of students within an institution compared to staff, it supplies a much greater number of experiences to base recommendations and actions on. 

The Wider Impact 

When students are upskilled in accessibility practices, they carry these skills into the workforce, thereby contributing to a more inclusive society. Whether they enter the fields of technology, healthcare, public policy, or any other sector, their understanding and application of accessibility practices can make a significant impact. For instance, a student who has learned about accessible web design could influence the development of more inclusive digital platforms in their future employment. 

Be the change

The integration of accessibility and accessible practices into tertiary education serves a dual purpose. Not only does it create a more inclusive educational environment, but it also equips students with valuable graduate attributes that have far-reaching implications. As educators and policymakers, it is our responsibility to recognise the broader benefits of teaching accessibility and to strive for its comprehensive integration into the tertiary education curriculum. 

By doing so, we are not just adhering to legal mandates or ethical principles; we are investing in the future, empowering our students to become agents of change in creating a more inclusive world. 

Tour de Bases – a charity bike ride

Having told everyone, I guess I should announce my charity ride on my website too!

From Sunday 12 August, a friend (Dan) and I will be cycling 1000km over 7 days, in memory of my dad, and Dan’s mum who were lost to cancer. We will be starting in Newquay, Cornwall and criss-crossing the country until we reach Lincoln, Lincolnshire on Saturday 19 August.

We chose this route because it goes through all the English military bases my dad was stationed when he was in the RAF (the wife said a trip to Diego Garcia, Indian Ocean was out of the budget), and where Dan was stationed whilst serving in the British Army.

We are raising money for three important charities:

  1. Cancer Research UK – to help fund research so patients and their families don’t have to go through the pain of terminal cancer diagnosis.
  2. SSAFA – who support service personnel and their families (Dan’s choice)
  3. RAF Benevolent Fund – who do the same for RAF families (Marcus’ choice)

If you want to know more, you can check out our charity ride route, see more about the charities and possibly donate, view and interact with our social media updates, and follow us on a live tracker.

If you can spare a couple of quid, any donations would be gratefully received. We are funding the trip, so all money goes to charity. It helps us to keep turning the pedals knowing we’ve raised loads of money. Thanks.

Do you really care?

I’m going to ask you a question. I want you to answer the question quickly and without thinking about it. Please be honest, it’s only for yourself. Here goes:

Can you teach someone without caring about them?

Did you have a quick answer? Now you’ve had a moment to reflect on your answer, does it surprise you? I know the rational, nice, answer is that “of course we have to care”, but the instinctive (honest?) answer is often “no”.

Your initial thoughts might have gone to your students or classmates that you might not know or actually like. Out of a class of 10, 30, 200, you may not know everyone’s names, their stories, who they are…

I have been asking quite a few people this same question, and although a few might sit on the fence, the vast majority of first answers were “no”. In fact, there were very few “yes” answers at all. 

So what does that mean about education? We’re all uncaring sociopaths, or is it more nuanced?

Normally, after asking that question, I follow up with a conversation and hear similar post-hoc rationalisations such as “what about YouTube videos? They are teaching you things.”

On the surface, this is a great argument, but one I want to counter by suggesting that YouTube how-to videos, and the like, are instructing, not teaching. You may think that is a minor semantic difference and I am being overly penickerty. And maybe I am.

However, I would argue that the fundamental difference between teaching and instruction is assessment. If assessment does not occur, this is instruction/training, and if assessment does occur, this is teaching. 

There is some nuance in what we call assessment, granted. However, if there is a genuine requirement to take information or instruction and to process it, e.g. extrapolating it to different scenarios, then that is assessment. You could call it ‘authentic assessment’ if the wind is blowing in that direction and that phrase is still in vogue. Parroting facts via poorly executed multiple choice questions (the longest answer wins, or it is ‘all the above’) is pseudo-assessment and doesn’t count.

Now, I want to be very clear here: the industries of professional development (i.e. training) and education both utilise teaching and instruction/training. The best corporate training isn’t training at all, it is actually teaching. In the same way that the worst teaching is actually training as the assessments and what’s taught don’t align or have value. There is a place for both, and both will likely exist concurrently without any negative value judgement. But there is a fundamental difference: assessment.

So, coming back to the original question, my hypothesis is that assessment is an act of care about your students and their development. To instruct or train, without the opportunity to test if that knowledge or skill has been developed – i.e. usable in other contexts – is not teaching, and it isn’t an act of caring. You are passing across information to your students without any interest in how it might affect their lives. 

Assessment is the activity we use to show we care about not only what our students have learned, but also what change in their actions and behaviours that learning entails.

It’s worth noting that an act of care may not be personalised; but, if you are teaching them, there is an intention that they develop even if you don’t like the person. It’s also important to consider time and space within this hypothesis. I also suggest that displacement of time or location does not matter. This means that it is possible for asynchronous, online learning to be teaching. What matters is the intention of the educator.

If I had asked the initial question differently: do you teach to make a difference to your students’ lives? Then I think the answers would have been overwhelmingly positively. If my thesis is correct, then you have the mechanism to know how and why you do care. It might help you think about making your assessments more meaningful.

NB: There are no references in here because I wanted to extract thoughts from my brain and get them down on the page without the risk of it being actively viewed through someone else’s lens. I would very much like your thoughts and critique of this thesis. I may return to provide more explanation or rigour in the future, but I also may not.

Digital Assessment Maturity Model v0.2

Release notes

We released v0.1 of the Digital Assessment Maturity Model in March 2023, and spent the rest of the month talking to people from the sector and refining our initial work. This blog post lays out the version 0.2 of the model and outlines how it may be used.

Current assessment landscape

Assessment processes are complex [Citation needed]. Different universities do things differently and use differing terminology, however, the general process is somewhat similar. Our first step in developing a maturity model was to attempt to define the assessment process, shown below:

A flow diagram taking you through the five stages of assessment: design, creation and resourcing, completion and submission, marking and feedback, and quality assurance and enhancement. It is a highly branched process with many dotted lines.

Note, there will be considerable institutional nuance, but the process chart aims to capture most of the activities that take place to ensure assessments run effectively.

Our second step was then to identify and define areas of activity within the assessment process. We have called these:

  • Assessment design
  • Assessment creation and resourcing
  • Assessment completion and submission
  • Marking and feedback
  • Quality assurance and enhancement

These were chosen because they often require the work of different groups of people within the institution. For example, timetabling and estates teams may only focus on assessment creation and resourcing activities in relation to scheduling on-campus digital exams. It also allows us to potentially tag relevant resources in the future.

We are aware that this is an incredibly simplified process map compared to sector practice, and that each of the five activity areas is inextricably linked to each other – even if that is not immediately apparent from the process map.

Growing digital assessment

Digital assessment has increased in UK Higher Education over the last five years, driven by underlying strategy and greatly accelerated due to the COVID-19 pandemic. Universities and Colleges may have found themselves un- or under-prepared for a mass move to online exams, for example, and innovation practice happened at pace.

In these circumstances, enhancements to digital assessment may not have occurred evenly across all aspects of the assessment process. You may have concentrated efforts on creation and resourcing, or completion and submission, as these are the most visible to students, and considered digitising the quality assurance and enhancement element less so.

It is worth noting that while even enhancements across the five activity areas is not entirely necessary, as you progress through the digitisation of assessments, the links between stages can cause bottlenecks in your overall processes where they are at different points in their maturity journey.

Creating the maturity model

We started from the point that accessibility is something that is baked-in to all aspects of the assessment process, and that digital assessment meets all the legal and moral requirements currently in place. While this assumption may be fair, where this is not the case, accessibility should be approached before any other changes as per this maturity model.

We have split the maturity model into four stages, with each stage denoting a significant shift in what you may observe. The stages are:

  • Approaching and understanding,
  • Experimenting and exploring,
  • Operating and embedding,
  • Transforming and evolving.

The progression from one stage to the next is cumulative; each stage builds from the one before. Within each stage, we have attempted to identify a non-exhaustive list of activities from across the wide spectrum of assessment practice as examples of what you might see at that stage.

The model is not intended to be a pathway you progress along, but rather a self-assessment checkpoint and to give you suggestions for the future of digital assessment in your institution. For many institutions, moving along the Digital Assessment Maturity Model may not be appropriate or favourable. Indeed, achieving ‘transforming and evolving’ may not be compatible with your internal practices or strategies.

Please don’t use this as a checklist to determine your stage; use it as a holistic tool to give a potential direction of travel. There will be some relation between your institutional assessment strategy, however, you should ensure an alignment between your strategy and practices over this model. The most value may come from comparing the maturity stages across all five areas of activity of the assessment process, especially if there are wide discrepancies, and to help update and renew institutional strategies.

Remember that Jisc has a range of experts and experience in supporting the development of digital assessment, purchasing frameworks and services, so please reach out to your Relationship Manager for more information.

Considerations and constraints

We have purposely avoided assessment methods and methodologies, as they are often subjective and pedagogically-driven (don’t worry, we are looking at that elsewhere in Jisc). We have tried to focus on the digital assessment process and the underlying technologies as enablers.

Introducing the model v0.2

A table showing examples of Digital Assessment Maturity. The column headings are approaching and understanding, experimenting and exploring, operating and embedding, and transforming and evolving. The row headings are assessment design, assessment creation and resourcing, assessment completion and submission, marking and feedback, and quality assurance and enhancement. Within the table are specific examples.

We have structured the Digital Assessment Maturity Model so you can scan across each area of activity of the assessment process and get a quick understanding of the differences between each stage of maturity. Each element has a summary title that gives a very basic overview, followed by some examples of what you might see.

In addition, we have created an abridged version.

A table showing examples of Digital Assessment Maturity. The column headings are approaching and understanding, experimenting and exploring, operating and embedding, and transforming and evolving. The row headings are assessment design, assessment creation and resourcing, assessment completion and submission, marking and feedback, and quality assurance and enhancement. Within the table are brief summaries of what you might see.

Key changes

Based on sector feedback, we made a number of notable changes to the model.

The most obvious change from v0.1 is the move away from a graph to a more tabular representation. While it is anticipated that institutions will move through the model from left-to-right, the graph gave the appearance that ‘rightmost was better’. Some institutions will not find that to be the case.

The second biggest change was reduction of the model from five to four stages through the amalgamation of ‘operational’ and ‘embedding’. We also took the opportunity to tweak some of terminology and provide more context for what each stage means. Although this does take us away from existing Jisc maturity model nomenclature, we felt it made the model more relevant and usable.

The third biggest change is the restructuring of information to provide a better coverage of activities and to layer content detail. While we previously had the model and blog posts for two levels of detail, we now have summary headline, what you might see, and the blog posts. This should ensure the model is useful as an in-depth guide and as an ‘at-a-glance’ tool. This may necessitate the rewriting of the blog posts, but that was outside the scope of the current work.

Further changes were made to :

  • ‘Quality management’ was changed to ‘quality assurance and enhancement’ to better match sector terminology
  • The assessment process was added, and colour-coded against the model for ease of reference
  • Additional emphasis that the examples in each stage are indicative of what you might see, rather than a checklist of things to be achieved.
  • Dropping ‘phases’ in favour of ‘areas of activity’ when describing the assessment process.
  • With a view to adding interactivity in the future, we created two versions of the model, one with ‘what you might see’ examples, and one without.

Next steps

Working alongside colleagues from Jisc and the sector, we will be further refining the model, particularly the terminology and examples at each stage. We especially hope to engage with colleagues in quality enhancement, timetabling and estates to assist with these refinements.

As the blog posts are largely redundant now that a lot more information has been moved into the model, we intend to rewrite the accompanying blog posts. We may repurpose them to provide additional context or next-steps.

If you are interested in being part of this review, please contact us. If you have any comments or questions, please add them below.

Read more

AI vs AI: a story

A bead of sweat trickled slowly down my forehead. I wiped it away quickly before reaching for the bottle of water in front of me. Taking a quick swig, I wished it was something a little stronger. How did it come to this?

Across the table from me were two police officers. They couldn’t be more different if they were picked out of a cheap novel from an airport bookstore. One, the one who seemed to be in charge, stood up and walked around the room and leant against the wall in the corner. The shadows made it difficult to make out his features, but he was tall and lean. His hair was neat and face shaved smooth, and his demeanour quiet and reserved. He barely spoke throughout but when he did, his voice was deep, with the hint of a northern accent.

His partner was a talker, his arms emphasizing every point. He was unshaven, his hair dishevelled, and you could tell he was a couple of lucky life choices from sitting on the other side of this desk. His name was Smith, and he smelt of raspberry vape. His accent was local, and he was the sort of person who had never left London since the day he was born. You could see a glimpse of the West Ham crest peeking out from his rolled up sleeves.

Smith leant forward and lowered his voice. “I’m going to ask you again, where were you on the afternoon of 22 January? And this time I don’t want any excuses or bullshit. I want the truth.”

I looked him in the eyes “I was at uni. All day. I had lectures in the morning, then grabbed some lunch. I popped into Dr Yafiq’s office to ask him a question, then sat down with my laptop before my seminar at 3pm. After that had finished, I hung around for a while for questions, then I went home.”

“Would ya say that it was a busier day than normal?”

“No, it was pretty typical, I am normally rushed off my feet. At least I got lunch that day – but only because Dr Yafiq’s office is on the opposite side of the canteen.”

Smith looked at me, narrowing his eyes. “and what about the night before”

“I told you, I have a part-time job in a bar in the evening. I was there from 7pm until we closed up at about 11:30pm.”

“And after, I expect you went home, did you?” Smith asked

“Yes. I had some food at work at the beginning of my shift, so I went straight home. I had a quick look at my essays but decided I was too tired, so went to bed.”

“You see, somethin’ doesn’t quite add up for me. You had all this work to do, however many thousand words, and I can’t see how you could squeeze that into your busy life.” Smith said.

I squirmed in my seat. “I know it looks bad, but I didn’t do anything wrong. My life is really busy, I knew that it would be when I came here.”

“Too busy to do everything your meant to? Or do things get missed out?”

“I try to do as much as I can…”

“But there’s never enough time is there? Not if you wanna do a good job, eh?”

“I prioritise though – I make sure nothing comes before my work”

“And you have enough time to do that?” Smith raised an eyebrow.

“I try my best, I’ve always tried my best.” Smith stares at me. “No-one cares how hard you try. It’s about what’s on the fancy piece of paper at the end of day. So you do what you need to.”

Smith sits back in his seat and folds his arms. He stares intently at my face, studying every feature. He picks up a piece of paper, stares at it for a second, and then waves it towards me.

“It says here your submission day was on the 22 January too.”

“Um, yeah, if you say so,” I said.

“You know exactly when it was. You’ve got it written in your calendar.” He picks up another piece of paper from the file and points at it.

I lean forward to read it, but before I can, Smith pulls it away. “Would you say you are pretty organised?”

“I try to be, as much as I can.”

“Try to plan ahead, do ya?”

“I don’t really have a choice. You have to be otherwise you will sink.”

Smith stands up and walks around the table, and crouches next to me, his eyes at my eye level. I can smell the hint of raspberry on his clothes. His voice softens. “Tell me about this particular piece of work, I want to know all the details.”

“Erm, it’s just one of the things I have to do. Professor Blenkinsop told us we needed to do it. It’s 5000 words. A reflective account.”

“Go on…”

“Well, it is like four parts, four different examples. They said we should write about a thousand words for each part, and then a thousand words for the introduction, conclusion and overall reflections.”

“How long have you known you’ve had to do it?” Smith puts his hand on my shoulder. “It’s okay.”

“I’ve had the deadline in my calendar for months.” I look down. I can feel myself getting upset. I choke it back. “It’s not like they help though. Yes, they give you ‘advice’ about what to write and how to structure it, but not once did anyone check to see if I had done it right. I don’t know how to write these things, it’s not my fault.”

Smith pauses. He glances at his watch. “What help did you want?”

“I don’t know. Anything. Like someone to read a draft or something. None of us knew how to write reflectively. We hadn’t done anything like this before. If it was an essay, that’s fine. I know how to write an essay, I am good at writing essays, but not this.”

“Were there opportunities to meet with someone to discuss your work before you submitted?”

“Yeah, but not when I could go. I was busy every time. It’s not like they were at times I could make. We kept getting reminders there were help sessions but none of us could ever get there because we had lectures or whatever.”

“So what you are saying is that you really didn’t have a choice? You were forced into cheating”

“No, no that’s not what I’m saying… I, er, it’s just that it’s tough. It wasn’t on purpose. I didn’t try to cheat anyone, I just needed help that I couldn’t get at the uni.”

Smith stroked his chin. “So you felt helpless then?”

I shook my head. “Well, no. But. Look, it is complicated. It’s the same assessment as they’ve had for years. I spoke to other people who did it last year and they said it was a tick box exercise. I mean, if I’m not going to learn from it, why should I care? There’s loads of stuff online about it, from other unis. I mean, I looked at that stuff, that isn’t a crime.”

“No, it isn’t. But what you did was, wasn’t it?”

“You keep saying so. You are assume it is, but I don’t think it is that clear”

The officer in the shadows shifted his position. Smith glanced over his shoulder at him, nodded and turned back to me.

“So why don’t you tell us? What did you do?”

“Well, I had heard about this AI thing that will give you an answer when you ask it a question.”

“And…” said Smith.

“And I was playing around with it, and was testing it out different questions to catch it out.”

“So it was a bit of a game then?”

“Well kind of. It seems really clever. I don’t know how it works, something to do with scanning the internet and working out the best possible answer to a question.” I took a deep breath. “So I thought it might be a good idea to test it on my work.”

“Can you tell me what happened then?”

“Well, the answer it gave was pretty good. You can change what you ask it and it will run the answer again and results come out even better.”

“So what did you do?”

“Well I asked more questions and gathered together all the answers, put them in a Word doc, and edited them together”

“So what you’re saying is you let a machine come up with all your answers. You didn’t do the work at all?”

“No, I did the work. I asked the right questions, I edited and rewrote the answers. It’s my work.”

“If I had done the same, would I have come up with the same answers?”

“No, probably not. You might have worded your questions slightly differently, and anyway, the answers come out different each time.”

“Like when someone is plagiarising something, they change a few words here and there…?” Smith left the question hanging.

“I don’t know about that, but this was my work. I added my own reflections in there and changed the details.”

“But what you are saying is this AI did the heavy lifting?”

“Yes.” I took a deep breath. “It wasn’t cheating. It wasn’t like I had copied someone.”

“You know these are serious allegations”.

I looked down. “I know.”

“I wouldn’t say you’ll be unemployable, but I know McDonalds are getting pickier about who they hire”

“I’m pretty much unemployable anyway. Universities hardly hold the kudos they used to.”

The tall, lean officer moves out of the shadows and taps his colleague on the shoulder. They swap places in the seat. It is a smartly choreographed move like they had done this many times before.

“I don’t think I properly introduced myself. My name is Kitchener. I just want to get the bottom of this so we can all move on.”

“I want that too,” I said.

“Should I tell you how this AI gets to be so clever?”

“Erm, okay”

“Well, it scrapes the entire internet. Gathering up everything anyone has ever written and published. Millions and millions of words. Then it does some clever mathematics and works out what word is the most likely to come next. So when you ask it a question, it just calculates the probability that this string of words is most likely to come in that order.”

“I didn’t know that”

“And here’s the thing. It doesn’t know what any of the words mean. It hasn’t a clue. In fact, it doesn’t care. It knows that if the first word is ‘red’, it has to decide whether the next word is ‘bus’ or ‘tomato’.”

“A bit like autocorrect?”

“Yes, something like that… but really it’s just copying whatever everyone else has written.”

“But what does that mean for me?”

Kitchener straightened out his jacket. “Ah yes, I was getting to that. So what I’m saying is that you used AI to copy other people’s work. You didn’t know, or care, whose work you copied either. There are no references, no attribution. These were other peoples’ words.”

“But it’s not like that…”

Kitchener jumped in before I could finish. “More than that, you did it because you could.”

“I… I…”

He adjusted the cuffs under his suit jacket. “In this game, we talk about crimes needing three things: a motivation, a means, and an opportunity. Do you understand?”

“Er… yes”

“Well, let me run this past you. You are tired, overworked without the time to do your work to a high enough standard: motivation. You heard about this new tool that will write thousands of words for you with a little prompt: means. And as you said yourself, the work was ‘a tick box exercise’ and didn’t really matter: opportunity. What do you say?”

I sat there blinking.

Smith leaned over, “Ain’t that a pickle.”

[Fade to black]

Epilogue: Dr Jones received a formal warning on her HR record and resubmitted her Advance HE Senior Fellowship application after re-writing in her own words. She was successful. She has since left her HPL role and found a permanent job at another university. She no longer works part-time in a bar.

1000 weeks and counting

This is a nascent blog post, or notes, from about 2017 that I've decided to share.

1000 weeks. It doesn’t sound much. 7,000 days. 168,000 hours. 10,080,000 minutes or even 604,800,000 seconds.

If we go the other way, 1000 weeks is about 230 months, or just over 19 years. It represents the time from the literal cutting my children’s cord to the metaphorical one. 1000 weeks is my mantra when I’m struggling as a parent: “It’s only 1000 weeks until we have got rid of the kids”.

1000 weeks represents a childhood.

I started saying it when my daughter was born a few years ago. I’m still saying there’s just 1000 weeks.

1000 weeks represents the time for someone to transform from a parasitic mush of their parents DNA, to become a real person. Who can vote. Who can love. Who can lose. Who can be themselves.

What is childhood? I don’t know, but there’s a lot to fit in.

Digital Assessment Maturity Model (alpha-release)

This originally appeared on the JiscInvolve Co-design blog

When the COVID-19 pandemic hit in early 2020, it forced UK education institutions to shift teaching online. Many of us assumed that it might only be for a few weeks, but then as spring gave way to summer, it was clear that assessments would have to change markedly and quickly. Three years on, we have reached a point where we can reflect on these rapid changes in practice. Maybe you introduced a number of new software services and are struggling to embed their use, or maybe you feel you failed to capitalise on some of the benefits of digitalisation throughout your digital assessment process.

At Jisc, we have begun the process of classifying digital assessment maturity model to allow institutions the chance to assess their current level of maturity and use it to plan their next steps.

Creating the maturity model

We have split the maturity model into five stages, with each stage denoting a significant shift in practice. The stages are:

  • Approaching and understanding;
  • Experimenting and exploring;
  • Operational;
  • Embedded;
  • Transformational.

The progression from one stage to the next is cumulative; each stage builds from the one before. Within each stage, we have attempted to identify activities from across the wide spectrum of assessment practice within an institution. After introducing the generalised model, you will find more detailed sections for each phase of the assessment process below.

We didn’t intend for the model to be a linear pathway you progress along, but rather a self-assessment checkpoint for your institution (previously I did an individual digital literacies self-assessment tool) and to give you suggestions for the future of digital assessment in your institution. For many institutions, moving along the digital assessment maturity may not be appropriate or favourable. Indeed, achieving ‘transformational’ may not be compatible with your internal practices or strategies.

Please don’t use this as a checklist to determine your stage; use it as a holistic tool to give a potential direction of travel. Remember that Jisc has a range of experts and experience in supporting the development of digital assessment, so please reach out to your Relationship Manager for more information.

Considerations and constraints

We have purposely avoided assessment methods and methodologies, as they are often subjective and pedagogically-driven (Jisc are looking at that elsewhere). Therefore, it was important to focus on the digital assessment process and the underlying technologies as enablers.

We identified five phases of the assessment process: design, creation and resourcing, completion and submission, marking and feedback, and quality management. Although institutions may have their own unique way of approaching digital assessment, we anticipate these areas are suitable for all. However, these phases may appear linearly or concurrently, and different teams in your institution may out the related activities. We will be following up this post with a series that unpacks each area in more detail, giving specific examples of some of the activities you might see for each phase.

Introducing the digital assessment maturity model

A graph background, with a exponential curve increasing from left to right. The graph is split into five sections horizontally. The left-most section is labelled 'approaching and understanding', below the label it says "digital assessments may be used as a supplement to traditional paper-based assessments. Technology is used primarily for administration and data collection purposes'. The second stage is titled 'experimenting and exploring', with below "Institutions are actively exploring and experimenting with different digital assessment tools.". The middle section is titled 'operational'. Underneath it says "Institutions are beginning to 
integrate digital assessment tools 
into their courses and have implemented processes to ensure 
that their assessments are reliable 
and valid." The fourth section is called 'embedded', the text says "Technology is used to provide adaptive and personalised 
learning experiences, to support collaboration and co-creation 
among students, and to provide 
real-time feedback and 
data analysis". The right-most section is called 'transformational', with text "Technology is used to support 
the full integration of assessment 
and instruction, and to support 
the use of data for continuous improvement of process 
and performance." There are a number of examples of evidence below that are elaborated on in the accompanying text.
Digital Assessment Maturity Model. Emma Beatson, 2023

Digital assessments and the digitalisation of the processes surrounding assessment may be rare or non-existent at the approaching and understanding stage. Many activities will be paper-based and manual. Where technology is used to support assessment, it is primarily around administration or data collection. Assessments will be designed to be accessible to all students, which may include the use of digital alternatives, assistive technologies or interventions, such as amanuensis.

Institutions that are actively experimenting and exploring may be using digital assessments in small-scale trials, or looking at alternative assessment modes and methods. This may include automated knowledge-check MCQs (multiple choice questions) and formative assessment. They are beginning to understand the benefits of using technology to assess student learning, and to build the skills and knowledge required to effectively implement digital assessment tools. Institutions may be looking at BYOD (bring your own device) or other policies to ensure students have access to suitable learning technologies.

When digital assessment is operational, it may be used to enhance traditional assessment process and to add optimisations into the assessment process. Essays are submitted via a VLE, exams may have moved online, and the use of anti-plagiarism software may be commonplace. There may be processes and policies designed to ensure digital assessments are consistent, reliable and valid. Discrete systems to support assessment may start being integrated and specified in the curriculum. There is suitable infrastructure and resource to support digital assessment, including Wi-Fi, hardware, and software. Staff and students are trained and supported in the use of new technologies for assessment.

Digital assessments become fully integrated and embedded into the curriculum, when technology begins to provide adaptive and personalised learning and assessment experiences for students, and data analytics for staff use. Assessment technologies are integrated with various institutional systems, such as the VLE, student record system, and curriculum management system, to allow the creation of a holistic picture of student performance. Analytics may be used to trigger interventions for students at risk of failing. There may be purpose-designed spaces on campus for student assessment.

At the transformational stage, digital assessments are used to support and enable different forms of pedagogic practice, such as problem-based learning, competency-based education and personalised learning. Data generated by assessment is used for continuous improvement of process and performance. Institutions have fully integrated digital assessment tools into their teaching and learning. They are actively and critically exploring new technologies and approaches for assessment, and are beginning to use predictive analytics to better understand student learning.

Digital assessment design

When approaching and understanding assessment design, most assessment will be simple, and often paper-based. The design process is often manual. Digital assessment may be sometimes used to supplement traditional assessment methods.

Experimenting with digital assessment may introduce additional modes of assessment, such as MCQs, and may use existing technologies, such as the VLE tools to introduce elements of question randomization and auto grading of basic assessments. The assessment process may involve increased digitalization. The use of digital assessment may be focused on low-/no-stakes formative assessment initially. Technology use is primarily aimed at replicating existing methods of assessing.

As things become operational, elements of personalization may become apparent, especially to support additional needs. There may be some level of automation of the design process through the use of rubric builders or shared resources. Disciplines may receive additional software to support subject-specific requirements, such as mathematical notation. Assessment methods or approaches may be augmented through the use of technology. Specific technologies may be used beyond the VLE, such as in-class MCQ services.

The shift to embedded digital assessment, will see the design process informed by data analytics and may include elements of machine learning to determine the optimum assessment strategy. There will be increased personalization designed into assessments. The use of automated knowledge and application checks as formative assessment will be extensive, giving students real-time feedback on their performance. The assessment methods or approaches may be significantly modified through the use of technology.

In a transformational scenario, assessments are fully dynamic, constantly updated, and personalised to each student. Machine learning algorithms may refine the assessment design to optimise the assessment experience. Assessment design will be integrated into a larger digital learning ecosystem, made of a range of specific services to enable variety and flexibility for staff and students. Assessments will assess a wide-range of skills and applications of knowledge. Assessment design is likely significantly reimagined through the use of technology.

Digital assessment creation and resourcing

At the level of approaching and understanding, assessments are often limited to a small number of assessment types and are manually created. The timetabling and administration of assessments is a manual, paper-based process, relying on the labour of academics and administrators. There may be a lack of consistency in process and practice, potentially leading to errors and scheduling conflicts.

When experimenting and exploring digital assessment creation and resourcing, you may begin to see the use of standardized templates and question banks. There may be some level of automation of the creation process, and the integration of multimedia or interactive elements. Spreadsheets or calendars may be used to record and plan exam and assignment timetables. There is little integration of these systems with others relating to the assessment process.

As digital assessment creation is operationalised, the creation process is more automated and may offer rule-based differentiation or personalisation of assessments, such as the use of adaptive release in VLEs. There will begin to be a level of integration between timetabling, assessment management, VLE, curriculum management, or Student Information systems. Timetabling is fully-automated but may rely on some human ‘tidying’. Assessment load and bunching can be identified and acted upon. Additional resources will be available to students to support them to develop their assessment skills, often within the assessment system itself.

When digital assessment is embedded, integrated timetabling, assessment and student information systems will automatically generate a personalised, flexible and adaptive assessment. Students may have the opportunity to choose the mode of assessment to evidence meeting learning outcomes. The creation process will use data from historical assessments, student performance and demographic data to optimise the creation and resourcing of assessment.

Digital assessment creation and resourcing may be transformational when assessments are fully dynamic. This means the modes, scheduling, and resourcing of assessments may be variable and flexible. Students may be able to complete assessments according to their own schedules, rather than follow the institution’s calendar. Timetabling may be fully automated and actively avoids bunching, and may allow personalisation by booking of assessment windows. Curriculum data from other systems may be used to automatically generate assessment points.

Digital assessment completion and submission

There may be no digital element of assessment at the approaching and understanding the completion and submission process. While students may use word processors or other software to generate their work, submission may be paper- or storage media-based. There is a limited use of technology to manage and track submissions; this is likely paper-based or a standalone system, such as a spreadsheet. In some instances, assessments are printed out for marking, feedback, and archival purposes.

At the experimenting and exploring stage, digital platforms are used for submission, such as email, Google Forms, OneDrive, or basic VLE tools — but this is not consistent. Managing and tracking of submissions is carried out within the system, and may provide students with a digital receipt or record of submission.

When digital assessment is operational, there is use of digital submission of assessments for most modes. Often there will be specialist assessment tools, such as Turnitin or Wiseflow. There may be a level of automated plagiarism checking or grading present, as well as the ability to provide instant feedback (especially in the case of MCQs). Late submissions are automatically flagged, and there may be a notification system for staff and students. There may be some level of automation of reasonable adjustments, but this may rely on significant manual intervention.

The embedded level is similar to operational, but practice is more consistent. The digitalisation of non-traditional assessments, such as videoing presentations or scanning artwork, may be more commonplace, allowing digital marking and feedback. There may be more robust policy and processes to govern submitted media and its archival.

Transformational digital assessment may include the use of analytics and data visualisation to explore long-term trends and personalise student support. Reminders or hints could be released based on student activity, e.g. flagging students who haven’t accessed content or module handbooks. The process for providing additional support or reasonable adjustments may be fully automated, based on student needs, assessment type, and support available. There may be some level of integration between assessment systems and the tool used to complete the assignment.

Digital marking and feedback

The marking and feedback for assessments is manual at the approaching and understanding stage. Staff will grade assignments and provide hand-written feedback to students. The feedback process is often time-consuming and may not be consistent across teachers or assessments.

At the experimenting and exploring stage, there may be introduction of some level of automation in the process. Multiple choice questions may have pre-determined grading and automated per-question or generalised feedback. Feedback begins to be delivered in digital formats. Academic staff have access to suitable equipment to support digital marking, such as two monitors or input devices.

At an operational stage, personalised, digital feedback is provided for each student. There may be some element of aggregation and analysis of marking and feedback to provide both longitudinal development for individual students, and cohort analysis. Academic staff have access to multimedia creation resources in order to provide marking and feedback in the most appropriate manner. For group assessments, it may be possible to mark and feedback based on whole-group and individual contributions within the system. Rubrics will be used, but may not be marked against digitally. To support staff, assessment content may be converted to different media types to best support their working practices or additional needs. There may be provision for second-marking, multiple markers, team marking or batch marking within cohorts.

When digital assessment is embedded, there may be the incorporation of analytics and predictive models to provide adaptive, automated feedback. Feedback may be aggregated for individual students to allow them to note patterns. Rubric use is systematic and wide-spread, with grading and associated feedback digitally captured, and shared with other systems. Group assessments may allow peer assessment or weighting. The use of standard feedback phrases may be partially automated or suggested. Second-marking, multiple markers, team marking, or batch marking may be fully automated based on pre-set criteria.

To be transformational, the marking and feedback process is dynamic and adaptive. Aggregated feedback will prompt skill development for students, and will highlight suitable resources and support opportunities automatically. The use of AI or machine learning may automate some of the marking and feedback process. Feedback may be constantly updated and optimised based on real-time student data. Academic integrity detection may encompass new technologies and methods, such AI authoring. This may include automated checks, or the ability to determine document histories or analysis of snapshots during the production of the assessment submission.

Digital quality management of assessment

When approaching and understanding digital assessment, student assessment data may be stored in ad-hoc, basic tools, such as Microsoft Excel. Sharing data may only take place via email or on-premises network storage. Exam boards may be paper-based, in-person meetings, with manual processes to support them.

You will see the inclusion of digital assessment into the process and management of assessments at the experimenting and exploring stage. Rubrics and standards may be evident, but there may be limited use of technology to support quality control.

At an operational stage, robust quality measures are evident, including peer review or testing. There are likely to be specific software packages through which to record, distribute and analyse academic outcomes. Assessment data will be passed between relevant systems to allow the automation of grade return, feedback collation and management, and exam board processes. Where not explicitly designed out, the use of plagiarism and academic integrity tools may be used.

The embedded stage involves more sophisticated quality control measures, such as digital processes to support external validation, sampling, or accreditation with disciplinary bodies. Assessment data and validated results will be synchronised across multiple systems, and may be used to provide insight into student performance and predictive analytics.

Where digital assessment is transformational, the advanced use of data analytics and machine learning to monitor and improve assessment quality. Assessment data management from a variety of sources will be fully integrated and seamless.

Conclusions

This work is at its very early stages, and we are looking to road test it with the sector. We have plans to work with colleagues from various universities to further refine this model, with the hope that it provides value to senior leaders, timetabling managers, IT directors and Quality Management leaders.

Please contact me if you are interested in being part of this review. If you have any comments or questions, please add them below.

An experiment with ChatGPT

I’ll start off by saying that I am not a fan of AI – mainly because most of the ‘fan-boi’s are so non-critical and sometimes obnoxious – and I am certainly not a fan of private companies exerting power over people by controlling public spaces (looking at you Google and Twitter in particular). However, ChatGPT, and it’s underlying LLM (large language model) GPT3, are getting a lot of press so I thought I should dip my toe in that particular sea. I did make a prescient suggestion in 2017 about AI and assessment.

Continue reading “An experiment with ChatGPT”

Getting ideas fast

// First published on JiscInvolve Codesign blog

On Tuesday 22 November, Harvey Norman and I were asked to run a session at the Jisc Student Experience Experts group in Birmingham. We took the opportunity to introduce the Pathfinders initiative and to give the delegates a chance to experience one of the many methods we use as part of our innovation process.

Continue reading “Getting ideas fast”