Skill assessments grounded in real-life scenarios are essential to the Lab’s 21st Century Skills Micro-credentials. Current state of the credential earning process involves an assessment that requires the learner to demonstrate a competency or sub-competency of a 21st century skill within the context of a workplace-based scenario. These existing assessments utilize rubrics as guides and facilitators as assessors to validate the learners demonstration of their skill(s), and align with more traditional instructional models. And, as a result, these assessments can be incredibly difficult to scale while keeping time input (and access, accordingly) low for both facilitators and learners. Wide scale adoption of these types of assessments is just as important as skill validation itself. So, how might we maintain the experience and rigor of real-life learning, while reducing time input?
Automation is a key part of this. In partnership with Muzzy Lane, we’re co-designing the next iteration of our skill assessments to be available to our partners. And, once available, our partners will be able to choose between the existing facilitator-graded assessments or the iterated auto-assessment.
The automated assessment for our Critical Thinking Micro-credential takes about 30-45 minutes for a learner to complete and, afterwards, provides a summative deep dive into the learner’s skill performance for facilitators to review (viewable in an example snapshot of the dashboard above). This dashboard visualizes for a facilitator the count of correct answers (green bars) and wrong answers (red bars) for specific questions. With this visual, facilitators can spot skill deficits more easily across their students and make better decisions about how to approach closing those gaps.
The Lab’s XCredit initiative is developing additional automated assessments using a variety of technologies for three of the Lab’s Micro-credentials: Critical Thinking, Oral Communication, and Creative Problem Solving. It is critical that we explore, test, and develop new and more accessible ways to validate existing skills and assess learning outcomes in a collective effort to remove existing barriers to the digital visibility and portability of skills.