Ant Pugh is great at breaking things down into 4 minute videos and has some great knowledge to share about online training.
Ant Pugh is great at breaking things down into 4 minute videos and has some great knowledge to share about online training.
“Assessment automation is a key step to – finally – unlocking the real potential of e-learning”
There’s growing recognition that automation, artificial intelligence, and electronic assessment tools have an important role in the future of e-learning. But e-learning company Enrolo sees assessment automation as a vital step towards transforming that future.
“Assessment automation is a key step to – finally – unlocking the real potential of e-learning,” says Enrolo CEO Adam Menary. “Like many companies, we’ve been exploring a range of new innovations and approaches to e-learning. Though we’ve also been approaching e-learning in a way that draws upon the best thinking and best practices from other industries.”
Beyond e-learning, Mr Menary has extensive experience in HACCP, risk and quality management systems. “These involve a systematic approach to assessing performance and achieving continuous improvement”, Mr Menary said. “This approach has been missing from e-learning because most data and systems are focussed on the learner. This can be useful, but ultimately, it’s a bit like trying to understand a tennis match when you’re seeing only one of the players and one side of the tennis court.”
So why does assessment automation matter? “It’s about advancing the assessment process with new technology-based tools, data and innovative approaches,” said Mr Menary. “Through our research projects, we’re starting to see that automation tools can transform the role of assessors, amplifying their ability to deliver value, both for students, and across the entire e-learning process.”
Enrolo CEO Adam Menary talked with -e-learning analyst Paul Wilson over coffees and Skype.
Question: What is it about assessment that has been missing from e-learning?
Enrolo CEO Adam Menary:
Two things. Firstly, we’ve missed the opportunity to leverage the knowledge, value and insights of that assessors can provide.
In the drive to get e-learning content online and to get student numbers, the actual assessment process has received little attention. Without new assessment tools to match the pace and scale of work generated by learner-side technology advances, the ability of assessors to contribute value has been gradually diminishing. This can lead to a cost-focussed view of assessment, and with that, risks to learning outcomes – for example, through oversimplification of questions to reduce assessment costs, or reduced quality of assessment and feedback.
More broadly, in a competitive online sales environment this situation risks an unhealthy low cost and low value “race to the bottom” scenario for the future of e-learning.
However, it does not need to be this way.
The assessor is the person on the other side of the screen in a student’s e-learning experience. They can be the drivers of e-learning value and continuous improvement. They have front-line contact with students; they provide assessments and feedback that helps students confirm and improve their understanding; they also gain important insights into the strengths and weaknesses of e-learning content. They have the knowledge and insights to make a difference – and that’s usually the reason they’ve got into education in the first place.
Secondly, capturing rich data in the assessment process is essential to an integrated and scientific approach to continuous improvement across the entire e-learning process and organization. While other data is also important, data from the assessment process complements learner data, and helps provide a more complete view of overall e-learning performance.
Question: What do you mean by an “integrated and scientific approach”?
Enrolo CEO Adam Menary:
HACCP (food safety) systems use continuously-collected data to help ensure that food is safe to eat and free from contamination. Historically, e-learning has lacked that grade of data. For example, while SCORM involved collecting data that a student had successfully completed an assessment, SCORM might not be configured to keep track of which questions they answered, which they failed, where they got help or feedback, and whether that made a useful difference. This meant the data didn’t support efforts to statistically or scientifically track down areas where issues were occurring, or to identify whether changes were actually improvements.
The limitations of SCORM have been widely recognised, and xAPI emerged to address some of these. However, the focus remains on the learner and capturing learning-related data. Complementing this with data specifically about the assessment process can do two things: support work to streamline and improve assessment activities, and also provide insights about the overall performance of e-learning.
Question: Do you have any thoughts on how this might affect assessment from a learner’s perspective?
Enrolo CEO Adam Menary:
Yes. Multiple choice questions, true/false, checklists and the like are basic online forms of traditional assessment approaches but today they should be viewed by educators as simple check your progress tools rather than actual assessments. And sure, like others, Enrolo has done some interesting and innovative things with interactive check your progress. However, across the industry, e-learning has been constrained by a particular horizon of assessment possibilities, based on the costs, abilities, and practicalities of an assessor working in somewhat traditional manner. Automation, data and tools can help expand that horizon of assessment possibilities – and so offers further opportunities to transform e-learning.
And really, this is essential. Some of e-learning’s traditional assessment approaches are not that great. They can be blunt and ineffective tools for confirming a learner’s understanding.
We’re starting to explore ways to completely change assessment tasks, and learning content, based on the new possibilities of automation-enabled assessment. For example, what might it mean to have more free-form styles of assessment task, or much faster cycles of submission and feedback. Or imagine having assessment feedback going beyond comments – for example, using it the assessment feedback to dynamically reconfigure e-learning content to leverage a student’s strengths, or provide extra examples and explanations to address areas of weaknesses.
As the traditional true/false, pick from a list, or drag and drop type questions are relegated to check your progress aids I think we’re going to see some really important innovations and powerful new approaches to assessment and meaningful learning outcomes.
Today’s corporate leaders recognise that the pace of technological change is creating a major skills problem for their organizations. In a recent study, Deloitte found that 90% of CEOs believe their company is facing disruptive change driven by digital technologies, and 70 percent say their organization does not have the skills to adapt (Deloitte 2017: 30). Indicating some of the factors driving this problem, Deloitte point to the changing nature of a career, as follows.
The changing nature of a career (Deloitte 2017:30)
Clearly, with such frequent job changes and rapid skill obsolescence, there’s a significant demand placed on continuous learning and skills development. Thus, in many ways, individual and organizational performance now critically depends on the capability of e-learning providers to meet these demands. But how can e-learning leaders and innovators respond to this problem?
It’s useful to first get a little historical perspective.
The skills challenges outlined by Deloitte are the modern outcome of a long-term trend.
In earlier times, widely-used technologies and skills would remain mostly unchanged for generations. People could learn everything they needed from their parents and grandparents. Broader knowledge was sparse and slow-changing. For example, in Europe, Burke (1985) explains that Martianus Capella’s (5th century) book on the seven liberal arts became “all the education there was for the next 700 years”.
In later times, as knowledge and technologies started to change, schools and universities provided intensive bursts of education – ideally enough to last a lifetime. However, the pace of change has increased, leading to the situation described by Deloitte. For an overall perspective, the long-term trend can be understood by looking at the relationship between human lifespans and technology lifespans, as shown below.
Human lifespan vs technology lifespan (adapted from Scholte 1988:31)
In modern times, things just move too quickly for many of our traditional approaches to education. Intensive bursts of formal education can no longer last a lifetime. Job descriptions change and blur. Key knowledge and skills become obsolete as entire industries fade and others emerge.
In these circumstances, there is a growing number of reasons why traditional education can’t keep up. And even faster-paced e-learning needs solid innovation and transformation.
The following sections explore ways that e-learning might respond to these challenges.
It takes time to develop quality learning content – to make sense of a domain of activity, to distil the relevant insights, and to design content that helps establish these in the mind and practices of a learner. It can be time-consuming and costly. Given the rate of technological and skills change above, the challenges of content development are likely to increase. Deloitte (2017) points out one pathway forward:
“The good news is that an explosion of high-quality, free or low-cost content offers organizations and employees ready access to continuous learning. Thanks to tools such as YouTube and innovators such as Khan Academy, Udacity, Udemy, Coursera, NovoEd, edX, and others, a new skill is often only a mouse click away.”
This suggests a pathway involving something more along the line of e-learning content curation, drawing upon the best of what already exists and is freely available. Another related strategy might be to parallel the concept of “mashups”, creatively mixing this free content to support a specific learning need and context.
Such approaches will prove vital – but they will not be enough. This is because, historically, learning content authors have been able to draw insights from already-established and widely-accepted bodies of (mostly published) knowledge that may have been developed over generations. However this approach is under threat because it takes time for these bodies of knowledge to form; indeed, in times of rapid technological change, they may not get to even approach this point before another technological disruption occurs. This suggests a future where learning shifts to the workplace front-line, actively supporting knowledge development and application in everyday activities.
Another response to the pace of change is to focus on a smaller and ongoing stream of workplace and on-demand learning activity – in essence, learning all the time, and using micro-learning and micro-credentials. Eades (2014) explains micro-learning as “delivering content in small, very specific bursts”; and on this same theme, micro-credentials involve providing formal recognition for skills and knowledge gained from on-the-job experience which can be “stacked” to form recognised qualifications (O’Keefe 2016). Practically, some of the main challenges of micro-learning will center upon:
A challenge for the micro-learning approach is keeping track of an individual’s learning activities. One approach was explored in the Institute for the Future’s “Learning is Earning 2026” project. This project considered a very broad approach to learning and learning providers, with some interesting possibilities (see the project video, below) – and a central part of this they called “the Ledger”. They explain:
“In this future, the currency of learning is tracked and traded on a digital platform called the Ledger. It’s a complete record of everything you’ve ever learned, everyone you’ve learned from, and everyone who’s learned from you. The Ledger not only tracks what you know – it also tracks all of the projects, jobs, gigs, and challenges you’ve used that knowledge to complete.” (http://www.learningisearning2026.org/)
The foundation envisaged for the ledger concept is blockchain – the technology that underpins crypto-currencies like Bitcoin. There are various explanations of its potential for learning, but the essence is that it provides a way to keep an accurate and trusted records of historical activity – in the learning context, records of the activities a learner has completed.
The reality is that this technology faces some hurdles to being applied in a learning context. For example, the Bitcoin blockchain handles relatively simple and well-defined information: it deals with currency transactions. In contrast, a learning-focussed application of blockchain may need to record information about a learner, the person or authority certifying the learning, the deliverer of the learning, the content and version of the learning, and potentially submitted work and assessment information. Meanwhile, Flipo and Berne (2017) point out that the computer facilities that underpin the Bitcoin blockchain already consume electricity on a scale equivalent to the country of Ireland, and that even widespread adoption of Bitcoin would have grave energy consequences globally. A blockchain of complex learning data would, in all likelihood, quickly surpass Bitcoin’s energy impact.
Beyond these issues, the need and potential for strategies that address accelerated skills and learning needs remains. There will be ways forward. And while some details are yet to be established, the scenarios contemplated in the Institute for the Future’s project provide some interesting glimpses into ways that micro-learning, rapid skill development, and ubiquitous workplace and lifelong learning might play a role.[Youtube video: https://www.youtube.com/watch?v=DcP78cLPGtE ]
Paul Wilson elearning Consultant
Burke J (1985), The Day the Universe Changed, BBC, London.
Deloitte University Press (2017), “Rewriting the rules for the digital age”, https://www2.deloitte.com/content/dam/Deloitte/global/Documents/HumanCapital/hc-2017-global-human-capital-trends-gx.pdf
Eades J (2014), “Why Microlearning is huge and how to be a part of it”, eLearningIndustry.com, accessed at: https://elearningindustry.com/why-microlearning-is-huge
O’Keefe D (September 21, 2016), “Old ways not enough to gain workplace credentials”, The Australian, accessed at: http://www.theaustralian.com.au/higher-education/old-ways-not-enough-to-gain-workplace-credentials/news-story/286a526a99fc866f96f99cdac2c6981a
Flipo F & Berne M (2017), “The bitcoin and blockchain: energy hogs”, The Conversation, accessed at: https://theconversation.com/the-bitcoin-and-blockchain-energy-hogs-77761
Learning is Earning 2026: http://www.learningisearning2026.org/
Scholtes PR (1988), The leader’s handbook: a guide to inspiring people and managing the daily workflow, McGraw-Hill, New York.
Takahashi D (December 7, 2016), “New York Times columnist Thomas Friedman tells us how to live in accelerated times”, Venture Beat, accessed at: https://venturebeat.com/2016/12/07/new-york-times-columnist-thomas-friedman-tells-us-how-to-live-in-accelerated-times/
Disclosure statement – Enrolo is a custom online training platform that complies with all of the best practice integrity measures detailed in this document. At Enrolo we believe that for the betterment of the online training industry as a whole all online training providers of nationally recognised training must implement these integrity measures as a minimum.
To ensure that learning outcomes are achieved in an online environment certain minimum integrity measures must be in place. Integrity measures also need to be in place to reduce fraudulent activity. Whilst examples of industry best practice guidelines for online training do exist, many training providers are either not aware of them or, in some cases, are deliberately not implementing them to cut costs. Whilst there are examples of regulators stipulating mandatory online integrity measures, these requirements are often not enforced due to lack of resources or technical understanding. Training providers that are not complying with best practice online integrity measures are producing what is known as ’page turning’ or ‘tick and flick’ learning with assessments such as true/ false or multiple choice that can be passed by guessing. Providers that are cutting corners with online training are leading a race to the bottom in terms of the quality of training, assessment and ultimately the student’s value for money. The result of lack of integrity is not only poor learning outcomes and inadequate training, leading to inefficiencies in the workplace but also damage to the reputation of the online or the ‘e-learning’ industry as a whole. This best practice guide is intended to provide regulators and online providers with integrity measures that must be in place to ensure the integrity of online learning outcomes.
Automation in the training profession is continuing to advance at an ever increasing speed. Examples include interactive videos, free text auto marking, interactive video assessment and virtual reality scenarios for the demonstration of skills. The experience API is also allowing online training to move beyond the boundaries of formally structured content development.
Many qualifications that are taught using sophisticated online training methods can already deliver the same, if not better learning outcomes than in-class training. Since online training innovation is growing at an exponential rate it is advantageous if training providers are allowed to author their own learning materials to keep up with these changes. In the future regulators that attempt to provide learning material to training providers may face increased costs and struggle to keep up with the rapidly evolving innovation. An alternate approach that regulators could consider would be to provide elements such as videos or interactive content that must be included as part of a course.
From a training provider perspective, it is essential to realise that to achieve efficient and effective online learning, content and assessment are inseparable. This is because assessments are directly related to the learning material and in order to achieve accurate mapping to a unit of competency assessment questions must be written in such a way that may require regular edits to the content. Furthermore, if an assessment question is causing confusion or returning a high rate of incorrect answers it may be the case that the content needs to be edited rather than the assessment question. Another issue that can occur with content provided by third parties is if the third parties are renamed or restructured often they lose funding to maintain content and this content is then not able to be updated for name or content changes by training providers as there are often IP issues with authored content.
There is a perception that online training is a way for training providers to cut costs and increase revenues. However, if a training organisation is meeting the requirements for vocational education standards, despite the reduced cost of hiring classrooms, the cost of delivering effective online training is equal to if not greater than in-class training. This is because not only do all of the existing requirements such as qualified trainers and assessors, assessment material, mapping, internal audits, continuous improvement registers and so on all need to be in place but in addition to these costs, online providers need to purchase and run an online learning environment and build online learning material and robust assessments. Effective and engaging online training includes videos, games, interactive scenarios, and animations with voice overs and challenging exercises as well as additional reading materials. If the online content is achieving real learning outcomes then it can cost up to AUD$30,000 per hour of e-learning output. Many “online providers” are cutting corners by developing simple e-books or a PDF with a voice over and a next button between pages. Whist it is possible to map such material to the elements and criteria of a unit of training package there needs to be more emphasis on being able to demonstrate that a student can achieve competency from such material through rigorous assessment. If high integrity online training does not reduce costs for the training providers then what are the benefits?
In order to achieve integrity, a Learning Management System (LMS) must be used by training providers that has the sophistication to be able to collect data for integrity and track student activity and progress through the learning material and assessments. Whilst some LMS’s can track the amount of time that a student has interacted with learning content, how many attempts they have made for each assessment, and capture the result of each attempt, other LMS’s may only record the course sections that have or have not been accessed and that assessments have or have not been successfully completed, i.e. without capturing the granularity of the data. It is important that regulators understand these differences and to improve integrity by specifying the requirement for high granularity tracking of student interactions and assessment. Without the high granularity of data related to the e-learning environment it is more difficult to demonstrate that integrity has been maintained.
Confirming student identify in an online setting is of paramount importance in reducing fraudulent activity, i.e. by ensuring that the person doing the training is the person obtaining the Statement of Attainment/Certificate.
Fingerprint and retina scans are becoming more prevalent however, until these are more widely accepted the following procedures and features are considered best practice to verify that a student who is enrolled in an online course is the person who also completes the course and receives the certification:
Students must provide via a secure online website a copy of current government-issued photo ID which is reviewed and accepted or rejected by trained staff. Details captured and verified must include the full name, date of birth, and contact details (including residential, valid email address, and contact phone number) of the student.
The residential address of the student must be validated. This can be improved using an online address validation service or address look up tool.
Students must not be allowed to change personal details themselves – they need to lodge a request and provide evidence, such as government-issued photo ID, marriage certificate, change of name certificate, etc. where required.
For administrators the valid photo ID should be displayed against a student account when videos are being assessed, or during video conferencing so trainers can ascertain that they are speaking with the student who has made the recording.
In Australia it is a government requirement for nationally recognised training that before completing their enrolment students must obtain a Unique Student Identifier (USI) and the details must match their valid photo identification provided. In Australia it is a government requirement that students supply their Unique Student Identifier (USI), the details of which must match their valid photo ID provided, before they can be issued with their Statement of Attainment/Certificate (qualification).
Students must create their own unique username such as an email address
If an overly complex password is provided to the student then they will need to write it down and this reduces the security of the password so students must create a secure password themselves.
Students must be provided with best practice information regarding setting up and maintaining the security of their passwords.
Student password recovery must be via clicking a link to recover their password that sends them details via their registered email. Passwords must not be able to be changed without the student logging into the account.
Identification during phone support
When a student contacts a training provider by phone or email requesting help, the provider must have procedures in place to check their identity before providing any account details or assistance.
Students must agree to terms and conditions at the start of an online course, confirming that they are the person doing the training. This can be done via a statutory declaration or equivalent.
Once complete and before generating and accessing the final PDF Statement of Attainment/Certificate the student must declare that they were the person that did the training and that they did so without assistance. This must be recorded in the electronic management system and be able to be provided as evidence for legal breaches.
Assessments must be mapped to a unit of competency to demonstrate that all of the required learning elements have been assessed.
In an online environment assessments that are designed to provide a true reflection of the knowledge and skills that have been achieved are one method of proving competency has been gained. Offline activities can also be undertaken and witnessed by a suitably qualified supervisor.
In an online environment in order for assessments to demonstrate that a student has achieved understanding and competency they must have the following integrity measures:
In contrast to assessments that comply with the above integrity measures, assessments that are 100% marked by a computer that are predominantly multiple choice by elimination or true/false by elimination, provide no requirement or opportunity for a student to demonstrate their comprehension of the learning material. Whilst it can be argued that by eventually getting a question right is a form of learning, allowing students to keep attempting questions until they get the correct answer does not deliver meaningful learning outcomes. Demonstrated competency through rigorous and auditable assessment in combination with the above integrity measures is considered an integral requirement for the demonstration of online learning outcomes.
Online training providers may try to cut corners and costs by using administration staff with no training qualifications to assist with training. To cut costs training providers may also reduce access to qualified trainers to very short periods. It is essential that qualified trainers are available in an online environment. These trainers may communicate with the students via:
Integrity measures include:
The following integrity measures must be in place to minimise the potential for fraudulent activity
The volume of learning includes guided learning, individual study, research, learning activities in the workplace and assessment activities. It could be argued that rather than applying this at a qualification level, the volume of learning should be applied at a unit of competency level and rather than minimum duration the relationship between the quality of the content being able to map to a rigorous assessment should be the deciding factor in assessing competency as the AQF goes on to say:
“The duration of the delivery of the qualification may vary from the volume of learning specified for the qualification. Providers may offer the qualification in more or less time than the specified volume of learning, provided that delivery arrangements give students sufficient opportunity to achieve the learning outcomes for the qualification type, level and discipline.”
This could easily be applied to a competency rather than the whole qualification.
When looking at VET training, we need to remember that students come from hugely various backgrounds and experience, from Year 11 students undertaking a VETiS program to someone with a PhD who is looking for a career change to an older person with a wide-ranging life experience. This all affects how quickly they can become competent, not only based on their level of experience and in the skills they already have or need to learn but their ability to undertake and comprehend the training in the first place.
While setting minimum durations based on the time a learner (who is new to the industry area) would be required to undertake supervised learning and assessment activities (ASQA report A review of issues relating to unduly short training, page 15 https://goo.gl/z9v6jr) has merit, the dilemma remains that some people undertaking formal study will already be to some degree competent and not require the same minimum duration.
In terms of auditing the appropriate volume of learning on a student by student basis, training providers using data from the AVETMISS enrolment should be able to demonstrate whether a student is new to the field and requires what could be considered as minimal learning hours as opposed to someone who is requalifying and could reasonably be expected to complete a competency in a much shorter time frame. I.e. based on evaluating the needs of the student from the language, literacy and numeracy (LL&N) together with the AVETMISS enrolment data a training provider should be able to specify the expected or minimum volume of learning required. In a class room this could be part of the learning plan. In an online environment this tailored learning plan must be established and managed in a dynamic way, i.e. by having certain locks or minimum interactive learning hours completed before assessment is made available.
Copying or editing statements of attainment can be done no matter how they have been issued. However if statements of attainment are issued online the following integrity measures must be in place to minimise fraudulent activity:
Your custom online
Built for smart phones,
tablets, Macs & PCs