There is an ever-increasing focus on learner performance and the data our learning management systems (LMSs) and facilitators collect. You can add a few new tools to your learning tech stack to help you collect more robust data, manage and analyze the robust data in meaningful ways, then share these findings with stakeholders. These new tools can allow you greater insight beyond your learning ecosystem.

First, let’s discuss some definitions:

  • Learning Management Systems (LMSs): platforms where learning content — either on-demand eLearning courses, files or live virtual instructor-led (VILT) sessions — are stored and consumed by learners. LMSs contain built-in tools to collect simple learner data such as enrollment date, completion date, quiz or test results and time to completion.
  • Sharable Content Object Reference Model (SCORM): the format of most eLearning courses and objects stored in an LMS. Most, although not all, learning content is packaged as SCORM files; it is the most common eLearning file format. Popular tools like Articulate Rise, Adobe Captivate and Lectora all have options to export courses in SCORM formats (i.e., 1.0, 1.2 and 2004).
  • Learning Record Store (LRS): a specially designed database that receives, stores and provides access to learner data from a wide variety of sources, not just an LMS. The LRS database will not replace an LMS, but function alongside it and collect the data in a pass-through function. The LMS will still host learning content and retain basic learner data, but the LRS will collect and store more robust data such as how long a learner visited a certain page, which option they clicked first on a quiz or test, and how many attempts it took them to pass the quiz or test.
  • Learning Analytics Platform (LAP): an optional add-on functionality to an LRS, the LAP will allow you to make sense of the stored data and will provide analytic insights via dashboards, content recommendations and artificial intelligence (AI)-assisted conclusions.
  • Experience API (xAPI) or Tin Can API: format that eLearning (and other learning content) can be exported into as an alternative to SCORM. SCORM allows the visibility of only basic learner activities (i.e., enrollment date, completion date, quiz or test result and time to completion), but xAPI can track those and so much more, such as: multiple scores, team learning, blended or informal learning, offline learning, and detailed quiz or test results. The biggest advantage that xAPI provides is that it can track learning no matter where it occurs; it does not need an LMS or even a web browser to function.

Note: More advanced practitioners may write their own custom xAPI statements to collect learner data, but for the purpose of this article, we will just refer to the xAPI export option of eLearning authoring tools, such as Articulate Rise, Adobe Captivate and Lectora.

Using an LRS allows you to use data visualization or dashboard tools, such as Tableau or Power BI, to compare your learner data with other data collected by your organization. By translating learner data into an LRS database format, you can read it alongside other sources of data such as data lakes and enterprise data warehouses (EDWs). You can analyze all data in one place which will allow you to spot trends in performance, patterns of change over time and insights that will inform the next era of content design.

Here are some process flow examples:

An LMS using SCORM content:

An administrator uploads an eLearning course with a built-in final exam using SCORM format into their LMS. A learner enrolls themselves, passes the final exam with a score of 83%, and completes the course within seven days of enrollment. Information that the LMS and SCORM file are unable to tell us includes: How many times it took the learner to pass the exam; if there are any questions that the majority of learners are getting incorrect consistently; what areas of the content the learner took the longest to consume; what areas of the content the learner spent the least time on; and even how many times the learner paused and resumed their session (logging in and out or even just switching tabs).

An LMS using xAPI content that passes through an LRS:

An administrator uploads an eLearning course with a built-in final exam using xAPI format into their LMS that has an attached LRS. A learner enrolls themselves, spends about two hours in each section of the course, attempts the final exam three times, passes the final exam with a score of 83%, and completes the course within seven days of enrollment.

Additionally, we can see that every learner who attempted the same final exam failed the first true-or-false question. On closer examination, we found that this first question was programmed incorrectly, and the administrator uploaded a corrected course file.

Conclusion

The practice of using these evidence-based data insights to apply changes to learning material is referred to as “learning engineering,” a developing human-centered learning analytic and learning operations approach being driven by researchers at Arizona State University’s Learning Engineering Institute and at Carnegie Mellon’s Open Learning Initiative, Learn Lab, among others. By leveraging learning technologies and tools like those outlined in this article, you will be better positioned to gain meaningful insights that can help you report training impact to key stakeholders and gain buy-in for future programs.