The ongoing work of learning standards

The ongoing work of learning standardsAt the dawn of the video recording age, a battle raged about the best standard to use: VHS or Betamax. In it, Sony came out with a better standard, but kept it to themselves.  Meanwhile, JVC’s VHS standard wasn’t quite as good, but they openly licensed it.  The outcome: VHS won. As Stephen Johnson tells us, in his book Where Good Ideas Come From, the openness of standards fosters innovation.  

Standards provide several benefits. By having an agreed upon protocol, things can work together, and others can build value-added services on top of them. They make working together easier. Consider the internet; developments on top of the TCP/IP protocol like SMTP and HTML allowed a variety of tools to work together to bring us email and the world wide web. For interactivity, we originally used Flash, and now have HTML 5 as a more secure and reliable replacement.

Which is a preface to talk about learning standards, and the field is quite dynamic right now.  After initial efforts in 1993, and bursts of energy circa 2000 and again in 2004, we’re seeing a new resurgence of activity and interest.  And you should be paying attention.

Learning Standards

Initially, when you developed eLearning, you were pretty much dependent on the tool you developed it in. You could use any of the many authoring tools, such as Authorware, but you needed an appropriate player. With the advent of Flash, many authoring tools started creating output that could be played by the Flash player, which meant you didn’t need special software to use the courses (if your IT org would support you having it).

Looking for systematicity, the aviation industry created a subcommittee to develop standards for Computer Based Training: the Aviation Industry CBT Committee (AICC). And then the IMS (an initiative of Educause before being spun out as an independent effort) started creating standards, as did the IEEE (Institute of Electrical and Electronics Engineers), the major body in the US responsible for computing standards. The goal was to create interoperability of content, so that any content created can play on anybody’s system, and vice versa.  

The problem with committee work is well known; in this case vendors who participated wanted the standard to make it easy for their content to be made compliant. And academics fought for a variety of theoretical positions.  It was all (relatively) well-intentioned, but it was taking a long time.

SCORM

The Advanced Distributed Learning (ADL) initiative of the Department of Defense finally had enough. They wanted a standard, took a plausible set of standards from the work, and called it the Standard Courseware Object Reference Model (SCORM). And because of the 800 lb. gorilla factor that is the US government, it got traction.

SCORM was updated in 2004, and that is the existing standard for course delivery.  The ADL provided compliance testing, so that content and systems could be validated for interoperability. Vendors got on board, and SCORM became a more-or-less viable mechanism.

The core of SCORM is the ability to launch a course, perhaps made up of a variety of components, and then receive an indication that the course has completed, and possibly also record a score, whether pass fail or some other assessment mark. It can store interim status, so you can stop and restart as needed. This is both the boon and the bane of the standard.

The ability to track learning, critical to a Learning Management System (which really is a Course Management System), requires knowing who has done what, and how well. If we need to track compliance, we may need to know not only that the person completed, but the amount of time spent or the successful completion. And we want to know whether or how well someone did for critical business skills.

Which is exactly what SCORM does, with a limitation. SCORM works for courses. And that’s pretty much it. Yet, there’s more.

Quite simply, there’s more to learning than courses. From web activity streams, people started thinking about tracking other forms of activity. People access resources, e.g. how to videos, and job aids, that may impact their success. They may ask questions of others. This is all learning.

The recognition of the importance of informal learning triggered a desire to look for more.  A concern with what people do, not just what they learn, was also emerging. What about interviews, projects, job aids, searches, and all the other things that could contribute to an individual’s learning and ability to perform?  Something more was needed.

xAPI

ADL decided to create a new standard, the Experience API (xAPI; originally known as Tin Can). An API is an Application Programming Interface, and what it specifically provides is a way for an application to make its data available to others. The application specifies a particular syntax that it will respond to.  Another application can query the application to provide information, and do something with that. It’s a consistent way to share data.

Integrating a consistent mechanism for communication with a desire to track human activity is the underlying concept. With a very simple syntax: [who] [did] [what] (e.g. [Lee] [viewed] [the troubleshooting video]), the xAPI provides a way for systems to provide data about what people do. The goal is to track a richer suite of information about behaviors.

The idea is not to simply track data, but start relating activity data to business outcomes. It’s not just that people are using this resource and not that one, but is that having any difference in success?  And that success doesn’t come within activity or learning data, but from business data. The goal is to connect the activity information with business intelligence data.

This doesn’t come free. First, you need to either select products that are instrumented to generate xAPI, or add code to your systems to do so.  You also need to collect this data, typically in a Learning Record Store (LRS).  Then you need to connect the LRS to business intelligence to start looking at real outcomes.  Simple? Not necessarily. Valuable?  Absolutely!

What you really want is to start aligning your interventions with business gaps. You should be addressing real problems that the organization faces, where people measurably perform below the needed level. Then you should be designing interventions to address those gaps. And you should track the data to see how the interventions work on the needs. And now you can.

So, does xAPI replace SCORM?  

Going Forward

Recognizing that xAPI by itself wasn’t sufficient to support launching and tracking courses, but wanting to support the xAPI infrastructure, the ADL is now working on CMI 5, a protocol that wraps xAPI with additional information to support grouping elements together to create full courses. Thus, CMI 5 is the next generation of SCORM.

In addition, IMS has released a specification for activity very much like xAPI, called Caliper. There are now efforts to reconcile them, though they may not result in being interoperable. As the first such effort stated: it may be a matter of best fit. There are background issues as well. Ultimately, it may be Caliper for education, and xAPI for business. Crossing those boundaries could be problematic.

Work continues in interesting ways. The ADL has a ‘Total Learning Architecture’ (TLA) effort underway, where different systems can be pulled together to create performance ecosystems. The IEEE has, in a related endeavor, is working on the Actionable Data Book, an ePUB standard (for ebooks) that supports social interaction and personalization based upon user actions.

While standards work is complex, the benefits accrue from the ability to connect systems together in trouble-free ways.  The benefits further accrue by creating an environment where people can build on top of the standards. You should at least be on top of xAPI, and tracking the progress of the TLA.  While you’re not expected to be on top of all of them, you do want to look at how they can be used to meet your goals. And, you should be looking at the new things they facilitate that you may not already be addressing.

Quite simply, people are advancing our ability to do our jobs. We need to both understand how they’re doing it, but also what they’re doing. It may be that they’re on to things that we should be paying attention to!