LRS-as-a-Service: How Learning Platforms Can Add xAPI Support

LRS-as-a-Service: How Learning Platforms Can Add xAPI Support


Learning ecosystems are getting more complex. LMSs, LXPs, digital coaches, mobile apps, videos, XR/VR, content libraries, AI agents, and performance support tools all generate signals and data. Organizations want those signals captured in a consistent way so they can prove impact, spot trends, and report learning and performance outcomes across systems. A key building block of any modern learning ecosystem is a Learning Record Store (LRS), especially for teams who want to be data-driven and support capturing these signals with xAPI.

By using an "LRS-as-a-service" you can:

  • Add xAPI support quickly without spending years of engineering into building an LRS
  • Offer richer analytics (dashboards, queries, exports, BI integrations) on top of learning activity data
  • Support enterprise requirements like separation of customer data, permissions, and scalable provisioning
  • Focus product development on the learning experience instead of infrastructure

Examples of Products Integrating Veracity’s LRS Today
Here are a few different ways platforms are using Veracity as the LRS engine inside their product.

SparkLearn
SparkLearn is a fresh new learning experience platform for performance support and learning that integrates Veracity LRS to support reporting and analytics, including embedded/custom reporting workflows and a clear emphasis on measuring outcomes (including AI enablement). As a result of integrating Veracity's LRS, SparLearn offers xAPI-powered analytics built-in without requiring customers to procure and integrate a separate LRS.
>> Learn More

Build Capable XCL
Build Capable's XCL tool allows you to send learning data to an LRS, enabling customers to provide xAPI content support without requiring an LMS to deliver the content. XCL uses an on-premises install of Veracity’s LRS by default as part of the product’s core architecture. XCL can also be configured to connect to any external LRS.
>> Learn More

Larmer Brown LRS
Larmer Brown implemented and white-labeled Veracity’s on-premises license to offer secure LRS capabilities to their customers within their own secure enclave in the UK. The Larmer LRS platform is powered by Veracity’s LRS.
>> Learn More

Remote Reviewer
Remote Reviewer captures xAPI-compliant assessment data and stores it in a chosen LRS. Remote Reviewer uses the Veracity LRS as the data store and leverages Veracity APIs to create an LRS on-the-fly for users, making it easy for their customers to stand up an LRS as part of the workflow.
>> Learn More

RePubIT
RePubIT is a digital coach and content publishing and delivery system for interactive learning content and performance support. RePubIT uses Veracity LRS as part of its product offering in order to capture and store the xAPI data it generates, enabling reporting and analytics.
>> Learn More

Enterprise Multi-tenancy for Your Customers
We also have growing number of customers using our multi-tenancy LRS capabilities to provide their own customers with an LRS. This can support scenarios like customer-by-customer data isolation, different environments, and clean separation of reporting.
Read more here on why organizations may want to use more than one LRS instance.

LMS Without an LRS
If your LMS does not currently include an LRS but has a requirement to support xAPI, there is a straightforward path: integrate our LRS by using our APIs or by using LTI rather than building and maintaining an LRS internally. Self-hosted, On-Premises enterprise LRS customers can also use our System APIs for deeper operational integration. Using these approaches, Veracity’s LRS has been integrated with several popular LMSs, such as Canvas LMS, Cornerstone, and Moodle.

As learning ecosystems expand, the winners will be the vendors who can deliver measurable outcomes. If you want to add xAPI + LRS capabilities to any learning system, platform, or even your LMS, you can leverage any of our many LRS offerings and cost effectively add LRS support to your ecosystem. Simply treat the LRS as an interoperable service layer in your architecture!