We are approaching the final stages of the Learning Analytics for AR (LAAR) project. Our next goal is to trial some of our final applications, designed to train stagehands in the event industry. The applications are part of a bigger framework the project developed. It’s aim is to provide a complete learning analytics cycle for educational AR software, it is to act as demonstrator and template for future adult work-based education. Using competence and skills catalogues such as ETTE and the related ESCO, we are prototyping AR applications that test selected skills. Performance of the learner is matched – via corresponding actions taken within the app – with the related skill. Actions are recorded using the Experience API (xAPI, formerly known as TinCan) and stored on a learning record store, which will then be analysed against the skills and competencies from the directory, feeding learning analytics back to the user, bringing us full-cycle.

For the project pilots, we went to Frankfurt, Germany, demoing results of our labour at Prolight and Sound 2019, one of the largest conferences in the event industry. We had a prime location, next to the stand of project partner VPLT.

Although of course there were some initial difficulties to overcome, as with any trial, the event was a huge success. The black box with its single low frequency LED light source was incredibly dramatic and realistic, but initially played havoc with the Hololens spatial mapping and the AR markers. With perseverance and practice, we got it working in the end, using some white gaffer tape to create some ‘trackable’ artefacts to increase stability of the AR spatial anchoring.

Our wonderful research assistants made sure there was a constant stream of people willing to take the trial. Along side demonstrations we ran trials of the apps. Almost in competition, the Brookes app was placed side by side with the ITU app.

We also gave a keynote at the event, on ‘training 4.0: AR experience capturing and sharing’, presented by Dr Fridolin Wild.

From the pilots: some example xAPI statements in the LearningLocker LRS used for competenceanalytics.com

We are now very excited about what comes next. The project is converging, consolidating the entire framework for AR-based learning analytics systems, which could be configured to many other work-based education areas! We hope to take the results and use them to further understand what worked, what didn’t work, what is practical or not, how better connect the virtual packets of actions to real competencies and skills as outlined in ETTE.

As always, I am but a small cog in a big team. I want to end with a huge thank you to everyone who made this happen. Peter the project manager and his wonderful two research assistants. Fridolin for his support and advice. Chris for his wisdom and insights. Tommy for all his hard work setting up and always smiling. Everyone at ITU (Leo, Neils and Oliver) for the wonderful work they have done. And, last but not least, thanks to my partner Goda for all her support and guidance that week. She came alone and helped me run all the Hololens trials without a single complaint, all for free.