My LASI-NL 2014 experience

I was participating in LASI-NL event on 30th June and 01st July where I identified myself as the representative of the EMMA project and addition to LASI-NL event I had a chance to discuss several EMMA issues with our OUNL partners and plan the deliverable and share some EMMA learning analytics insights.

But back to LASI-NL event, where I was one of the fewest foreigners.  The first day was opened by the Dr. Stefan Mol and Dr. Gábor Kismihók from the University of Amsterdam. They shared their experiences with the learning analytics, faced barriers and mostly – how to overcome the barriers. They discussed the aspects of using learning record store (LRS) and Tincan API/xAPI specification. They don’t use existing LRS, as I understood, but aim to develop own LRS what stores data from different data sources.

From practical field, the presenters described their practical experiences with LA. The first cluster, as they named it, focused on mirroring of traditional and non-traditional study performance to students. Their tool (UvA Mirror (Coach)) – visualizes the position of individuals in the context of the group. Another tool qDNA (cluster exam feedback) provides fine grained mirroring of exam results to provide students and teachers with insight in the development on four competencies and knowledge goals. Provided dashboard facilitates individual students to choose from, and set their own goals (and deadlines) against specific course events deadlines based on mirrored data. Dashboard shows them in a glance how they are scoring/succeeding in attaining the goals compared to their fellows.

Second cluster focuses on providing feedback to teachers based on specific student data.  Tool visualizes scores for students and teachers over (partial) assignments or courses, provides web lecture statistics to find potential problems, difficult material to improve teaching etc.

The third cluster uses other people’s data to provide recommendation system to students and aims to reduce dropouts in bachelor programs by first find predictors and then start developing interventions.

The list of lessons learnt identified by the presenters, was quite long. One of the interesting remarks was that data is everywhere, but accessibility is the issue and additional legal and ethical issues are on the agenda. I also noticed that in Netherlands the LA is in national level important direction and funds are available for the R&D activities on the field.


Second session was about stakeholders’ requirements and Hendrik Draschler and Maren Scheffel from Lace project introduced their research. Since 2012 they have had three studies that involves LA stakeholders. First study in 2012, with the 121 participants (75% from the field of higher education) illustrated that 70% of the participants are quite sure that students are not able to learn from the learning analytics.

Two another studies were mainly interesting for me because of their data collection and analysis methods – group concept mapping and point mapping seemed like methods that I’d like to try. Results of the second study they conducted in OuNL indicated that Netherlands is ready to roll out LA in K12 and higher education. Third study involved LA experts (55 all around the world) and based on their expertise and opinion the framework of LA quality indicators was developed. LA experts considered the LA important as it provides the learning and outcome support and raise the students’ awareness of learning.

That session was interesting for me because I wanted to see what is the OunL LA research group and LACE project doing.


Next one was practical session. Session was guided by Dr Jeroen Donkers from WatchMe project. The WATCHME project aims to improve workplace-based feedback and assessment and professional development by means of learning analytics. Surprisingly the WP of the LA is coordinated by Tartu University and my opponent – small and exciting world.

After short introduction to the project and theoretical overview of how the LA and instructional design could complete each other, we were grouped for small groups and got the practical task: MOOC about educational management is being conducted for the second time. Drop-out of the course is high, small groups do not work optimal way and participants complain that content does not fit to their actual situation. The situation was best possible scenario for me, because it is directly associated with the work that I’m doing in EMMA. Not that surprisingly all the six groups has the same understanding what could be the solution. In our group we tried to take some instructional design model as a baseline (e.g. Kemp et al), but soon we came to conclusion that LA is very cyclical and the evaluation is part of the implementation and it is not enough to evaluate the iteration in the end. Also it was discussed that it is not important to focus on drop-out rate, but to direct those participant to collaborate who would like to collaborate. Anyway all groups seemed to commonly agree that it is important to: a) find out in the early stages the background of the participants in order to evaluate what is the interest in course and background (pre-screening of the participants); b) monitor the progress and redesign the course design including recommending the suitable materials based on the history and background, relevant study peers and study groups; c) post-evaluate the course.


From the second day I’d like to highlight the second keynote – Dr Ulriche Hoppe (University Essen-Duisburg, Germany). Title of his talk was intriguing to me –  Beyond the obvious – advanced analytics tools and applications in educational contexts. Simply said – “obvious” LA in his talk was the simple LA that focuses on frequencies of the activities, participation, enrollment. To be honest – something that is in my mind in EMMA as well. Non-obvious is analysis of sequences and paths that users are stepping on. To be honest – it is also in my mind, but is more complicated. Hoppe mentioned three LA methodologies – activity analysis, artefact analysis and network analysis and brought two of the non-obvious examples of his research, which were really close to artefact-actor-network analysis method that I’m trying to implement in EMMA as well. To be honest, some of his talk was far complicated to me (e.g. practical examples how to analyse), but I warned him that I’ll try his tools and contact him when needed. From Dr Hoppe talk I learnt that it is not important to visualize everything to the users –complex SNAs or patterns is relevant for the researchers.


Next session was too short for me, I wanted more. I’ve become familiar with the xAPI and with the concept of learning records store, but I wanted to hear about more and more. Anyway Alan Berg, Hendrik Draschler and Sander Latour shared their understanding and experiences with the xAPI and LRS. In theoretical level it seems that I’ve understood so far everything correctly and I know quite much. Sander’s practical experience was interesting because they have done something that is relevant for our Estonian LA project (eDidaktikum) – dashboard that provides overview of activities, comparison with others and recommendations by using xAPI standard. Activities in two different environments were combined, which is also interesting.

From Alan’s talk, I took with me two concepts – LTI protocol that allows to embed application to existing LMS and which works well with the xAPI standard; Caliper – framework and its relation with the xAPI.


To be concluded – it was extremely valuable experience for me. I positioned myself where am I and what are others doing. I understood how I’d like to proceed with the LA in my research group. I had a chance to meet my OuNL colleagues and make plans for EMMA. It has been long time since I was so excited 🙂