Category Archives: EMMA

Learning analytics methodology for MOOCs

Few days ago I finished the deliverable of EMMA project about learning analytics methodology for MOOCs together with colleagues from OuNL, Unina, IPSOS, Atos. It would be fair to say that our deliverable is rather proposal of the methodology, because learning analytics application will be piloted in September and after the first evaluation adjustments will be made.

Learning analytics in EMMA project will focus on: a) real-time analytics through learning analytics dashboards for teachers and students; b) retrospective analysis of the digital traces in EMMA platform. First approach aims to support participants’ learning activities whereas the second approach is intended for more in-depth analysis of the MOOCs and overall EMMA evaluation. As EMMA is a MOOC platform then calculating the dropout and clustering the participants will be one of the research aims. Additionally uptake of the knowledge, students’ progress and social structures emerging from MOOCs will be analyzed in the pilot phase.

Theoretically we have relied on the work of  Greller and Drachsler (2012). They give a general framework for learning analytics and offer focusing attention to six critical dimensions within the research lens. According to the framework, each of the dimensions can have several values and it can be extended upon a need. Represented dimensions are: stakeholders, objectives, data, instruments, external constraints and internal constraints.

Different studies about MOOCs and learning analytics were investigated for the deliverable. Most of them focused on clustering the participants and calculating the drop-out rate. EMMA learning analytics approach takes the retention rates and clusters of the users into account, because dropout is important, but will redefine drop-out in the context of a MOOC while in addition considering the concept of drop-in. Such clustering enables to approach the participants more personally by taking the different types of users and their personal learning objectives into account. This is accomplished by making use of a variety of both qualitative and quantitative analyses.

In pilot phase of the EMMA MOOCs following aspects will be analyzed:

  • Clustering of the participants
  • Progress and performance
  • Uptake of knowledge
  • Social structures
  • Engagement with the content

Technical architecture consists of tracking tool in EMMA, learning record store (Learning Locker) and dashboard module. xAPI standard will be used when storing learning experiences as it offers good opportunities for the personalized advice foreseen in EMMA. The context (social ties, groups, activity duration) and also semantics and used tags are also part of the tracked learning activities in EMMA analytics for conducting more in-depth analysis and provide meaningful dashboards.

EMMA’s learning analytics application is an advanced solution in learning analytics field for MOOCs since it makes a combination of the xAPI specification and the Learning Record Store (LRS) Learning Locker for storing and sharing the learning experiences that is not widely in common by MOOC platforms. In particular because the dashboards for students and instructors that will be developed are based on the collected and analyzed events in EMMA platform and are geared towards the specific conditions that apply to MOOC settings. Moreover, these dashboards do not only provide feedback about the courses and learning activities, but also offer reflection and monitoring opportunities in support of the personalized learning objectives of the students.

EMMA learning analytics approach will be introduced in Ec-tel 2014 conference in two different workshops. Firstly in MOOCs workshop and in learning analytics workshop.


Greller, W., & Drachsler, H. (2012). Translating Learning into Numbers: A Generic Framework for Learning Analytics. Educational Technology & Society, 15 (3), 42–57.


My LASI-NL 2014 experience

I was participating in LASI-NL event on 30th June and 01st July where I identified myself as the representative of the EMMA project and addition to LASI-NL event I had a chance to discuss several EMMA issues with our OUNL partners and plan the deliverable and share some EMMA learning analytics insights.

But back to LASI-NL event, where I was one of the fewest foreigners.  The first day was opened by the Dr. Stefan Mol and Dr. Gábor Kismihók from the University of Amsterdam. They shared their experiences with the learning analytics, faced barriers and mostly – how to overcome the barriers. They discussed the aspects of using learning record store (LRS) and Tincan API/xAPI specification. They don’t use existing LRS, as I understood, but aim to develop own LRS what stores data from different data sources.

From practical field, the presenters described their practical experiences with LA. The first cluster, as they named it, focused on mirroring of traditional and non-traditional study performance to students. Their tool (UvA Mirror (Coach)) – visualizes the position of individuals in the context of the group. Another tool qDNA (cluster exam feedback) provides fine grained mirroring of exam results to provide students and teachers with insight in the development on four competencies and knowledge goals. Provided dashboard facilitates individual students to choose from, and set their own goals (and deadlines) against specific course events deadlines based on mirrored data. Dashboard shows them in a glance how they are scoring/succeeding in attaining the goals compared to their fellows.

Second cluster focuses on providing feedback to teachers based on specific student data.  Tool visualizes scores for students and teachers over (partial) assignments or courses, provides web lecture statistics to find potential problems, difficult material to improve teaching etc.

The third cluster uses other people’s data to provide recommendation system to students and aims to reduce dropouts in bachelor programs by first find predictors and then start developing interventions.

The list of lessons learnt identified by the presenters, was quite long. One of the interesting remarks was that data is everywhere, but accessibility is the issue and additional legal and ethical issues are on the agenda. I also noticed that in Netherlands the LA is in national level important direction and funds are available for the R&D activities on the field.


Second session was about stakeholders’ requirements and Hendrik Draschler and Maren Scheffel from Lace project introduced their research. Since 2012 they have had three studies that involves LA stakeholders. First study in 2012, with the 121 participants (75% from the field of higher education) illustrated that 70% of the participants are quite sure that students are not able to learn from the learning analytics.

Two another studies were mainly interesting for me because of their data collection and analysis methods – group concept mapping and point mapping seemed like methods that I’d like to try. Results of the second study they conducted in OuNL indicated that Netherlands is ready to roll out LA in K12 and higher education. Third study involved LA experts (55 all around the world) and based on their expertise and opinion the framework of LA quality indicators was developed. LA experts considered the LA important as it provides the learning and outcome support and raise the students’ awareness of learning.

That session was interesting for me because I wanted to see what is the OunL LA research group and LACE project doing.


Next one was practical session. Session was guided by Dr Jeroen Donkers from WatchMe project. The WATCHME project aims to improve workplace-based feedback and assessment and professional development by means of learning analytics. Surprisingly the WP of the LA is coordinated by Tartu University and my opponent – small and exciting world.

After short introduction to the project and theoretical overview of how the LA and instructional design could complete each other, we were grouped for small groups and got the practical task: MOOC about educational management is being conducted for the second time. Drop-out of the course is high, small groups do not work optimal way and participants complain that content does not fit to their actual situation. The situation was best possible scenario for me, because it is directly associated with the work that I’m doing in EMMA. Not that surprisingly all the six groups has the same understanding what could be the solution. In our group we tried to take some instructional design model as a baseline (e.g. Kemp et al), but soon we came to conclusion that LA is very cyclical and the evaluation is part of the implementation and it is not enough to evaluate the iteration in the end. Also it was discussed that it is not important to focus on drop-out rate, but to direct those participant to collaborate who would like to collaborate. Anyway all groups seemed to commonly agree that it is important to: a) find out in the early stages the background of the participants in order to evaluate what is the interest in course and background (pre-screening of the participants); b) monitor the progress and redesign the course design including recommending the suitable materials based on the history and background, relevant study peers and study groups; c) post-evaluate the course.


From the second day I’d like to highlight the second keynote – Dr Ulriche Hoppe (University Essen-Duisburg, Germany). Title of his talk was intriguing to me –  Beyond the obvious – advanced analytics tools and applications in educational contexts. Simply said – “obvious” LA in his talk was the simple LA that focuses on frequencies of the activities, participation, enrollment. To be honest – something that is in my mind in EMMA as well. Non-obvious is analysis of sequences and paths that users are stepping on. To be honest – it is also in my mind, but is more complicated. Hoppe mentioned three LA methodologies – activity analysis, artefact analysis and network analysis and brought two of the non-obvious examples of his research, which were really close to artefact-actor-network analysis method that I’m trying to implement in EMMA as well. To be honest, some of his talk was far complicated to me (e.g. practical examples how to analyse), but I warned him that I’ll try his tools and contact him when needed. From Dr Hoppe talk I learnt that it is not important to visualize everything to the users –complex SNAs or patterns is relevant for the researchers.


Next session was too short for me, I wanted more. I’ve become familiar with the xAPI and with the concept of learning records store, but I wanted to hear about more and more. Anyway Alan Berg, Hendrik Draschler and Sander Latour shared their understanding and experiences with the xAPI and LRS. In theoretical level it seems that I’ve understood so far everything correctly and I know quite much. Sander’s practical experience was interesting because they have done something that is relevant for our Estonian LA project (eDidaktikum) – dashboard that provides overview of activities, comparison with others and recommendations by using xAPI standard. Activities in two different environments were combined, which is also interesting.

From Alan’s talk, I took with me two concepts – LTI protocol that allows to embed application to existing LMS and which works well with the xAPI standard; Caliper – framework and its relation with the xAPI.


To be concluded – it was extremely valuable experience for me. I positioned myself where am I and what are others doing. I understood how I’d like to proceed with the LA in my research group. I had a chance to meet my OuNL colleagues and make plans for EMMA. It has been long time since I was so excited 🙂