Author Archives: Kairit Tammets

Experimenting with Learning Layers tool

This autumn I have unique chance to be a teacher, not a researcher, who experiments with some technology in course settings. Usually when conducting such case studies, I’m researcher myself and part of the development team. But my colleagues from Learning Layers asked to agree with the experiment and sure, lets try.

The course is “Apprenticeship of educational technology”, where my students are at the practice place for eight weeks and try to get familiar with the role of the educational technologist in educational or other institution. They reflect weekly in their weblogs and perform some tasks that I give them.

This time we will experiment with the Attacher. Without being part of the development team and knowing exactly why this tool is developed, I could say that it supports learning in the community settings. Now besides the reflection-task, students are supposed to find theoretical materials from web related with practice tasks, tag them, share them, browse peers’ materials and reflect about the materials.

The first seminar went well, students agreed to be part of the experiment and lets hope the best for the research team.

Personal learning record – for me

Recently I learnt that Learning Locker, the learning record store that we use in two of our R&D projects, has a plan to develop personal learning record store or personal data locker  solution. They claim that personal learning record enables to give your learners control of their learning data; to store, sort and share their experience with the world. And their roadmap includes the concepts like publication of personal LRS, badges, highlights of user activities etc. Lot of learning involves nowadays in MOOCs (even as a drop-in with the aim not to finish the course) or by following social media channels, participating in networks, browsing blogs, following Twitter. I’m that type of learner. My formal PhD studies were highly combined with the informal studies in the Web. Combining the institutional (school/university/workplace) LRS with the personal data locker is also an option to harmonize the informal and formal learning, as stated here. In that sense, learner has an option to choose what kind of data is shared with whom. But what I miss is the sense-making of my informal learning – if and what learning networks do I have, how long time do I spend on different activities, what activity patterns and sequences emerge in my learning trajectories.

I was participating in EC-TEL last week and one of the workshops where I participated was Learning Analytics Data Sharing (LADS14). We had task in group work to think about one data sharing scenario with the possible challenges, risks, possibilities and so on. Our group worked with the case of informal learning when learning takes place in different settings with different tools, medias and mediums. And here the concept of personal LRS seemed appropriate. Due to my other workshop, I left the workshop earlier, so I don’t know what was the final solution and if the personal LRS was part of the solution or not, but it is not important anyway.

I like the idea that learner itself is the owner of data that has been generated in different learning context (school, university, workplace learning, vocational training and so on). Learner has an access to data about learning experiences in different platforms, which promotes the sense-making and reflection part of the stored data and learning in general. I believe it supports self knowledge through self-tracking and has an impact to my learning behaviour. Finally I consider the sharing of my learning experiences with others important, which should be supported my the personal data locker.

My professional activities have been around learning analytics a little now. By searching and reading different studies, approaches, applications I aim to improve my research and development activities. But in my personal and informal life.. I would like to play around with the personal data locker.

 

Learning analytics methodology for MOOCs

Few days ago I finished the deliverable of EMMA project about learning analytics methodology for MOOCs together with colleagues from OuNL, Unina, IPSOS, Atos. It would be fair to say that our deliverable is rather proposal of the methodology, because learning analytics application will be piloted in September and after the first evaluation adjustments will be made.

Learning analytics in EMMA project will focus on: a) real-time analytics through learning analytics dashboards for teachers and students; b) retrospective analysis of the digital traces in EMMA platform. First approach aims to support participants’ learning activities whereas the second approach is intended for more in-depth analysis of the MOOCs and overall EMMA evaluation. As EMMA is a MOOC platform then calculating the dropout and clustering the participants will be one of the research aims. Additionally uptake of the knowledge, students’ progress and social structures emerging from MOOCs will be analyzed in the pilot phase.

Theoretically we have relied on the work of  Greller and Drachsler (2012). They give a general framework for learning analytics and offer focusing attention to six critical dimensions within the research lens. According to the framework, each of the dimensions can have several values and it can be extended upon a need. Represented dimensions are: stakeholders, objectives, data, instruments, external constraints and internal constraints.

Different studies about MOOCs and learning analytics were investigated for the deliverable. Most of them focused on clustering the participants and calculating the drop-out rate. EMMA learning analytics approach takes the retention rates and clusters of the users into account, because dropout is important, but will redefine drop-out in the context of a MOOC while in addition considering the concept of drop-in. Such clustering enables to approach the participants more personally by taking the different types of users and their personal learning objectives into account. This is accomplished by making use of a variety of both qualitative and quantitative analyses.

In pilot phase of the EMMA MOOCs following aspects will be analyzed:

  • Clustering of the participants
  • Progress and performance
  • Uptake of knowledge
  • Social structures
  • Engagement with the content

Technical architecture consists of tracking tool in EMMA, learning record store (Learning Locker) and dashboard module. xAPI standard will be used when storing learning experiences as it offers good opportunities for the personalized advice foreseen in EMMA. The context (social ties, groups, activity duration) and also semantics and used tags are also part of the tracked learning activities in EMMA analytics for conducting more in-depth analysis and provide meaningful dashboards.

EMMA’s learning analytics application is an advanced solution in learning analytics field for MOOCs since it makes a combination of the xAPI specification and the Learning Record Store (LRS) Learning Locker for storing and sharing the learning experiences that is not widely in common by MOOC platforms. In particular because the dashboards for students and instructors that will be developed are based on the collected and analyzed events in EMMA platform and are geared towards the specific conditions that apply to MOOC settings. Moreover, these dashboards do not only provide feedback about the courses and learning activities, but also offer reflection and monitoring opportunities in support of the personalized learning objectives of the students.

EMMA learning analytics approach will be introduced in Ec-tel 2014 conference in two different workshops. Firstly in MOOCs workshop and in learning analytics workshop.

References

Greller, W., & Drachsler, H. (2012). Translating Learning into Numbers: A Generic Framework for Learning Analytics. Educational Technology & Society, 15 (3), 42–57.

My LASI-NL 2014 experience

I was participating in LASI-NL event on 30th June and 01st July where I identified myself as the representative of the EMMA project and addition to LASI-NL event I had a chance to discuss several EMMA issues with our OUNL partners and plan the deliverable and share some EMMA learning analytics insights.

But back to LASI-NL event, where I was one of the fewest foreigners.  The first day was opened by the Dr. Stefan Mol and Dr. Gábor Kismihók from the University of Amsterdam. They shared their experiences with the learning analytics, faced barriers and mostly – how to overcome the barriers. They discussed the aspects of using learning record store (LRS) and Tincan API/xAPI specification. They don’t use existing LRS, as I understood, but aim to develop own LRS what stores data from different data sources.

From practical field, the presenters described their practical experiences with LA. The first cluster, as they named it, focused on mirroring of traditional and non-traditional study performance to students. Their tool (UvA Mirror (Coach)) – visualizes the position of individuals in the context of the group. Another tool qDNA (cluster exam feedback) provides fine grained mirroring of exam results to provide students and teachers with insight in the development on four competencies and knowledge goals. Provided dashboard facilitates individual students to choose from, and set their own goals (and deadlines) against specific course events deadlines based on mirrored data. Dashboard shows them in a glance how they are scoring/succeeding in attaining the goals compared to their fellows.

Second cluster focuses on providing feedback to teachers based on specific student data.  Tool visualizes scores for students and teachers over (partial) assignments or courses, provides web lecture statistics to find potential problems, difficult material to improve teaching etc.

The third cluster uses other people’s data to provide recommendation system to students and aims to reduce dropouts in bachelor programs by first find predictors and then start developing interventions.

The list of lessons learnt identified by the presenters, was quite long. One of the interesting remarks was that data is everywhere, but accessibility is the issue and additional legal and ethical issues are on the agenda. I also noticed that in Netherlands the LA is in national level important direction and funds are available for the R&D activities on the field.

===

Second session was about stakeholders’ requirements and Hendrik Draschler and Maren Scheffel from Lace project introduced their research. Since 2012 they have had three studies that involves LA stakeholders. First study in 2012, with the 121 participants (75% from the field of higher education) illustrated that 70% of the participants are quite sure that students are not able to learn from the learning analytics.

Two another studies were mainly interesting for me because of their data collection and analysis methods – group concept mapping and point mapping seemed like methods that I’d like to try. Results of the second study they conducted in OuNL indicated that Netherlands is ready to roll out LA in K12 and higher education. Third study involved LA experts (55 all around the world) and based on their expertise and opinion the framework of LA quality indicators was developed. LA experts considered the LA important as it provides the learning and outcome support and raise the students’ awareness of learning.

That session was interesting for me because I wanted to see what is the OunL LA research group and LACE project doing.

===

Next one was practical session. Session was guided by Dr Jeroen Donkers from WatchMe project. The WATCHME project aims to improve workplace-based feedback and assessment and professional development by means of learning analytics. Surprisingly the WP of the LA is coordinated by Tartu University and my opponent – small and exciting world.

After short introduction to the project and theoretical overview of how the LA and instructional design could complete each other, we were grouped for small groups and got the practical task: MOOC about educational management is being conducted for the second time. Drop-out of the course is high, small groups do not work optimal way and participants complain that content does not fit to their actual situation. The situation was best possible scenario for me, because it is directly associated with the work that I’m doing in EMMA. Not that surprisingly all the six groups has the same understanding what could be the solution. In our group we tried to take some instructional design model as a baseline (e.g. Kemp et al), but soon we came to conclusion that LA is very cyclical and the evaluation is part of the implementation and it is not enough to evaluate the iteration in the end. Also it was discussed that it is not important to focus on drop-out rate, but to direct those participant to collaborate who would like to collaborate. Anyway all groups seemed to commonly agree that it is important to: a) find out in the early stages the background of the participants in order to evaluate what is the interest in course and background (pre-screening of the participants); b) monitor the progress and redesign the course design including recommending the suitable materials based on the history and background, relevant study peers and study groups; c) post-evaluate the course.

===

From the second day I’d like to highlight the second keynote – Dr Ulriche Hoppe (University Essen-Duisburg, Germany). Title of his talk was intriguing to me –  Beyond the obvious – advanced analytics tools and applications in educational contexts. Simply said – “obvious” LA in his talk was the simple LA that focuses on frequencies of the activities, participation, enrollment. To be honest – something that is in my mind in EMMA as well. Non-obvious is analysis of sequences and paths that users are stepping on. To be honest – it is also in my mind, but is more complicated. Hoppe mentioned three LA methodologies – activity analysis, artefact analysis and network analysis and brought two of the non-obvious examples of his research, which were really close to artefact-actor-network analysis method that I’m trying to implement in EMMA as well. To be honest, some of his talk was far complicated to me (e.g. practical examples how to analyse), but I warned him that I’ll try his tools and contact him when needed. From Dr Hoppe talk I learnt that it is not important to visualize everything to the users –complex SNAs or patterns is relevant for the researchers.

===

Next session was too short for me, I wanted more. I’ve become familiar with the xAPI and with the concept of learning records store, but I wanted to hear about more and more. Anyway Alan Berg, Hendrik Draschler and Sander Latour shared their understanding and experiences with the xAPI and LRS. In theoretical level it seems that I’ve understood so far everything correctly and I know quite much. Sander’s practical experience was interesting because they have done something that is relevant for our Estonian LA project (eDidaktikum) – dashboard that provides overview of activities, comparison with others and recommendations by using xAPI standard. Activities in two different environments were combined, which is also interesting.

From Alan’s talk, I took with me two concepts – LTI protocol that allows to embed application to existing LMS and which works well with the xAPI standard; Caliper – framework and its relation with the xAPI.

===

To be concluded – it was extremely valuable experience for me. I positioned myself where am I and what are others doing. I understood how I’d like to proceed with the LA in my research group. I had a chance to meet my OuNL colleagues and make plans for EMMA. It has been long time since I was so excited 🙂

Course about technology-enhanced learning at workplace – reflection

This term had I new problem-based course with the IT-management master students about technology-enhanced learning at the workplace. Students in that course work mainly in public institutions – ministries as a IT managers.

The initial idea of the course was to  to support the enhancement of the knowledge and skills of the students to plan, implement and evaluate the technology-enhanced informal learning in the organization. Technologies that would support the workplace learning were planned to be technologies developed in Learning Layers.

Following phases were planned for the course: a) Students conduct the survey for finding out what kind of technology-enhanced learning practices can be identified in their organization, what could be changed and what kind of challenges can be faced with the focus on implementing technology for supporting learning; b) Students  design the technology-based learning activities and possible scenarios with the technological prototypes in the organization; c) Students evaluate the technology-enhanced learning scenarios with the technological prototypes.

The reality was a bit different, mainly of two reasons: the planned technologies were not in the state as initially planned and secondly, the institutions, where students are coming from, are quite restricted, so planning any new technologies is like.. impossible. I was encouraging them to imagine and dream that if it could be possible, how would you do then..

Most of them chose the technologies that the organization is already using, but not purposely. Most of the employees in their organizations are knowledge consumers who read newsletters, follow intranet or e-mails and never contribute to the organizational level knowledge. So they designed their scenarios with the focus on knowledge sharing and documenting the professional practices. As a result it turned out to be really interesting. Some of the evaluations of the scenarios with their colleagues turned out to be really successful, because they took the scenarios into real plan for the near future.

In the end of the course we had short reflection about the course. The main thing what they said was that next iteration of the course should not be about designing and evaluating so much, because it is more “managers’ issue” and less “IT-issue”. But they are more “IT-persons”. They would like to hear more about concrete technologies that are used in different SMEs, larger organizations for supporting learning, knowledge sharing. Based on the different research results, I could introduce the learning and knowledge sharing practices that work in different organizations. So, lets see.

For myself the course was extremely pleasant experience. I’ve never had such a group of learners and I’m glad I had a chance to teach them and to learn from them.

Update – I wrote short description of the course to Learning Layers Open Design Library as well, can be found here.

 

Learning analytics in teacher education

The rising trend of learning analytics is quite surprising. It is surprising that it is still that popular and no signs of.. subsiding. Being the PC in two the TEL-related international conferences, I can say that most of the ongoing research seems to be related with the learning analytic: general frameworks, small developments, learning analytics in higher education, at the workplace, offline and online analytics and so on. Researchers, learning designers, learning technologists, educational technologists, practitioners at school, university or workplace seems to be still interested in it. But what can I say, it is interesting and fascinating field of research.

Last time I wrote that in EMMA project we are developing the learning analytics methodology for the MOOCs. Such task is challenging, because so far our learning analytics experiences are related with the formal higher education courses and we’ve already faced that the learning practices and learning experience in MOOC is significantly different than traditional e-learning course at the small Estonian university. But the work is in the progress and as it is in the large European projects – lot of intensive discussions and negotiations.

Our local development project – web-based learning environment for teacher education eDidaktikum – is also in the phase where we have focused on learning analytics. eDidaktikum can be used as repository of the learning resources, learning environment for conducting the courses or collaboration space. But additionally the idea of the eDidaktikum supports the development of lifelong learning e-portfolio from the initial studies till the end of the professional activities. Meanwhile the role of the user just changes from the pre-service teacher to in-service teacher or teacher educator or any other. The first idea of our learning analytics framework..

.. We use the xAPI, which is a new specification that is part of the ADL Training and Learning Infrastructure. The objective, according to the Glahn (2013), of the XAPI is to express, store and exchange statements about learning experiences, which has two primary parts: a) focus on the syntax of the data format; b) the second part defines the characteristics of “learning record stores” (LRS). At the core of the xAPI are the statements about learning experience.

Learning record store (LRS) in our development is planned to be Learning Locker. Learning Locker is the closest to our needs and although they are developing it, we have managed to get the current version working. eDidaktikum sends the learning interactions in accordance with the xAPI standard to the Learning Locker. Learning Locker presents the list of the learning interactions performed during specific time. In the nearest future the Learning Locker should push the visualized data back to students’ or facilitators’ dashboards in our learning environment.

This is the technical point of view of the learning analytics in eDidaktikum. But what is the aim for implementing learning analytics? Would it be the rise of the motivation of students? To improve the course design? To prevent the fall-outs in courses? Any other? In general there are at the moment two broad aims: to analyze progress of the students with the aim to raise awareness of their learning and overview of the course design with the aim to continuously keep them updated based on the collected feedback. Additionally I’ve came to conclusion in my PhD research the knowledge accumulated in the system is acting like scaffold of teachers’ learning. Therefore, as step further, I’d like to investigate in which way the accumulated knowledge could act as part of the learning analytics by supporting teachers’ professional development.

One of the characteristics of eDidaktikum is that it supports competence-based learning. Different objects in the system and practices can be associated with the competencies from teachers’ qualification standard. Therefor the first step is to use those competencies and to provide the overview about what kind of competencies are covered by different courses in teacher education curricula based on different courses; what kind of competencies are covered in one course; which is the competency profile of the individual student and how does it evolves during the teacher education studies. The figure here illustrates the first idea how to present the competency profile of the student together with the evidences.

ed_learning_analytics

 

 

 

 

 

I’ve found myself playing with the idea that our competence-based learning and assessment has some similarities with the Open Badges (OB). Open Badge (OB) is a digital recognition of one’s achievements or status. In eDidaktikum we follow the achievements of one’s competences as well, but it is not awarded process. Typically OB is awarded by a learning provider or an employer for accomplishing tasks or attaining goals. Different authors have considered the OB to be used as a motivational mechanism – badge for accomplishing the task. More interesting seems the approach to use the OB when rewarding students for improvement instead of their performance and it can be assumed that feedback they receive through digital badges tends to influence their motivation. OB as a recognition mechanism can be used for as an additional element in current prevailing paper-based job application, validation, appraisal processes.

Lets imagine a teacher, who would like to raise the qualification and apply for the next career rank. Teacher presents instead of the paper-based CV + evidences the link to the competency-based e-portfolio and OBs in it. It enables the assessment board to immediately check the evidence and make their own judgments. But sure there are some hesitations related with the badges – who should award badges in informal learning? What does the badge actually says about the learner? And is this visual badge also needed and maybe it is enough that system visualizes how pre-service teacher have developed the competences during some period of time?

These are the first quick ideas that I’ve been playing with. This week I have two presentation about eDidaktikum in our local Estonian e-learning conference. One of my presentations is about competency-based learning and professional development portfolio and I’m quite sure that the idea of analytics is here impossible to avoid.

Reference:

Glahn, C. (2013). “Using the ADL Experience API for Mobile Learning, Sensing, Informing, Encouraging, Orchestrating.” Proceedings of the 7th International Conference on Next Generation Mobile Apps, Services and Technologies (NGMAST). Prague, Czech Republic, 25-27 Sep. 2013. (CPCI)

New project – EMMA started

I just arrived from Naples, kick-off meeting of the EMMA project. We, in the Centre for Educational Technology in Tallinn University, started with the new project – EMMA – European Multiple MOOC Aggregator, funded by 7th framework program, European Commission. Project period is 02.2014 – 08.2014.

Project is coordinated by Universita Degli Studi di Napoli Federico II from Italy.  Consortium includes partners from Spain (Atos Spain SA, Fundacio per a la Universitat Oberda de Catalunya, Universitat Politecnica de Valencia), Italy (Ipsos SRL, CSP – Innovazione Nelle ICT S.C.A.R.L), Netherlands (Open Universitet Nederland), Portugal (Universidade Aberta), United Kingdom (University of Leicester), France (Universite fe Bourgone), Belgium (ATIT Bvba).

Project aims to pilot the EMMA – MOOC’s aggregator – for conducting the MOOCs in different European countries. Emma aims to showcase excellence in innovative teaching methodologies and learning approaches through the large-scale piloting of MOOCs, providing a system for the delivery of free, open, online courses from diverse European universities with the goal of preserving Europe’s rich cultural, educational and linguistic heritage and promoting cross-cultural and multi-lingual learning. MOOCs are a recent development, rapidly becoming a significant trend in higher education. The open nature of MOOCs provides opportunities for expanding access to higher education and creating a space for experimentation with online teaching and learning. This exploration of new approaches for higher education provision has generated significant interest from governments, institutions and others. Building on existing and proven technology, EMMA will provide an opportunity, even for smaller institutions, to share high-quality content. EMMA will provide a framework and infrastructure, while participating institutions will remain autonomous in their choice of design, methodology and tools. EMMA will operate in two main modes; as an aggregator and hosting system of courses produced by European universities; and as a system that enables learners to construct their own courses using units from MOOCs as building blocks. The EMMA Consortium aims at a multi-lingual, multi-cultural approach to learning by offering inbuilt translation and transcription services for courses hosted on the platform. The pilot will operate in 2 steps, first by making available a significant number of existing courses from project partners and/or MOOCs providers and then by bringing on board a second tier of universities keen to experiment with MOOCs. Pilots will run in 7 countries with a total of 16 MOOCs and will involve at least 60.000 users. Courses will be offered in the language of each country, in English, and the pilot will trial an embryo form of multi-lingual translation by offering courses in Italian and Spanish as well. Advances in learning analytics will feature in the analysis and evaluation work and a series of innovative approaches will be trialed to make the piloted service sustainable in the medium to long term.

In the project, we will:

– Develop the learning analytics methodology with the technical suggestions how to integrate EMMA and different approaches to analytics;

– Develop and conduct at least one MOOC

Project seems interesting and the learning analytics approach is really exciting for us. Certainly gives valuable insights for our other projects as well that develop with the learning analytics.

Kick-off meeting was nice and I can’t be happier that we have consortium that consists mainly South-European countries – what a city, weather, food and people we met.