All Recommendations

Based on the experiences and findings of the projects, recommendations were formulated within the different outputs. The main recommendations of the project are summarized here in four categories: data, ethics & privacy, scalability & transferability, and impact.


Retrieved from Telmo Zarraonandia, Ignacio Aedo, Paloma Díaz, and Alvaro Montero. An augmented lecturefeedback system to support learner and teacher communication.British Journal of EducationalTechnology, 44(4):616–628, 2013.

Before delving into more advanced data sources or creating new learning traces (e.g. by building new online courses, by tracking students using card swipes, by measuring the brain activity during contact moments) every institute should check if they already optimally use the available data.

Even “small” data such as course grades are usable for learning analytics and have been proven a first but solid basis for student-facing learning dashboards that provide students with useful feedback.

By building student-facing learning dashboards using the already available data, students could get a first experience on how learning analytics could support their learning.

Hereby, the learning dashboards can provide students real experiences on learning analytics that can create further buy-in, rather than resistance, from students. This also allows to steer learning analytics conversations to a concrete level, rather than staying on the abstract level often inducing a lot of fears.

Higher education institutes can supplement data already available within data warehouses regarding students’ online activities with self-reported data from for instance questionnaires or surveys. These self-reported data might already be available from for instance course-evaluations, student-support services, quality-care, or research. Moreover, self-reported information can often be additionally generated with limited extra investments.

Microinteractions are a new but low-cost opportunity to collect data from students, which is particularly suited for integration in learning analytics dashboards.

Not all available data will be usable for learning analytics. Data of the Virtual Learning Environment will for instance only provide useful learning traces if the Virtual learning Environment is actually used for learning. One condition that learning analytics data should meet before being usable is that it relates to an outcome measure such as student achievement, retention, or success.

Learning activities that trigger online learning will create learning traces that are more likely to be usable in learning analytics. It is therefore important to keep learning analytics into mind when designing learning activities.

Learning design should be informed by learning analytics, such that finally the learning design will, by the learning traces it generates, enable learning analytics.

Learning dashboards themselves generate traces that are useful for learning analytics. The learning traces can both concern the usage of dashboards and microinteractions within the dashboards.

Therefore, the design, development, and implementation of learning dashboards provides a valuable source for new learning traces. This might be especially useful for higher education institutes who are currently lacking useful learning traces due their particular context (more traditional education with a lot of face-to-face teaching, lack of online learning modules on the virtual learning environment, etc.)


 Teachers and course builders will provide the contextual information that is often needed to make sense of the data provided in online courses. They can add information on the targeted audience, timing of the course, the setup of the course (e.g. blended learning), etc.
They can help to identify the questions to be answered by learning analytics and help to understand the particular context and data underlying the learning analytics intervention. 

Student advisers provide invaluable support for designing learning analytics interventions. Their expertise considering the advising process and the typical challenges, experiences, needs, and lines of thought of the involved students. Their input was particular valuable to support the ethics of the learning analytics dashboards: which data to display, which messages to deliver, how to refer to actions to take and additional support, etc. Furthermore, they are the experts in the particular context of the students, which allow them to make small adaptations to the dashboard to tailor them as much as possible to the context of the program at hand. 

Student advisers and student success researchers will provide the required expertise when choosing a reference group for social comparison. Moreover, they will help to assess the implications of the choices made. 

Student advisers provide invaluable support for designing learning analytics interventions. Their expertise considering the advising process and the typical challenges, experiences, needs, and lines of thought of the involved students is particularly valuable to support the ethics of the learning analytics dashboards: which data to display, which messages to deliver, how to refer to actions to take and additional support, etc.

Visualization experts provide support on choosing the bet visualization for the job. Choosing the most appropriate visualization is not easy and the choice should depend on the data to be visualized, the goal of the visualizations, the context in which the visualization is used, and the primary users of the visualization. Therefore we recommend to involve visualization experts to support the decision of the best visualization for the job.

Data-based feedback should be actionable, i.e. it should allow the user (student, prospective student, teacher, etc.) to take action based on the provided feedback. Or differently stated, feedback is only actionable if it provides an opportunity for improvement to those receiving the feedback. 

Feedback can originate from both malleable skills (e.g. student engagement, online learning behaviour, or learning and studying skills) and non-malleable characteristics (e.g. grade or former academic achievement). In the former case, actionable feedback is achieved easily. In the latter case the interventions should provide feedback on  malleable skills related to student success that underlying the non-malleable characteristics.

Predictive algorithms should be used with care within learning analytics.

Predictive models work with the “data” (background information, learning traces) that is available. This available data is only a limited measurement of who the student actually is. Therefore,the data will by definition fail to capture a full picture of the student.

When predictive models are then used to  “predict”  if a student can be successful in a program or not, the prediction is merely based on the data that is available. The prediction will never get “better” than the data that was only available in the first place. As a consequence, the predictions will be uncertain.

Even when accurate predictions are obtained, the actual  “goal”  of these predictions should be decided on first. How are these predictions going to support the users? Will the predication provide insights on what is important to be successful? Will the predictions allow for actionable feedback?

Predictive algorithms are only useful if they provide “interpretable insights”  in the predictions being made. Only then such algorithms have the potential of contributing to actionable feedback that would allow the one receiving the feedback take action based on the feedback provided.

Visualizations can support the interpretation of predictive algorithms by highlighting the variables that contributed to the prediction.

The text supporting the visualizations in learning analytics dashboards should be carefully worded. The wording should support the correct interpretation of the provided data and visualizations and additionally support actionable feedback.

For example, feedback should avoid the use of “chances of success” as such wording creates the perception that education is a “random game”. It seems to indicate that any action the student would take afterwards does not matter, which strongly contradicts the idea of actionable feedback. Furthermore, it does not provide any room for including individual circumstances that might impact the study career.

Visualizations should properly represent the uncertainty of the underlying data (and the underlying predictive models when applicable) and hereby the complexity of the underlying phenomena.

Visualizations should therefore focus on visualizing the underlying uncertainty. Hereby, they can induce reflection of the user, with room for interpretation and for instance the incorporation of individual factors not captured in the data.  The visualization should support reflection, interpretation, and action of the user.

Simplifying visualizations such as traffic lights do not provide the necessary nuance and room for interpretation.

Student-facing learning dashboards can provide student with a transparent and useful view on the data that higher education are collecting.
Furthermore, directly targeting students as the primary audience, helps to introduce the concept of a learning analytics in a positive and nonthreatening manner, resulting in increased buy-in and stakeholder-driven demand for expansion of learning analytics.

Student-facing learning analytics dashboards are a powerful means to offer data transparency to students. They provide student not only with a transparent but also useful view on the data that higher education are collecting.

Therefore, learning analytics and learning dashboards in particular are an opportunity for GDPR.


  • The context of the transition from secondary to higher education is challenging. In particular the lower educational level of the students in the transfer and the possible lack of self-regulatory skills should be taken into account when developing and evaluating learning analytics interventions.
    The case study of transferring the Learning Tracker from a Massive Open and Online Course (MOOC) to a Small and Private Online Course (SPOC) shows that context matters.
    The different learning activities and the particular context of the SPOC preparing for a high-stakes entrance exam required adaptations to the Learning Tracker and the summarizing measures for learning activities it is providing feedback on in particular.
  • The case study of the LASSI dashboard reveals that the different involved programs embed the dashboards differently within their support practices. The LASSI dashboard was found to be useful for embedding in both private conversations between student and student adviser and in workshops regarding learning and studying skills.

  • The case study of the  REX dashboard reveals that the different involved programs embed the dashboards differently within their support practices. The results of the dashboard regarding use and perceived usefulness have to be interpreted with the difference of context into mind. Without knowing the context, the differences in impact are hard to understand or explain.
  • The case study of the POS dashboard reveals that the particular context can strongly influence the learning analytics solutions. For the POS dashboard the users, who are aspiring students, have at the time of data collection no formal relationship with the participating institutions (yet), which provides particular challenges for the feedback and students’ data privacy. The procedures and technical solutions have to be adapted to the particular context. The use of pseudonimization based on the feedback code, provides a particular example.
  • The case study of the  SPOC dashboard for teachers and course builders shows that the actual dashboard depends will depend on the context, while the underlying open-source and modular technology stack can remain constant. Based on interaction with the stakeholders the dashboard focuses on answering questions that are focused on the SPOC add hand. While this makes the actual dashboard more useful for the particular case study, it limits its wider application

The modularity of the software solutions is key for allowing integration within the existing resources at higher education institutes. A modular design ensures that the stakeholders can choose between different software modules (e.g. they can choose to use proprietary software for particular tasks) for easy adjustment to their needs and optimal integration with existing systems. 

The experiences from the project show that the developed learning analytics interventions will only be sustainable if they will be embedded with the existing institutional IT-choices, know-how, and educational policy and practices

  • The experience with the Learning Tracker have shown that research teams and educational developers can build the required evidence for learning analytics interventions, supporting further continuation and long-term deployment of the initiatives. 
  • The experiences of the LASSI, REX, and POS dashboards have shown that a bottom-up initiatives, developed within projects, can build the required experience and can already realize support from the stakeholders, maximal integration with the existing institutional IT-choices and know-how. However, for long-term continuation the involved institutes should really embed the developed learning analytics dashboards within their IT-support and overall educational vision and practices.

Starting a learning analytics pilot at an institution of higher education can be generally be accomplished without too many difficulties. However, two main factors provide a challenge to scalability: availability of resources, organizational complexity of the institution and its systems. Commitment of the executive board is necessary to gain the necessary resources to scale up learning analytics initiatives. Diversity in university governance, structures, and IT landscapes poses a challenge for national approaches to introducing learning analytics at scale. These factors directly translate to stakeholders within the university: leadership (which has to support a learning analytics initiative and commit resources) and the administrative apparatus of the institution (i.e. the adaptability of the institution’s administration and the interconnectedness and complexity of the IT landscape).

National cooperation organizations in the field of educational technology have demonstrated in the United Kingdom and the Netherlands that they can provide important support for higher education institutions when considering and implementing learning analytics. In these countries, they have given an impulse to the concrete application of learning analytics and also to accompanying research. National action plans for learning analytics are easier to initiate and execute the more institutions within said country share common organizational structures and IT systems landscapes, and are generally inclined to cooperate at the national level. National cooperation organizations can help in supporting this.


Before impact can be realized with learning analytics dashboards, dashboards have to find their way to practice. Therefore, these dashboards have to be accepted first by the stakeholders.

An important, and often first, step in evaluating learning analytics interventions is to assess the dashboards acceptance.

The involvement of the student advisers in design of the learning dashboards, and the possibility to edit the content of the dashboards directly in particular, have created the required acceptance.

The transfer of the LASSI dashboard from KU Leuven to TU Delft showed that acceptance is key before impact can be realized. At TU Delft there was less buy-in from staff and students before the launch of the learning dashboards, resulting in a low response rate to the questionnaire preceding the intervention.
This shows that it is important that the learning analtyics intervention is well-accepted within the study programs such that staff will take the responsibility in the communication towards students and students can take a well-informed decision whether they want to participate or not.

Impact of learning analytics in academic achievement or retention can be hard to measure. While particular isolated context such as Massive Open and Online Course can provide opportunities for assessing the impact of learning analytics interventions on academic achievement and retention, similar application to higher education context can prove to be more challenging.

  •  First, often learning analytics interventions do not have an immediate academic evaluation, and thus assessment, connected to it.
  • Secondly, real higher education contexts are no well-controlled environment where all conditions, except a learning analytics intervention, can kept unchanged. Even more, often (and preferably) learning analytics interventions are part of a larger strategy to improve learning or student support.
  • Thirdly, ethical restrictions apply within a real university context which for instance can prevent testing with a treatment and control group.

Impact of learning analytics interventions should however be considered more broadly than academic achievement and retention. Learning analytics interventions can also impact learning engagement and behavior, or overall academic experience.

Learning analytics interventions can reach many stakeholders using two approaches:

  1. by learning analytics interventions that operate on a “general” level such that they are applicable to entire programs or even groups of programs.
    Examples are the LASSI, REX, POS, and NTU student dashboards.
  2. by learning analytics interventions targeting a very specific setting, which itself can reach many stakeholders.
    Example is the Learning Tracker, which is implemented and adapted for particular online courses (MOOCs and SPOCs), but can reach a large target audience even within the specific course