Yesterday’s LAEP/LACE expert workshop on policy and learning analytics focused on the future. Where is learning analytics going? What will be the barriers for moving forward? Below is a summary of the day’s discussions and activities:
The first activity involved considering the year 2019 (3 years ahead), and discussing in small groups what barriers and complications will be faced in learning analytics’ immediate future. Here’s what attendees came up with:
An important consideration is the General Data Protection Regulation (GDPR), which will be enforced in the next two years. This will impact the learning analytics field in many ways, many of which are largely unknown at this point. Europe has taken a standpoint that individual privacy is important, and that changes to current practices in general analytics are needed. Moving forward, the definition of personal data is going to be larger and more complex, and these legal changes will move universities from the role of processors of data to being containers of data. This will lead to an increased need to help parents and students understand what their data is being used.
One other issue is that organisations/schools/companies that aren’t privacy sensitive will be cautious and slow about adoption of learning analytics, while those that are not conscious will be the ones first on the market. There is a notion (or catch-22, perhaps) that we need to make sure that our regulations are not pushing us too far behind. We are potentially at a competitive disadvantage when we follow the book ‘to a T.’
Three main ideas from this group: (1) there is a need to bring people and stakeholders on board by reaching out to teachers, students and staff in institutions. Currently our notion of ‘stakeholder’ is too narrow and means that important elements in the problem are ignored. (2) Legislation is needed to create new rules and guidelines from an EU perspective, as well as guidelines to help students/teachers understand when it is safe to use and how to best harness benefits from learning analytics. (3) Resources: we need to be developing and creating opportunities from the resources we already have, as well as creating new resources to serve the needs we still have.
Three main points from Group 3 as well. (1) The notion of institutional support — teachers need support to scale up their use of data and to provide more time for teacher to spend on these issues. (2) More empirical evidence will be needed moving forward. Teachers also need support to scale up and experiment, and more time to spend on these issues. (3) Grant proposals should explicitly include an evaluation phase, not just the design of new tools. Too often the goal is to simply make a tool, not to push that tool forward into use. Grants should also include 1 or 2 people that can spend a significant amount of time on them (i.e. more than 10%).
One consideration is that 3 years is one generation of students, but the lifecycle of the teacher is much longer. The teachers of tomorrow are already on campus, and we must work with what we have. We should also keep in mind what views and sentiments are coming from our societies.
- Bringing the data back to the learner — there was repeated reference to the role of the student in the learning analytics process. It was noted that more resources are needed to help students make sense of learning analytics and findings from their own educational data in order to form meaningful conclusions about their students. One key aspect in this is the creation of resources and ‘wise advice’ on an administrative level within the university. Further pushed was the notion that students are not necessarily on a linear or standard career path (as many of those in the room could attest to from their own educational background).
- It is our duty to act upon the data we have. Learning analytics has now progressed to a level that many universities are able to collect and analyse data to make predictions of individual student success. Many in the workshop felt strongly that universities who can predict that a student is at risk for failure or leaving the university have a duty or obligation to act on that knowledge. There was much discussion about how knowing is not enough, and more resources are needed to act on the ‘red flags’ that learning analytics highlights.
- Do we want learning analytics to change or reinforce the status quo? There seems to be a debate about whether we should look at adopting learning analytics policies and practices from the supply or demand perspective. Should we create learning analytics practices that change the way we do education? Or should the focus be on enhancing the educations systems we already have (perhaps by ‘making life easier’ for the teacher)?
- Intelligent systems need human and cultural awareness. Today’s workshop marked perhaps the first time that I’d seen culture and learning analytics being discussed in depth (yay!). There was mention that human-written algorithms can reinforce institutional and personal biases. Similarly, there was the notion that a multi-cultural lens is needed to interpret data outputs. There are dangers in homogenising students and a need for human interpretation to make sense of data in a realistic, real world setting.
- Desirable learning outcomes must be identified. It’s not enough to simply collect data and analyse it. We need to understand how to interpret data and make sense of it in the context of the curriculum. Important questions include: What do we want students to know? What data can demonstrate this knowledge?
- Learning analytics should enhance teaching, not replace it. There were discussions about dystopian fears of education becoming a machine-like process and learning analytics eliminating the role of the teacher. That’s not the learning analytics we want to move towards — we want teaching practices to be aided by learning analytics, not eliminated.
- Individual achievements are more important than interpersonal comparisons. There were comments about normative ranking of students being problematic. Learning analytics should give insights about the individual student learning process, rather than vetting students against one another in competition for the best classroom ranking.
- What we measure is as important as how we measure it. Many comments warned against ‘easy data’ and interpreting based on data we have, rather than data we need. There was a fear that it is easier to measure the ‘wrong things’ than it is the ‘right things.’ There is a need to consider what we want to know about students about what data we need to get to that point, rather than just playing with data until we reach some semblance of a conclusion.
- We need to determine what analysis is too little versus what is too much. Many of the scenarios were considered undesirable as they learned too strictly towards a rigid notion of behaviouralism. There were many comments about the need to include (innovative) pedagogy into the equation, which can help determine the appropriateness of the ‘level’ of learning analytics for learning needs.
- We need more than just ‘flashy’ data. Big data is charming and many tools can create pretty graphs, but what is truly needed is a translation from ‘flashy’ to results. Important to achieving this is a focus on teachers and learners in realistic classroom settings. There is also a need for continued evidence of the benefits of learning analytics and evaluation of tools.
The final segment of the workshop considered actionable changes in policy. This was divided into two categories: “EU policy: Action NOW!” and a ‘Wish List’ for the future. Below is a summary of the discussion.
- Innovative pedagogy: There is a need for novel, innovative pedagogy. We need to flip the equation and bring pedagogy into the equation to drive the use of data to solve practical problems.
- Data privacy: a clear statement is needed from privacy commissioners about controls to protect learners, teachers and society
- Orchestration of grants: The current system does not benefit learning analytics research and more efficiency is needed to move progress forward. Considerations include: a need to focus on evidence over tools, making steps to counter duplication of work, shorter tenders to eliminate the administrative process
- LACE evidence hub funding: the funding for the LACE evidence hub is due to expire soon, but this project serves a vital, necessary function that drives forth the field. Continued funding is needed to make sure a repository for evidence and transparency continues to exist.
- 21st century skills: There is a risk that learning analytics will take us down an objective view of education, as it promotes primarily measurable things. One key missing area of measurement is on 21st century skills, as well as their relationship to learning analytics so that a particular view of education is not favoured.
- Focus on process rather than outcomes: Policy focuses should be not just on measurable outcomes, but also about the holistic process of learning
- Crowd sourced funding support: One consideration would be a crowdsource-like funding of tools that teachers actually need, perhaps with EU top-up funding when crowdsourcing is successful
- Ambassadors of the cause: In the outside world, very few people are actually aware of what is going on in learning analytics. We need more outreach: conferences, active local communities, building learning analytics into national curriculums, etc
- Open access standards: There is currently no standardisation of open standards in Europe. There are now small profiles for standards (JISC work in the UK is a good example), but these need put into practice for analytics.
- Teacher education: media competencies and learning analytics knowledge needs to be built into the education for both new and existing teachers
- Orchestration of grants (see above)
- Decide which problems we want to solve: We can’t move the field forward until we have collective discussions on what direction we want to go
- Openness and transparency (see above)
- Analytics for 21st century skills (see above)
- Learning analytics for the process of learning (see above)
- Identify success cases that give us a solid foundation: We need more examples of successful analytics (such as those compiled by the LACE evidence hub!), in order to learn from those who are doing it well
- Supporting teachers, standards and technology architecture: There needs to be a cost-benefit analysis of investing in learning analytics. Tools and technology are needed, as well as IT workers in the school to manage it. However, we need pedagogy driving that innovation and teacher involvement in developing systems.
- Identify successful methodologies: (similar to #7)
- Facilitate data amalgamation: More consideration is needed about how to combine data sources to a multi-faceted insight into the problems we seek to solve. Another important problem is the ability to exchange data and findings between institutions.
- Consider where we want go to next: Learning analytics is not just about numbers and statistics. We can use our skills to tackle big problems at the university, such as gender and diversity issues. We need to consider how learning analytics and the combination of data sources can fit in with the wider university missions.
Finally, there was a vote to determine the top needs of the field moving forward. The following won in each category:
EU Policy: Action NOW!
1.) Innovative pedagogy
2.) More evidence and funding for the LACE evidence hub
3.) Ethics policies
1. Teacher education training
2. Deciding which problems we are solving and how we are solving them