LAEP/LACE expert workshop: Summary of Day 2

Yesterday’s LAEP/LACE expert workshop on policy and learning analytics focused on the future. Where is learning analytics going? What will be the barriers for moving forward? Below is a summary of the day’s discussions and activities:


The first activity involved considering the year 2019 (3 years ahead), and discussing in small groups what barriers and complications will be faced in learning analytics’ immediate future. Here’s what attendees came up with:

Group 1
An important consideration is the General Data Protection Regulation (GDPR), which will be enforced in the next two years. This will impact the learning analytics field in many ways, many of which are largely unknown at this point. Europe has taken a standpoint that individual privacy is important, and that changes to current practices in general analytics are needed. Moving forward, the definition of personal data is going to be larger and more complex, and these legal changes will move universities from the role of processors of data to being containers of data. This will lead to an increased need to help parents and students understand what their data is being used.

One other issue is that organisations/schools/companies that aren’t privacy sensitive will be cautious and slow about adoption of learning analytics, while those that are not conscious will be the ones first on the market. There is a notion (or catch-22, perhaps) that we need to make sure that our regulations are not pushing us too far behind. We are potentially at a competitive disadvantage when we follow the book ‘to a T.’

Group 2
Three main ideas from this group: (1) there is a need to bring people and stakeholders on board by reaching out to teachers, students and staff in institutions. Currently our notion of ‘stakeholder’ is too narrow and means that important elements in the problem are ignored. (2) Legislation is needed to create new rules and guidelines from an EU perspective, as well as guidelines to help students/teachers understand when it is safe to use and how to best harness benefits from learning analytics. (3) Resources: we need to be developing and creating opportunities from the resources we already have, as well as creating new resources to serve the needs we still have.

Group 3
Three main points from Group 3 as well. (1) The notion of  institutional support — teachers need support to scale up their use of data and to provide more time for teacher to spend on these issues. (2) More empirical evidence will be needed moving forward. Teachers also need support to scale up and experiment, and more time to spend on these issues. (3) Grant proposals should explicitly include an evaluation phase, not just the design of new tools. Too often the goal is to simply make a tool, not to push that tool forward into use. Grants should also include 1 or 2 people that can spend a significant amount of time on them (i.e. more than 10%).

Group 4
One consideration is that 3 years is one generation of students, but the lifecycle of the teacher is much longer. The teachers of tomorrow are already on campus, and we must work with what we have. We should also keep in mind what views and sentiments are coming from our societies.

Transparency is also needed as no one truly knows what is happening with algorithms. There are discussions about ‘safe spaces’ in education and student aversion to being ‘shocked’ by educational materials –> how will that impact their views towards learning analytics? There is a need to distinguish ourselves from the negative portrayals of ‘big data’ in the media (examples: Facebook, google), otherwise we will fail because student sentiments will lead in the end.
Group 5
Group 5 highlighted two primary concerns: (1) There needs to be a focus on quality and resource creation. Too often learning analytics projects are funded for a finite amount of time. What happens when the funding ends? How do we continue to push forward initiatives when projects are finished? (2) In the end, we must consider: what do we want from learning analytics? In this workshop there has been a focus on cognitive gains, but we should also consider emotional and social elements.
Group 6
There are often problems because motivations and policy initiatives have effects at different levels. There is a problem when IT departments have control of data due to institutional policies, and this leads to the faculties being unable to access the complete set of data that is available about students. Institutional policies about continuing assessment enables experimental analytics to come into play. Government and agency policy items, such as fee structs, affect learning at all levels. There is a need for quality control, but there is also an issue of who would be responsible for managing and controlling it. Finally, there is an important question of how to drive innovation and push it in a direction it needs to go when there is resistance on multiple levels. We simply don’t have enough empirical evidence at this point to counter those notions.
Group 7
One issue that needs attention is of restraint. When thinking of technology and the future, we need to consider the danger of getting carried away and doing too much too soon. There are many fun and interesting things we can do with data, but at the same time we must keep in mind that the analytics we do must help students succeed and foster education. Perhaps we should focus on a more shallow depth across institutions rather than just a small few doing learning analytics at high levels. At a low level, maturity in the field is already there to see results, but institutions must take care not to be distracted by technological possibilities.

Foresight exercise 
The workshop next looked further into the future at the year 2025. A series of fictional case studies from the future were considered by each individual group. If you’re interested in a play-by-play, my colleague Doug Clow kept an excellent live blog (here and here). As a brief summary, here are a few key themes and considerations put forth throughout the activity (in no particular order)
  1. Bringing the data back to the learner — there was repeated reference to the role of the student in the learning analytics process. It was noted that more resources are needed to help students make sense of learning analytics and findings from their own educational data in order to form meaningful conclusions about their students. One key aspect in this is the creation of resources and ‘wise advice’ on an administrative level within the university. Further pushed was the notion that students are not necessarily on a linear or standard career path (as many of those in the room could attest to from their own educational background).
  2. It is our duty to act upon the data we have. Learning analytics has now progressed to a level that many universities are able to collect and analyse data to make predictions of individual student success. Many in the workshop felt strongly that universities who can predict that a student is at risk for failure or leaving the university have a duty or obligation to act on that knowledge. There was much discussion about how knowing is not enough, and more resources are needed to act on the ‘red flags’ that learning analytics highlights.
  3. Do we want learning analytics to change or reinforce the status quo? There seems to be a debate about whether we should look at adopting learning analytics policies and practices from the supply or demand perspective. Should we create learning analytics practices that change the way we do education? Or should the focus be on enhancing the educations systems we already have (perhaps by ‘making life easier’ for the teacher)?
  4. Intelligent systems need human and cultural awareness. Today’s workshop marked perhaps the first time that I’d seen culture and learning analytics being discussed in depth (yay!). There was mention that human-written algorithms can reinforce institutional and personal biases. Similarly, there was the notion that a multi-cultural lens is needed to interpret data outputs. There are dangers in homogenising students and a need for human interpretation to make sense of data in a realistic, real world setting.
  5. Desirable learning outcomes must be identified. It’s not enough to simply collect data and analyse it. We need to understand how to interpret data and make sense of it in the context of the curriculum. Important questions include: What do we want students to know? What data can demonstrate this knowledge?  
  6. Learning analytics should enhance teaching, not replace it. There were discussions about dystopian fears of education becoming a machine-like process and learning analytics eliminating the role of the teacher. That’s not the learning analytics we want to move towards — we want teaching practices to be aided by learning analytics, not eliminated.
  7. Individual achievements are more important than interpersonal comparisons. There were comments about normative ranking of students being problematic. Learning analytics should give insights about the individual student learning process, rather than vetting students against one another in competition for the best classroom ranking.
  8. What we measure is as important as how we measure it. Many comments warned against ‘easy data’ and interpreting based on data we have, rather than data we need. There was a fear that it is easier to measure the ‘wrong things’ than it is the ‘right things.’ There is a need to consider what we want to know about students about what data we need to get to that point, rather than just playing with data until we reach some semblance of a conclusion.
  9. We need to determine what analysis is too little versus what is too much. Many of the scenarios were considered undesirable as they learned too strictly towards a rigid notion of behaviouralism. There were many comments about the need to include (innovative) pedagogy into the equation, which can help determine the appropriateness of the ‘level’ of learning analytics for learning needs.
  10. We need more than just ‘flashy’ data. Big data is charming and many tools can create pretty graphs, but what is truly needed is a translation from ‘flashy’ to results. Important to achieving this is a focus on teachers and learners in realistic classroom settings. There is also a need for continued evidence of the benefits of learning analytics and evaluation of tools.

The final segment of the workshop considered actionable changes in policy. This was divided into two categories: “EU policy: Action NOW!” and a ‘Wish List’ for the future. Below is a summary of the discussion.

EU Policy: Action NOW!
  1. Innovative pedagogy: There is a need for novel, innovative pedagogy. We need to flip the equation and bring pedagogy into the equation to drive the use of data to solve practical problems.
  2. Data privacy: a clear statement is needed from privacy commissioners about  controls to protect learners, teachers and society
  3. Orchestration of grants: The current system does not benefit learning analytics research and more efficiency is needed to move progress forward. Considerations include: a need to focus on evidence over tools, making steps to counter duplication of work, shorter tenders to eliminate the administrative process
  4. LACE evidence hub funding: the funding for the LACE evidence hub is due to expire soon, but this project serves a vital, necessary function that drives forth the field. Continued funding is needed to make sure a repository for evidence and transparency continues to exist.
  5. 21st century skills: There is a risk that learning analytics will take us down an objective view of education, as it promotes primarily measurable things. One key missing area of measurement is on 21st century skills, as well as their relationship to learning analytics so that a particular view of education is not favoured.
  6. Focus on process rather than outcomes: Policy focuses should be not just on measurable outcomes, but also about the holistic process of learning
  7. Crowd sourced funding support: One consideration would be a crowdsource-like funding of tools that teachers actually need, perhaps with EU top-up funding when crowdsourcing is successful
  8. Ambassadors of the cause: In the outside world, very few people are actually aware of what is going on in learning analytics. We need more outreach: conferences, active local communities, building learning analytics into national curriculums, etc
  9. Open access standards: There is currently no standardisation of open standards in Europe. There are now small profiles for standards (JISC work in the UK is a good example), but these need put into practice for analytics.

Wish list

  1. Teacher education: media competencies and learning analytics knowledge needs to be built into the education for both new and existing teachers
  2. Orchestration of grants (see above)
  3. Decide which problems we want to solve: We can’t move the field forward until we have collective discussions on what direction we want to go
  4. Openness and transparency (see above)
  5. Analytics for 21st century skills (see above)
  6. Learning analytics for the process of learning (see above)
  7. Identify success cases that give us a solid foundation: We need more examples of successful analytics (such as those compiled by the LACE evidence hub!), in order to learn from those who are doing it well
  8. Supporting teachers, standards and technology architecture: There needs to be a cost-benefit analysis of investing in learning analytics. Tools and technology are needed, as well as IT workers in the school to manage it. However, we need pedagogy driving that innovation and teacher involvement in developing systems.
  9. Identify successful methodologies: (similar to #7)
  10. Facilitate data amalgamation: More consideration is needed about how to combine data sources to a multi-faceted insight into the problems we seek to solve. Another important problem is the ability to exchange data and findings between institutions.
  11. Consider where we want go to next: Learning analytics is not just about numbers and statistics. We can use our skills to tackle big problems at the university, such as gender and diversity issues. We need to consider how learning analytics and the combination of data sources can fit in with the wider university missions.

Finally, there was a vote to determine the top needs of the field moving forward. The following won in each category:

EU Policy: Action NOW!
1.) Innovative pedagogy
2.) More evidence and funding for the LACE evidence hub
3.) Ethics policies

Wish list
1. Teacher education training
2. Deciding which problems we are solving and how we are solving them

 

Advertisements

2 thoughts on “LAEP/LACE expert workshop: Summary of Day 2

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s