LAEP/LACE expert workshop: Summary of Day 1

Today’s LAEP/LACE expert workshop in Amsterdam brought together nearly 50 learning analytics experts from approximately 14 countries to discuss to role of policy in learning analytics. The culminating activity of Day 1 was a group exercise to suggest key areas that should be considered in future practice and policy creation. So what happens when you put a bunch of learning analytics experts in one room and ask them to make suggestions about moving the field forward? Here is what the groups came up with:

Group 1
Learning analytics needs rich data, but learning management system data can simply give us activity data, such as clicks and time stamps. What is needed is data from other sources, such as rich data sources that are complimentary to activity data. Two examples: data from formative assessment (assessment for learning, rather than of learning) and student disposition data. (One example of a university using this method is Maastricht University School of Business and Economics)

Group 2
We’ve had a lot of discussions about how to get rich data, but one important question is: how do we get teachers to actually use this data? Many teachers are conservative in their way of teaching and are rather qualitatively-oriented. Thus, how do we get them to work with quantitative data, with which they may feel uncomfortable? We need to be able to first convince teachers that using learning analytics is a good idea.

Group 3
We have seen long lists of tools, but an important question is: when we give this exhaustive list of resources to teachers, what do they do with it? In reality, most are probably unsure about how to approach which tools to use to fit their specific needs. Group 3’s idea is to create an evaluation framework for teachers to help them make tool selections. Rather than focusing on building more tools, we should instead focus on helping schools and teachers find the right ones for their specific needs.

Group 4
Much of the discussion so far today has been about data, projects and models, but there hasn’t been much discussion about how these connect with education. There is tendency to think about learning analytics from the supply side (i.e. the IT side). However, you also have to consider the ways in which teachers make the change to work with the technology.  Group 4’s suggestion is to look rather at the demand side, putting the teacher in the spotlight to understand what they want and need. The system should work for the teacher, not the other way around.

Group 5
Group 5 related their ideas to ‘fun.’  They argued for including a human side to learning analytics. Much of the learning analytics discussion has been focussed on performance matrices and how this affects teachers/learners/policy makers. However, there is delight and motivation inherently involved in education and gaining information about students’ learning from learning analytics systems. Learning analytics should empower learners and teachers to make the right decisions for their needs. We should work on that empowerment, keeping in mind that it’s not about activity data, but rather about rich data and the human side of learning.

Group 6
Group 6 also focussed on the teacher in learning analytics, with a message similar to Group 4. They argued that there is a plethora of tools available on the supply side of the learning analytics equation, but the demand side hasn’t yet caught up. Perhaps it is time to flip the thinking and focus on teacher wants and needs.

Group 7
Group 7 considered a way to bridge the gap between educational research and institutional use of learning analytics. They argued that a bottom-up approach is needed, in coordination with a top-down approach. On one hand, it is important for management to say learning analytics is important and why we need it. From the bottom, another need is enthusiastic people who want to do educational experiments with data. These two groups need to start talking together, which is often not the case.

Group 8
The analytic tools are plentiful, and have the capabilities to give nice pictures and graphs about student data to anyone interested. However, teachers and managers who are buying into these systems need more information about which are the most useful for their specific needs. There should be more resources for helping school and teachers decide which practices will work for them in a realistic real-world setting.

Several key themes could be seen in this activity, as well as discussions throughout Day 1:

  1. There was a focus throughout the day’s discussions on the teacher. Partly this was in the context of incorporating teachers as key stakeholders in the adoption of learning analytics practices. After all, for learning analytics to be sustainable, it must work for and with existing education practices. In other cases, there was a notion that teachers must be convinced of the merits of using learning analytics, and (in a way) their hesitancy or analytic skills gap demonstrated a barrier to wide-scale adoption. Key questions for moving forward: Do teachers have time for learning analytics? How do we train teachers to use quantitative data? How does learning analytics enrich the classroom rather than distract from it? What merits do we use to ‘sell’ the use of analytics to teachers?
  2. Learning analytics needs to be more than just numbers. As education is inherently a complex and qualitative process, it is important to not just quantify learning, but rather to consider and incorporate real, authentic human elements into the discussion. It is important to keep in mind that learning analytics is meant to serve and embellish education. However, to preserve this notion, it is important to consider what we mean by ‘education.’ What goals and values does education serve in our society, and how can learning analytics enhance that?
  3. The field cannot mature without further considering data ethics and privacy. Points were made today that this is not just about student ownership of their own data, but also about the privacy of the teacher. This is particularly important as learning analytics becomes increasingly tied to student performance and can, thus, be linked with teacher performance.
  4. There were several ideas put forth today that student data should go beyond simple activity data. Rather, analysis should include rich data sources such as disposition or assessment data to create a more rounded picture of student performance. At the same time, there was pushback that activity data can be useful as a ‘quick and dirty’ understanding of students’ understanding and engagement. Key questions to consider here: What do we want to know about students? What data do we need to get there?

However, just as important to what we did discuss are the things we didn’t discuss. Here are some notable absences in today’s discussion, which may need more attention in Day 2 of the workshop (and in moving forward the field in general).

  1. Much of today’s discussions centred on institutions, teachers and tools. One critical aspect that was not explicitly addressed was the student. How is learning analytics serving students, and what tangible benefits do they gain from the adoption of learning analytics practices and policies?
  2. Today’s talks mostly considered learning analytics in the higher education environment, whereas more discussion is needed on how this translates to other domains: schools, workplace, informal learning, etc.
  3. It’s easy to make wide suggestions for moving forward, such as ‘include more teachers’ or ‘use more data.’ However, little discussion has centred on how to address these good ideas in practice. What is explicitly needed in policy and practice to push forth these initiatives? What realistic objectives can we work towards in the creation of new policies to support our goals in the field?
  4. Here’s me being a little biased and tying in my own research: there was plenty of discussion about keeping in the diversity of needs from a teacher perspective. However, there was little talk on diversity of the students themselves. When we consider the ‘human’ elements of learning analytics, we must also consider that the human beings to which the data serves are must-faceted themselves and come from a wide variety of culture and ethnicities. How does diversity and culture play a role in (1) the data we collect about student dispositions and activities and (2) their view of the ethical use of their data? I personally feel that there is an urgent need in learning analytics to not homogenise students and their backgrounds, and to consider how things such as ethnicity and culture affect student behaviours (as this is also key to transferability of findings between institutions internationally). Similarly, we must remember that this ‘human element’ is not a synonymous or ‘one size fits all’ term for all those served by education systems.




2 thoughts on “LAEP/LACE expert workshop: Summary of Day 1

Leave a Reply

Fill in your details below or click an icon to log in: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s