Today’s LAEP/LACE Expert Workshop highlighted findings thus far from the LAEP project. You can find most of the resources online for comment. Here’s what was highlighted from the project today:
LAEP research aims:
- What is the current state of the art?
- What are the prospects for the implementation of learning analytics?
- What is the potential for European policy to be used to guide and support the take-up and adaptation of learning analytics to enhance education in Europe?
What do we want in the future from learning analytics (looking 10-15 years on), and how can policies influence this?
This was a 9 month study, which gives its final report in June. It was made up of several key areas: literature review, glossary, inventory, case studies, expert workshop, and a final report.
A list of someone brand new to learning analytics, so they can look at the glossary and see the main terms. This was developed using frequently used keywords from the LAK dataset.
This literature review focused on implementation, which is a new addition to the field. A few key areas were highlighted, which need to be addressed in order to move the field forward:
- Underpinning technology
Systems need to be technically and semantically interoperable. We don’t want everyone across Europe reinventing the wheel and not sharing expertise. We need systems that can talk to each other and can link and build on each other. We also need clarity and context in the data sets: are they children or university? We need to think about the quality of data and whether our data sets are complete. Do they have holes or gaps? Are they out of date? Are people giving us the correct data? Who owns the data? Is it owned by the individuals who generated it, or the institution, or groups? At the moment we don’t really know the answers to these questions. We need to know who should have access to data and why, and when that access is needed or should be removed. Also more research is needed on data warehouse and storage.
- Policy, codes of practice and governance
A lot of data has been gathered, without people necessarily knowing it has been collected. How can we tell them about what is going on? How do we involve people in the creation of policy? How do we preserve privacy and adapt practices as things change? There is na need to keep people updated with what we are doing, otherwise their consent is meaningless as they do not know what they are consenting to.
- Skills and data literacies
Current capability to develop and deploy analytics is still low. Skills gap of those who can do the analysis, but also in who knows how to use the analytics once it is made (teachers). Data may mislead teachers, or they may simply set it aside because it is meaningless to them. We need open and shared analytics curriculum covering both technology and pedagogy. Research is needed on the use of visualisation and helping it make sense to others.
- Culture, values and professional practice
We need to relate analytics to the purpose of education. To some extent we agree on that, but it can also vary by our context or institution. We need to think about what we are trying to achieve with education and how we can link analytics to that so it really works for people. If we want people to use analytics, how do we build that into training, so they can be confident with using it? Educators need to be able to use the tools and make informed decisions.
The purpose of the inventory was to show the state of the art of the practical adoption of learning analytics. It was a “broad but shallow” collection of informative examples in three areas:
- Policy documents (14 inventories)
- Practices (13 inventories)
- Tools (19 inventories)
Option to look at these online and add to them — additions welcome.
LAEP also did an in-depth case study of six organisations:
- BlueCanary as a commercial provider of data science expertise
- Kennisnet’s work in raising sector awareness, knowledge and skills
- University of Technology, Sydney use of learning analytics as part of a data intensive strategy
- Apereo, and their creation of an open-source software stack for learning analytics
- Norway’s recent government initiatives and funding of a national centre for learning analytics (SLATE)
- the Open University’s institutional initiatives to create an ethics policy specific for learning analytics
As part of the case study process, LAEP also considered the role of policy for each. Important to note that each come from a variety of perspectives with very individual motivation. Nevertheless several key policy considerations were identified in the case studies:
- A need to holistically include stakeholders in policy discussions, including students, teachers and industry. It is important for stakeholders to be defined and actively incorporated in the process.
- Testing and evaluation of schools/students may lead schools/teachers to shun the use of new technologies. There is a perception that evaluation methods may favor outdated practices.
- A need for more explicit educational ethics policies specific to education on institutional, national and international levels.
- A shared data dictionary and ability to ethically share data between institutions