Skip main navigation

Learning Analytics at ALT-C 

Author: Matt Jenner, Learner Experience Lead at FutureLearn

The Association for Learning Technology (ALT) represents technologists, academics, designers, researchers and policy makers from many organisations and sectors across the UK. All members have an interest in learning technology. ALT-C is the annual conference which is attended by around 500 members and has an impressive attendee to presenter ratio of about 2:1.

Like all conferences, there were plenty of chances to interact with practitioners and vendors, discover small projects, and be inspired. In this piece, I wanted to share some insights that arose around data and learning analytics and highlight some of the key platforms and tools on show. 

Data and Learning Analytics

This year’s theme was Data, Dialogue & Doing and my goal was to discover more about what the sector is doing in the space of data and learning analytics. From the many sessions I attended, several related themes arose from the use of data and/or learning analytics – but to start, a quick definition taken from JISC:

“Every time a student interacts with their university – be that going to the library, logging into their virtual learning environment or submitting assessments online – they leave behind a digital footprint. Learning analytics is the process of using this data to improve learning and teaching.”

This is important; the field of learning analytics, and as the use of data grows, it can mean different things to many individuals. While broadly there is consensus, the minutiae of what analytics means, and how it can be used is still somewhat debated. This wasn’t the theme of the conference, but it was evident that the field is still emerging based on a variety of factors. 

Predictors and indicators

The idea of using data to provide a dashboard or traffic light system to show students ‘at risk’ is still emerging and immature. There remains this goal (or unfilled potential) for learning analytics that it should be used to provide some level of support for students who may become lost in the system or drop-out. The goals of a predictor or indicator is to use data, interpreted by algorithms, and then into a human-readable format. 

However, at this stage the data collected is not comprehensive because courses are not consistent. There is no design standard that can be used to model data-driven outputs for 100% accurate reporting. Therefore the indicators themselves are missing several important factors to make them reliable for reporting or offering any kind of comprehensive window into student’s learning or progress. 

Humanising data 

There was a strong call from different sessions that we should remain critical of our increasingly data-driven world. The use of analytics as a measurement for human performance needs to be scrutinised. Much like an individual who can’t take out a financial loan because the bank’s computer has decided ‘no’, we must also be sure that a human can intervene when the learning system shows a learner at risk. Learning is one of the most complex processes and set of actions that humans can take. Therefore; it’s highly risky to draw conclusions from data alone. 

There is value, however, when using data to make more informed decisions on which students may be more likely to need support. How the data is used, and the impact it can have is one that remains an open part of the dabate. A clear example came out of a session on improving student feedback at scale – using data-informed approaches but still being linked to learning design, educator support and personalised learning: 

OnTask is a platform to send customised, personalised, emails to students based on data. The tool creates emails to send in bulk but includes segments of feedback based on logical decisions derived from both the course data and supporting feedback that educators want to send to their students. This may be based on participation, activities, assessments, content consumption, group work activity and more – broadly whatever data is collected can then be linked to the design and support from the course team. We’re exploring this tool and how it may be used to support FutureLearn learners. 

Find out more: https://www.ontasklearning.org/ 

Quality of data

There were several talks on using data which may be based on poorly, or non-existent learning designs. When there is unreliable or unstructured data, the resulting collection, or use of this data results in very poor capabilities for data-driven decisions and insight. While it’s more prevalent in blended and on-campus courses, it was recognised as a key challenge for organisations and their use of data generated by the platforms they use. 

Student spaces

In addition, it was widely recognised that learners are in three spaces; 

  • Institutional: the official and recognised tools and platforms as a part of student’s studies. Usually includes the VLE/LMS and other academic and scholarly platforms. 
  • Personal: tools and platforms each student uses for their studies, could include social media, office tools, websites and physical technologies.  
  • Invisible:more likely to include social media, informal learning environments and groups where content is shared and discussions take place. 

Of these three spaces, only the institutional space is able to generate data that can be used for any kind of analytics. Students are active in all three spaces, each to a different amount. This reason alone is why it’s no easy challenge for the use of data in supporting students and using analytics to predict and support learning outcomes and success. 

Stakeholder engagement 

There are many stakeholders when it comes to data and according to a comprehensive EU-wide research project they have somewhat disjointed priorities: 

  • Managers: improve student performance, teaching excellence, student satisfaction and retention.  
  • Teachers: support students in self-regulated learning, make better decisions, student engagement and Program-level overall quality enhancement. 
  • Students: personalised support, improved feedback and resource access 

There were many other themes at the conference but with a focus on data and learning analytics, I focused, at least in this write-up, on the above themes as the core outcomes / takeaways. Data-driven innovation in education continues to show a need for further dialogue around the direction of data and learning analytics. From a FutureLearn perspective, my takeaways are to look at how we can use data to help our partners deliver on their KPIs and also to support learners in achieving their goals and outcomes.

Related stories on FutureLearn

FutureLearn - Learning For Life

Reach your personal and professional goals

Unlock access to hundreds of expert online courses and degrees from top universities and educators to gain accredited qualifications and professional CV-building certificates.

Join over 18 million learners to launch, switch or build upon your career, all at your own pace, across a wide range of topic areas.

Start Learning now