29 July 2016

Learning analytics, even student dashboards, are they the wrong way round?

In a recent session discussing a learning analytics project, I seemed to be the only person in the room who was anxious about the whole idea again. I've been this way ever since George Siemens started the Google Group some 10 years ago. That anxiety culminated in a presentation I made to the University Analytics forum in Melbourne in 2012, which I'm sad to say, along with my posts to the forum, has generated little to no response. Is it just me and my tin foil hat, or is there a general reluctance to talk about an elephant in the room with learning analytics?

The best I've seen from the over all movement is a general agreement that it is ethical and progressive to develop analytics as a "student dashboard", that is to say that the effort is first and foremost about collecting data so that the individuals that the data is about can see and reflect on their own patterns, and in relation to the demographic groups that seem relevant to them, presently and historically. The antithesis of this is the collection of data for teachers and administrators to roughly calibrate their behaviouralist experiments - what most learning analytics projects are about.

But in this session recently, it occurred to me that even the projects that describe themselves as "student dashboard" projects, seem to be allowing themselves to be drawn a very long way away from the principles of why they are developing that way. Most of these projects that I have seen seem frustrated by the difficulty of obtaining useful data, and end up narrowing scope in on a single environment like an LMS or a handful of online social platforms, within a single course. They accept that this then renders the project an unscalable proof of concept, and acknowledge that they leave far too much out of a person's wider context to really get any useful insights. Is there another way to try and uncover insights about learning? A way that better fits the principle of student dashboard, and potentially encompasses that wider context that seems impossible to account for?

I'm suggesting a closer affiliation with the field of QuantifiedSelf. Who in the learning analytics world is investigating the large range of mobile applications designed to assist with time management and task completion, for example? It could be that one of these, or a combination of them, offer students an optional way to record and manage their own data, and to even pool in with an online community or collection for comparison and bigger pictures. This seems to be an approach that would inherently deal with many of the ethical concerns of a university gathering data on students - often without even a research ethics application!

It seems to me that this suggestion is to at least qualify the data currently being collected in the more top down approaches, if not control for it. But I suspect it's more than that. With the right additions, the voluntary and guided use of such apps and methods might be the very idea that "student dashboard" projects set out to achieve. The outcomes of projects that take this approach might be a range of suggested apps and guided activities to help participants make the most of logging their lives around learning; how to pool data for comparisons; and how to better design course curriculum to help students manage time and task completion.

A search for "time management" in Google Play reveals quite a few useful candidates to try out, many with data export ability. Learning designers could design weekly time management schedules around a course for example, for participants to run in something like TimeTune to stay on task. We could suggest that participants try using an application like Working Time Management, that tracks the time spent on projects, including communications with people in the project, similar with aTimeLogger, and simple activities where the group compares records. These are just the first few apps available in a search for Time Management.

I've recently started using the application Headspace, a charming application which isn't a life logger at all, though it has some optional features that could be used like that. It's primarily a 10 step course in meditation and mindfulness. It's pretty popular it seems, and interesting as a format for a course. The various tools and techniques for managing time, focus and headspace could conceivably be combined into one, as layers around a course on any topic, where students (if they like) can turn off and on those features, some of which offer guidance in time management, others an opportunity to measure, manage and compare their engagement with topics and projects.

Does anyone know of a learning analytics project that takes this approach? Such an approach would alleviate some of my anxieties about the field and its elephants, especially if they were to dgo as far as to investigate the source of the applications and determine what the companies do with the data collected. 


Jude said...

The limitations of NAPLAN as a measure of achievement in schools alongside the adoption of poor educational practices make 'Learning analytics' as a way forward concerning, quite aside from the issues of data mining of students. How much is learned though interrogation best practice in assessment? Do the opportunities outweigh the risks? Thanks Leigh

Leigh Blackall said...

Good point Jude and nice to see you here again, it's like old times! :)
Learning analytics, like almost all edutech before it, feels very much like a solution looking for a problem to me.. and its going to create a heap of problems before ever offering a solution.

polly7 said...
This comment has been removed by a blog administrator.