Skip navigation

The lure of learning analytics

Researchers now have access to a flood of educational data on students that they hope will offer insights on how to improve the learning experience. Will it work?
BY NAVNEET ALANG
APR 23 2019

The lure of learning analytics

Researchers now have access to a flood of educational data on students that they hope will offer insights on how to improve the learning experience. Will it work?

BY NAVNEET ALANG | APR 23 2019

A couple of years ago, Marek Hatala, a professor at Simon Fraser University, faced a conundrum. He was teaching a computer science course to students who weren’t majoring in the field. Tasked with rather difficult projects that would often take three or four weeks, students complained that they would consistently run out of time. Dr. Hatala’s problem: was his syllabus too hard in a program that is not computer science majors? Or did students simply have poor time management?

These kinds of questions about how to best approach pedagogy have, for most of the history of postsecondary education, been answered with a combination of experience, feel and anecdotal evidence. But to be an educator in 2019 is to live in a flood of digital data. For many, this means sitting in front of a screen and tracking marks, attendance, and more, with a learning management system – most commonly, software like Blackboard and Canvas.

These digital dashboards list courses and present data that instructors can track – say, how many students have looked at an assignment and on what day, or how many have handed in an assignment on time electronically. They can then present that information in the form of graphs or tables – essentially, packaged data that lets an instructor get a bird’s-eye view of how a particular assignment or entire class is doing.

That is the idea behind the emerging field in education called learning analytics. At its most basic, learning analytics is simply the analysis of data related to learning, and thus encompasses even rudimentary acts like taking attendance or calculating average test scores in a class. But, as digital technology evolves, the amount and types of data one can collect continues to expand. Think about the variety of digital tools a student might use: online courses in which they work out calculus or physics equations step-by-step; digital textbooks that can track which parts are highlighted and by how many users; or, even more futuristic technology that tracks eye movement to see where a student’s attention is focused. What might one be able to do with all these data?

Dr. Hatala, who teaches in the school of interactive arts and technology at SFU, is researching how learning analytics can be used to motivate students, particularly in online learning environments. His specific problem – that students seemed overwhelmed by the workload – prompted him to create his own software to track how students approached the course material.

“In an online scenario, instructors are often in the dark as to how and when students engage with the material,” Dr. Hatala explains. “What I heard anecdotally was that students run out of time, but the data showed that wasn’t true, which was quite an eye-opener.”

Armed with these data, Dr. Hatala adjusted how he guided students through the process of completing assignments, asking them to pay closer attention to how they manage their time. “As an instructor, I want to see as the course is unfolding how students are progressing,” he says. “I can see how students engage with the material in Canvas, and there is a group who haven’t looked at the assignment at all for many days after being posted. I can then address that in a lecture to tell them, ‘Look, 30 percent of the class haven’t even opened the assignment.’”

That is, in essence, the ideal scenario of learning analytics: as courses and pedagogy move more and more onto digital platforms, the data they track and produce can be used to refine how an instructor teaches, and for the better. Yet, even these comparatively rudimentary results were a product of Dr. Hatala’s ability to design and code his own scripts to work with his institution’s content management system, a level of technical proficiency that is beyond most professors. It reflects the reality that learning analytics is still predominantly an emerging field.

“Learning analytics is a very broad term that refers to set of tools for different kinds of stakeholders,” says Dr. Hatala. “So they deliver information and analytics to administrators, to educators or advisors, and also for students. Each have different goals and deliver a different kind of analysis.”

So, for example, an administrator may be more interested in the relationship between attendance and extracurricular activities, while an instructor may be more focused on the particular details of a single course or assignment. “In our other project, we developed information both at the level of the curriculum, but also for advisors,” Dr. Hatala says. “So if you see a student with this pattern of behaviour, we can advise them not to, for example, take chemistry and physics at the same time.”


In 2010, George Siemens – then at Athabasca University and now executive director of the Learning Innovation and Networked Knowledge Research Lab at the University of Texas at Arlington – sent out an email to some academics to see who might be interested in studying the burgeoning field of analytics. “At that time, analytics as a field was largely focused on academic analytics,” he says. “It emphasized organizational efficiency, better allocation of classroom resources – basically the application of business intelligence principles to the university.” However, when applied to learning, “it reflects the desire most educators have to focus on producing a more effective, more impactful learning experience for our students.”

The resultant group eventually coalesced into SoLAR, the Society for Learning Analytics and Research. Dr. Siemens was its founding president. The group held its first international conference on Learning Analytics and Knowledge, now an annual event, in Banff, Alberta, in 2011. SoLAR also organizes an annual Learning Analytics Summer Institute, this year to be held in Vancouver at the University of British Columbia in June.

The push toward learning analytics, as in all areas of data analytics, “is being driven by the same thing: the datafication of society,” says Dr. Siemens. “That’s basically due to the availability of data, where we’re capturing what we do at the most granular levels, from buying something or liking things online.”

In an educational context, “what blew things up was, first, the development of learning management systems such as Moodle, Canvas, Blackboard and so on,” Dr. Siemens says. “From those, there was all this data no one was doing anything with, and then a few folks realized this data might provide us with insights: which students are at risk of dropout, what students engage with, or what might help students master a subject of concept.” That is the core of learning analytics currently: as more digital technology gets used in educational settings, more data is generated, and more and finer insights can be gleaned from it, all with the ostensible goal of improving both the experience and results of educational systems.

Companies, such as Ontario-based Desire2Learn, creator of the Brightspace learning management system, are building on this by developing tools to make learning analytics easier. “Brightspace is targeted towards making online and blended learning easy and flexible,” says Desire2Learn’s Amrita Thakur, who manages data and analytics products for the company.

“At a high level, Brightspace collects information on how learners are progressing and what does achievement look like for them,” she says. “Common metrics include things like login traffic, which things are being accessed, knowing how often you’ve posted to a discussion forum, or did you submit an assignment on time. Then there’s grades data, content usage, and which modules you’ve completed and so on.”

However, Desire2Learn is just one tool among many, and there is no standardized format or platform for learning analytics. Instead, the field is mostly ad hoc, reflecting the bespoke solutions like Dr. Hatala’s. What’s more, the kinds of data currently available from most learning management systems can be less granular than some learners, professors and researchers would like. They might be able to track marks, attendance or when resources are downloaded, but more specific data – what information students struggle with or specific tactics they use to study – currently escapes most systems.

That’s part of what educational psychology expert Phil Winne has been trying to address with his research program at Simon Fraser University, where he is a professor in the faculty of education. “Learning analytics fundamentally involves gathering data about students and factors that impinge on processes they use in learning, to track how they learn, develop predictions about their success, and help them steer toward success,” says Dr. Winne.

To that end, he and his team have developed an online system called nStudy. The software, still in an experimental form, is a kind of digital study space for students in which they can make notes, highlight sections of text, link information across sources, and other actions, all of which is tracked. “We’re interested to find out as much as we can about the specifics of what learners do to which bits of information,” says Dr. Winne. “We want to help students understand how they learn so they can move toward career goals and life goals in an informed way.”

Dr. Winne posits a kind of continuum of information in education: on one end is the subjective uniqueness of learners and, at the other end, lies the systemic or institutional dimension. “Learning analytics actually goes in both directions,” he says. “It offers opportunities to help individuals and, at the same time, it helps to develop the groundwork for broad policies.”

Some examples of Dr. Winne’s work include challenging ideas about learning styles – the beliefs about how one learns that research shows don’t actually shape how learning occurs – and examining how highlighting relates to achievement. Learning analytics might also help to address other challenges such as personalizing learning in large classes and MOOCs (massive open online courses).

On another front, Anatoliy Gruzd is part of a research team at Ryerson University that was awarded an Insight grant from the Social Sciences and Humanities Research Council to look at how learning analytics can be used to understand learning processes on social media. Dr. Gruzd’s aim is to examine informal learning on platforms like Twitter and Reddit. Studying how students exchange ideas, or where they deliberate most actively, may help determine which types of social media are most conducive to differing forms of learning, he says.

However, the use of social media also highlights a key issue dogging learning analytics: privacy. It’s top of mind for Dr. Gruzd. “Those of us who are developing these tools also understand there needs to be a robust data privacy model,” he says. “There needs to be rules and guidelines as to how the data is being collected, who has access to it, how it can be used for decision-making.” Indeed, in the midst of what appears to be a broader backlash against tech giants like Facebook and Amazon for the ways they use personal information, the shift to learning analytics – and the data collection on which it is predicated – is becoming controversial.

Photo by Luke Chesser on Unsplash.

For many years now, Audrey Watters, an education writer and independent scholar based in the U.S., has been a vocal critic of education technology on her site Hack Education. Calling herself “ed-tech’s Cassandra,” she remains deeply skeptical of learning analytics, not just because of questions of privacy but because of what it signifies more widely about trends in education.

“Historically, learning analytics is in line with a trend in which we believe collecting more data is going to reveal new insight,” she says. “And in the case of learning analytics, I don’t think there’s any demonstrable evidence yet that collecting all the clicks and activity does much more than reveal things teachers already know.”

Ms. Watters points to MOOCs (massive open online courses) as an example. The kinds of insights that instructors have gleaned from them – for example, that students who do more activities, or who come to class, do better – are hardly revelatory epiphanies. “I just don’t think the things that people are tracking have revealed themselves to be more predictive,” she says.

What’s more, the focus on predictive data may rely on algorithms that have built-in biases, perpetuating structural inequalities, Ms. Watters argues. For example, the software that guides students in their choice of courses or majors may inadvertently steer disadvantaged students towards less-demanding options. She also worries that these systems prioritize what’s quantitative and measurable over the qualitative and indeterminate.

Either way, these technologies “fit very much with this larger cultural moment we’re in” of collecting ever more data on individuals, says Mr. Watters. Indeed, as companies such as Facebook reach historically unprecedented levels of scale and influence, there is a growing feeling of the inevitability of the datafication of society.

The decisions made in the next decade or so will determine the implementation of learning analytics in both practical and ideological terms. Where we sit now is thus a position in which learning analytics, like technology at large, is a contested ground for differing ideals of what the world should be. And for those students who sit in front of some new problem, baffled and frustrated, the hope for idealistic instructors is that their struggles are seen as individual and not merely one more piece of data in an endless sea of information.

PUBLISHED BY
Navneet Alang
Missing author information
COMMENTS
Post a comment
University Affairs moderates all comments according to the following guidelines. If approved, comments generally appear within one business day. We may republish particularly insightful remarks in our print edition or elsewhere.

Your email address will not be published. Required fields are marked *

  1. Vindaheer Abundari / April 24, 2019 at 14:09

    “Learning analytics is a very broad term that refers to set of tools for different kinds of stakeholders,” says Dr. Hatala. “So they deliver information and analytics to administrators, to educators or advisors, and also for students. Each have different goals and deliver a different kind of analysis.”

    That jargon suggests the good Doc was born for business school. Or admin track at a uni.

Click to fill out a quick survey