Applications of Learning Analytics

Are you curious about how learning analytics can support your pedagogical inquiry? There are many ways that learning data is used to investigate teaching and learning questions, assess instructional strategies, and inform evidence-based decisions. These examples highlight the ways that learning analytics can support evidence-informed teaching practices. 

Course Profile Reports

Key questions

  • How many of my students have other classes in common this term?
  • Do students in my class have shared or disparate academic backgrounds?
  • What are the prior courses my students are mostly likely to have taken?
  • Are most students in the class pursuing a major in the same department, or is the course an elective for them?

Learn more

Many instructors find it challenging to get to know all their students; in some cases, this is because large class sizes make it challenging to interact with all students as individuals. The anonymity of online asynchronous environments can also exacerbate the issue.

To help instructors get to know their incoming student cohort, reports or dashboards can be distributed to instructors after students register, but before the first day of term.

Examples:

  • Know Your Class at the University of Saskatchewan
  • Atlas at the University of Michigan (available to both instructors and students)

By revealing the breadth of backgrounds and experiences students bring to the classroom these types of visualizations are often intended to counter stereotypes about the “typical student.” Specific visualizations may include student demographics, such as the gender, age and racial distribution of students, or academic backgrounds, such as home faculty, prior courses completed, or GPA.

Often, these reports are developed with an eye towards student diversity, equity and inclusion, so accurately presenting the breadth of student experience while preserving individual student privacy is critical. In some cases, these visualizations are designed to show a specific cohort (e.g., Math 100) in comparison to a reference group (e.g., all first year Science students) to reveal distinctive characteristics about a course cohort to the instructor.

Recommended reading

Course Engagement and Performance Dashboards

Key questions

  • When are students most active in the online course materials?
  • What fraction of students have accessed each of the course components?
  • Which course materials are most frequently revisited?
  • Are students reading ahead, falling behind, or on-track with the course schedule?

Learn more

In large, asynchronous online learning environments, it can be challenging for instructors to monitor course engagement and performance. Instructors may lack opportunities to connect with each student individually and may also struggle to form a cohesive sense of how the whole class is engaging with the course materials. Learning analytics dashboards have been designed to help instructors understand and monitor the patterns of engagement occurring in their courses and have been incorporated into mainstream learning management systems like Canvas.

Examples:

Similar dashboard tools have also been developed for students, often with an emphasis on supporting students in developing self-regulated learning techniques. My Learning Analytics (MyLA) is one such tool. MyLA enables students to view their own course engagement and performance in relation to their class cohort. Visualizations in MyLA address questions students commonly ask their peers: “have you looked at…?” and “how did you do on …?”, but they level the playing field by making the same class-wide data available to all students, not just those who are socially well connected, and the visualizations present objective data from the LMS, rather than potentially exaggerated self-reports. How best to reflect LMS data to learners to support the learning process remains an active research question in the field.

Recommended reading

  • Wise, A. F., & Jung, Y. (2019). Teaching with analytics: Towards a situated model of instructional decision-making. Journal of Learning Analytics, 6(2), 53-69. https://doi.org/10.18608/jla.2019.62.4
  • Jivet, I., Scheffel, M., Specht, M., & Drachsler, H. (2018, March). License to evaluate: Preparing learning analytics dashboards for educational practice. In Proceedings of the 8th international conference on learning analytics and knowledge (pp. 31-40). https://doi.org/10.1145/3170358.3170421
  • Aguilar, S. J., Karabenick, S. A., Teasley, S. D., & Baek, C. (2021). Associations between learning analytics dashboard exposure and motivation and self-regulated learning. Computers & Education, 162, 104085. https://doi.org/10.1016/j.compedu.2020.104085
  • Kaliisa, R., Misiejuk, K., López-Pernas, S., Khalil, M., & Saqr, M. (2024, March). Have learning analytics dashboards lived up to the hype? A systematic review of impact on students’ achievement, motivation, participation and attitude. In Proceedings of the 14th learning analytics and knowledge conference (pp. 295-304). https://doi.org/10.1145/3636555.3636884
  • Fritz, J. (2017). Using analytics to nudge student responsibility for learning. New Directions for Higher Education, 179, 65-75. https://doi.org/10.1002/he.20244

Social Network Analysis

Key questions

  • Who are the central participants in the discussions? Are they the same people across discussion topics?
  • Are there sub-cliques of discussion? Or is everyone engaging with everyone?
  • Which students are *lurking* (i.e. reading, but not posting) and who is disengaged?

Learn more

In small discussions, whether face to face or online, it is relatively easy to track patterns of engagement, such as who responds first to a question or prompt, identifying cliques of students who only talk to each other, whether some individuals dominate the conversation, and which individuals are not participating at all. However, in courses with many students, or many parallel discussions, or even across courses, it can be challenging to quickly and accurately identify these same trends.

Examples:

  • Threadz – network analysis and visualization tool
  • Gephi – network visualization tool

Network analysis and visualization tools such as Threadz and Gephi can assist instructors to see the social shape of online discussions.

Finally, these tools can be highly informative for instructors when reflecting on their own role in their course discussions: are course discussions used as a public forum for students to ask questions and the instructor to respond? Or are peer to peer conversations fostered with the instructor facilitating rather than answering?

Recommended reading

  • Gruzd, A., Paulin, D., & Haythornthwaite, C. (2016). Analyzing social media and learning through content and social network analysis: A faceted methodological approach. Journal of Learning Analytics, 3(3), 46-71. https://doi.org/10.18608/jla.2016.33.4
  • Hernández-García, Á., González-González, I., Jiménez-Zarco, A. I., & Chaparro-Peláez, J. (2016). Visualizations of Online Course Interactions for Social Network Learning Analytics. International Journal of Emerging Technologies in Learning (Online), 11(7), 6. https://doi.org/10.3991/ijet.v11i07.5889
  • Brooks, C., Liu, W., Hansen, C., McCalla, G., & Greer, J. (2007, July). Making sense of complex learner data. In Assessment of Group and Individual Learning through Intelligent Visualization Workshop (AGILeViz) (p. 28).

Mass Personalization of Feedback

Key questions

  • How can we scale the provisioning of feedback in a resource-constrained environment?
  • What are student perceptions and attitudes towards mass personalized feedback?
  • Am I able to use this technology to build better personal relationships with my students?
  • (By A/B testing feedback content) what types of feedback are most informative/effective/motivating?

Learn more

Providing individualized feedback is a time-consuming task in large courses and often small armies of teaching assistants are needed to do so. As class sizes grow and budgets shrink, there is pressure to find new ways to provide high quality feedback with fewer resources.

Examples:

  • OnTask – templated conditional feedback tool

Templated conditional feedback tools like OnTask offer one solution. These approaches draw inspiration from mass-mailing tools used in personalized advertising campaigns. Using these tools, instructors can write feedback including distinct text for each recipient (e.g. addressing each student by name, providing individual grades in the message, or unique comments) as well as conditional text blocks where parts of the message will be included/excluded for each recipient based on data about them (e.g. students from different academic faculties may receive examples tailored to their program of study.)

Recommended reading

  • Pardo, A., Jovanovic, J., Dawson, S., Gašević, D., & Mirriahi, N. (2019). Using learning analytics to scale the provision of personalised feedback. British journal of educational technology, 50(1), 128-138. https://doi.org/10.1111/bjet.12592
  • Mousavi, A., Schmidt, M., Squires, V., & Wilson, K. (2021). Assessing the effectiveness of student advice recommender agent (SARA): The case of automated personalized feedback. International Journal of Artificial Intelligence in Education, 31, 603-621. https://doi.org/10.1007/s40593-020-00210-6

Early Identification of At-Risk Students

Key questions

  • What are the early markers of success/failure in my course? Are students well informed about which pieces of early feedback should be treated as a wake up call?
  • What sorts of interventions are effective? What are the long term effects?

Learn more

Many institutions are using LMS data to provide predictive “early-alert” dashboards to academic advisors, particularly in student athletics. Individual students receiving low/alarming grades may hesitate to seek out help for various reasons: they may believe they can make up for a poor grade on future assignments, they may not grasp the significance of their situation, they may not know what supports are available, or they may feel shame or embarrassment. Unfortunately, this can lead to students failing courses and thus delaying their degree completion.

Examples:

  • Predictive “early-alert” dashboards for academic advisors
  • UBC Science Advising using midterm exam scores to proactively target support

By providing grade, engagement, and/or attendance data to staff advisors, institutions can target support to students in need, often before the situation becomes dire. At UBC, Science Advising has used midterm exam scores from large, first year Science courses to proactively target support and opportunities for advising appointments to those at risk of failing.

Recommended reading

  • Jayaprakash, S. M., Moody, E. W., Lauría, E. J., Regan, J. R., & Baron, J. D. (2014). Early alert of academically at-risk students: An open source analytics initiative. Journal of Learning Analytics, 1(1), 6-47. https://doi.org/10.18608/jla.2014.11.3
  • Gutiérrez, F., Seipp, K., Ochoa, X., Chiluiza, K., De Laet, T., & Verbert, K. (2020). LADA: A learning analytics dashboard for academic advising. Computers in Human Behavior, 107, 105826. https://doi.org/10.1016/j.chb.2018.12.004

Enrolment Pathways and Curricular Analytics

Key questions

  • When students leave a given program, where do they go?
  • How many students take a gap term or gap year (for employment or otherwise)? How is their degree completion trajectory impacted?
  • Where are the bottleneck courses in each program that constrain time to degree completion?
  • Are there unofficial minors (e.g. clusters of courses commonly taken together) within programs?
  • For the body of students earning a given credential, how similar is the list of courses they have taken? Do they have a clear common education, or is each student’s path highly distinct?

Learn more

Many students take longer than the expected four years to earn a degree; students may switch majors or degrees, take time away from their studies to work, or may repeat courses due to withdrawal or failure. Learning Analytics can help program administrators understand how students move through, between, and out of academic programs. Similarly, in academic programs with a high degree of flexibility in course selection, it can be a challenge for program administrators to see patterns in how students select and sequence their courses. Curricular data analysis and visualizations can help answer these questions.

Examples:

  • UBC Student Flows Dataset – a 20-year enrolment history dataset
  • Sankey visualizations to support administrators exploring program enrolment

At UBC, the Learning Technology Hub and PAIR have jointly curated a 20-year enrolment history dataset and a set of Sankey visualizations to support administrators exploring questions about program enrolment. Access to the UBC Student Flows Dataset is available upon request through Enterprise Data Governance.

Recommended reading

Classroom Experimentation and the Obligation to Act

Key questions

  • Pragmatically, what steps should I take to run an experiment in my classroom?
  • How can I assess the generalizability of a new teaching technique within and across institutions?

Learn more

The wealth of student data universities hold in both administrative and learning management systems has led to calls to action to leverage that information to improve the teaching and learning environment for all. Institutional datasets can be used surface teaching and learning inequities across courses and faculties, as well as across student demographics, such as gender, ethnicity and economic background.

Examples:

  • Terracotta – tool for running classroom experiments in Canvas

At the same time, new tools such as Terracotta are emerging to support researchers engaged in the scholarship of teaching and learning by simplifying the process of running classroom experiments. Terracotta streamlines the process of collecting consent, assigning treatment/control conditions to participants, linking outcome measures, and exporting de-identified data for analysis, all from within Canvas.

Recommended reading

  • Prinsloo, P., & Slade, S. (2017, March). An elephant in the learning analytics room: The obligation to act. In Proceedings of the seventh international learning analytics & knowledge conference (pp. 46-55). https://doi.org/10.1145/3027385.3027406
  • Motz, B. A., Üner, Ö., Jankowski, H. E., Christie, M. A., Burgas, K., del Blanco Orobitg, D., & McDaniel, M. A. (2024). Terracotta: A tool for conducting experimental research on student learning. Behavior Research Methods, 56(3), 2519-2536. https://doi.org/10.3758/s13428-023-02164-8
  • Motz, B. A., Carvalho, P. F., de Leeuw, J. R., & Goldstone, R. L. (2018). Embedding experiments: Staking causal inference in authentic educational contexts. Journal of Learning Analytics, 5(2), 47-59. https://doi.org/10.18608/jla.2018.52.4
  • Fyfe, E. R., de Leeuw, J. R., Carvalho, P. F., Goldstone, R. L., Sherman, J., Admiraal, D., … & Motz, B. A. (2021). ManyClasses 1: Assessing the generalizable effect of immediate feedback versus delayed feedback across many college classes. Advances in Methods and Practices in Psychological Science, 4(3). https://doi.org/10.1177/25152459211027575