Produced with Scholar

EDS: General Field Literature Review

Project Overview

Project Description

This 5,000-7,000 word literature review will eventually be submitted for your general field examination and will be refined to become a part of a chapter in your dissertation. You will create another part of this chapter in the "special field examination," coming up as course/step 2, so be sure that your literature review covers the broad shape of the field, not the specialized area you will be addressing in your dissertation work.

The literature review should not merely be descriptive—it should be analytical and critical, supported by the literature. What theories are associated with this general field?  What are the main issues arising in this general field? What are the main challenges to be addressed? What are the questions being asked by the intellectual and practical leaders in the field? What are the findings?  What are the absences or gaps in our knowledge? What work needs to be done?

Icon for General Field: Learning Analytics Efficacy in Higher Education

General Field: Learning Analytics Efficacy in Higher Education

Word Version

 

DiWu_General Field_Learning Analytics_08232023

 

Note to Reviewers and Change Notes

First, thanks for your time and effort in peer-reviewing this work. Any constructive feedback will be greatly appreciated.

Change Note (for Faculty Review):

  • I added the structure in Scholar following the guideline.
  • I revised some word choice and adapted annotations from Kara. 
  • I added the purpose into introduction section.
  • I beef up the background, definitions, and theories sections. I included a discussion on the limitations of the theories presented (big data section).
  • I checked and corrected the APA format throughout the work. (Ensure that years are added the first time you mention a source in a paragraph, then for subsequent narrative citations, you don't repeat the year, but you do for parenthetical citations. I changed the article titles be capitalized; Corrected the way I cited the direct sources; I ensure add the page at the end and used double quotation marks.)
  • I avoid paragraphs dedicated to a single source. I diversify your sources and make connections. I avoid going into too much detail on a source where it feels like an annotated bibliography entry. Some sub-sections have been synthesized or recontextualized (such as the Implementation and Challenge section.)
  • I used the Scholar structure tool for subheadings that I used to have in italics.
  • I ensured that my voice doesn't slip in and out. It reflects citation's voices.
  • I revised each citation to make sure they are condensed yet still with some context of the study.
  • I revised the flow to make each section is connected. And combined some sub-sections (such as the Implementation section where I used to have many sub-headings).
  • I adjusted the format, for example, the spaces between paragraphs.

Title Page

TITLE (ALL CAPS): TBD

 

 

 

 

 

 

 

 

BY

 

DI WU

 

 

 

 

 

 

 

DISSERTATION

 

Submitted in partial fulfillment of the requirements

for the degree of EdD in Learning Design and Leadership

in the Graduate College of the

University of Illinois Urbana-Champaign, [year of conferral]

 

 

 

Urbana, Illinois

 

 

 

Doctoral Committee:

 

Add one member per line

TBD Ex. Associate Professor John Doe, Chair

Abstract (placeholder during early milestones)

Table of Contents (placeholder during early milestones)

Chapter 1 (placeholder)

Chapter 2 Literature Review

Part 1: General Field

Introduction

Digital learning has played a supplemental role in education for the past few decades as human beings entered the era of the internet and digitalization. Aldosari et al. (2022) acknowledged that the Covid-19 pandemic accelerated the growth of digital learning globally when schools and universities had to offer instruction to students remotely due to mandated lockdowns. Aldosari et al. noted that many students had used learning software and digital devices to receive instruction from home during this global pandemic. Cope and Kalantzis (2016) claimed that learning technologies allow educational institutions to collect a large amount of data from digital devices, which can support educators in improving teaching practices and students' learning outcomes. Cope and Kalantzis also recognized that educational data science has become popular in the past decade, and its potential has gained more attention than ever. However, according to Dhankhar and Solanki (2020), researchers are still in the early phase of the educational data science era.

This general field literature review aims to investigate the efficacy of learning analytics in Higher Education. With over a decade of work experience in education, the author of this literature review has a background in educational administration, learning technology, curriculum design, and innovation. The current job title of the author is the instructional designer (curriculum and assessment) at a public technical college in Wisconsin, the United States. The author observed some universal challenges among community colleges in America, such as accreditation pressure, decreasing student enrollment, unsatisfactory student dropout rate, massive unemployment during the global pandemic, and disparity of educational attainment. Many studies in the literature reviewed centered on implementing learning analytics and proved its potential to help solve these problems in Higher Education. Lastly, Higher Education institutions have implemented digital learning in rapid growth, which provides a solid technical foundation to apply learning analytics into practice. Investigating the efficacy of learning analytics in community colleges will support their mission to continuously meet the educational needs of underserved people and transform their lives. The general research question is to investigate how learning analytics can increase the efficiency of online teaching and learning in Higher Education. Thus, this literature review looks at the definitions, theories, and implementations of learning analytics in the Higher Education context and identifies challenges and gaps in the literature to disclose the research directions for the author's doctorate dissertation.

Learning Analytics Background

Cope and Kalantzis (2016) stated that global digitalization and Internet popularization generate enormous amounts of data called 'big data.' Cope and Kalantzis claimed that big data could transform how we live, work, and think, even change how we manage the business, participate in politics, and live our lives. Cope and Kalantzis acknowledged that big data also impacts education with promises.

Oliva-Córdova et al. (2021) agreed that education has also changed due to technological development. Schools and classrooms have been increasingly using digital technology to improve the process of teaching and learning. Similarly, Viberg et al. (2018) indicated that Higher Education has also massively adopted digital technology through online learning, which provides synchronous and asynchronous interactions within a virtual environment. Viberg claimed that the big data obtained mainly from the online learning environment could support students' learning. Dhankhar and Solanki (2020) echoed that when the big data in Higher Education is analyzed, it will reveal valuable information which was 'unseen, unnoticed, and therefore unactionable' (p. 868). Dhankhar and Solanki believed the information might help understand and support students' learning.

Several scholars examined that there are still multiple challenges for Higher Education institutions while digital technology has been mainly integrated into Higher Education (Jonathan et al., 2018; Tsai et al., 2020). Jonathan et al. (2018) pointed out that Higher Education institutions have generated significant revenue and are held accountable by the market and accreditation body to report their strategic directions and performance trajectory. Najdawi and Stanley (2021) also agreed that Higher Education institutions want to create value for learners to receive high tuition. Furthermore, Tsai et al. summarized other challenges in Higher Education, such as underachieving students' learning outcomes, unsatisfactory students' experiences, variable teaching effectiveness, neglected quality evaluation, weak financial stability, and inconsistent institutional performance. Besides, Al-Tameemi et al. (2020) emphasized that another challenge is low student enrolment and decreasing program completion rates at most institutions.

The rapid development of learning analytics in Higher Education institutions sparked some studies in the last decade. Vanessa Niet et al. (2016) argued that institutions could analyze educational information and enable informed decision-making about teaching and learning with learning analytics. Sonderlund et al. (2019) pinpointed that big data in education could predict students' success or failure academically, increasing course and program completion rates. A few researchers agreed that big data in education becomes an opportunity for better schooling in the digitalization age (Cope & Kalantzis, 2015; Najdawi & Stanley, 2021). Viberg et al. (2019) agreed that learning analytics will better understand and support student learning.

 

Learning Analytics Definitions and Key Terms

It is reasonable to look at definitions and key terms of big data in Higher Education virtual learning environments. Clarity of some relative terms should be defined, including online learning, learning management system, and learner success, educational data science, learning analytics, educational data mining.

Firstly, a few key terms are related to the online learning environment in Higher Education where learning analytics emerges (Muljana & Luo, 2021), such as online learning, learning management system, and study success. Moore et al. (2011) summarized that most authors define online learning as "access to learning experiences via some technology" (p. 2). Moore also claimed that one of the most popular digital learning tools is the learning management system (LMS). Muljana and Luo (2021) defined the learning management system as delivering content and tracking student progress in real-time, which enhances teaching and learning. Muljana and Luo also stated that the massive integration of the learning management system has contributed to the emergence of learning analytics in Higher Education. Ifenthaler and Yau (2020) defined study success as "to capture any positive learning satisfaction, academic improvement, or social experience in Higher Education" (p. 1962).

Secondly, Najdawi and Stanley (2021) defined learning analytics as discovering hidden patterns in the educational process, assessing student learning, and making predictions, which provide a better understanding of teaching, learning, and interpretation of student data. Jonathan et al. (2018) extended the function of Learning Analytics to the institutional level of performance measurement and fed it into information gathered in assessing key performance indicators. Tsai et al. (2020) summarized the definition of learning analytics holistically as harnessing big data collected by learning technologies to provide insight for enhancing teaching practices, learning decisions, and educational management.

Thirdly, learning analytics originated from educational data science. Cope and Kalantzis (2016) claimed that educational data science is the umbrella term for big data in education. They defined educational data sciences as the purposeful recording of activity and interactions in digital learning environments with the potential to develop learner success, institutional accountability, educational software design, learning resource development, and educational research. Patil and Gupta (2019) suggested that Educational Data Mining (EDM) and learning analytics are the two subdisciplines under the educational data sciences.

Many studies in the literature reviewed compared the differences between Educational Data Mining and learning analytics. Patil and Gupta (2019) stressed, "Educational Data Mining is a method that uses a large educational database to collect useful and patterns." Cope and Kalantzis (2015) claimed that Educational Data Mining examines 'unstructured data' to interpret evidence of learning from massive and chaotic data sets. Learning analytics focuses on 'structured data' embedded within the learning environment purposefully. Patil and Gupta further explained the two terms' differences. Educational Data Mining researchers generally use reductive harness frameworks to reduce component-based phenomena and analyze individual components and relationships. Learning analytics scientists, in contrast, generally centered on complex structures as wholes. They developed a table that shows the significant differences between Learning Analytics and Educational Data Mining in Figure 1.1 below. Lastly, Najdawi and Stanley (2021) stated the difference between the two terms in their literature review of twenty studies. They summarized that Educational Data Mining could benefit automatic discovery, identify unique knowledge concepts, and construct automatic personalization. In contrast, learning analytics focuses on resourcing actors with their roles, providing holistic learning strategies, and empowering learners and instructors to make choices.

Figure 1.1: a brief comparison of Learning Analytics (LA) and Educational Data Mining (EDM) (Patil & Gupta, 2019)

 

Learning Analytics Theories

Learning analytics has employed various methodologies and theories from education, learning science, psychology, sociology, linguistics, statistics, computer science, and artificial intelligence (Dhankhar & Solanki, 2020; Oliva-Córdova et al., 2021). Thus, three major theory themes emerged in the literature reviewed: data mining, technology adoption, and online learning.

Many studies in the literature reviewed have examined the relationship between learning analytics and data mining. Patil and Gupta (2019) argued that 'The Educational Data Mining technique helps humans find patterns and data in educational areas' (p. 1) because instructors and students cannot physically interact with each other in the online learning environment. Tsai et al. (2020) also outlined popular data collection and visualization methods; for example, Patil and Gupta defined clustering analysis as a technique to classify groups or categories to identify data patterns and relationships. Tsai also recommended involving qualitative methods and learning theories to implement learning analytics. Al-Tameeti et al. (2020) presented a five-step process focusing on data analysis, including capturing, reporting, predicting, acting, and refining (see Figure 1.2 below). Their research provided technical methods used for learning analytics to support student success. In summary, the above scholars suggested using data mining techniques in the research of learning analytics.

Figure 1.2: Learning Analytics Components (Al-Tameeti et al., 2020)

However, Mangaroska et al. (2020) developed a quantitative study with 46 Computer Science students in a debugging activity. They argued that quantitative methods (data mining) for learning analytics research fail to reveal some complex questions, such as learning offering effects on the learning process. On the same note, Toro-Troconis et al. (2019) agreed with this statement. They utilized mixed methods with quantitative and qualitative studies to investigate learner engagement in online programs. While the quantitative element of the study demonstrated weak evidence on the association between overall grades and student engagement, the qualitative element of the study has revealed students’ engagement by including students’ voices.

The literature reviewed revealed several theories and frameworks that connect learning analytics in Higher Education to technology adoption and implementation. For example, Muljana and Luo (2021) utilized the Technology Acceptance Model (TAM) framework to explore learning analytics applications to improve course design. They defined TAM as 'four synthesized determinants that explain perceived usefulness and perceived ease of use, including (a) individual differences which represent demographics and personality; (b) system characteristics that include features of the system; (c) social influence that covers social pressures or processes; and (d) facilitating conditions which consist of organizational support and resources (Muljana & Luo, 2021, p. 211). In their review, several studies have verified TAMs' suitability for understanding the adoption processes for learning analytics. In contrast, via a case study, Klein et al. (2019) adopted Zellweger Moser's Faculty Educational Technology Adoption Cycle (FETAC) model to discover faculty barriers to implementing learning analytics in Higher Education. The key element of this framework is that "Adoption of new technologies is dependent upon extrinsic and intrinsic support, via organizational structures, resources, and incentives and individual interest and time" (Klein et al., 2019, p. 606). Their findings demonstrated that the barriers are untrusting technological infrastructure, misalignment between user needs and learning analytics capabilities, and ethical concerns for educational data..

Several studies in the literature reviewed focused on the learning analytics implementation frameworks that could guide the adoption process of learning analytics projects or initiatives. Specifically, Arnold et al. (2014) proposed that the Learning Analytics Readiness Instrument (LARI) focuses on the readiness of institutions to implement learning analytics and assists Higher Education institutions in identifying their strengths and weaknesses. They stated that the LARI framework could remediate the identified shortcomings to ensure the success of the learning analytics implementation. Similarly, Tsai et al. (2020) indicated that the Latin American Learning Analytics (LALA) framework provides detailed steps to identify the needs of different stakeholders, design, implement, and evaluate learning analytics tools, eventually forming a community to develop Learning Analytics practice and research. Prieto et al. (2019) presented a learning analytics framework called Orchestrating Learning Analytics (OrLA) that represents learning analytics adoption as a sociocultural activity system. According to Prieto, this framework can be a practical tool to enable change management in the classroom via learning analytics adoption and serve as a boundary object to aid in the communication between the different stakeholders in the community (See Figure 1.3 below). The frameworks proposed in the literature reviewed support the implementation of learning analytics in Higher Educational institutions.

Figure 1.3: Orchestrating Learning Analytics Framework (Prieto et al., 2019)

The literature explored some theories regarding online learning science where learning analytics emerged. Firstly, the Informational and Communication Technology-Competency Framework for Teachers (ICT-CFT) indicates six aspects of teacher professional practice used by Oliva-Córdova et al. (2021) to explore how learning analytics can improve teaching practices for online learning. The six aspects are (1) understanding the role of ICT in education policy, (2) curriculum and assessment, (3) pedagogy, (4) application of digital skills, (5) organization and management, and (6) professional learning by teachers. This framework encourages teachers to reinterpret the curriculum to function in a knowledge society, adopt authentic assessment strategies, and utilize learner-centered, problem-based, project-based pedagogies that integrate collaboration and cooperation. The limitation of this framework is the author's bias which a quality assessment instrument can mitigate.

Additionally, Cope and Kalantzis (2017) proposed a new theory called e-Learning Affordances, which can create e-learning ecologies that will be more engaging, effective, resource-efficient, and equitable in the face of learner diversity. This e-Learning Affordances theory identified seven "new learning" affordances: ubiquitous learning, active knowledge production, multimodal knowledge representations, recursive feedback, collaborative intelligence, metacognitive reflection, and differentiated learning (See Figure 1.4 below). The purpose of this framework is to propose the 'reflexive pedagogy enabled by an emerging wave of educational technologies can create e-learning ecologies that will be more engaging for learners, more effective, more resource efficient, and more equitable in the face of learner diversity' (Cope and Kalantzis, 2017, p. 13). They believe this theory can guide instructors and designers to build a better online learning environment for a better student learning experience. Both theories on digital learning science share the typical focus on learner-centered in the context of online learning in Higher Educational institutions.

Figure 1.4: eLearning Affordances (Cope & Kalantzis, 2017, p. 21)

 

 

Learning Analytics Implementations

This section will address the implementation of learning analytics in the Higher Education online learning context. The findings and lessons learned from the literature reviewed after implementation will also be presented. Due to the complexity and variety of topics in learning analytics implementation, the literature review findings are grouped based on the purposes of implementation, including student success, teaching excellence, learning design, and institutional goals.

Student Success Prediction, Intervention, and Learning Outcomes

The first theme from the literature reviewed addresses the use of learning analytics to facilitate student success prediction, timely intervention, and ultimately improve students' learning outcomes. Students are the key stakeholder in this theme.

Firstly, many studies in the literature reviewed have focused on student success and performance prediction. Dhankhar and Solanki (2020) provided a bird's eye view of learning analytics usage in Higher Education worldwide by reviewing leading publications and research evidence. The learning analytics implementation in different countries and regions is summarized in this research, such as the United States, United Kingdom, Austria, and Europe. It identified one of the main functions of learning analytics, which is discovering hidden patterns in the educational process, assessing student learning, and making predictions, which in turn provide a better understanding of teaching, learning, and interpretation of student data. Furthermore, in another systemic review conducted by Iffenthaler and Yau (2020), learning analytics can identify potential students at risk of dropping out by analyzing students' online behavior (login frequency, online forum posts, and assignment submission) data combined with their grades.

Furthermore, students' demographics (academic self-concept, academic history, and financial condition) can also predict student success. Unlike predictions based on students' demographic information, Foster and Siddle (2020) examined non-engagement student data (non-attendance or non-submission of coursework) by comparing disadvantaged groups and their control group. The non-engagement student data can also indicate at-risk students. They proved that non-engagement alerts are more efficient at identifying at-risk students than demographic data. It provided a more neutral framework to allocate resources to students who need support without stigmatizing students' backgrounds. Their findings demonstrate the potential of using automatic non-engagement alerts to address the attainment disparity from an equity, diversity, and inclusion lens. Learning analytics could predict students' performance equitably so students will get immediate remediation from the instructors.

Secondly, some research has shed light on the Instructors' Intervention. Liu et al. (2017) analyzed data from a large-scale online course and identified different learning behaviors and performance patterns. They found significant differences in the learning behaviors of students who earned high grades versus those who earned low grades. Based on these findings, the authors suggested several design principles for adaptive learning systems, such as providing personalized feedback and resources to students based on their learning behaviors, using data to predict which students may be at risk of falling behind, providing personalized interventions to help them catch up, and designing course content and activities to promote more active and engaged learning behaviors. Similarly, several intervention practices to support study success are summarized in this review by Ifenthaler and Yau (2020), such as visual signals and dashboard features for instructors, peer interactions, adaptive learning materials, prior knowledge building, reduction of test anxiety, and student-teacher perceptions. The review emphasized that personalized learning paths and interactions can be effective intervention strategies to mediate the students who are at risk. However, Sonderlund et al. (2019) conducted a literature review of 11 studies on the efficacy of learning analytics interventions in Higher Education. They summarized that while there are plenty of studies on predicting student performance and retention, the research on the effectiveness of learning analytics intervention is limited because of lacking synthesis of effective practices from different regions and conditions. This review advocated for more research to investigate the total value of learning analytics intervention. Thus, the above research examined how instructors' intervention related to learning analytics supports student success.

Lastly, Several studies in the literature reviewed investigated how Learning Analytics can develop students on different learning outcomes, such as cognitive gains, self-regulation skills, and student agency. Viberg et al. (2018) reviewed 252 studies investigating the effects of using learning analytics dashboards. Viberg et al. summarized that cognitive gains refer to intellectual abilities such as self-reflecting, metacognition, and analyzing skills. Sonnenberg and Bannert. (2015) conducted a study in a Higher Education context, specifically in a programming course. The authors found that using dashboards was positively associated with cognitive gains among the students. Students who used the dashboards improved their understanding of the course material and ability to apply the learned concepts to practical programming tasks. Overall, the study suggested that using learning analytics dashboards can effectively enhance students' cognitive gains in Higher Education settings.

In Viberg et al.'s review, studies examined that learning analytics dashboards can improve students' self-regulated learning, presentation, and problem-solving skills in Higher Education settings. Similarly, Jääskelä et al. (2021) defined student agency means "a student's experience of access to/having (and using of) personal, relational (i.e., interactional), and context-specific participatory resources to engage in intentional and meaningful action and learning." The same authors presented a student agency learning analytic service architecture that can visualize the student agency profiles. It also summarized the definition and framework of agency in Higher Education. Furthermore, it proved that learning analytics could present student agency profiles to optimize learning and teaching through personalized academic advising, self-reflection, and self-regulation.

Jivet et al. (2020) also researched the relationship between learning analytics dashboard design and students' self-regulated skills. The authors conducted a literature review and a quantitative study of 176 students to explore the impact of learning analytics dashboards on self-regulated learning. They identified several studies that suggested a positive relationship between learning analytics dashboards and developing self-regulated learning skills. In addition, Jivet et al. provided some recommendations for designing the learning analytics dashboard to provide immediate feedback to students on their learning performance and help students reflect on their learning behaviors and strategies, then build and refine the learner behavior. Lastly, Niet et al. (2016) conducted a study of 364 undergraduate students with a mixed method to explore the impact of an academic performance module (APM) on students' academic performance, attitudes, and behaviors. The APM is a learning analytics tool designed to provide students with personalized feedback on their academic performance and promote self-regulated learning. The authors found that the APM positively impacted students' attitudes toward learning, such as their motivation and confidence. Many studies in the literature reviewed shed light on how learning analytics can impact learners' cognitive gains, self-regulation, and affection.

Teaching Excellence

This theme focuses on how learning analytics can help instructors with teaching skills and practices. The teaching aspect of learning analytics has gained popularity in Higher Education. Viberg et al. (2018) conducted a review and found that 62% of research focuses on improving learning support and teaching, which suggests we should consider how to transfer research into practice. There is more evidence supporting institutions and teachers rather than students themselves. Numerous studies in the literature reviewed investigated the following areas: teacher feedback and intervention, online teaching practices, and pedagogical knowledge.

As addressed in the student success section, teachers can use learning analytics to identify students' problems and provide personalized support. Several studies in the literature reviewed examined teachers' feedback and intervention via learning analytics. Viberg et al. (2018) conducted a systematic review of learning analytics for teacher intervention. They identified 17 studies that met the inclusion criteria, and the results indicated that interventions that provide personalized feedback and early warning systems improved student learning outcomes. The authors also found that the design and implementation of the interventions and the level of engagement and support from the instructors were critical factors in their effectiveness. While Viberg et al. specifically focused on using learning analytics in interventions, Sonderlund et al. (2019) provided a broader perspective on the effectiveness of educational interventions in Higher Education. The authors conducted a meta-analysis of 67 randomized controlled trials that evaluate the effectiveness of educational interventions in Higher Education. The results indicated that active learning interventions (e.g., flipped classroom, problem-based learning) moderately positively affected student learning outcomes. The authors also found that interventions that provided immediate feedback and targeted study skills improved student learning outcomes.

To better use learning analytics for teaching excellence, it is vital to understand the definitions of excellent teaching competencies and practices in the online learning environment. Oliva-Córdova et al. (2021) built a literature review of 50 studies on learning analytics and teaching skills. It summarized the benefits of the learning analytics application for teaching competencies and teaching practices. Oliva-Córdova et al. addressed that the learning and application of technologies should be an integral part of teachers’ continuing education throughout their careers. The literature reviewed suggested that the learning analytics tool can help instructors improve their teaching practices in several areas, such as learning design, learning management, and pedagogical mediation. The table (See Figure 1.5 below) summarizes the benefits of using learning analytics in teaching practices.

Figure 1.5: Benefits of using Learning Analytics in teaching practice. (Oliva-Córdova et al., 2021)

In Oliva-Córdova’s review of the benefits of using learning analytics in teaching practice, Learning Management System (LMS) is a significant component. Some examples are making decisions related to redesigning learning experiences, identifying effective teaching methods, and measuring what is happening in the LMS. Using learning analytics requires faculty to gain digital skills, computer literacy, and new pedagogies to use these digital tools for teaching.

The following three studies in the literature reviewed focused on the relationship between learning analytics and Teaching Practices. Cope and Kalantzis (2016) argued that the LMS could embed formative assessments and present structured data to show evidence of learning. This evidence will alert faculty for instructional recalibration. The authors argue that learning analytics can provide teachers with real-time data on their students’ learning progress, enabling them to make informed decisions about adjusting their teaching to meet their students’ needs. Similarly, Dhankhar and Solanki (2020) discussed that a ‘Loop’ project in some Australian universities targeted the practical problem. They explored how to support effective teaching better online and understand the needs and perceptions of teachers in Higher Education to ensure that learning analytics can be genuinely helpful in teaching and learning practice. In addition, Patil and Gupta (2019) focused more on using learning analytics to support collaborative learning. The authors argue that learning analytics can provide teachers and students with real-time data on their collaborative learning activities, enabling them to make informed decisions about adjusting their learning strategies to support collaboration better. The authors addressed the importance of using learning analytics to support collaborative learning practices prioritizing active participation, equal contribution, and feedback.

A few studies in the literature reviewed focused on the relationship between learning analytics and pedagogical knowledge. Bronnimann et al. (2018) conducted a case study and suggested that academic staff can articulate pedagogical questions around learning analytics based on the six stages of Scholarship of Teaching and Learning. The six stages provide a structure and process to explore the relationship between learning analytics and instruction. They found that learning analytics can enhance pedagogical knowledge by providing insights into student learning and engagement, facilitating formative assessment, and informing teaching practices. Likewise, Viberg et al. (2018) explored teachers' role in using learning analytics. They argued that teachers must be involved in designing and implementing learning analytics systems to ensure that they align with their pedagogical goals and practices. They found that teachers were more likely to adopt learning analytics if they perceived it valuable and relevant to their teaching practices. They suggest that developing learning analytics systems should be a collaborative process between teachers and learning analytics designers to ensure that the systems meet both needs. Lastly, Cerro Martínez et al. (2020) agreed that using learning analytics could support pedagogical knowledge development by providing teachers with real-time data about their student's learning progress. The authors conducted a case study in a university setting. They found that teachers who used learning analytics could better understand their students' learning needs and tailor their teaching approaches accordingly. They suggested that learning analytics can contribute to creating a data-driven culture in education, where teachers use data to make informed decisions about their teaching practices. Overall, all three studies in the literature reviewed suggested that learning analytics has the potential to enhance pedagogical knowledge by providing teachers with real-time data about their student's learning progress. These studies addressed that teachers must be involved in designing and implementing learning analytics systems to ensure that they align with their pedagogical goals and practices.

Learning Design

This section discusses on how learning analytics can help designers and instructors continuously improve learning design for online courses and programs. Rienties et al. (2015) defined learning design's objective as "establish the objectives and pedagogical plans which can be evaluated against the outcomes captured through learning analytics" (p. 315). They emphasized that learning design usually influences learners' engagement in the online learning environment. Viberg et al.'s review echoed that learning design is a popular research area in learning analytics. However, Rienties et al. disputed that Learning design, often ignored by the learning analytics knowledge community, is a process in which educators "make informed design decisions with pedagogical focus and communicate these to colleagues and students" (p. 315). They also valued that learning analytics can evaluate the objectives and teaching plans established by learning design. This theme centers on three major topics under learning design: enhance curriculum alignment, guide curriculum redesign, and promote learner engagement.

Firstly, in the literature reviewed examined how learning analytics can support curriculum alignment in online learning. Liu et al. (2017) organized a study using learning analytics to improve adaptive learning design. In their study, Liu et al. analyzed data from 215 students who completed an online adaptive learning course. They used data mining techniques to identify patterns in the data and then used these patterns to develop recommendations for improving the design of adaptive learning experiences. This study explored the potential of data to inform the design of practical adaptive learning experiences. These findings align with the broader literature on learning design, emphasizing the importance of alignment with learning outcomes, collaboration and interaction, and feedback provision. Likewise, Rienties et al. (2015) agreed the Liu et al.'s finding about curriculum alignment. Using a cluster and correlational study, compared 87 learning modules, Rienties et al. found that academics did not consider blueprints when designing them. They also discovered that it is difficult to interpret the learning analytics data without associating the learning design activities with the Learning Management System usage.

Secondly, in Oliva-Córdova et al. (2021)'s systematic review of 50 studies, learning analytics applications are related to curriculum review. Learning analytics can provide invaluable insight into how students react to different learning designs. The data-driven approach can also identify what has worked and has not worked in course design. Here are some examples of how learning analytics can support course design:

  • Identify which learning activities are practical for students.
  • Improve competencies and the mapping of the curriculum.
  • Make decisions related to redesigning learning experiences.

However, Jayashanka et al. (2019) emphasized that creating synergy between learning analytics and learning design is essential to improve learners' performance, engagement, and satisfaction. Jayashanka et al. proposed a framework and a tool to create a dashboard for faculty and students to enhance blended learning in Higher Education. The tool will provide faculty information to allow course revision while teaching the course. The authors found that this tool improved students' engagement and learning outcomes, examining the importance of incorporating learning analytics into effective learning design. Moreover, Nguyen et al. (2018) discovered a mismatch between the faculty's estimated learning time and students' actual learning time. They conducted a study that analyzed the timing of students' engagement with online course content using learning analytics and found that students who engaged earlier and more frequently tended to perform better academically. This research allows faculty to take action on adjusting the course materials. Nguyen's study also emphasized the pedagogical context so faculty can interpret learning analytics into actionable insights for continuous improvement in course design.

Lastly, Rienties et al. (2015) researched the relationship between learning design and learner engagement and performance in the Learning Management System. The author conducted a study that involved 10,975 undergraduate students across 36 modules at a UK university. One of the findings is that learning design activities impact students' engagement in the online learning environment. Another finding is that the learning design activities can influence learner performance, motivation, and engagement. Similarly, Liu et al. (2017) emphasized students' online engagement in the adaptive learning system by studying 128 first year pharmacy students. Their finding demonstrated that cognitive ability and affective characteristics (e.g., motivation, mastery goal orientation) play a role in students' engagement and performance. The finding prompted the agenda to develop learner profiling of different characteristics throughout time to generate more personalized learning content for students. Klein et al. (2019) agreed that learner experience and engagement are associated with the quality of the learning design. The study used a mixed-methods approach, including surveys and interviews, to collect data from 147 participants from 14 Higher Education institutions in the United States. The participants included faculty, administrators, and technical support staff. Learning analytics could enhance curriculum alignment, course redesign, and learner engagement.

Institutional Goals

Using learning analytics for institutional goals has been widely studied in the review. This theme centers on how administrators can implement Learning analytics for staff performance, workload, program review, institutional effectiveness, and institutional adoption of learning analytics.

Firstly, a few researchers investigated learning analytics to measure the performance of instructors, support staff, and student services employees for institutional-level goal alignment. Jonathan et al. (2018) presented a theoretical model consisting of two stages to measure key performance indicators of staff. However, no researchers have yet to prove the model. This study demonstrated learning analytics' potential in institutional-level goal alignment and performance measurement besides predicting students' performance and intervention. However, little research has been done on this staff performance measurement, Dhankhar and Solanki (2020) conducted a case study that Strayer University utilized learning analytics to increase faculty engagement and improve their behavioral mindset. They concluded that learning analytics is not only a research issue but also an organizational issue. Thus, they suggested that institutions address faculty engagement issues according to institutional and cultural context. Likewise, Vanessa Niet et al. (2016) echoed that assigning appropriate workloads to faculty and staff has challenged administrators in Higher Education institutions. To meet the administrators' needs, Vanessa Niet's team proposed a Decision-Making Support System Model and developed a software called UDLearn to support administrators' decision-making process. This software provides a tool for assigning new supervisors or examiners to graduation projects, considering the number of documents that each teacher workload. It also helps with the business logic report generation, and the scalability of the learning analytics application is feasible after a pilot test carried out in a public university in Latin America.

Secondly, like how learning analytics can be utilized to inform course review and continuous improvement explored in the earlier section, it can also be used to guide program review and evaluation. Bronnimann et al. (2018) conducted a case study by an Australian university to use learning analytics to inform program review. It used the Scholarship of Teaching and Learning approach and discovered pulling data from Learning Management System and Student Information System to conduct program review and program redesign. It is also helpful to provide ongoing student support in a new program. Thus, they recommended that administrators include institutional-level data about the student experience in the program review process, generating new insight. Ultimately this new usage of learning analytics leads to data-driven continuous improvement for better student retention and success, which coincided with a section on student success.

Similarly, a research project was outlined by Jayashanka et al. (2019) on using learning analytics for program research and development. The authors also proposed a framework called Intelligent Interactive Visualizer, which can provide real-time feedback on students' progress and examine program areas that require improvement. Further, Horn et al. (2019) examined the effectiveness of community colleges via learning analytics. After examining the effectiveness scores of over 800 community colleges in the United States, they found that the effectiveness measure can indicate whether closer scrutiny of institutional conditions is needed. The result also supported to use of the effectiveness scores as a holistic indicator for community colleges.

The last common from the literature reviewed is the adoption process of learning analytics at an institutional level. Several studies in the literature reviewed focused on inventing models and frameworks for successful implementation. Arnold et al. (2014) proposed an instrument survey called Learning Analytics Readiness Instrument (LARI) to measure the readiness of institutions to embark on the implementation of learning analytics. This instrument centered on five primary factors: ability, culture and process, data, governance and infrastructure, and overall readiness perception. The utility of this tool can identify deficiencies in the institution and provide actions to remediate these areas for a higher chance of successful implementation. Following the invention of the LARI survey, Oster et al. (2016) presented an analysis of the LARI survey pilot administration from a larger sample. Their finding enhances the process of learning analytics implementation, and it is critical to consider institutional and role characteristics across different factors before initiating the action plan. Aiming to implement learning analytics in Higher Education, Tsai et al. (2018) addressed a framework that can generate strategies and policies around learning analytics. This study examined three case studies using the SHEILA (Supporting Higher Education to Integrate Learning Analytics) framework. It demonstrates the potential to utilize this framework to support institutions' strategic planning and policy-making process. Further, it provides three areas, including actions, challenges, and policies, that should be considered in the systematic adoption of learning analytics. Thus, the learning analytics adoption frameworks have been a critical research emphasis under the institutional goals section.

Learning Analytics Challenges

Though many studies in the literature reviewed explored learning analytics implementations, a few researchers investigated the challenges that Higher Education professionals face in applying learning analytics. Tsai et al. (2020) summarized four significant challenges those global institutions face: stakeholder engagement and buy-in, weak pedagogical grounding, resource demand, and ethics and privacy. These challenges have also been examined and echoed by other scholars.

 

 

Stakeholder Engagement and Buy-in

A few studies in the literature reviewed examined a few challenges of implementing learning analytics internationally. Tsai et al. (2020) pointed out that the imbalance of stakeholder engagement and resistance to the change is considered the most common barrier to learning analytics implications in European Higher Education. In the United States, institutions share a similar challenge as Tsai's team. A case study by Klein et al. (2019) has discovered a similar barrier of untrusting technological infrastructure. One faculty responded at a focus group, "So, you have an institutional learning analytics tool, a degree progress tool, a student information management tool, and then your e-mail and whatever. So, you have to be relatively savvy with all those programs to use them and know which one provides you with which information. Yeah, you have got to have a lot of stuff" (Klein, 2019, p. 614). Faculty and advisors feeling frustrated, would not use learning analytics, leading to no buy-in, uneven use, and alternative practices. Klein's team also identified some barriers regarding learning analytics adoption from the lens of faculty and advisors. They conducted a focus group and found that the first barrier is a lack of trust in the technology infrastructure from faculty and advisors. The second barrier is the cumbersome infrastructure, making the tools difficult to use. The third barrier is lacking accurate and timely data and lacking visualized dashboard. The last barrier is the conflict between actionable data and teaching pedagogy.

Several studies in the literature reviewed focused on the barriers encountered by instructional designers in learning analytics implementation. For example, Muljana and Luo (2021) outlined that Instructional Designers serve as support staff to provide helpful information generated from learning analytics tools to facilitate the course design process in most Higher Education institutions. The study used semi-structured interviews with 14 instructional designers from various Higher Education institutions in the United States. The findings demonstrated that several factors influence instructional designers' adoption of learning analytics integration in course design practice: individual differences (prior exposure and pre-perception, pedagogical belief), systematic characteristics, social influences, and facilitating conditions. To clear these barriers, the same authors suggested some practical implications, for example, conducting a needs analysis before jumping into the learning analytics adoption plan would be wise. The authors recommended seeking synergy among different stakeholders on campus to ease the integration and implementation process. To summarize, disengagement of stakeholders brought barriers to applying learning analytics in Higher Education institutions.

Weak Pedagogical Grounding

The literature reviewed revealed that learning analytics can only function effectively when this tool is based on solid educational foundations and theories. Cerro Martínez et al. (2020) found evidence of this challenge to optimize the pedagogical opportunities of online learning. However, they posit that learning analytics can support instructors in finding new ways to develop pedagogical dynamics. Tsai et al. (2020) indicated a lack of alignment between pedagogical theories to meet the needs of students and faculty in European Higher Education. Identifying associated learning science theories is essential, especially when collecting suitable data sets and choosing indicators of learning progression.

Similarly, Nguyen et al. (2018) emphasized that pedagogical context must be provided to users to interpret and translate the learning analytics into actional items. The authors noted that the timing and format of students' engagement can significantly impact their learning outcomes. Designing learning activities and assessments that align with the intended learning outcomes is essential. Finally, Nguyen's team emphasized the need for a pedagogical approach that integrates learning analytics with learning design to ensure that the data generated by learning analytics is meaningful and actionable.

Resource and Training Needs

The integration of learning analytics relies on strategic investment in human, financial, and technical resources (Tsai et al., 2020). Multiple studies in the literature reviewed demonstrated a lack of flexible components to support varying user needs when applying learning analytics. Oliva-Córdova et al. (2021) suggest that institutions must provide ongoing faculty training. The life-long professional development opportunities will be the central component to addressing these challenges, such as resistance to change, lack of knowledge, and ethical concerns. Besides the demand for human resources, Klein et al. (2019) indicated that technical support is needed due to a lack of integrated, accurate, and timely information and a helpful visualization dashboard. The authors conducted a qualitative study to identify the need for user-friendly interfaces that are easy to navigate and use. Users reported that complex and difficult-to-use interfaces were barriers to adoption, which hindered their ability to use the tools and interpret the data generated effectively. Al-Tameemi et al. (2020) agreed that learning analytics could be complicated. The authors systematically reviewed the literature to identify challenges associated with predictive learning analytics. For example, the five-step process includes capturing, reporting, predicting, acting, and refining. The authors found that the accuracy and effectiveness of predictive models heavily depended on the quality and availability of data. Poor data quality, incomplete data, and data silos were identified as barriers to successfully implementing predictive learning analytics. The research cited above echoed Tsai's summary of the tension between institutions' need to innovate and their existing capacity to tackle priorities.

Ethics and Privacy

Lastly, Tsai et al. (2020) stressed that, in European Higher Education, data collection for learning analytics had raised concerns about intruding on the learner's privacy and various ethical implications. In the United States, Cope and Kalantzis (2016) stated some similar challenges around data privacy. The first concern is how big data may negatively impact students and faculty by predetermining their academic performance and professional destinies. The second concern is the privacy protection for users when using extensive data methods. The last concern is how we can prevent significant data from becoming 'Big Brother' and recruit users as co-collectors and co-researchers so they are at the center of the data-driven decision process.

Besides data privacy, there is also a concern about increased bias against students when using predictive data. For example, Klein et al. (2019) discussed that learning analytics data, including prior academic experiences, could engender bias. Thus, the existence of learning analytics data, whether used to predict, surveil, or inform, needed a clear purpose for its use. Apart from the bias issue, learning analytics can also bring surveillance issues to faculty and students. Though faculty may provide quick responses to students, students may feel their every action in the learning management system is monitored, which will cause discomfort on the student end. Though the ethical and privacy issues in learning analytics concern students and instructors, Viberg et al. (2018) conducted a literature review of the themes in learning analytics research and found that only 18% of the studies indicated ethics and privacy issues. Their result also indicates that only a few studies approach this issue systematically among some empirical research. Examples of studies focus on institutional ethical considerations, moral and policy issues, and students' vulnerabilities and autonomy.

Gaps in the Literature

Multiple stakeholders can benefit from learning analytics, from students and instructors to instructional designers and administrators across the institutions. However, many areas of its usage and application need further research based on the literature reviewed.

Firstly, A few studies in the literature reviewed suggested that future research should focus on clear definitions of learning analytics and its combination with learning science. Ifenthaler and Yau (2020) recommended that future research focus on designing large-scale, longitudinal, or quasi-experimental studies with well-defined and operationalized constructs. Also, standards and research methodology for producing valid findings are much needed. Similarly, Tsai et al. (2020) also addressed the importance of ensuring the foundation of learning science in Learning analytics design and implementation. The authors suggest that future research should address the barriers to implementing learning analytics and develop more effective models that can enhance teaching and learning processes. Viberg et al. (2018) also agreed that more future research with a well-designed mixed-method approach would help us understand the complex learning environment. Especially future studies should combine pedagogical knowledge and learning analytics to pose the research questions so that the learning practices can be improved in Higher Education.

Secondly, though a few researchers have invented some frameworks for learning analytics, more studies on validation and examination are still needed. Muljana and Luo (2021) emphasized that future research could validate the existing learning analytics frameworks to investigate the design process (critical decisions, design interventions, outcomes, contexts, and conditions) from instructional designers' perspectives. To consider the students' involvement, several European researchers suggested addressing the imbalance between instructors and students, for example, examining the existing frameworks to include students' voices in the design process (Tsai et al., 2020; Jivet et al., 2020). Jonathan et al. (2018) recommended future work to create a learning analytics framework that can visualize the performance of all users and involved processes for accountability. They also suggested that the future of learning analytics in Higher Education lies in developing more sophisticated models to measure and improve student performance and engagement.

Thirdly, more research needs to be done on learning analytics from the instructional design aspect. Muljana and Luo (2021) suggested that scholars may consider exploring the perspectives of Instructional Designers and their current state of learning Analytics-related practices in various settings, such as in corporate and healthcare, where utilizing data may be a norm. Furthermore, Klein et al. (2019) encouraged future research focusing on useful visualizations and fulfilling Instructional Designers' professional needs. Al-Tameemi et al. (2020) proposed future research to identify the correlation between student activities on the learning platforms and their final-year results using appropriate machine learning and data mining algorithms. Cerro Martínez et al.(2020) proposed to research the effect of making learning analytics available to students when they are involved in collaborative learning. Exploring how students self-regulate during this process and what strategies they deploy to achieve their objectives will be promising. Also, during the assessment process, researchers can give students access to learning analytics during the course, allowing them to co-create the criteria and offer data use training. Liu et al. (2017) stressed that future research should investigate better alignment among learners' prior knowledge and assessments to improve the efficiency of the assessment and the presentation of personalized content. There is a need to research learner profiling and trace behavior patterns to generate an effective personalized learning path. Further, more researchers called for future studies on learning analytics efficacy from the learner and user perspective. For example, Viberg et al. (2018) recommended that institutions consider how we can facilitate learning analytics to benefit learners and provide guidelines for researchers and practitioners focusing on opportunities, barriers, and challenges. Oster et al. (2016) also suggested developing a communication plan internally regarding the learning analytics policies and practices. Rienties et al. (2015) suggested future research on integrating demographics, individual, and sociocultural data about students to evaluate learning design to see if it fits the learners' needs. Most importantly, they also advocated combining research and institutional data with exploring how context, individual characteristics, and learning design activities influence students' learning journey.

Lastly, more needs to be understood about the learning analytics implementation in terms of privacy and ethical issues. Dhankhar and Solanki (2020) examined various worldwide challenges, such as privacy and ethical issues. Future research should consider carefully and address the institution and cultural context. Similarly, Oliva-Córdova et al. (2021) recommended research on how Higher Education institutions can provide learning analytics training around privacy and ethical concerns for all stakeholders. The training purpose is to be applied daily, and both students and instructors benefit from it. Viberg et al. (2018) also agreed that future research should consider ethical issues, as 80% of the papers examined in the literature review failed to mention ethics. Furthermore, they called for validation of the learning analytics tools and methods used ethically better to support the quality and efficiency of Higher Education.

Conclusion

Oliva-Córdova et al. (2021) stated that learning analytics is considered the third wave in educational technology and is a promising, rapidly evolving field of study. The authors summarized that learning analytics had been studied for a decade and is deemed effective in Higher Education to improve student learning outcomes, increase retention, and guide decision-making for better teaching and learning. This literature review in the general field examines existing definitions for learning analytics in the Higher Education context. It then compared and examined related terms, theories, and frameworks of learning analytics. In the next section, learning analytics implementations are synthesized and grouped based on four significant purposes: student success, teaching excellence, course design, and institutional goals. While the deployment of learning analytics prevails across universities worldwide, some common challenges are identified, including the lack of buy-in from stakeholders, weak learning science foundation, inadequate training and resources, and lack of studies on ethical and privacy issues. Ultimately, the scientists have addressed future research directions in four areas: clear definitions, validated frameworks, instructional design issues, privacy, and ethical issues. This literature review in the general field will help the author finalize the specific research questions and guide the continuing literature review in a special field of learning Analytics.

Part 2: Special Field (placeholder)

Chapter 3: Theory and Methodology (placeholder)

Chapter 4: Findings (placeholder)

Chapter 5: Conclusions (placeholder)


References

Aldosari, A. M., Eid, H. F., & Chen, Y.-P. P. (2022). A Proposed Strategy Based on Instructional Design Models through an LMS to Develop Online Learning in Higher Education Considering the Lockdown Period of the COVID-19 Pandemic. Sustainability, 14(13), Article 13. https://doi.org/10.3390/su14137843

Al-Tameemi, G., Xue, J., Ajit, S., Kanakis, T., & Hadi, I. (2020). Predictive Learning Analytics in Higher Education: Factors, Methods and Challenges. 2020 International Conference on Advances in Computing and Communication Engineering (ICACCE), 1–9. https://doi.org/10.1109/ICACCE49060.2020.9154946

Arnold, K. E., Lonn, S., & Pistilli, M. D. (2014). An exercise in institutional reflection: The learning analytics readiness instrument (LARI). Proceedings of the Fourth International Conference on

Bronnimann, J., West, D., Huijser, H., & Heath, D. (2018). Applying Learning Analytics to the Scholarship of Teaching and Learning. Innovative Higher Education, 43(5), 353–367. https://doi.org/10.1007/s10755-018-9431-5

Cerro Martínez, J. P., Guitert Catasús, M., & Romeu Fontanillas, T. (2020). Impact of using learning analytics in asynchronous online discussions in higher education. International Journal of Educational Technology in Higher Education, 17(1), 39. https://doi.org/10.1186/s41239-020-00217-y

Cope, B., & Kalantzis, M. (2015). Sources of Evidence-of-Learning: Learning and assessment in the era of big data. Open Review of Educational Research, 2(1), 194–217. https://doi.org/10.1080/23265507.2015.1074869

Cope, B., & Kalantzis, M. (2016). Big Data Comes to School: Implications for Learning, Assessment, and Research. AERA Open, 2(2), 2332858416641907. https://doi.org/10.1177/2332858416641907

Cope, B., & Kalantzis, M. (Eds.). (2017). e-Learning Ecologies: Principles for New Learning and Assessment. Routledge. https://doi.org/10.4324/9781315639215

Dhankhar, A., & Solanki, K. (2020). State of the art of learning analytics in higher education. International Journal of Emerging Trends in Engineering Research, 8(3), 868–877. Scopus. https://doi.org/10.30534/ijeter/2020/43832020

Foster, E., & Siddle, R. (2020). The effectiveness of learning analytics for identifying at-risk students in higher education. Assessment & Evaluation in Higher Education, 45, 842–854. https://doi.org/10.1080/02602938.2019.1682118

Horn, A. S., Horner, O. G., & Lee, G. (2019). Measuring the effectiveness of two-year colleges: A comparison of raw and value-added performance indicators. Studies in Higher Education, 44(1), 151–169. https://doi.org/10.1080/03075079.2017.1349741

Ifenthaler, D., & Yau, J. Y.-K. (2020). Utilising learning analytics to support study success in higher education: A systematic review. ETR&D-EDUCATIONAL TECHNOLOGY RESEARCH AND DEVELOPMENT, 68(4), 1961–1990. https://doi.org/10.1007/s11423-020-09788-z

Jääskelä, P., Heilala, V., Kärkkäinen, T., & Häkkinen, P. (2021). Student agency analytics: Learning analytics as a tool for analysing student agency in higher education. Behaviour & Information Technology, 40(8), 790–808. https://doi.org/10.1080/0144929X.2020.1725130

Jayashanka, R., Hewagamage, K. P., & Hettiarachchi, E. (2019). An Intelligent Interactive Visualizer to Improve Blended Learning in Higher Education. 2019 Twelfth International Conference on Ubi-Media Computing (Ubi-Media), 69–73. https://doi.org/10.1109/Ubi-Media.2019.00022

Jivet, I., Scheffel, M., Schmitz, M., Robbers, S., Specht, M., & Drachsler, H. (2020). From students with love: An empirical study on learner goals, self-regulated learning and sense-making of learning analytics in higher education. Internet and Higher Education, 47. Scopus. https://doi.org/10.1016/j.iheduc.2020.100758

Jonathan, J., Sohail, S., Kotob, F., & Salter, G. (2018). The Role of Learning Analytics in Performance Measurement in a Higher Education Institution. 2018 IEEE International Conference on Teaching, Assessment, and Learning for Engineering (TALE), 1201–1203. https://doi.org/10.1109/TALE.2018.8615151

Klein, C., Lester, J., Rangwala, H., & Johri, A. (2019). Technological barriers and incentives to learning analytics adoption in higher education: Insights from users. Journal of Computing in Higher Education, 31(3), 604–625. https://doi.org/10.1007/s12528-019-09210-5

Liu, M., Kang, J., Zou, W., Lee, H., Pan, Z., & Corliss, S. (2017). Using Data to Understand How to Better Design Adaptive Learning. Technology, Knowledge and Learning, 22(3), 271–298. https://doi.org/10.1007/s10758-017-9326-z

Mangaroska, K., Sharma, K., Gaševic, D., & Giannakos, M. (2020). Multimodal Learning Analytics to Inform Learning Design: Lessons Learned from Computing Education. Journal of Learning Analytics, 7(3), 79–97.

Moore, J. L., Dickson-Deane, C., & Galyen, K. (2011). e-Learning, online learning, and distance learning environments: Are they the same? The Internet and Higher Education, 14(2), 129–135. https://doi.org/10.1016/j.iheduc.2010.10.001

Muljana, P. S., & Luo, T. (2021). Utilizing learning analytics in course design: Voices from instructional designers in higher education. Journal of Computing in Higher Education, 33(1), 206–234. https://doi.org/10.1007/s12528-020-09262-y

Najdawi, A., & Stanley, J. S. (2021). Exploring the Role of Big Data Analytics in Reinnovating Higher Education: The Case of UAE. 2021 International Conference on Innovative Practices in Technology and Management (ICIPTM), 200–204. https://doi.org/10.1109/ICIPTM52218.2021.9388354

Nguyen, Q., Huptych, M., & Rienties, B. (2018). Linking students’ timing of engagement to learning design and academic performance. Proceedings of the 8th International Conference on Learning Analytics and Knowledge, 141–150. https://doi.org/10.1145/3170358.3170398

Oliva-Córdova, L. M., Garcia-Cabot, A., & Amado-Salvatierra, H. R. (2021). Learning Analytics to Support Teaching Skills: A Systematic Literature Review. IEEE Access, 9, 58351–58363. https://doi.org/10.1109/ACCESS.2021.3070294

Oster, M., Lonn, S., Pistilli, M. D., & Brown, M. G. (2016). The learning analytics readiness instrument. Proceedings of the Sixth International Conference on Learning Analytics & Knowledge - LAK ’16, 173–182. https://doi.org/10.1145/2883851.2883925

Patil, J. M., & Gupta, S. R. (2019). Analytical Review on Various Aspects of Educational Data Mining and Learning Analytics. 2019 International Conference on Innovative Trends and Advances in Engineering and Technology (ICITAET), 170–177. https://doi.org/10.1109/ICITAET47105.2019.9170143

Prieto, L. P., Rodríguez-Triana, M. J., Martínez-Maldonado, R., Dimitriadis, Y., & Gašević, D. (2019). Orchestrating learning analytics (OrLA): Supporting inter-stakeholder communication about adoption of learning analytics at the classroom level. Australasian Journal of Educational Technology, 35(4), 14–33. Scopus. https://doi.org/10.14742/ajet.4314

Rienties, B., Toetenel, L., & Bryan, A. (2015). “Scaling up” learning design: Impact of learning design activities on LMS behavior and performance. Proceedings of the Fifth International Conference on Learning Analytics And Knowledge, 315–319. https://doi.org/10.1145/2723576.2723600

Sonderlund, A. L., Hughes, E., & Smith, J. (2019). The efficacy of learning analytics interventions in higher education: A systematic review. BRITISH JOURNAL OF EDUCATIONAL TECHNOLOGY, 50(5), 2594–2618. https://doi.org/10.1111/bjet.12720

Sonnenberg, C., & Bannert, M. (2015). Discovering the Effects of Metacognitive Prompts on the Sequential Structure of SRL-Processes Using Process Mining Techniques. Journal of Learning Analytics, 2(1), Article 1. https://doi.org/10.18608/jla.2015.21.5

Toro-Troconis, M., Alexander, J., & Frutos-Perez, M. (2019). Assessing Student Engagement in Online Programmes: Using Learning Design and Learning Analytics. International Journal of Higher Education, 8(6), 171–183.

Tsai, Y.-S., Moreno-Marcos, P. M., Tammets, K., Kollom, K., & Gašević, D. (2018). SHEILA policy framework: Informing institutional strategies and policy processes of learning analytics. Proceedings of the 8th International Conference on Learning Analytics and Knowledge, 320–329. https://doi.org/10.1145/3170358.3170367

Tsai, Y.-S., Rates, D., Moreno-Marcos, P. M., Muñoz-Merino, P. J., Jivet, I., Scheffel, M., Drachsler, H., Delgado Kloos, C., & Gašević, D. (2020). Learning analytics in European higher education—Trends and barriers. Computers & Education, 155, 103933. https://doi.org/10.1016/j.compedu.2020.103933

Vanessa Niet, Y., Díaz, V. G., & Montenegro, C. E. (2016). Academic decision making model for higher education institutions using learning analytics. 2016 4th International Symposium on Computational and Business Intelligence (ISCBI), 27–32. https://doi.org/10.1109/ISCBI.2016.7743255

Viberg, O., Hatakka, M., Bälter, O., & Mavroudi, A. (2018). The current landscape of learning analytics in higher education. Computers in Human Behavior, 89, 98–110. https://doi.org/10.1016/j.chb.2018.07.027


Appendix (placeholder)

  • Di Wu