Write a wiki-like entry defining an assessment concept. Define the concept, describe how the concept translates into practice, and provide examples. Concepts could include any of the following, or choose another concept that you would like to define. Please send a message to both admins through Scholar indicating which you would like to choose - if possible, we only want one or two people defining each concept so, across the group, we have good coverage of concepts.
What is Learning Analytics?
“LA [Learning Analytics] collects and analyzes the ‘digital breadcrumbs’ that students leave as they interact with various computer systems to look for correlations between those activities and learning outcomes” (Educause, 2011).
It is the process of taking all the data being produced and collected by the technologies, activities, and assessments used for learning and using them to measure and provide feedback on the progress of particular learning outcomes. This process is also interested in making the feedback loop much shorter, and in turn, more useful to the learner and the teacher. The feedback loop in learning assessment (grading and evaluation) historically has been difficult to speed up and, in turn, act upon before the learning outcome/lessons have ended. The learner and teacher can act on analysis that would suggest remediation, review, or advancement into particular disciplines.
The Society for Learning Analytics Research adopted its own definition similar to Educause:
"Learning analytics is the measurement, collection, analysis and reporting of data about learners and their contexts, for purposes of understanding and optimising learning and the environments in which it occurs" (Ferguson, 2012).
Learning analytics is different from Academic Analytics as illustrated by Long and Siemens in the chart below.
Long and Siemens also indicate that as we illustrate the distinction between learning analytics and academic analytics, we also should develop some stages of learning analytic development. They suggest:
- Course-level: learning trails, social network analysis, discourse analysis
- Educational data-mining: predictive modeling, clustering, pattern mining
- Intelligent curriculum: the development of semantically defined curricular resources
- Adaptive content: adaptive sequence of content based on learner behavior, recommender systems
- Adaptive learning: the adaptive learning process (social interactions, learning activity, learner support, not only content)
(Long & Siemens, 2011)
Learning analytics is meant to look deeply at the learning process and assist teachers in helping each learner succeed. It can help teachers at the course level analyze student work and make the necessary pedagogical changes and intervention to help a student get back on track. But also, these analytics can help teachers identify learners that are more advanced and need to move on to more complex outcomes to stay engaged and studious. Thinking this way, it could be that focused learning analytics could help maintain the rigor required of learning outcomes by consistently informing teachers when advancement exists in each student allowing the teacher to act on it.
Course level analytics can help the teacher find engagement levels and time on task and intervene accordingly. Data mining has great potential for helping teachers predict learner struggles and persistence levels and intervene accordingly. Analytics inside intelligent curriculum can help curriculum developers and teachers keep the relationships between topics and disciplines intact, allowing learners to move through content and make connections. Analytics in adaptive learning and content allow the learner and teacher see the level of student and keep the pedagogical methods appropriate to the individual learner and the content level appropriate to the individual learning. Allowing learners to remediate or advance depending on content mastery.
“[L]earning analytics can lead to crucial interventions that can help students complete their education programs successfully and in a timely fashion. More broadly, institutions can apply lessons drawn from learning analytics to deepen student learning and engagement overall by fine-tuning pedagogy, enriching student services, and generally honing how they deliver educational content”(“Building Blocks”, 2013).
Teachers and learners, as well as school administrators, can gather data from various sources and use it to make modifications according to individual student needs. “Such data can be drawn from course management systems, e-portfolios, library systems, student information systems, clicker systems, external publishers and other providers, and other sources”(“Building Blocks”, 2013).
Here is an infographic illustrating the potential of Learning Analytics:
From where have Learning Analytics been derived?
Learning Analytics derived from a variety of fields. Educational data mining, business intelligence, and web analytics all have contributed to the beginning of learning analytics. Other factors that have contributed to the development of this theory are the mainstream use of online learning and learning management systems (LMS) and the affordability of "Big Data."
Educational data mining has been used for decades at the institutional, state, and federal levels but never at the course or classroom level. Big Data use in business intelligence has contributed to the growth of analytics in business, allowing business to perform targeted marketing to individual customers based on previous purchases or interest in particular items online. Amazon is the main example of the use of data to analyze customer wants and needs. There has also been the recommender tools used both by Amazon for purchasing and advertising, but also companies like Netflix who use the data analyzed to recommend films and television to watch based on previous viewings and rankings of film and television by the user/customer. Web analytics, as modeled by Google, also has contributed to the development of learning analytics, by illustrating targeted individualized advertising based on searches, clicks and other datasets (Ferguson, 2012).
Major changes in how the internet was experienced changed in the 1st decade of the 21st century making the read/write web the norm of online experiences allowing people to externalize desires, needs, thoughts along with a host of other personal information. Social media became the norm and online learning became a regular practice all involved in this new read/write web where social interactions through online means became desirable. From these social connections and interactions, data could be extracted and analyzed. Educational analytics began developing along with the new social web. In 2010, the field of analytics split to create a few separate areas for research. They are:
- Educational data mining focused on the technical challenge: How can we extract value from these big sets of learning-related data?
- Learning analytics focused on the educational challenge: How can we optimise opportunities for online learning?
- Academic analytics focused on the political/economic challenge: How can we substantially improve learning opportunities and educational results at national or international levels?
(Ferguson, 2012)
Learning analytics is still an emerging field where we investigate learning through data evidence which allows us to make changes in learning environments and/or learning behaviors. Buckingham Shum indicates that this goes well beyond the technologist, but should included educational researchers and practitioners and the learners themselves so that as many learners as possible can benefits from the information derived from these analytics (Buckingham Shum, 2012).
How could Learning Analytics being used?
Although learning analytics is an emerging field due to the advent of huge amounts of data, there are some uses for learning analytics that exist and will exist in the near future.
1. Dispositions, values and attitudes. “There is substantial and growing evidence within educational research that learners’ orientation towards learning—their learning dispositions—significantly influence the nature of their engagement with new learning opportunities, in both formal and informal contexts. Learning dispositions form an important part of learning-to-learn competences, which are widely understood as a key requirement for life in the 21st Century” (Buckingham Shum and Deakin Crick, 2012). This study is looking into how learning analytics can help teachers and learners themselves understand their “learning power.”
2. Outcome mastery, engagement with learning outcomes, interventions. The U.S. Department of Education published an issue brief to bring together information and research in the field of data mining and learning analytics.
“Educational data mining and learning analytics are used to research and build models in several areas that can influence online learning systems. One area is user modeling, which encompasses what a learner knows, what a learner’s behavior and motivation are, what the user experience is like, and how satisfied users are with online learning. At the simplest level, analytics can detect when a student in an online course is going astray and nudge him or her on to a course correction. At the most complex, they hold promise of detecting boredom from patterns of key clicks and redirecting the student’s attention. Because these data are gathered in real time, there is a real possibility of continuous improvement via multiple feedback loops that operate at different time scales—immediate to the student for the next problem, daily to the teacher for the next day’s teaching, monthly to the principal for judging progress, and annually to the district and state administrators for overall school improvement.” (Bienkowski, Feng, & Means, 2012).
Learning analytics differs from educational data mining by enabling human intervention and adjustments of learning content, instructional practice and assisting at-risk students.
Here is a chart illustrating potential uses of learning analytics from the Department of Education Brief:
Examples of Use
Dr. Diane Reddy, Professor of Psychology, Director of the Center for Excellence in Teaching and Learning, University of Wisconsin-Milwaukee discusses the use of U-Pace and its use within the university’s learning management system (LMS) to watch for moments in learning that may require instructional intervention.
She says:
“At the University of Wisconsin-Milwaukee, the U-Pace online instructional approach offers guidance for making these decisions, outlining how instructors can create messages called Amplified Assistance—which is designed to support students as they tackle challenges and take control of their learning. (Multiple studies have found that these proactive, personalized instructor-initiated messages help students to achieve success.)
How It Works
To build Amplified Assistance messages, U-Pace instructors use data from any LMS to identify specific concepts that pose a challenge to individual students, and the scores related to any quiz attempts. With this information, instructors can then send positive messages that incorporate both personalized help with content and an unwavering belief in that student’s ability to succeed. These messages also recognize the effort and small steps that students are taking towards success, encouraging further personal and academic growth.”
(Reddy, 2014)
Another example of learning analytics being used was at Iowa Community College Online Consortium where seven colleges use common learning analytics to improve success in student learning outcomes.
“The goal of our innovation is to use key data indicators to identify when a student’s trajectory toward success off course and intervene with students very quickly to get them back on track. This project allowed us to provide an at-risk dashboard to instructors so that they could more quickly identify students at-risk of failure. For this project our focus has been College Composition I, Math for Liberal Arts, and College Algebra.” (Iowa Community College, 2012).
There is more about this research project here: http://www.iowacconline.org/NGLC
Another popular example is Purdue University's Course Signals project. The Course Signals project uses learnign analytics to provide feedback to students, detect early problems in understanding material or just general engagement, and provides frequent and constant ffedback on how well you as the student are doing in said course. Intervention systems within it allow the faculty memeber to engage with students before it is too late, to get help from various support areas on campus and to potentially try different pedagogical approches with a particular student. It uses a basic green, yellow, red stop light system to indiucate to students where they may require additional time and practice and where they are doing well.
Arnold and Pistilli indicate such in their research report on the program. They say:
"Course Signals (CS) is a student success system that allows faculty to provide meaningful feedback to student based on predictive models. The premise behind CS is fairly simple: utilize the wealth of data found at an educational institution, including the data collected by instructional tools, to determine in real time which students might be at risk, partially indicated by their effort within a course" (2012, p. 1).
Potential Strengths
(Brown, 2012; Pinantoan, 2012)
Weaknesses
Ethics and data use
There are many possibilities still yet discovered within learning analytics. There are also many questions to ask and ground rules to develop. Educators and Administrators should start thinking about the ethical dilemmas with all this data collection. Issues of privacy, surveillance and the need for transparency being only a few to tackle. In their article, Sharon Slade and Paul Prinsloo look into major areas for ethical concerns. They indicate that though it is obvious that understanding where students are at in their progress is extremely beneficial to teacher and student, we must be aware of the ethical implication involved with such massive data collection. They write, "such collection of data and its use faces a number of ethical challenges, including issues of the location and interpretation of data; informed consent, privacy and the de-identification of data; and the classification and management of data” (2013).
George Siemens even ends his lecture,The Role of Learning Analytics in Improving Teaching and Learning, by indicating that ethics and privacy are the areas which need the most work. He indicates that when collecting data, we may focus on those student most at risk of not succeeding and begin to establish a socioeconomic divide due to at-risk indicators, even by accident. Or it could happen purposefully as Siemens indicates describing how researchers can take one's pattern of liking things in Facebook and determine a number of things about that person. Is that ethical? Does it get into issue of Big Brother? How well can we understand our students? How far should we go? Siemens says the hard questions about learning analytics is not what can we do, "but what shouldn't we do" (Siemens, 2013).
New fields
Siemens lecture of the role of learning analytics in teaching and learning is worth watching particularly as he discusses how externalization of thought, relations, and connections of knowledge is able to be analyzed for areas of success and/or improve. Social Learning, he says, and the analytics behind it can help us get to those more authentic learning results since it encourages us to externalize these thoughts, relations, and connections, and once they are externalized, they can be analyzed (Siemens, 2013, 16 min mark).
Learning Analytics itself has subsections which researchers are currently developing. Buckingham Shum indicates a number of these subsections in his UNESCO policy brief which will have great implications on research and implementation. His list includes,
(Buckingham Shum, 2012).
The future is bright for learning analytics looking at the prospect of getting real-time feedback from a system that let's me, as a teacher, make adjustments to my pedagogy or curriculum as it relates to my learners. If a teacher can see individual students when they cannot master a particular learning outcome, a teacher can react with positive interventions and changes in pedagogy to accommodate differnt learning styles. This could have tremendous effect on student success. Students can see how they are doing, hopefully, at the outcome level and make their own decisions on how to remediate or ask for assistance from the teacher. Long and Siemens say, "For educators, the availability of real-time insight into the performance of learners—including students who are at-risk—can be a significant help in the planning of teaching activities. For students, receiving information about their performance in relation to their peers or about their progress in relation to their personal goals can be motivating and encouraging" (2011).
Crucial rules need to be put in place regarding privacy, data ownership to the student (much like medical records), intrusion and the misuse of predictive modeling and profiling tools, like keeping students on a particular track too early in their education career or using them as summative punitive devices for school funding and other such practices.
With the right ethics in place and rules established, learning analytics could and should be a real-time constant formative assessment, informing student and educator about the student progress, successes, difficulties, and dispositions toward learning.
EDUCause Library - Learning Analytics: http://www.educause.edu/library/learning-analytics
Journal of Learning Analytics is a peer-reviewed, open-access journal, disseminating the highest quality research in the field. The journal is the official publication of the Society for Learning Analytics Research (SoLAR). http://learning-analytics.info/ (new in 2014)
Society for Learning Analytics Research: http://solaresearch.org/
International Educational Data Mining Society: http://www.educationaldatamining.org/
Learning and Knowledge Analytics blog
Learning Analytics: Welcome to the future of assessment? Simon Buckingham Shum: https://www.youtube.com/watch?v=LqDEJtlzMiw
Clow, D. (2013). An overview of learning analytics. Teaching in Higher Education, 18(6) pp. 683–695.
Arnold, K.E. and Pistilli, M.D. (2012). "Course Signals at Purdue: Using Learning Analytics to Increase Student Success." LAK '12 Proceedings of the 2nd International Conference on Learning Analytics and Knowledge. Retrieved from http://www.itap.purdue.edu/learning/docs/research/Arnold_Pistilli-Purdue_University_Course_Signals-2012.pdf
Bienkowski, M., Feng, M., & Means, B. (2012, October). "Enhancing Teaching and Learning Through Educational Data Mining and Learning Analytics: An Issue Brief." U.S. Department of Education: Office of Educational Technology. Retrieved from http://www.ed.gov/edblogs/technology/files/2012/03/edm-la-brief.pdf
Brown, M. (2012, July). "Learning Analytics: Moving from Concept to Practice." Educause Learing Initiative Brief. Retrieved from http://net.educause.edu/ir/library/pdf/ELIB1203.pdf
Buckingham Shum, S. (2012). "Learning Analytics Policy Briefing." UNESCO. Retrieved from http://iite.unesco.org/pics/publications/en/files/3214711.pdf
Buckingham Shum, S. and Deakin Crick, R. (2012). "Learning dispositions and transferable competencies: pedagogy, modelling and learning analytics." In: 2nd International Conference on Learning Analytics & Knowledge, 29 Apr - 02 May 2012, Vancouver, British Columbia, Canada. Retrieved from http://oro.open.ac.uk/32823/1/SBS-RDC-LAK12-ORO.pdf
"Building Blocks for College Completion: Learning Analytics." (2013, September). EDUCause: Next Generation Learning Challenges. Retrieved from http://net.educause.edu/ir/library/pdf/NGI1301.pdf
EDUCause. (2011). "Seven things you should know about first generation learning analytics." EDUCause Learning Initiative. Retrieved from http://net.educause.edu/ir/library/pdf/ELI7079.pdf
Ferguson, R. (2012). "Learning analytics: drivers, developments and challenges." International Journal of Technology Enhanced Learning, 4(5/6) pp. 304–317. Retrieved from http://oro.open.ac.uk/36374/1/IJTEL40501_Ferguson%20Jan%202013.pdf
Learning Analytics: Leveraging Education Data [infographic] http://www.opencolleges.edu.au/informed/learning-analytics-infographic/
Long, P., & Siemens, G. (2011). "Penetrating the fog: Analytics in learning and education." EDUCAUSE Review September/October, 31-40. Retrieved from http://www.educause.edu/ero/article/penetrating-fog-analytics-learning-and-education
Pinantoan, A. (2012). "Grades 2.0: How Learning Analytics Are Changing The Teacher’s Role." Edudemic. Retrieved from http://www.edudemic.com/grades-2-0-how-learning-analytics-are-changing-the-teachers-role/
Reddy, D. (2014, September). U-Pace and learning analytics: Translating LMS data into successful instructional interventions. Next Generation Learning Challenges Blog. Retrieved from http://nextgenlearning.org/blog/u-pace-learning-analytics-translating-lms-data-successful-instructional-interventions
Siemens, G. (2013, March). "The Role of Learning Analytics in Improving Teaching and Learning." Presented at Penn State TLT Conference. Retrieved from http://youtu.be/0ILt-ERdb64
Slade, S. and Prinsloo, P. (2013). "Learning analytics: ethical issues and dilemmas." American Behavioral Scientist, 57(10) pp. 1509–1528. Retrieved from http://oro.open.ac.uk/36594/2/ECE12B6B.pdf