New Learning’s Updates

Scholar’s New Analytics App: Towards Mastery Learning

We are pleased to announce the launch of Scholar’s new Analytics area, created by our research team at the University of Illinois, with the support of funding from the National Science Foundation.[1] This post is a brief conceptual orientation to the tool, now online at http://cgscholar.com

The Theory and Practice of Mastery Learning

Nearly 50 years ago, renowned educator Benjamin Bloom suggested an approach to teaching and learning that he called “Mastery Learning” (Bloom 1968). His underlying proposition was that when students are normally distributed by aptitude and they experience uniform teaching, achievement will also be normally distributed. However, “given sufficient time (and appropriate types of help) 95 percent of students … can learn a subject up to a high level of mastery, … [but] it will take some students more effort, time, and help to achieve this level than it will others.” (p.4). Fig 1. illustrates the contrast between classical uniform instruction and the hypothesized outcomes of mastery learning (Block 1971: 6-7).

Fig.1. The outcomes of uniform instruction compared to optimal instruction in Mastery Learning

This idea today underlies the idea of standards-based approaches to teaching and learning, where it is assumed that most students at a particular grade level have the potential to reach the prescribed standard (Kalantzis and Cope 2012: chapter 10).

Bloom made a number of suggestions for optimal instruction, including: regular, formative assessment that directs teachers to intervene in the case of students who are falling behind, allowing students to work at their own pace, group work, and intensive tutoring. However, these approaches have consistently been frustrated by the logistical challenges of differentiating instruction in traditional classrooms and the continued emphasis on summative rather than formative assessment.

A new generation of technologies grouped under the concept of “learning analytics” (Behrens and DiCerbo 2013, Cope and Kalantzis 2015, Cope and Kalantzis 2016, Cope and Kalantzis 2017, Siemens and Baker 2013) may at last make Bloom’s aspirations a practical possibility, offering integral formative assessment that supports and tracks progress toward mastery, providing structured supports for group or peer-to-peer learning, and making it easier for the teacher to manage the logistics of differentiated instruction. Building on research at the University of Illinois in the area of learning analytics, this is what we intend now offer in the Scholar: Analytics tool.

Scholar: Analytics is a learning visualization tool, where in any unit of work a student can see their progress towards mastery, and a teacher can see the comparative progress towards mastery of all members of the class, identifying which students may require more time or special attention to achieve mastery.

The intended outcomes of Scholar: Analytics are:

  1. To increase learner responsibility for learning progress and growth of self-efficacy by making learning expectations explicit, along with an always-available data visualization of progress towards meeting these expectations.
  2. To support a range of types of continuous formative assessment: item-based assessment; rubric-based reviews (peers, self, teacher); online discussion contributions etc.
  3. To support adaptive and personalized instruction, by offering the possibility of re-taking quizzes, revising work, extending contributions to online discussions etc., until mastery is attained.
  4. To improve on-time mastery with clear time objectives and “focus” credits which indicate the degree of effort so far expended by the student, compared to the degree of effort expected.
  5. To encourage via “help” credits, collaborative or peer learning, originally expressed by Bloom et al. as group discussions and peer tutoring.
  6. To support teachers by providing them clear visualizations of student activity, per student, as well as class progress towards mastery. It is possible to “drill down” into all the constituent datapoints. Because the basis of the analysis is a very large number of datapoints for every student, at any moment in time during a unit of work, teachers is able to see a detailed progress record for every learner, based on data that was in previously, in a practical sense, largely invisible.
  7. To support differentiated instruction, whereby learners can work at their own pace towards mastery goals.
  8. To provide an extensive supporting evidence for summative assessments made by the teacher in the gradebook. This reduces the grading burden for teachers by providing comprehensive learning progress data.

Scholar: Analytics in Practice

To illustrate how Scholar: Analytics works, consider the following a scenario using a sample learning module, Energy Transformations, published in Scholar. This module aligns with the Next Generation Science Standard “PS3.B: Conservation of Energy and Energy Transfer” and the Common Core State Standard for Writing “W.8.2: Write informative/explanatory texts to examine a topic and convey ideas, concepts, and information through the selection, organization, and analysis of relevant content.” In this scenario, the task expectations in the unit have been clearly laid out by the teacher:

  1. Respond to posts: The teacher will make 6 posts introducing the basic science concepts including text, videos, and diagrams each of which prompts online and in-class discussion. Every student must respond online to the comment prompt.
  2. Make posts of your own: Students will make 2 posts of their own, illustrating energy transformations in their everyday lives. Every student will be expected to comment on at least three other students’ posts.
  3. Knowledge quizzes: Students will take 2 knowledge surveys when requested; they may take them again at a later point to improve their score until they achieve a score of 80%.
  4. Science Report: Students choose an energy transformation topic, such as a bow and arrow, a roller coaster, a windmill or a space ship. They draft their energy transformation project.
  5. Peer Feedback: Students offer peer review three other students’ energy transformation projects against a rubric based on the NGSS and CCSS standards, and also make annotations.
  6. Share Report: Students revise their energy transformation projects and write a self-review. When published, they read others’ reports and discuss.

As the students work, they will each see their progress towards mastery in an aster plot (Fig. 2):

Fig 2: Scholar: Analytics learning progress visualization

Our focal learner in this scenario, Tarryn, is be able to view her learning visualization at all times during the unit of work. The central image she sees is an “aster plot,” where each petal of the metaphorical flower represents one kind of learner activity. At the start of the unit, the plot is a white circle, but as Tarryn progresses towards mastery, the colored petals in the aster plot grow using data continuously mined by the software during her learning activities. In this example, she is part way through the Energy Transformations unit of work. This is how she interprets the visualization:

  • The width of the petal is the weighting given by the teacher to this aspect of the work. The length of the petal is the amount of achievement of the learner to this point. (In this indicative image, we have not keyed all the petals.) In this case, Tarryn has responded to all 6 teacher posts but still has more work to do in other areas. For instance, she could take the knowledge surveys again in order to achieve a higher score in that petal.
  • θ is progress towards mastery. 100 learning credits represents mastery. (We want to use the positive concept of earning “credits,” to get away from the frequently negative and judgmental notion of a “mark” or a “score.”) All students can increase their learning credits and achieve mastery by doing more work, for instance revising their projects, adding more comments to the class discussions or re-taking the knowledge surveys. The teacher may set a target minimum mastery level, such as 75%. Tarryn has not yet achieved target minimum mastery level because she is only part way through this unit of work.
  • Each petal in the aster plot is active, so clicking on it brings Tarryn to more detailed information about the data used to generate the conclusion about learning represented by each petal. Tarryn is able to navigate all the way down to the work itself—an incorrectly answered quiz item, or a low score against a criterion in a rubric. This guides her to places where more work is required.
  • The visualization is divided into three major segments, each labelled by an imperative verb and a symbol representing that variable: φ Focus! … or Bloom’s concept of “perseverance,” measuring variables such as time on task and amount of work produced. ε Know! … which measures knowledge via data elements such as quizzes or knowledge surveys and peer review ratings against rubrics. β Help! … which measures community contributions and collaborations, such as the extent and quality of comments on others’ posts and peer reviews.
  • This page also serves as a gradebook, where the teacher can enter a grade based on their overall judgment of the quality of student work, supported by Scholar: Analytic’s progress data and calculation of learning credits. In Tarryn’s case, her work is still incomplete, and the teacher has not yet entered a grade.

Before the unit of work commences, the teacher has specified: a) The timeframe for the task. b) Task expectations. c)The relative weighting of each task.

During the unit of work, the teacher is able to access each student’s progress visualization in the same view as Tarryn’s, including the capacity to dig deep into areas requiring additional attention by an individual learner. This makes visible deficiencies which might otherwise pass unnoticed by the teacher.

Making Mastery Learning Practicable

Twenty years after Bloom and colleagues’ proposal, Kulik et al. were able to conduct a meta-analysis of 108 controlled studies of mastery learning programs. Their conclusion was that “mastery learning programs have positive effects on the examination performance of students in colleges, high schools, and the upper grades of elementary schools. The effects appear to be stronger on the weaker students in the class … Mastery programs have positive effects on student attitudes towards course content but may increase student time on instructional tasks.” (Kulik, Kulik and Bangert-Drowns 1990) These results were achieved before the widespread introduction of computers into educational settings.

Notwithstanding this research evidence, there has not been widespread adoption of mastery learning approaches simply because of the logistical complexity of what Bloom termed “optimal instruction.” Simply put, uniform instruction is easier to implement.

However, in the era of networked and “cloud” computing where every student has a device, instruction oriented to mastery learning becomes as easy to implement as uniform instruction. This is our vision with Scholar: Analytics. It is our contention that the main possibility opened by computer-mediated teaching in learning is to make logistically feasible long-held educational aspirations that until now were unrealistically ambitious (Cope and Kalantzis 2017). Of course, there is a professional learning curve for teachers along which they need to change their pedagogical practices. However, with the introduction of a panoply of new tools in schools where computers are extensively used, many teachers have already travelled a long away along this learning curve.

[1] Assessing “Complex Epistemic Performance” in Online Learning Environments, National Science Foundation, Award 1629161, 2016-2018.

References

Behrens, John T. and Kristen E. DiCerbo. 2013. "Technological Implications for Assessment Ecosystems." Pp. 101-22 in The Gordon Commission on the Future of Assessment in Education: Technical Report, edited by E. W. Gordon. Princeton NJ: The Gordon Commission.

Block, James H., ed. 1971. Mastery Learning: Theory and Practice. New York: Holt Rinehart & Winston.

Bloom, Benjamin S. 1968. "Learning for Mastery." Evaluation Comment 1(2):1-2.

Cope, Bill and Mary Kalantzis. 2015. "Assessment and Pedagogy in the Era of Machine-Mediated Learning." Pp. 350-74 in Education as Social Construction: Contributions to Theory, Research, and Practice, edited by T. Dragonas, K. J. Gergen, S. McNamee and E. Tseliou. Chagrin Falls OH: Worldshare Books.

Cope, Bill and Mary Kalantzis. 2016. "Big Data Comes to School: Implications for Learning, Assessment and Research." AERA Open 2(2):1-19.

Cope, Bill and Mary Kalantzis. 2017. "Conceptualizing E-Learning." Pp. 1-45 in E-Learning Ecologies, edited by B. Cope and M. Kalantzis. New York: Routledge.

Kalantzis, Mary and Bill Cope. 2012. New Learning: Elements of a Science of Education. Cambridge UK: Cambridge University Press.

Kulik, Chen-Lin C., James A. Kulik and Robert L. Bangert-Drowns. 1990. "Effectiveness of Mastery Learning Programs: A Meta-Analysis." Review of Educational Research 60(2):265-99.

Siemens, George and Ryan S J.d. Baker. 2013. "Learning Analytics and Educational Data Mining: Towards Communication and Collaboration." Pp. 252-54 in Second Conference on Learning Analytics and Knowledge (LAK 2012). Vancouver BC: ACM.

  • Shu Shan Cheng
  • Alain Blaise Tatsinkou
  • Liliana S.Giraud
  • William Cope
  • Ryan Bartelmay
  • Ihsan Ullah Ihsan Ullah
  • Chris Li
  • Nate McKee
  • William Cope
  • Edgar F Roca