Produced with Scholar
Icon for Assessment for Learning

Assessment for Learning

Learning Module

Abstract

For several decades now, assessment has become an increasingly pressing educational priority. Teacher and school accountability systems have come to be based on analysis of large-scale, standardized summative assessments. As a consequence, assessment now dominates most conversations about reform, particularly as a measure of teacher and school accountability for learner performance. Behind the often heated and at times ideologically gridlocked debate is a genuine challenge to address gaps in achievement between different demographically identifiable groups of students. There is an urgent need to lift whole communities and cohorts of students out of cycles of underachievement. For better or for worse, testing and public reporting of achievement is seen to be one of the few tools capable of clearly informing public policy makers and communities alike about how their resources are being used to expand the life opportunities. This learning is an overview of current debates about testing, and analyses the strengths and weaknesses of a variety of approaches to assessment. The module also focuses on the use of assessment technologies in learning. It will explore recent advances in computer adaptive and diagnostic testing, the use of natural language processing technologies in assessments, and embedded formative assessments in digital and online curricula. Other topics include the use of data mining and learning analytics in learning management systems and educational technology platforms. The module also considers issues of data access, privacy and the challenges raised by ‘big data’ including data persistency and student profiling. A final section addresses the processes of educational evaluation. Video presenters include Mary Kalantzis, Bill Cope, Luc Paquette and Jennifer Greene.

1. Intelligence Tests: The First Modern Assessments (Admin Update 1)

For the Participant

Media embedded April 14, 2018
Media embedded March 24, 2018
Media embedded March 24, 2018
Media embedded March 24, 2018
Media embedded March 24, 2018

Intelligence versus knowledge testing - what are the differences in assessment paradigm? A good place to begin to explore this distinction is the history of intelligence testing - the first modern form of testing:

And if you would lile to read deeper into a contemporary version of this debate, contrast Gottfredson and Phelps with Shenk in the attached extracts.

Shenk, The Genius in All of Us
Gottlebson and Phelps, Correcting Fallacies

Comment: What are the differences between testing intelligence and testing for knowledge? When might each approach be appropriate or innappropriate?

Make an Upate: Find an example of an intelligence and explain how it works. Or find peer reviewed article(s) or book(s) about intelligence testing. Analyze its strengths and weaknesses as a form of assessment.

For the Instructor

2. Select and Supply Response Assessments (Admin Update 2)

For the Participant

Media embedded March 27, 2018
 
 
Media embedded March 24, 2018
Media embedded March 24, 2018
Media embedded March 24, 2018

Item-based, standardized tests have epistemological and social bases.

Their epistemological basis is an assumption that there can be right and wrong answers to the things that matter in a discipline (facts, definitions, numerical answers to problems), and from the sum of these answers we can infer deeper understanding of a topic or discipline. (You must have understood something if you got the right answer?) Right answers are juxtaposed beside 'distractors'—plausible, nearly right answers or mistakes it would be easy to make. The testing game is to sift the right from the (deceptively) wrong.

The social basis of item-based tests is the idea of standardization, or tests which are administered to everyone in the same way for the purposes of comparison measured in terms of comparative success or failure.

Psychometrics is a statistical measurement process that supports generalizations from what is at root survey data. (An item-based test is essentially, a kind of psychological survey, whose purpose is to measure knowledge and understanding.)

Today, some standardized tests, such as PISA and TIMMS aim to evaluate higher order disciplinary skills.

Comment: When are standardized tests at their best? And/or worst?

Make an Update: "Parse" a standardized test. Describe the implementation of a standardized test in practice. Or find peer reviewed article(s) or book(s) about standardized testing. What are the strengths and weaknesses of standardized testing?

For the Instructor

3. Standards-Based and Alternative Practices of Assessment (Admin Update 3)

For the Participant

 
Media embedded March 27, 2018

 

Media embedded March 24, 2018

Standards-based assessment allows the possibility that everyone in a certain level of education or in the same class can succeed. For the underlying principles, see:

Criterion referenced, norm-referenced and self-referenced assessments have fundamentally different logics and social purposes. In the following image from Chapter 10 of our New Learning book, we attempt to characterize the different logics. But what are the different social assumptions?

Comment: What are the social assumptions of each kind of assessment? What are the consequences for learners? For better and/or for worse, in each case?

Make an Update: Find an example of a standards-based assessment, or an alternative form of assessment. Describe and analyze it. Or find peer reviewed article(s) or book(s) about standards based or alternative assessments. Analyze their strengths and weaknesses.

For the Instructor

4. New Opportunities for Assessment in the Digital Age (Admin Update 4)

For the Participant

Media embedded March 24, 2018
Media embedded March 24, 2018
Media embedded March 24, 2018

Here are two papers exploring the impact of new technologies on assessment:

  • Cope, Bill and Mary Kalantzis. 2015. "Assessment and Pedagogy in the Era of Machine-Mediated Learning." Pp. 350-74 in Education as Social Construction: Contributions to Theory, Research, and Practice, edited by T. Dragonas, K. J. Gergen, S. McNamee and E. Tseliou. Chagrin Falls OH: Worldshare Books.
Cope_20_26_20Kalantzis_20Learning_20and_20Assessment_202015.pdf

Comment: What are the potentials for new forms of assessment in the digital age? What are the dangers?

Make an Update: Find an example of an innovative, computer-mediated assessment. Describe and analyze it. Or find peer reviewed article(s) or book(s) about innovative, computer-mediated assessment. Discuss the challenges and potentials.

For the Instructor

5. Learning Analytics: A Case Study of CGScholar (Admin Update 5)

For the Participant

Media embedded March 24, 2018
Media embedded March 24, 2018
Media embedded March 24, 2018

Comment: What are the potentials and the challeges in creating and implementing environments with embedded learning analytics?

Make an Update: Find a learning and assessment envrionment which offers learning analytics. How does it work? What are its effects? Or find peer reviewed article(s) or book(s) about learning analytics. Discuss the challenges and potentials.

For the Instructor

6. Educational Data Mining - Luc Paquette (Admin Update 6)

For the Participant

Media embedded March 24, 2018
Media embedded March 24, 2018
Media embedded March 24, 2018
Media embedded March 24, 2018


Comment: What are the possibilities and challenges of educational data mining?

Make an Update: Find a piece of research that uses educational data mining as a source of evidence. What kinds of things can educational data mining tell us, or not tell us? Or find peer reviewed article(s) or book(s) about educational data mining. Discuss the challenges and potentials.

For the Instructor

7. Educational Evaluation - Jennifer Greene (Admin Update 7)

For the Participant

Media embedded March 24, 2018
Media embedded March 24, 2018
Media embedded March 24, 2018
Media embedded March 24, 2018
Media embedded March 24, 2018
Media embedded March 24, 2018

Comment: Why do we need to evaluate what we do in education? How do we do it most effectively?

Make an Upate: Find an educational evaluation. Analyze its strengths and weaknesses. Or, propose in outline form an educational evaluation that you would like to undertake. Or find peer reviewed article(s) or book(s) about educational evaluation. Discuss the logistics of evaluation, its potential challenges and benefits.

For the Instructor