Produced with Scholar

Practice Analysis of a New Learning ‘Ecology’

Project Overview

Project Description

Parse a student learning experience in a computer-mediated learning environment. What are the elements and patterns of this practice in terms of teacher-student interactions, student-resource interactions, student-student interactions, and the nature of student assessment? How are these different from, and perhaps also similar to, traditional classroom interactions? This work could consist of a reflection on practice you have already used, or analyze a new or unfamiliar practice the dimensions of which you would like to explore. Consider and cite the theoretical models of learning ecologies developed by you and your colleagues in Work 1.

Icon for STEP Intermediate Project: A New Approach to Literacy Assessment for Students in Grades 4-6

STEP Intermediate Project: A New Approach to Literacy Assessment for Students in Grades 4-6

Work 2: Draft 2

The Challenge: Evaluating Peer Discussion and Peer-to-Peer Learning

Although effective peer discussion has traditionally been a standard area of evaluation for individual students’ language arts grades (for example, a student’s effectiveness in peer discussion may factor into a “class participation” score), there is not often a systematic approach to evaluating peer discussion in the classroom. Teachers may take a quantitative approach (e.g. minimum number of questions/comments provided during an in-class discussion), but more important than setting any minimum amount of contribution is determining the quality of the students contributions—for instance, to what extent is the student able to think on deep level about a given text and connect themes across different genres or content areas? Moreover, when we factor in the broad spectrum of individual student dispositions, students who are more introverted, which by some accounts may comprise up to 40% of a classroom population (Cain, 2012), may be able to contribute more successfully during small group discussion or through written contributions than during a large group discussion. Too often introverted students get overlooked or teachers try to “coax them out of their shells,” instead of trying to adapt their teaching methods to cultivate the talents of each individual student. Letting students contribute to peer discussion in a variety of formats allows students, especially the more introverted ones, more time and autonomy to reflect on a subject before sharing a response.

Additionally, in this new 21st century age of multiliteracies, it is increasingly important to be able to recognize the changing nature of the way students actually learn in that 1) there is a growing digitization of text sources, 2) the producers of such texts are much more globally diverse and may include the students’ own contemporaries, 3) students can begin to see from an early age that they have a voice that can inform and influence others. This is a hallmark of the new “Generation P” students (for “participatory”) which learn “in more in semi-formal and informal settings and from a variety of sources […] they are the ones who actively participate, who solve problems, who innovate, who take calculated risks and who are creative” (Kalantzis & Cope, 2012, p.10).

Moreover, the new Common Core State Standards emphasize the importance of students’ ability to be more actively engaged and creative in navigating the multimedia environment as expressed in CCRA.R7: “Integration of Knowledge and Ideas: Integrate and evaluate content presented in diverse media and formats, including visually and quantitatively, as well as in words” (National Governors Association Center for Best Practices, 2010).

As such, the question arises: how do we effectively evaluate student peer discussion and peer-to-peer learning in this new computer-mediated learning environment?

Background on the STEP Intermediate Project

The original STEP ("Strategic Teaching and Evaluation of Progress") Literacy Assessment was developed over twenty years ago at the University of Chicago as a formative, progress monitoring assessment program with ongoing professional development training for teachers in grades PreK-3. It provides schools the opportunity to analyze student data and adjust instructional practices accordingly throughout the academic year. Currently, the STEP Literacy Assessment program is being implemented in schools in over 21 states, 39 cities. The overarching goal of the STEP Intermediate project is to expand the scope of the original STEP beyond the third grade by developing an objective, evidence-based measure of fluent readers’ literacy skills for students in grades 4-6. This assessment will link to the PreK-3 STEP assessment and be appropriate for measuring growth in students who have exceeded STEP 12 (end of third grade). This measure will take into account contemporary views of reading by focusing on the ways that students do and do not construct meaning from non-fiction texts, and will be aligned to Common Core State Standards. 

Here, we will focus on the project’s approach toward evaluating peer-to-peer online learning and discussion through the constriction of class blogs. For example, in a pre-pilot study this summer, we asked students to read an informational text on the Earth's rotation. As we only had access to individual students without the structure of the actual online platform with the class blog having been built out, the pre-pilot was limited in scope. The goal was to begin to try out question types to see how students might approach peer-to-peer education. One question we asked was: "if a younger student had difficulty reading this text, how would you help that student better understand it." We used the example of a "younger student" as opposed to a "classmate" to get at what strategies a student might use to help a struggling reader, which is easier to conceptualize with someone younger, as opposed to a classroom peer. Also, by asking them what how they might help another student, it gives us a window into strategies they might use themselves when presented with challenging texts. In this example, one interesting finding was that that students tended to focus on vocabulary and pictoral clues as reading strategies, e.g. one student explained: "well, I would tell them to read the glossary because they might not understand the words and I would use what I know to give them clues" and another replied: "by helping him reread and look more at the pictures." In that respect, it's a little easier to imagine what a student-led peer-to-peer tutoring or discussion session might look like.

As this project is still in the early stages of development, we are open and welcome any recommendations of current practitioners and researchers in the field of education on how we may improve the current assessment design plan.

Overall Structure of the Assessment

The core readings for the assessment will consist of a total of 36 informational texts presented in two formats: 18 printed booklets (anchor texts, ranging from 800-1000 words) and 18 web-based passages (secondary texts ranging from 400-700 words). Students will be provided with two parallel forms, A and B, in case they do not pass a certain STEP Level during an assessment window, they can be tested on an alternate form. The table below provides a summary of the text levels to be included which increase in overall length and complexity as the students move up the grade and STEP Levels.

STEP Intermediate Text Components for Evaluating Reading Comprehension
Grade Fourth Grade Fourth Grade Fourth Grade Fifth Grade Fifth Grade Fifth Grade Sixth Grade Sixth Grade Sixth Grade
ATOS Levels 4.0-4.4 4.5-4.9 4.5-5.4 5.0-5.4 5.0-5.9 5.5-5.9 6.0-6.4 6.5-6.9 6.9-7.4
STEP Levels 13 14 15 16 17 18 19 20 21
Anchor Text Forms (Booklets) A&B A&B A&B A&B A&B A&B A&B A&B A&B
Anchor Text Length 800-1000 words each 800-1000 words each 800-1000 words each 800-1000 words each 800-1000 words each 800-1000 words each 800-1000 words each 800-1000 words each 800-1000 words each
Secondary Text Forms (Web-based) A&B A&B A&B A&B A&B A&B A&B A&B A&B
Secondary Text Lengths 400-500 words 400-500 words 400-500 words 500-600 words 500-600 words 500-600 words 600-700 words 600-700 words 600-700 words

The reason for focusing on informational versus literary texts is the increasing need for students in the intermediate grades to comprehend the structure and content of informational texts across all content areas. Ogle explains: “Most elementary literacy instruction has been focused on the use of narrative texts and infrequently have students received coherent and ongoing instruction in how to read and learn with informational texts. As a result, many students have problems comprehending the range of textbooks, trade books, web-based articles, chapters and newspapers that they need to read to learn in their school subjects” (Ogle, personal communication, Sep 24, 2013). We chose ATOS levels as a reading complexity guide, since the ATOS online analyzer is authorized by Common Core as a reliable text complexity tool. With ATOS, the first number corresponds to a grade level, while the second roughly corresponds to months in school, so that at STEP 13, the text would be at a complexity level of approximately the beginning of fourth grade (4.0) to four months into the academic school year (4.4).

The texts in this assessment will incorporate various forms of visual elements into the text such as archival photographs, diagrams, maps, graphs, and timelines to complement and enhance the reading comprehension of the texts. (Depending on funding, we would also like to incorporate video, audio, and animations into the web-based text passages.) Students will read an anchor text on a given topic in a printed full-color booklet (e.g. “The Human Endocrine System”), answer a short series of questions in a one-to-one interview with the assessor (their classroom teacher), with the assessor providing standardized prompts when needed, such as: “What in the text makes you think that?” “Tell me more,” and “Why is that important?” This allows the assessor to systematically and dynamically probe student responses to further analyze reading comprehension. The student will then proceed to read a related web-based text on their own (e.g. “The Effect of Junk Food and Supersizing on the Pancreas”).

After answering a series of comprehension questions, the student will be asked to write a short essay online (to be evaluated by the teacher) which compares and contrasts the information gleaned from the two sources. Finally, the student will be asked to select a short portion of their essay to post to the class blog, which will be viewable by all students of that same STEP Level (potentially across various schools). Following the reading of other posts by students who have read these same two core texts, the student will be asked to select and respond to 2-3 postings from their peers then write a reflection in the class blog on how reading these postings has helped present a new way of thinking about the topic. The graphic organizer below is adapted from the work of Zawilinksi (2009) which we are considering using to help scaffold the students' thinking process and synthesis of peer comments on the blog.

Synthesis Scaffolding: Thinking Across Text for Deeper Understanding
My Thoughts Comments from __________ Comments from ________ Comments from________
(Student's original posting) (Copy and paste from a peer's posting) (Copy and paste from a peer's posting) (Copy and paste from a peer's posting)
Reflections Part I: How are my peers thoughts the same or different from mine? (highlight in yellow those ideas that are the same; highlight in green those that are different)Reflections Part II: What questions do I have after reading these posts?
Synthesis: Combine writing from the Reflections Parts I and II into a new blog posting.

Students would then be evaluated on a rubric with such elements as whether or not they were able to identify the main ideas of the two core texts, successfully paraphrase these ideas, provoke further discussion on the topic by presenting questions to peers on the blog, the responsiveness in addressing questions/comments posed to their own blog posts, and overall whether they were able to sythesize ideas/perspectives of the author of the core texts and blog postings.

Various researchers have concluded that when students write using a digital format for peers (such as blogs, wikis, scripting iMovie projects, etc), they tended to be much more engaged than when writing in a paper-pencil format for their teacher audience (Mills, 2011; Jimoyiannis, 2011; Glenwa and Bogan, 2007; Chen et. al, 2011). This writing for peers provides a more authentic audience for the writer and helps them over time to develop a stronger, clearer voice as they are able to shape their wrtiting based on continuous feedback in a dynamic published format. This contrasts sharply with the traditional approach to writing for a larger audience (e.g publication of articles or books) in which the author produces a static work that receives responses from the public/critics only after the hard-copy version has been published; revision of such a text is more time consuming (since it takes a longer time to physically reach the target audience) and more costly (due to expenses of printing) when compared to a writing format such as a blog in which the original writer and the audience have a dynamic relationship exploring different ideas and perspectives, potentially in real-time.

However, we have not yet determined whether or not we will have the students' comments on blogs to be anonymous (e.g. students would have computer generated IDs) or whether their actual names should be associated with them. For example, Chen et. al (2011), point out with their study on fifth grade students, that the students liked having a certain amount of anonymity with the blogs so that they felt they could be more honest in their comments, yet having the students' real names could help create a greater sense of community within the classroom, and perhaps a better appreciation of the various perspectives of individual students.

Reading and Writing Comprehension Online: Similarities and Differences

According to the RAND Reading Study Group: "Electronic text can present particular challenges to comprehension, such as dealing with the non-linear nature of hypertext, but it also offers the potential for supporting the comprehension of complex texts, for example through hyperlinks to definitions or translations of difficult words or to paraphrasing of complex sentences" (RAND Report, 2002). For this reason, it is important that teachers incorporate such electronic texts into their regular curriculum so that students with limited access to the Internet and electronic texts at home will be able to begin to develop the necessary skills to navigate and comprehend such texts. We felt that if such texts were presented in conjunction with more traditional printed texts in a formative, quarterly assessment such as STEP, this would encourage teachers to help students understand and evaluate infromational texts in both traditional and electronic formats throughout the academic year, not just as part of the "gearing up" process for a standardized State summative assessment presented at the end of the school year.

Leu (2005) points out that the Internet is a specifically a reading and literacy issue, not a technology issue; with each new new age, there is a new technology (e.g. invention of cuniform, printing press, pencil, etc.) that will influence literacy learning. In the video clip below, Leu illustrates the way reading comprehension online is different from reading comprehension in a traditionally printed text in that when reading online, it is more important to know what NOT to attend to more than what to attend to; the student learns to focus on key points of information and disregard anything extraneous. This represents a new approach towards reading fluency that instead of the traditional focus of accurately capturing every word in a text, with the incredible volume of text sources available online, a reader must learn to quickly skim texts for specific pieces of information rather than get bogged down by details that may not pertain to their area of interest/research inquiry.

In the example below, Leu uses Camtasia software to take a screen recording of a student skimming web passages and providing justification of why they answered the questions in a certain way an individual researcher via chat. In the case of peer-to-peer learning and discussion with the STEP assessment blog, we'd like to be able to take similar screen recordings of how a student reads through other students' postings, toggles back to the original stimulus text (s), reviews own work, etc. and then posts his/her ideas to the blog and how much time they spend scrolling/reading each component as well as writing/revising their own posting(s). In this manner, we hope to gain more insight into students' reading and writing patterns when blogging during the pilot study, so that we may make adjustments for the field test and final product.

Don Leu: Clip from NRC Presidential Address, 2005

Reflections/Conclusions

While prominent organizations such as the RAND Reading Group, Common Core, ISTE, the IRA, and the NRC recognize and emphasize that the teaching of literacy should include reading and writing using digitial formats from an early age, they also recognize that there is much debate and limited research on what the actual instruction and evaluation of reading literacy should look like. The RAND Reading Group reflects: "Using computers and accessing the Internet make demmands on individuals' literacy skills; in some cases, this new technology requires readers to have novel literacy skills, and little is known about how to analyze or teach those skills" (p. 4). Leu responds to this by stating that it is up to literacy scholars to lead the charge in defining the new literacies required of this new generation, otherwise, others outside the field, such as policy makers, will do it for us (Leu, 2005).

One of the challenges with piloting a classroom blog in this assessment is that initially, there will be no peer postings for the students to respond to/reflect upon, since it will be a brand new platform. To address this, we are considering seeding the platform with sample anonymous postings at each STEP level so that when the first "real" student to take the assessment views the blog, there will be already be something for them to respond to, and to a certain extent, use as a model for their own postings.

Another challenge, is simply getting teachers to buy-in to the value of using blogs as a teaching and assesment tool. Throughout the pilot, we plan to provide teachers with professional development training to ensure that they understand best practices around using technology and especially around blogging.

Similarly, there will likely be students who are not as comfortable using computers/tablets for reading/writing. To that end, we still have the reading component of the anchor text which will be a hard-copy print-out and a reading comprehension oral interview with the teacher/assessor. Additionally, with the professional developement for teachers, we will provide teachers with strategies on how to incorporate online reading and writing into the regular classroom routine, so that by the time the student is to be assessed using STEP, it will not be the first time s/he is being asked to perform such online reading/writing tasks.

In creating an assessment which incorporates elements of both online reading (through web-based passage) and writing (through essay and class blog) with a strong emphasis on the individual human reader audience (in some cases the teacher, as with the online essay, in some cases other students, as with the class blog), we hope to encourage teachers to create a community of learning in which students can learn from multiple sources including their peers, as well as through a deeper reflection on their own work. Moreover, with the screen recordings of actual students reading and writing online, I think that we will be able to contribute empirical data and useful insight into how students actually read and write online to the field of education in the area of new technology.


References

Cain, S. (2012). Quiet : the power of introverts in a world that can't stop talking. New York : Crown Publishers.

Chen, Y., Liu, E., Shih, R., Wu, C., & Yuan, S. (2011). Use of peer feedback to enhance elementary students' writing through blogging.British Journal Of Educational Technology, 42(1).

Glewa, M., & Bogan, M. B. (2007). IMPROVING CHILDREN'S LITERACY WHILE PROMOTING DIGITAL FLUENCY THROUGH THE USE OF BLOG'S IN THE CLASSROOM: SURVIVING THE HURRICANE. Journal Of Literacy & Technology, 8(1), 40.

Jimoyiannis, A. A., & Angelaina, S. S. (2012). Towards an Analysis Framework for Investigating Students' Engagement and Learning in Educational Blogs. Journal Of Computer Assisted Learning, 28(3), 222-234.

Kalantzis, M. & Cope, B. (2008). New Learning: Elements of a Science of Education. Cambridge: Cambridge University Press.

Leu, D. New Literacies, Reading Research, and the Challenges of Change: A Deictic Perspective of Our Research Worlds. NRC Keynote Videos. Available at http://www.newliteracies.uconn.edu/nrc/don_leu_2005.html

Mills, K., & Levido, A. (2011). iPed: Pedagogy for Digital Text Production. Reading Teacher, 65 (1), 80-91.

National Governor's Association for Best Practices, Council of Chief State School Officers (2010). Common Core State Standards. Washington D.C.: National Governors Association Center for Best Practices, Council of Chief State School Officers.

RAND Reading Study Group (2002). Reading for Understanding: Toward and R&D Program in Reading Comprehension/ Catherine Snow. Santa Monica, CA: RAND.

Zawilinski, L. (2009). HOT Blogging: A Framework for Blogging to Promote Higher Order Thinking. Reading Teacher, 62(8), 650-661.