Produced with Scholar

Work 2A: Case Study (Educational Practice Analysis)

Project Overview

Project Description

Write a case study of an innovative learning practice—a method, a resource or a technology, for instance. This could be a reflection practice you have already used, or a new or unfamiliar practice which you would like to explore. Analyze an educational practice, or an ensemble of practices, as applied in a clearly specified a learning context. Use theory concepts introduced in this course. We encourage you to use theory concepts defined by members of the group in their published Work 1, with references and links to the published works of the other course participants.

Word limit: at least 2000 words

Media: Include images, diagrams, infographics, tables, embedded videos, (either uploaded into CGScholar, or embedded from other sites), web links, PDFs, datasets or other digital media. Be sure to caption media sources and connect them explicitly with the text, with an introduction before and discussion afterwards.

References: Include a References “element” or section with at least five scholarly articles or books that you have used and referred to in the text, and all the added media, plus any other necessary or relevant references, including websites.

Rubric: The educational practice rubric is the same as for Work 1, against which others will review your work, and against which you will do your self-review at the completion of your final draft.

Go to Creator => Feedback => Reviews => Rubric to see rubric against which others will review your work, and against which you will do your self-review at the completion of your final draft. The rubric explores four main knowledge processes, the background and rationale for which is described in the papers at this page.

Icon for Peer Feedback through Google Docs

Peer Feedback through Google Docs

Educators in today’s ever evolving world are presented with a host of new technologies to improve their practice. Sifting through what has actual impact on student learning, and what doesn’t is very difficult when the amount of choices one has seems endless. In the ten years I have been working as a teacher I have seen the amount of tools available multiply exponentially. Some of these have completely changed my instructional practice and lesson planning, others have been failed experiments that I gave up on, whether because the technology did not produce results or because I did not provide or have the right setting to implement the tools correctly.

One technology that has become a mainstay in both my professional and personal life is the cloud-based Google Suite, which includes Google Docs and Slides. Students are generally comfortable working with its collection of programs which seamlessly integrate other technology and apps.

At the school where I have been working for the past five years, all student work is assigned, collected and assessed via Google based products that are cloud-based. Although, much of our creative work now exists only on the cloud the consistency and reliability of the Google platform means that we expect few problems and get the results we want. Students all have at least one device (laptop), and usually have a second or third device which can include a phone or Ipad used for reading. The student body expects to receive their assignments through their class “drops”, which are cloud-based folders that teachers can access in order to assign tasks. Given this tech heavy environment it is not difficult to assign students revision or feedback tasks that require using cloud based apps, they have come to expect it as part of their learning process. Teachers often give feedback directly on Google docs through the comments feature and students are now also being asked to use this feature to provide each other with feedback.

As a humanities teacher, I have included peer reviewing as part of my student’s writing process for some time. In some cases, this peer review has been done by hand and face-to-face, with feedback sheets that students hand back to their peers. More recently, I have begun to ask students to use the comment function to peer review their writing through guided workshops. The idea is that by using Google docs I am able to track how much peer feedback is happening, and if it actually makes a difference in what students turn in for teacher feedback. Additionally, students can give each other feedback both inside and outside the classroom, and have a record of what the feedback is - so they can go back and make corrections whenever it is convenient to them.The purpose of this study is to review how the effectiveness of my current Google peer review feedback practices in a systematic manner. In order to do this I selected the group which I teach which has the least variables, which is a ninth grade English class with twenty students.

 

Theory and Concepts

The advent of cloud computing means that education and student interaction is no longer bound to the classroom. Cloud computing uses remote servers to pool resources so that users can access their data and information from any wi-fi enabled device at anytime and from any place (Wang, 2017). This availability of tools means that students can collaborate asynchronously and give each other feedback directly on their work, whether in or outside the classroom. Specifically, cloud computing tools such as Google Docs, which our school uses, “provide students the opportunity to edit a document at the same time or asynchronously (Al-Samarrie and Saeed, 2018, p. 85)”. Working on a document together, or sharing it for feedback allows students greater opportunity for rich feedback. From a practical standpoint peer feedback can be tremendously helpful for teachers because it can help reduce teacher workload if students can peer edit and self-correct less challenging elements, leaving teacher feedback and conference time for more challenging tasks (Neuman and Kopcha, 2018). Additionally, the process of giving each other feedback is important because it is a form of collaborative learning. Collaborative learning involves using active learning and student-reliance to create a community of learners and teachers; it shifts learning from passive to active creation which can be extremely rewarding for students (Suwantarahip and Wichadee, 2014). This process of collaborative learning is supported by Vygotsky’s theories of socio-constructed learning and the Zone of Proximal Development (Brodahl, Hadjerrouit and Hansen, 2011). Students can support each other by making meaning together, and push their thinking by peer reviewing with peers that may have more knowledge than they do about certain elements of their text.

Case Study Overview and Findings

As stated above, students at the school where I work are used to completing all their work on the Google Docs platform and are routinely required to use the comments feature and work asynchronously in many of their classes. This means that even with 9th graders, as is the case of the current group I am working with, I did not have to introduce them to any new tech gear or apps. The focus was on using the comments feature as a conduit to improving their writing. The process would take place partially in the classroom and partially at home and was guided via a revision workshop. Students were first asked to write a paragraph response to a story they had read. The response was started in class but finished at home. Each response was written in a Google Doc places in their class drop folder by me. They had received large group explicit instructions regarding literary analysis and cohesive writing. Once they completed their response they were asked to sit with a partner which was randomly assigned and complete a revision workshop on each other’s assignments during class - after which they needed to do revisions at home. The students were not told that there was a minimum requirement for feedback nor were they told that the feedback was assessed. Students considered this an in-class activity part of the formative process, thus would not have expected any impacts on their overall grade from participating in the process more or less extensively. Below are a series of images and a screencast video which gives an overview of what of the guided workshop given to the students.

Introduction and Outline of Workshop
Checking Analysis and Evidence
Verifying Cohesion with Example
 
Media embedded October 4, 2019

Once each student completed their feedback and editing, I then reviewed their paragraph giving feedback of my own. The intention of the whole process was the improvement of their writing and the writing process in order for them to successfully write and edit a cohesive paragraph independently. Ideally, if each student had thoughtfully completed the review there would have been some growth and changes in the paragraph I reviewed.

After they completed the process I read through their comments and looked through their revision and comments history in Google Docs, both features which allow me to follow all the actions taken on the document.

I found that of the class, which has twenty students, seven pairs (14 students) completed some sort of feedback. The other six students did not finish their paragraph before coming in for the workshop so they spent their time completing their paragraphs during the workshop and did not give feedback after class. In each of these cases, I gave feedback directly to the non-workshopped paragraphs. The seven remaining pairs of students completed approximately 3 comments average as their feedback. The quality of these comments varied significantly from open-ended critiques to specific actionable points. One notable exception was a pair of students who completed 30 and 37 comments each and edited their paragraphs significantly in response. These two varied between specifics about mechanics to detailed commentary and suggestions. Below are examples of their comments and responses (which also varied from rejection to actual editing). Importantly, however, both of these students are strong writers and have exceeded expectations in other writing assignments.

In their case, the detailed commentary provoked an equally detailed response from the partner and significant changes. Below are some of these comments.

AN*3

In addition to multiple pieces of feedback - details and suggestions.
Student points out elements of success as well as correction.

Within the rest of six pairs, students were broadly divided into two groups, with no seeming correlation between the detail of their feedback and the response or changes. One group gave more detailed feedback to their partner and included specific suggestions (biggest differentiator). Essentially, they made critiques but also added possible solutions. The second group made broad critiques but offered no help in what might be done to fix these issues. However, as noted this had no impact on how the students responded - which may mean that it depended on the receiver's ability to understand and make changes, not on the details of the change suggested.

The students below included examples of what their partner might do to change areas that are not working well.

 

An implicit compliment with a suggestion for improvement.

 

Describes the issue precisely and suggests change.

 

Other students made more broad criticisms, pointing out that things didn’t work or didn’t make sense but adding little else. For example in the case below.

Broad criticism with broad indication of what it needs to be.
Points out that something is wrong - no why or suggestions.
Points out lack of coherence, does not make any suggestions. (MB1)
Points out lack of coherence in a different way - again no suggestions.

This last set of images (MB1-2), however, is interesting because this was the student that had the most impact on the overall quality of their paragraph. Although the reviewer did not suggest how they might make their paragraph cohesive, it appears that simply pointing out that there was one idea too many, was enough to produce a significant improvement. One reason this may have happened is that the required next steps (eliminating one idea) were much simpler than some of the steps that were needed in other paragraphs. For example, even if a student is told that an appositive phrase would be helpful if they do not actually know what that is, the advice is useless.

Given the above anecdotal results and the trends they reflect across the classroom, there appear to be some significant issues with the use of peer review feedback process through Google Docs as I chose to implement it. The first is that students do not appear to be intrinsically motivated to complete the thorough feedback of their peers. Only one partnership really went through all the steps in the workshop and thoroughly analyzed their writing samples. This is minimal participation. The rest of the class appears to be less interested in a thorough peer review process - although this could also reflect as a multitude of disparities of ability throughout the class, that the first partnership did not have. In some cases, for example, the students did not even have their paragraphs ready. Having already fallen behind in their own work they were most likely less motivated to rush to complete an assessment of their peers. Additionally, given some of the examples were from partnerships involving different levels of capacity some students may simply not have been able to give feedback that was meaningful to their peers. If they did not have the knowledge to recognize the problems in some of the pieces of writing - they could not possibly articulate this clearly. Although they may have recognized that the writing needed work, in some cases they simply might not have been able to identify what was wrong with it exactly.

Another trend is that students receive feedback and did not make any changes. Interestingly, this was one of the findings in Neuman and Kopcha’s study of Google peer feedback as used by middle schoolers (2018). They noted that students failed to close the feedback loop even when they received meaningful feedback from their peers and suggested that this could be because students did not understand how to give actionable feedback. I agree that to some extent, in my classroom, this could be because they were unclear on what to do next - i.e. they received critical feedback but no suggestions and didn’t know what to do. It could also be because they prefer to wait to receive teacher feedback before they make any adjustments to their work. In some cases, given the richness of their own feedback to others, it is clear that the student has the ability to make changes like those suggested to them and they still held back. Knowing that they would have time to conference with their teacher the following day may have limited their interest in moving forward.

 

Criticism and Future Applications

There are multiple moments in this process that need improvement in order to get better student participation and results. First, students clearly do not see the relevance of the feedback in the same way I do. The first step would be to clarify the importance of the process and highlight that the reason we are using Google Docs and comments is because the entire process is visible to their teacher and them - i.e. make them aware that I am looking at what they did to edit their paper before I give my own feedback. Additionally, I could give an example of how I use feedback myself in my own studies and why peer review specifically is used in academia to improve results.

Also, although the Google Slides workshop provides a feedback framework it is clearly not explicit enough. Especially with younger students it may be critical to provide a mandatory step in each slide as well as an example of what that would look like in Google Docs comments. The amount of comments and types of comments I expect to see as a minimum may need to be outlined. My school does not allow us to grade steps in the formative process or learning behaviours so I could not grade the revision and comments but I could potentially make a certain minimum a prerequisite in order to meet for teacher feedback, which some students are more responsive to. Recently, I had the opportunity to actually implement this with a group of older students. Instead of a workshop model with suggestions and examples of accurate work, we gave a series of questions and required that each of them be answered through the comments section. For example, we asked: “Does the writer provide evidence supporting their claim? Highlight one example of evidence and comment on why it supports the claim or not. If there is no evidence comment below the claim.”. These instructions provided better results, students provided much more feedback on the text.

Another component that I would approach differently is how I create the partnerships. Although this took place early in the year I still had some sense at the time of different skills levels. Peer review partnerships should have reflected this knowledge because it would have allowed each student to address their next level of learning. Having large disparities in skill and knowledge levels meant that some students may have been reading things they did not understand while others were not being extended in their learning at all. That may have diminished the seriousness with which they approached the peer review process.

Finally, this study is obviously very limited. The sample size is very small (20 students), and I am studying my own practice. So in addition to gathering a small amount of data, there is a possibility that I am biased in how I am collecting the data and in my analysis. 

 

 

 

Future Application

In addition to revising the actual process and partnerships, there are two new apps that I have started using more frequently this year and would like to apply to the peer review process. The first is Draftback which allows students and teachers to view the whole writing process through a playback and statistical analysis. Both of these give students an opportunity to reflect on what changes they are actually making on their document once they receive feedback. Additionally, the playback and the statistics can be used with Screencast, the app I have been using to give students feedback that they can view at home. This class have now experienced the use of Screencast in teacher feedback. I used Screencast to give each student precise feedback on their paragraph - this allowed me to provide the same type of information I would have provided in person but while outside of the classroom and school hours. I found the impact of this feedback to be immediate, students came into the next class with changes already made. The fact that they had a recording meant they could go back to it several times. Now I would like the students themselves to use Screencast and Draftback to review their writing process and reflect on the changes they have made.


References

Al-Samarraie, H., & Saeed, N. (2018). A systematic review of cloud computing tools for collaborative learning: Opportunities and challenges to the blended-learning environment. Computers & Education, 124, 77-91.

 

Brodahl, Cornelia, Said Hadjerrouit, and Nils Kristian Hansen. "Collaborative writing with Web 2.0 technologies: education students' perceptions." (2011).

 

Neumann, K. L., & Kopcha, T. J. (2019). Using Google Docs for peer-then-teacher review on middle school students’ writing. Computers and Composition, 54, 102524.

 

Suwantarathip, O., & Wichadee, S. (2014). The Effects of Collaborative Writing Activity Using Google Docs on Students' Writing Abilities. Turkish Online Journal of Educational Technology-TOJET, 13(2), 148-156.

 

Wang, J. (2017). Cloud Computing Technologies in Writing Class: Factors Influencing Students' Learning Experience. Turkish Online Journal of Distance Education, 18(3), n3.