THE EFFECT OF FORMATIVE FEEDBACK THROUGH SCIENCE INTERACTIVE NOTEBOOKS ON STUDENT LEARNING IN HIGH SCHOOL BIOLOGY by Meghan Kathleen Hawkins A professional paper submitted in partial fulfillment of the requirements for the degree of Master of Science in Science Education MONTANA STATE UNIVERSITY Bozeman, Montana July 2017 ©COPYRIGHT by Meghan Kathleen Hawkins 2017 All Rights Reserved ii DEDICATION This paper is dedicated to all those who have helped me to fulfill this requirement and to succeed in pursuing the degree of Master of Science in Science Education. To my husband, Chan, who did all he could to help me especially when I had homework to do and papers to complete, who stood by me as I decided to embark on this endeavor and plan our wedding at the same time, I couldn’t have completed this program without his help. This work is also dedicated to my parents and sister who gave me encouragement, listened to me go on and on about this or that assignment, and read my papers if I needed another set of eyes and gave me love and support even though they (mostly mom) was shocked when I told her that I decided to apply to Montana State University, and yes, I would have to travel there for a semester. To my friends, those friends who read my paper, peer reviewed for me, and listened to my thoughts and ideas for this project, those friends who embarked on science interactive notebooks this year because I talked about them so much and were there for me when I was overwhelmed. And finally, to my professors at Montana State University who have helped me along the way, Eric Brunsell, my advisor and Joseph Bradshaw, my science reader, who have answered so many e-mails in the middle of this writing process. This paper is dedicated to all of you. iii TABLE OF CONTENTS 1. INTRODUCTION AND BACKGROUND ............................................................1 2. CONCEPTUAL FRAMEWORK ............................................................................3 3. METHODOLOGY ................................................................................................11 4. DATA AND ANALYSIS ......................................................................................17 5. INTERPRETATION AND CONCLUSION .........................................................31 6. VALUE ..................................................................................................................35 REFERENCES CITED ................................................................................................38 APPENDICES .............................................................................................................41 APPENDIX A Interactive Science Notebook Grading Rubric..............................42 APPENDIX B Interactive Notebook Peer Evaluation Form & Grading ...............44 APPENDIX C Graphing Rubric ............................................................................46 APPENDIX D Lab Conclusion Rubric..................................................................48 APPENDIX E L Feelings on Feedback Pre and Post Treatment Survey ..............50 APPENDIX F Graphing Rubric.............................................................................53 APPENDIX G Interactive Science Notebook........................................................55 APPENDIX H Montana State University Institutional Review Board Approval ........................................................................................57 iv LIST OF TABLES 1. Data Triangulation Matrix .....................................................................................17 2. ANOVA Results Comparing the Normalized Gains of the Treatment Areas .......30 v LIST OF FIGURES 1. Feelings on Feedback Pre and Post Survey Student Responses on Peer Feedback .................................................................................19 2. Feelings on Feedback Pre and Post Survey Student Responses on Feedback .........................................................................................20 3. Feelings on Feedback Pre and Post Survey Student Responses on Rubrics ............................................................................................22 4. Feelings on Feedback Pre and Post Survey Student Responses on Grades and Learning .......................................................................23 5. Student Pre and Post Rubric Scores on Graphing ..................................................25 6. Students Pre and Post Rubric Scores on Lab Conclusions ....................................26 7. Student Pre and Post Rubric Scores on Interactive Science Notebooks .................................................................................................27 8. Science Interactive Notebook Survey Response....................................................28 9. Average Pre and Post Intervention Student Score on the Three Skills Evaluated .....................................................................................30 10. Marking Period 2 and 3 Averages for Students .....................................................31 vi ABSTRACT Although my students would benefit from completing practice in the form of classroom work and activities, many of my students do not have either the time or the interest to do so. The students put a focus on earning grades instead of gaining knowledge from assignments and work throughout the year, they never worry about how they can learn more or improve their future work, just how they can get the grades they want on their report card. This project investigated how using formative feedback in the form of student self-reflection and peer-assessment can effect student metacognition and learning. This feedback came in two main forms, self-assessment and peer assessment and was reflected on by the students in their interactive science notebooks allowing students to show the steps they took towards mastery of standards. Data collection for this project not only included reflection in their interactive science notebooks, but the students’ attitude towards formative feedback and standards- based grading, as well as data collected on students’ cumulative grades both before and after the project. The results indicated that students benefited the most on the graphing assignments when performing self-assessment and peer assessment. Peer and Self-assessments had the littlest effect on the science interactive notebook grades. Peer and self-assessments seemed to be helpful in some areas and to some students but results were non-conclusive as a whole. Same was seen in the science interactive notebooks, some students benefited from having everything in one organized place yet others weren’t organized enough to benefit from the notebooks. 1 INTRODUCTION AND BACKGROUND Hackensack High School, is located in the city of Hackensack in New Jersey, with a population of 44,519 people as of 2014. The median household income from 2010- 2014 was $53,338 with an estimated 16.6% of the population being in poverty (United States Census Bureau, 2016). With a highly diverse population the city of Hackensack consists of 34.2% Hispanic, 30.3% Caucasian, 23% Black, 12% Asian, 0.7% two or more races and 0.7% other race in its population as of 2013 (City-data, 2016). The Hackensack High School serves students from several different districts including, Hackensack, South Hackensack, Rochelle Park and Maywood. Hackensack High School has a population of approximately 2,000 students in the ninth through twelfth grades and is a highly diverse school both socio-economically and racially, with the primary ethnicity being Hispanic. Additionally, Hackensack High School is considered a Title 1 school with 52% of the students being economically disadvantaged (J. Montesano, personal communication, September 9, 2015). Teaching and Classroom Environment As a teacher of five years I have only worked at one high school for my whole career. Hackensack High School is where I got my start with student teaching and was eventually hired nine months after my student teaching and graduate program was complete. In my five years as a teacher I have taught pre-AP biology, marine biology, environmental science, college preparatory biology and lab biology. Through all of these classes I have seen a number of students and have seen the fluctuations of student effort, work ethic and achievement. I do consistently see that students do not care what or if they are learning but only worry about their final grade. If the students are happy with the 2 grade that they received, then they just put the paper away or throw it out rather than see what comments were made to help them improve their understanding and skills. If students are not content with their grade, then that is when they come and ask what they can do to improve. Even when students come to see what they can do to increase their grade they are looking for an easy fix. Teacher feedback is generally never looked at or used to improve future assignments or understanding. By having students self-assess their work they are forced to look back on what they are doing and how they can improve future similar assignments. Having students peer assess the same assignments that they will be self-assessing should build up their self-assessment skills allowing for better metacognition of their own assignments. I hypothesize that having the students perform self-assessment and peer assessments of assignments that present skills that will be used throughout the year, and in future science classes, will improve these skills and get students to reflect on feedback and correct those errors in future work. Focus Question The purpose of this study is to identify if formative student feedback, in the form of self-reflection, on students’ performance in class can increase student overall performance. Secondly, it is to identify what type of formative assessment is most effective and determine if interactive science notebooks are an ideal tool for self- assessment. I have observed that students don’t value homework and look at it as a chore or something to get finished quickly, not as a tool for learning and a part of their grade. In many cases students just want to get the homework done and they split up the work with 3 friends or get “help” from each other in the form of copying. Students also rarely use the feedback that is given by the teacher to improve future work. Once students receive a grade deemed acceptable by them they either shove the work in their binders or backpacks or even throw it in the trash. I plan on using interactive science notebooks in the classroom which serves as a place for note taking and student response and reflection as evidence of learning. The interactive science notebook will also serve as a place to give students feedback and have them respond to that feedback allowing students to see how they are progressing through the year with their science learning. By performing formative assessments in the classroom in a different manner other than homework and having students respond to the feedback given by the teacher, I propose that student learning will increase exponentially. CONCEPTUAL FRAMEWORK Formative assessment and feedback are very much connected to each other. In order for a student to improve on their prior performance there needs to be feedback so that they can use the information for future assignments. Teachers also can use formative assessment for themselves to improve upon their instruction and to know if they were effective in the instruction of different areas of learning. For assessment to be considered formative, it seems commonly understood that it needs to help direct the teacher in their instruction and/or assist students in improving upon their learning by changing their thinking or aiding in improving their learning (Fluckinger, Vigil, Pasco & Danielson, 2010). 4 Formative Assessment I have found certain major characteristics of successful formative assessment that have been consistent in literature. One of these characteristics is to, “provide students with a clear, student-friendly vision of the achievement target to be mastered, including models of strong and weak work” (Stiggins, 2005, p.328). This provides students with clear guidance as to the expectations from the teacher in their work (Stiggings, 2005, Cauley & McMillan, 2010). Fluckinger et al., echo this concept and add that, “effective formative feedback must be specific, simple, descriptive, and focused on the task. This allows learners to set clear expectations of themselves and to make decisions that influence their own success” (p. 137). When students are given feedback early on in a project or chapter, it allows students to change their thinking and improve the overall product or outcome on an assessment (Fluckinger et. al., 2010). Cauley & McMillan (2010) gave four reasons why students learn more through formative assessment, they are, “1. Frequent, ongoing assessments allow both for fine- tuning of instruction and student focus on progress. 2. Immediate assessment helps ensure meaningful feedback. 3. Specific, rather than global, assessments allow students to see concretely how they can improve. 4. Formative assessment is consistent with recent constructivist theories of learning and motivation” (p.2). Formative assessment helps lower achieving students the most, allowing us to close the gap of achievement between our high and low achieving students. This allows students to see themselves as capable of learning and achieving and allows the lower achieving student to not become defeated in 5 the learning process as is often seen, especially by the time the student reaches the high school level (Black & Wiliam, 1998). Feedback Hatziapostolou & Paraskakis (2010), explained that feedback is anything that gets communicated with students to improve their learning and performance and shed light on the importance of feedback. They are quoted saying that, “feedback is an essential component in all learning contexts and serves a variety of purposes including evaluation of students’ achievements, development of students’ competences and understanding, and elevation of students’ motivation and confidence” (p.111). Effective feedback should share the following traits according to Lizzio & Wilson (2008), it should include performance gap information between actual and ideal performance, be clear and understandable to the student, and be fair. Students should feel respected by their teachers in the feedback that they receive and they should feel like they are being treated the same as others and receiving fair grades and feedback. And when students can’t understand what the teacher means in their feedback, then the feedback becomes unusable and dispensable. Cauley and McMillan add that, “task specific feedback influence students’ interest and commitment more positively than either grades or praise” (2010, p. 4). Hatziapostolou & Paraskakis (2010), add that feedback should be, “timely, constructive, motivational, personal, manageable, and directly related to assessment criteria and learning outcomes” (p. 111) and that students need to be engaged in the feedback process for it to be effective. Sadler additionally identified three conditions that must be met in order to benefit from feedback. He said that students must 6 know what good performance is and how their performance relates to good performance, which can be done in the classroom with easy to understand goals, exemplar of work and criterion for the task. And, students must also understand what they can do to close the gap between where they are in performance and where they want to be (their goal). This involves the teacher building up the students’ self-assessment skills because what a student might think as exemplary might differ drastically from the teacher and the standards. Self-assessment and reflection tasks are essential to building up the skill of students in self-regulation and by providing practice with this in the classroom it allows them to build their skill and confidence. Many students don’t use feedback given by their teachers despite the value of it because of many different reasons, including, but not limited to, not being sure of how to use the feedback and not being able to understand the feedback being given to them. Finding ways to involve the students in the feedback given to them would help students exponentially and is discussed in below sections. Student Response to Formative Assessment Students in a study conducted by Lizzio & Wilson (2008), affirmed that there were three characteristics of feedback that had high value. These three characteristics were transferrable learning, teacher engagement and recognition of achievements and effort. Transferrable learning was important because the feedback they received on the work could be used to approve upon different assignments as well as in the real world. When students felt that the feedback could help them in the future they found it much more valuable. Students valued when the teachers seemed engaged with the work and provided deeper feedback, showing that they took time reviewing and providing feedback 7 on the assignment. This was valuable to them because they felt that the teacher was trying to help them versus just marking for correct answers, students like to feel like an individual and not just another number. The recognition of efforts and achievements by teachers in their feedback, was valued by students because although there might have been some comments about improvements needed, they also didn’t feel like they did an awful job that instead they can make some improvements in the future, as opposed to only negative feedback which could feel like they couldn’t bounce back. Positive and constructive feedback is important to improving how a student sees him or herself and improving the feeling of self-worth. Stiggings (2005), also made it clear that students have emotional responses to feedback and that if they don’t feel like they can improve that you can end up with a ‘pessimistic response’ as compared to an ‘optimistic response’ to the feedback. Students need to feel that learning is worth the effort they are putting in or they will give up, explains Cauley & McMillan (2010). They also reasoned that allowing students ways to see what they have accomplished in their learning goals on a daily basis, gives the students a greater sense of accomplishment therefore leading students to a greater buy in on their learning. Cauley & McMillan emphasized the importance of formative assessment by stating that, “if the only feedback students receive is a final grade (unit, midterm, final, external tests), they can’t see how their efforts improve skills, lowering expectations of success in the future” (p. 4). 8 Interactive Science Notebooks as a Tool of Self-Reflection Most of my research was done on formative assessment and formative feedback, although some research was done on different ways to perform formative assessment in the classroom and how to involve students in the feedback process. One tool that can serve as formative assessment and act as a portfolio of learning is the interactive science notebook. The interactive science notebook is a valuable tool to both teachers and students enabling students to better see what they have learned and accomplished. Interactive science notebooks can act as a place for students to engage in certain forms of formative assessment and teachers can give feedback on where they feel the students are as compared to where they need to be. It also provides a canvas for students to improve upon their work by replying to teachers’ feedback and to show their improvements. Students and peers are also able to review and modify their notebooks before the teacher grades them. This allows the students to get credit for the work even if they didn’t do the work when they were supposed to, although they wouldn’t get a participation grade for the day, and to modify their work if they grew to understand the assignment better before the teacher collects the notebook. These notebooks document student growth and understanding over a prolonged period of time and students can look back and see how they have improved or changed their thinking over time providing them proof of learning and of achievement. Writing in science has also proved to help student learning because, “they are forced to clarify their thoughts and organize these ideas in a way that others can understand” (Young, 2003, p. 44). Marcarelli (2010, p. 6) states that, “the interactive 9 notebook allows students the opportunity to identify preexisting ideas, deepen and refine their scientific ideas throughout the learning activities and reflect on their learning.” This reflection part is the most important for the topic being studied here because without reflection on what they are learning, students are constantly looking forward and not learning from what they might need improvement on from prior work and assignments, even when they are skills that they will use again. Building Self-Regulated Learners Through Formative Assessment and Feedback Nicol & Macfarlane-Dick (2006), explain that, “in higher education formative assessment and feedback should be used to empower students as self-regulated learners. The construct of self-regulation refers to the degree to which students can regulate aspects of their learning, motivation and behavior during learning” (p. 199). Self- regulation can be seen in the setting of goals, coming up with a way to meet those goals, the effort put forth in meeting the goals, how students react to external feedback and in how all of these things are manifested in a completed product. Self-regulation is achieved by students being involved in their learning and monitoring their own performance to reach certain goals. Pintrich & Zuscho (2002), provided the following definition of self-regulated learning, “self-regulated learning is an active constructive process whereby learners set goals for their learning and monitor, regulate and control their cognition, motivation and behavior, guided and constrained by their goals and the contextual features of the environment” (p. 64). “In developing self-assessment [and regulation] skills, it is important to engage students in both identifying standards/ criteria that will apply to their work, and in 10 making judgements about how their work relates to these standards” (Nicol & Macfarlane-Dick, 2006, p. 207). Peer evaluation is a method to build these skills. By providing the opportunity for students to reevaluate and provide objective feedback on each other’s work against standards, students also learn to internally evaluate their own work. And although peer evaluation is crucial to students in developing these skills, feedback by teachers plays a pivotal role in identifying misconceptions and errors more effectively than peers can. In order for external feedback to prove usable by a student, it must be understood and internalized by them. There are many methods to help students do this one being to create a dialogue in teacher feedback. By teachers asking questions to guide students to a correct conclusion, instead of transmitting information, either on paper or in an oral discussion, it allows students to take ownership of their learning and to better understand and internalize the information. This can also be done by generating discussion with peer small groups or by using classroom response technologies to gather response data in order to create class discussion on particular topics. Peer discussion is a powerful source of external feedback as well, as it exposes students to a variety of perspectives on a level that is more understandable for them seeing as the discussion is with someone else who is also just learning the information (Nicol & Macfarlane-Dick, 2006). Feedback can be helpful both in helping a student in the production of a piece of work, but also can manifest in resubmission of the same task in which they did not perform proficiently the first time. Although many teachers don’t allow resubmission of work it is easily argued that allowing students to do so can improve their performance in 11 the future especially on topics where the knowledge is building, such as math and science. This also allows the student to use the feedback given to them and to improve future work as well (Nicol & Macfarlane-Dick, 2006). O’Connor (2002), explains that life is full of second chances and times to show improvement and that not to allow students to show improvement in school as well doesn’t make sense. For a student to fail one test and have it ruin their grade because maybe they didn’t study as much as they should, or didn’t have a chance to come for extra help, or any number of circumstances is wrong and students should be able to show that they learned the material and made improvements. The ultimate evaluation should be made on the best assessment they had instead of an average or some “crunching” of the numbers, even the SAT required for entry to many colleges allows you to take the test multiple times, in which they send the best of all the scores to prospective colleges. When it is deemed fit by the teacher and most likely on the students’ own time, such as after school, retakes or resubmission of work should be accepted to allow students to show improvement and use feedback. METHODOLOGY The focus of this study was to investigate if self-reflection through interactive science notebooks could increase student understanding in the high school biology class. This focus question is broken down into three sub questions: 1. What form of formative feedback (teacher feedback, peer feedback or self- reflection) is most effective? 2. Does the interactive science notebook provide students with a forum to effectively show growth in their learning? 12 3. Does peer assessment help students become more reflective of their own work, resulting in better understanding? Participants The participants of this study included the students of my college preparatory biology classes. The classes were composed of ninth grade students in Hackensack High School in Hackensack, NJ. The two classes together totaled 36 students, with 19 male students and 17 female students. Of the students in the classes 56% are Hispanic, 14% are Caucasian, 22% are Black, and 8% are Asian and this includes six students (14% of the students in the class) who were born in another country, one student was born in Ethiopia, four in Ecuador and one in Brazil. The primary language for these students is mostly English with 61% claiming it as their primary language. In addition to English, 36% of students have a primary language of Spanish and 3% has a primary language of Portuguese. Two of the students in these classes have 504 plans and there are no students who require and IEP plan. Intervention The treatment of this study included feedback reflection and self-reflection through Interactive Science Notebooks. Data collection began at the beginning of the 2016 school year with the students having not practiced self-reflection or peer evaluation in the biology class yet. Students started the year using their interactive science notebooks to take notes and perform extension activities applying the information they learned in their own way. In the interactive science notebooks, the right side is used for input, such as notes, labs, 13 background information and other types of factual information. The left side of the notebook is used for output and extension activities, such as concept maps, summaries, graphs, data collected from performing labs and anything showing student learning taking place. During the first and second marking period the left side of the notebook was used only for showing understanding of new materials and information. At the onset of the third marking period the students were introduced to self-reflection and used feedback provided by their peers through peer-assessment to reflect on their understanding of concepts and skills in the science classroom. This reflection took place on the left side of the science interactive notebook. Peer-assessment was performed on notebooks so that students could hone their self-evaluation skills and learn to be critical of their own work, this was done in the following way. First, students performed self-evaluation of their notebooks, using the provided Interactive Science Notebook Grading Rubric (Appendix A), students recorded a reflection on what needed improvement and what they did well on in their notebooks. Then, students used the Interactive Notebook Peer Evaluation Form and Grading sheet (Appendix B) to peer review two other students’ notebooks every two weeks during the third marking period. Each time the students had a peer-review done they were required to have it completed by a different student in class. Previous to this treatment, students had access to a rubric for notebook grading that the teacher used, however the teacher was the only one grading the notebooks. Students reflected in their notebooks how their self-assessment and their peers’ assessment compared. In addition to the grades given by the students, I graded the notebooks to see how their grading compared to mine. 14 In addition to peer-assessment and self-evaluation and reflection of their Interactive Science Notebooks, students performed this process with graphs and lab conclusions. Studies have found that students valued feedback given on skills that were transferrable since they would be able to improve these skills over time and would be able to use them again (Lizzio & Wilson, 2008), which influenced my decision to focus on skills that were transferable to other science classes as well as some math classes. Students then performed peer evaluation on two other papers. For evaluating graphing the students used the Graphing Rubric (Appendix C) and when evaluating lab conclusions, they used the Lab Conclusion Rubric (Appendix D). Throughout the third marking period whenever students performed the skill of graphing and of writing a lab conclusion they went through the peer-assessment process with two papers after students self-evaluated their own paper. When students received their peer feedback the students compared the feedback of their peers with that of their own self-assessment of their work and a reflection by the student was performed on how the two compared. At the completion of the treatment the same Feelings on Feedback Survey (Appendix E) was given to the students once again via Google Classroom using Google Forms to determine if their feelings changed after the treatment. Students were also interviewed using the Feedback and Self-Reflection Interview Questions (Appendix F) in January, February and March. Near the end of the treatment it was determined that more information on students’ feelings on the notebook were needed, therefore, the Interactive Science Notebook Survey (Appendix G) was given to the students in March at the end of 15 the treatment period to gain a broader understanding of students’ feelings after having used a science interactive notebook this year. Data Collection Before the treatment, the students were given a Feelings on Feedback Survey (Appendix E) to identify their feelings on types of feedback, self-reflection, peer evaluation and the use of interactive science notebooks in the classroom. This was completed through Google Classroom as a Google Form, since the students have their own school provided Chromebook and use this software regularly. Pre treatment scores for the three assignments came from the first assignments of that type for the year, with the exception of the pre assessment for notebook score (first graph in the pre assessment test we administered, first conclusion written for a lab). The notebook pre assessment score was that of the second notebook collection since I wanted students to get used to the notebooks first. Upon completion of the study, student scores on the named assignments above (notebooks, graphs and lab conclusions) and overall marking period grade from before the treatment, marking period two, and during the treatment, marking period three, were compiled and computed. This gave a quantitative value to the project. Students were also asked to complete a Feelings on Feedback Survey (Appendix E), in which they relayed their feelings on feedback and what types of feedback are most helpful, both pre and post treatment. Additionally, the Feedback and Self Reflection Interviews (Appendix F) with students were performed in order to gain the students perspective on self-reflection and feedback giving this project a qualitative value. Students from each level of achievement 16 were selected to be interviewed across both classes. Two high level achievers (100%- 85% marking period grade), two midlevel achievers (84%-70% marking period grade) and two low level achievers (69% and below marking period grade) were picked from each class to be interviewed three times each, in January (pre-treatment), February (during treatment) and March (post treatment). In order to evaluate the effectiveness of the Interactive Science Notebook as a forum for students to show growth in the learning, the notebook peer- assessment (Appendix B) and self-reflection of notebooks were used as a qualitative values and the teacher’s grades of the Interactive Science Notebooks before, during and after the treatment served as quantitative data. Additionally, there was a comparison of Interactive Science Notebook grades given by the teacher, through peer-assessment, and through self-assessment to see how closely students peer graded and self-graded to the teacher. It was also determined that a survey would help to get an additional view of how all the students felt about using science interactive notebooks, so the students were given the Interactive Science Notebook Survey (Appendix G) at the end of the treatment period. This survey served as qualitative data in addition to the question in the Feedback and Self Reflection Interviews (Appendix F) that referenced Science Interactive Notebooks. Table 1 below shows my triangulation matrix showing the questions to be answered in this study and the data sources used to answer the research questions. This study has been reviewed and approved by the Montana State University Institutional Review Board (Appendix H). 17 Table 1 Triangulation Matrix Focus Question: Does self-reflection through interactive science notebooks could increase student understanding? Subquestions Data Source Subquestion 1: What form of formative feedback (Teacher feedback, peer feedback or self-reflection) is most effective? Pre-survey Student Interview Post-Survey Subquestion 2: Does the interactive science notebook provide students with a forum to effectively show growth in their learning? Interactive Notebook Peer Evaluation Form and Self Reflection Teacher Interactive Science Notebook grades Post Interactive Notebook Survey Subquestion3: Does peer assessment help students become more reflective of their own work, resulting in better understanding? Pre- treatment scores on selected assignments Student Interview Post-Treatment scores on selected assignment DATA AND ANALYSIS This classroom research project’s intent was to examine if self-reflection through interactive science notebooks could increase student understanding. Additional sub questions included, what form of feedback was most effective, does the interactive science notebook provide students with a forum to effectively show growth in student learning, and does peer assessment help students become more reflective of their own work, resulting in better understanding. A variety of data was collected including, student interviews, student attitude surveys and student pre and post treatment scores on their notebook, graphing assignments and lab conclusions. The treatment group included 36 freshmen biology students in my periods two and three classes. 18 Effectiveness of Feedback On the project questions of “What form of formative feedback (Teacher feedback, peer feedback or self-reflection) is most effective?” and “Does peer assessment help students become more reflective of their own work, resulting in better understanding?”, students’ responses on the Feelings on Feedback Survey showed that more students liked performing peer assessments prior to the treatment with a total of 57.5% of the students agreeing or strongly agreeing in the pre-assessment, versus 44.4% of the students agreeing or strongly agreeing in the post assessment. The survey also showed that when performing a peer assessment or review that most of the students value the feedback given by their peers, with an increase in agreement after the treatment with a total of 65% of the students agreeing or strongly agreeing in the pre-assessment and 80.6% of the students agreeing or strongly agreeing in the post assessment. See Figure 1 for all Feelings on Feedback Pre and Post Survey student responses as they relate to peer feedback. 19 Figure 1. Feelings on feedback pre and post survey student responses on peer feedback, (N=36). Looking at data on feedback in general, the data from the Pre and Post Feelings on Feedback Surveys showed that students value feedback and use it to better themselves, as 91.7% of the students agreed or strongly agreed in their post treatment survey that they were “able to take constructive criticism and better myself with it” as compared to the pre-survey where 87.5% of the students agreed or strongly agreed with the statement. Students also valued teacher feedback and “often used the feedback from their teacher to improve their future assignments”, with 80% of students responding agree or strongly agree on the pre-survey and 91.7% of students responding agree or strongly agree on the 0 20 40 60 80 100 120 Pre Survey - When performing a peer assessment or review, I always value the feedback given by my peers. Post Survey - When performing a peer assessment or review, I always value the feedback given by my peers. Pre Survey - My peers help me when I need help. Post Survey - My peers help me when I need help. Pre Survey - Helping others means giving them my work to let them copy the information down. Post Survey - Helping others means giving them my work to let them copy the information down. Pre Survey - I like performing peer assessments (peer reviews). Post Survey - I like performing peer assessments (peer reviews). Student Responses (%) S u rv ey Q u es ti o n s (P re a n d P o st S u rv ey ) Strongly Agree Agree Dissagree Strongly Dissagree 20 post survey. See Figure 2 for all Feelings on Feedback Pre and Post Survey student responses as they relate to feedback. Figure 2. Feelings on feedback pre and post survey student responses on feedback, (N=36). Rubrics Looking at the data involved with rubrics on The Feelings on Feedback Pre- Survey showed that 45% of the students agreed and 10% strongly agreed that they always self-assessed themselves when provided a rubric another 40% of students disagreed with this statement and 5% of students strongly disagreed. However, after treatment, 22.2% of students strongly agreed, 55.6% of students agreed, 19.4% of 0 20 40 60 80 100 120 Pre Survey - I often use the feedback from my teacher to improve my future assignments. Post Survey - I often use the feedback from my teacher to improve my future assignments. Pre Survey - When I get a graded assignment returned to me, I know what I need to do to improve my work. Post Survey - When I get a graded assignment returned to me, I know what I need to do to improve my work. Pre Survey - I am able to take constructive criticism and better myself with it. Post Survey - I am able to take constructive criticism and better myself with it. Student Responses (%) S u rv ey Q u es ti o n s (P re a n d P o st S u u rv ey ) Strongly Agree Agree Dissagree Strongly Dissagree 21 students disagreed and only 2.8% strongly disagreed. However, slightly less students agreed or strongly agreed that they would be able to know if they did a good job before handing in an assignment and less students responded that they thought that grading criteria was clear and that they knew how to achieve higher grades after the treatment than before. This is seen in the graph below (Figure 3) showing that a total of 82.5% of students strongly agreed or agreed that they had a clear understanding of the grading criteria and understand how they can achieve higher grades before the study and only 72.2% of student strongly agreed or agreed to the same post treatment. Likewise, 77.5% of students’ pre-treatment agreed or strongly agreed that they were able to tell if they have done a good job or not before handing in an assignment and only a total of 61.1% of students agreed or strongly agreed to that statement when surveyed post treatment (N=36). 22 Figure 3. Feelings on feedback pre and post survey student responses on rubrics, (N=36). Grades and Learning Finally looking at data on grades and learning as it appeared on The Feelings on Feedback Survey, students showed to be most interested in their grades and not as interested in what they are learning. This can be seen in the data in Feelings on Feedback Pre and Post Survey Student Responses on Grades and Learning graph (Figure 4) below. The data showed that although more of the students (66.7%) disagreed or strongly disagreed with the statement, “If an assignment isn’t graded I don’t have to do it or put in a lot of effort” in the Post treatment survey than disagreed or strongly disagreed (33.3%), only 61.1% of the students surveyed agreed or strongly agreed in the post treatment survey that they were interested in biology and wanted to learn in the class. The percent 0 20 40 60 80 100 120 Pre Survey - When provided with a rubric I always self- assess myself using the rubric before handing in the assignment. Post Survey - When provided with a rubric I always self- assess myself using the rubric before handing in the assignment. Pre Survey - Grading criteria is clear to me and I understand how I can achieve higher grades. Post Survey - Grading criteria is clear to me and I understand how I can achieve higher grades. Pre Survey - Before I hand in an assignment I can tell if I have done a good job or not. Post Survey - Before I hand in an assignment I can tell if I have done a good job or not. Student Responses (%) S u rv ey Q u es ti o n s (P re a n d P o st S u rv ey ) Strongly Agree Agree Dissagree Strongly Dissagree 23 of students who wanted to learn and were interested in biology also went down in the post assessment, seeing as the number of students on the pre-treatment survey that agreed or strongly agreed was 75% (N=36). Also, seen in this section of data was that 95% of student in the pre-treatment survey agreed or strongly agreed that the first thing they did when an assignment is returned is look at the grade and in the post survey, 100% of students agreed strongly (75%) or agreed (25%) to the above statement. Figure 4. Feelings on feedback pre and post survey student responses on grades and learning, (N=36). In addition to the surveys, students were interviewed on their feelings on grades, when asked what do they do with feedback, one student responded, “It really depends on what the assignment is. If the feedback is on an assignment that I won’t have use again I 0 20 40 60 80 100 120 Pre Survey -Upon return of graded assignment or test the first thing I do is look at the grade. Post Survey -Upon return of graded assignment or test the first thing I do is look at the grade. Pre Survey - I want to learn in this class and am interested in biology. Post Survey - I want to learn in this class and am interested in biology. Pre Survey -If I know I can redo an assignment I still put in full effort and try my best on that assignment the first time. Post Survey - If I know I can redo an assignment I still put in full effort and try my best on that assignment the first time. Pre Survey -I care about how I do in school only because of the grades. Post Survey -I care about how I do in school only because of the grades. Pre Survey - If an assignment isn't graded I don't have to do it, or put in a lot of effort. Post Survey - If an assignment isn't graded I don't have to do it, or put in a lot of effort. Student Responses (%) S u rv ey Q u es ti o n s (P re a n d P o st S u rv ey ) Strongly Agree Agree Dissagree Strongly Dissagree 24 probably don’t look at it. If it is a skill that I will have to use again then sometimes I look at it and then sometimes I look at it when I have to do that skill again, but not till then.” Of the students interviewed, 80% of the students shared a similar view, that they only looked at feedback when it was going to apply in the future and if they are currently working on more of the same type of assignment. In addition, when I asked the students to truthfully tell me what they do when they get a graded paper returned to them, 95% of students interviewed admitted that they look at the grade first and then might look at the feedback. Graphing When interpreting the Graphing scores, scores went up from the Pre to the Post assessment, which an average score of 67% for the Pre-assessment to an average score of 96% for the Post assessment (N=36). The spread of the student scores before and after treatment and the mean score can be shown on the box and whisker plot (Figure 6) showing that before the treatment 25% of the students had obtained a score of 81% or above on their graphing assessments, however, after the treatment, 100% of the students obtained a score of 90.5% or above on the post assessment. The pre-assessment had a Mean score of 14.17, with a standard deviation of 3.64. The post-assessment average went up to a 20.22, with a standard deviation of 0.59. The difference between Pre- and Post- assessment scores was extremely significant, t(35)=10.0672, p< 0.0001. The normalized gain was considered high with a score of 0.87. 25 Figure 5. Student pre and post rubric scores on graphing, (N=36). Lab Conclusions When looking at the Lab Conclusion scores, scores went up minimally from the Pre to the Post assessment, with an average score of 63% for the Pre-assessment to an average score of 71% for the Post assessment (N=31). The spread of the student scores before and after treatment and the mean score can be shown on the box and whisker plot (Figure 7), which showed that before treatment only 25% of students obtained a score of 77.8% or above on their pre assessment, however, after treatment, 50% of the students obtained a score of 75% or more on their post assessment. The pre-assessment had a Mean score of 22.74, with a standard deviation of 6.28. The Post-assessment average went up to 26.00, with a standard deviation of 7.47. The difference between Pre- and 26 Post- assessments was not quite statistically significant, t(30)=2.0008, p=0.0545. The normalized gain for this data was considered low with a score of -0.2. Figure 6. Student pre and post rubric scores on lab conclusions, (N=31). Interactive Science Notebooks Data analysis of the Interactive Science Notebook scores, showed that scores went down from the Pre to the Post assessment, with an average score of 76% for the Pre- assessment to an average score of 72% for the Post assessment (N=32). The spread of the student scores before and after treatment and the mean score can be shown on the box and whisker plot (Figure 5). The box and whisker plot shows that pre-treatment 75% of students obtained a score of 75% or more on their notebook grades, however, after treatment only 50% of students obtained a score of 75% or more on their notebooks 27 scores. The pre-notebook evaluation had a Mean score of 30.38, with a standard deviation of 4.43. The post-notebook evaluation average dropped to 28.78, with a standard deviation of 6.03. The difference between the Pre- and Post- notebook evaluations was not significant, t(31)=1.9397, p=0.0616. The normalized gain was considered low with the score of -0.3. Figure 7. Student pre and post rubric scores on interactive science notebooks, (N=32). On the question of, “Does the interactive science notebook provide students with a forum to effectively show growth in their learning?”, the data from The Interactive Notebook Survey provided information about if students have used an interactive notebook before and if they felt that it increased their grades and learning. Analysis of the survey (Figure 9) showed that 51.4% of students have used an interactive notebook 28 before in school and the other 48.6% have not used an interactive notebook before. Most of the students, 67.6%) said that they did not think the interactive notebooks have helped them maintain or improve their grades and 32.4% of students said it did help improve or maintain their grades. When asked if the interactive notebooks in their classes have helped them learn better, 59.5% of students agreed that is did help them, and 40.5% of students did not agree that it helped. And when asked if the students would like to continue using interactive notebooks in their future classrooms, 18.9% responded with yes, 37.8% of students said no, and the rest of the 43.2% of students responded with maybe (N=36). Figure 8. Science interactive notebook survey response, (N=36). When interviewed on the topic of notebooks, one student responded, “I like that everything is where I can find it in the notebook. I do find it helpful, but I know I need to 51.4 32.4 59.5 18.9 48.6 67.6 40.5 37.8 43.2 THIS IS THE FIRST TIME I HAVE USED AN INTERACTIVE NOTEBOOK IN SCHOOL THE INTERACTIVE NOTEBOOKS HAVE HELPED ME MAINTAIN/IMPROVE MY GRADES THE INTERACTIVE NOTEBOOKS IN MY CLASSES HAVE HELPED ME LEARN BETTER I WOULD LIKE TO CONTINUE USING INTERACTIVE NOTEBOOKS IN MY FUTURE CLASSROOMS S tu d en t R es p o n se ( % ) yes/agree no/disagree maybe 29 put the assignments in there as soon as I get them or I might lose it. The only thing that I don’t always like is the possibility that I don’t have something in the notebook and that I won’t get a good notebook grade.” Another student when asked their opinion on the interactive science notebooks responded, “I don’t like the notebooks, I always forget to put my work in there and then I lose it and get a bad grade when there is a notebook check. I guess I should make sure I put the assignments in there when I get them, but I forget and then the papers get lost.” Of the students interviewed about notebooks, 45% of the students had a positive view of the notebooks and the other 55% of students either didn’t like the notebooks, or admitted to not being very organized when it came to the notebooks. When asked what the student would do with a paper before we started using notebooks, 60% of students said that they would look at the grade and then throw it out if they didn’t need it, 20% said they would put it in their backpack, notebook, folder, etc. and 10% would make sure to put it into the binder with the information it goes with. Intervention Comparison and Student Test Scores The mean student score for each skill or assignment area where they received intervention is illustrated in the line graph below (Figure 8), which shows that the graphing skill showed the most gains and the interactive science notebook decreased in average score through the intervention. 30 Figure 9. Average pre and post intervention student score of the three skills evaluated. To look at how the effectiveness of the three treatment areas (notebooks, graphing and conclusion writing), an ANOVA test was run on the normalized gains of the three. This gave a better idea as to if there was a significant difference between means of the three treatment areas. According to the results of the ANOVA test presented below on Table 2, the treatments had no statistically significant differences between the means of the normalized gains. Table 2 ANOVA Results comparing the normalized gains of the treatment areas Source of variation Sum of squares d.f. MS F (calculated) F- critical value (at 5%) Total 112.9596753 98.0 d.f. SSt Between groups 28.08422613 2 d.f. SSb 0.286574 MSb 0.324134705 3.09 Within groups 84.87544919 96.0 d.f. SSw 0.884119 MSw 67 96 76 72 63 71 0 20 40 60 80 100 120 PRE INTERVENTION POST INTERVENTION Student Average Assessment Scores (%) Conclusions Interactive Notebook Graphing 31 Student averages for the marking period two, pre-intervention, and marking period three, post intervention were collected as additional information. This is useful to see general trends in student work during the study period. As seen in Figure 10, half the students’ averages went up in the third marking period and half the students’ averages went down in the third marking period. There were a few students whose averages rose or dropped only slightly, and one student who had drastic gains as well as one who had a drastic decrease. Figure 10. Marking period 2 & 3 averages for students, (N=36). INTERPRETATION AND CONCLUSION As previously stated, the intent of this research was to determine if peer feedback and self-reflection impacts the students’ abilities to evaluate and improve future assignments. This classroom research project investigated the influence of peer and self- 0 10 20 30 40 50 60 70 80 90 100 M a rk in g P er io d A v er a g e (% ) Individual Students MP2 Avg MP3 Avg 32 reflection on three areas in particular, graphing, writing lab conclusions and in the organization and preparation of their science interactive notebooks. Through analyzing data collected this research project conclusion has been proven to be overall non- conclusive. My primary and secondary questions will be answered in this section more thoroughly. My first research question explored the impact of self-reflection through interactive science notebooks on student understanding of science concepts. The results of this intervention were inconclusive. According to the data, the students did create better graphs when utilizing peer review than when they did not. Many of the students did write better lab conclusions utilizing peer review than when they did not as well, however, the peer review did not seem to have a big effect on the quality of the science interactive notebooks. According to Black & William (1998), formative assessment most helps lower achieving students and allows us to close the gap of achievement between high and low achieving students. This is saw when analyzing the results of the classroom research project, not all of the lower achieving students achieved perfect graphs and lab conclusions, however they saw the most gains and every one of them increased their scores on the assessments. The second research question focused on determining if teacher feedback, peer feedback, or self-reflection was most effective. An analysis of data shows that each approach had benefits and negative impacts. Although, the Pre and Post Surveys showed that although the students didn’t necessarily enjoy the peer assessment and evaluation in 33 class, they did value the feedback that they received through them. Many students rolled their eyes when I told them that they were doing another peer assessment and I observed some students just rushing through and giving them the same score as the students scored themselves. In my experience, there will always be some students who don’t want to perform the task at hand or give it a hundred percent in class. I have yet to come up with a way to prevent this from happening. With that being said, the students seemed to figure out pretty quickly who’s feedback was best to follow and take seriously and who was just scoring the same as the last person. I have to say that one of my worries with peer assessment was always that the students wouldn’t give the correct feedback or that it would hurt the students if the student they were paired with didn’t give much effort on their peer assessment. I think that after performing so many peer assessments in class, I still have some reservations, however, the students seem to be a good judge of how legitimate the feedback is and they seem to weigh the peer assessment feedback accordingly. Overall the peer assessment seemed to have helped the student learning and understanding the most. I did see that there is room for teacher feedback and self-reflection as well in the classroom and the form of formative feedback that works best changes with the student and the assignment. Self-reflection is a useful tool to build students’ assessment skills and goes hand in hand with peer assessment. I realized that teacher feedback is important as well especially if there are students who have a poor understanding of the topic or skill which is being worked on. Teachers need to keep an eye out for what kind of feedback the students are giving each other and are useful to prompt ideas and questions about 34 their thinking and understanding. So overall, I don’t think any one form of formative feedback was the most effective, but that they all play their own role in the classroom. The third research question was focused on determining if an interactive science notebook provided students with a forum to show their learning. The analysis of data showed mixed results, notebooks could be beneficial to some students, to others it was just an annoyance and a so-so grade. This is because students needed to keep up with the work and to keep it organized and neat and not all students have inherited and learned those skills through their upbringing and education. Some students didn’t do all of their work and many of them didn’t bother to put together their notebooks till the last minute, including the day of in the beginning of the class. More organized students seemed to like having the notebook and though that it provided a place for feedback and to see their growth through the year as is evident from one students’ response in an interview that, “I love that the notebooks are so organized, I always know where to find what I am looking for and know that I can go back to my previous papers and see where I went wrong,” another student commented in their interview that, “I like having the rubric right there in my notebook, if I’m not sure if I’m graphing the right way I can just go back and look for what you taught us.” This shows that the science interactive notebook could be and was a forum for some to show growth and feedback. The answer to the final question, “does peer assessment help students become more reflective of their own work, resulting in better understanding?”, was yes, with a caveat, peer assessment is a skill that the students need to practice often an is best started at the beginning of the school year. Fluckinger et. al. (2010), stated that when students 35 are given feedback early on in a project or chapter, it allows students to change their thinking and improve the overall products or outcome. From the data collected on graphing, this was proven to be true. When the students received feedback on their graphs their graphing improved and kept improving through the classroom research project. Many students had to fix a few things when graphing pre-treatment, however, even after the classroom research project concluded the graphs stayed overall better with really the only issue still presenting a problem being the title of the graph. In addition, not all students will buy into the peer assessment practice. I did find that some students were just trying to get through the peer assessment in class. These students didn’t have as many gains and their peers who they were working with didn’t receive as much and beneficial feedback. I need see if there is a way to better involve and engage these students because, as Hatziapostolou & Paraskakis (2010) stated, students need to be engaged in the feedback process for it to be effective. VALUE Although the data is inconclusive, I have seen that for many students, self- assessment has helped students hone their skills in fairly straight forward “skill” areas, such as graphing and even writing a conclusion. Performing peer assessments helped them to practice that skill and gain feedback from the student perspective, which led to many of them using the feedback to better their next assignment. I saw many students looking back into their notebook for former feedback and rubrics and do think that although many of the students didn’t think it helped their grade, mostly due to the notebooks being graded and not being prepared, it did help them to have a place to store 36 the feedback and to find it if the students bothered to put together the notebook as required. The students need to be guided in how to perform peer assessments as well and did get confused a few times where to put their feedback and what they needed to write (the rubric score and why they scored them that way). The question now becomes, what should I do next year? Do I use self and peer evaluation? The answer is yes. Even though not all of the students benefited from the intervention, many of the students who did put effort into the reflections and evaluations did see some sort of increase in score meaning an increase in learning and bettering their skills. With that in mind, I would continue to use this idea next year, however I might want to think about how I can better get the students to buy into the process so that more of the students see a benefit. Additionally, next year I need to build up the students’ self-assessment and peer assessment skills more and make sure that they are confident in what I am asking for. The way I would like to build these skill is to provide some added instruction on how to score graphs, notebooks and conclusions at the beginning of the year using samples with all of the students grading the same samples. That way the scoring might be more accurate and consistent with my expectations. This would be beneficial since I did have some students who weren’t sure if a paper met certain qualification. Some students did ask me about questions they had on grading with the rubric, however, I am sure that some students didn’t speak out about not understanding and just put down a score for the student. In the end after the classroom research project concluded, I still see students turning to their rubrics and guides in their interactive science notebooks and commenting 37 on how they wanted to make sure that they did the graph the right way because they know they didn’t always graph correctly (knowledge gained from looking at their assignments multiple times critically) and many of the students now have a better understanding of what is expected and what they could do to better their work in the classroom. It has made many of them more reflective in practice and the grade isn’t the only thing that they look at, I too have become more reflective in my ways of teaching and have been asking the students to take a second look at their assignments and to find what is wrong. I have found extreme value in allowing the students to find their errors and make corrections instead of just marking something wrong on the paper, and with the interactive science notebooks as I use them in class along with my handy stamper, I have gotten to a point in the classroom where students show me their work to be approved, with a stamp, or improved so they can receive a stamp. Believe it or not, once there is a stamp involved, and the students know that the stamp means they received full credit, they put effort into make the revisions and find where they made errors and the process comes full circle when you find them later on asking the same questions you asked them to their peers to help them figure out where they went wrong. It is an amazing thing to watch the students perform peer evaluation without them realizing it, even when they complained about performing peer assessments in class. 38 REFERENCES CITED 39 Black, P. & Wiliam, D. (1998). Inside the black box: raising standards through classroom assessment. Phi Delta Kappan, 80(2), 139-147. Cauley, K. M., & McMillan, J. H. (2010). Formative Assessment Techniques to Support Student Motivation and Achievement. The Clearing House: A Journal of Educational Strategies, Issues and Ideas, 83(1), 1-6. City Data Hackensack New Jersey. (n.d.). Retrieved April 4, 2016, from http://www.city- data.com/city/Hackensack-New-Jersey.html Fluckinger, J., Vigil, Y. T. Y., Pasco, R. & Danielson, K. (2010). Formative feedback involving students as partners in assessment to enhance learning. College Teaching. 58(4), 136-184. Hatziapostolou, T., & Paraskakis, I. (2010). Enhancing the impact of formative feedback on students learning through an online feedback system. Electronic Journal of e- Learning. 8(2), 111-122. Heflebower, T., Hoegh, J. K., & Warrick, P. (2014). A school leader’s guide to standards- based grading. IN: Marzano Research. Lizzio, A., & Wilson, K. (2008). Feedback on assessment: students’ perceptions of quality and effectiveness. Assessment & Evaluation in Higher Education. 33(3). 263-275. Marcarelli, Kellie. (2010). Teaching science with interactive notebooks. Thousand Oaks, CA: Corwin Press. Nicol, D. J. & Macfarlane-Dick, D. (2006). Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. Studies in Higher Education. 31(2), 199-218. O’Connor, K. (2002). How to Grade for Learning: Linking grades to standards (Second edition). Thousand Oaks, CA: Corwin Press. Pintrich, P. R. & Zusho, A. (2002). Student motivation and self-regulated learning in the college classroom, in: J.C. Smart & W.G. Tierney (Eds) Higher Education: handbook for theory and research (vol. XVII). New York: Agathon Press. Stiggins, R. (2005). From formative assessment to assessment for learning: A path to success in standards-based schools. Phi Delta Kappan. 87(4), 324-328. Stiggins, R. & DuFour, R. (2009). Maximizing the power of formative assessments. Phi Delta Kappan. 90(9), 640-644. 40 United States Census Bureau. (n.d.) Retrieved March 31, 2016, from http://www.census.gov/quickfacts/table/PST045215/3428680 Young, J. (2003). Science interactive notebooks in the classroom. Science Scope. 26(4). 44-57. VanHook, M. S. (2014). Consider this… Standards-based grading. Retrieved April 3, 2016, from http://www.msveducationalnetwork.com/educação/artigos-e-recursos-para- educadores/ 41 APPENDICIES 42 APPENDIX A INTERACTIVE SCIENCE NOTEBOOK GRADING RUBRIC 43 100% Notebook contents are neatly completed, all pages are numbered, titled and dated. Right-side/left-side topics are correct and contents are organized according to class guidelines. Table of Contents reflects ALL entries to date. Right-side notes go BEYOND basic requirements. 90% Notebook contents are neatly completed, all pages are numbered, titled and dated. Right-side/left-side topics are correct and contents are organized according to class guidelines. Table of Contents reflects all entries to date. Right side notes largely MEET requirements and some go beyond. 85% Notebook contents are MOSTLY NEAT and complete (at least 90%), pages numbered, titled and dated. Right-side/left-side topics are correct and organized with no more than 2 assignments incorrectly placed. Table of Contents reflects 90% of all entries up to date. Right side notes meet requirements. 75% Notebook contents are legible, complete (at least 80%), pages numbered, titled and dated. Right-side/left-side topics are correct and organized with no more than 4 assignments incorrectly placed. Table of contents reflects at least 80% of all entries to date. Right side notes nearly meet minimum requirements. 65% Notebook contents are sloppy or incomplete (50%), many pages are not numbered, titled or dated. Right-side/left-side is inconsistent and contents are unorganized with more than 5 assignments incorrectly placed. Table of contents shows limited attempts at keeping entries up to date. Right side contents incomplete. 55% Notebook turned in but too incomplete to score well. Majority of pages are missing or incomplete. Right side contents incomplete and missing. 0% Notebook not turned in. No evidence of work done. Notebook has inappropriate content or words written on the cover or pages. 44 APPENDIX B INTERACTIVE NOTEBOOK PEER EVALUATION FORM & GRADING 45 Interactive Notebook Peer Evaluation Form & Grading Date Pages Stamps Rubric Score Peer Initials Teacher Initials - / - / - / - / - / - / - / 46 APPENDIX C GRAPHING RUBRIC 47 Criteria Minimal (1) Basic (2) Proficient (3) Title There is no title Title is present but does not describe the data Title describes data by including how the independent and dependent variables are related Data Points Not all data points are accurate Data points are accurate but not connected Data points are accurate and connected by a line Independent and Dependent Variables Independent and Dependent Variables are not on the correct axis ---------------------- Independent and Dependent Variables are on the correct axis X -Axis Label Axis is not labeled or has incorrect label Axis has a correct label but no units provided Axis is labeled correctly with units present (when applicable) Y- Axis Label Axis is not labeled or has incorrect label Axis has a correct label but no units provided Axis is labeled correctly with units present (when applicable) Even Intervals Uneven intervals used One axis has even intervals Both axes have correct and even intervals Key (when applicable) Key is not present but needed Key is present but incomplete or un readable Key is present and accurately displays information 48 APPENDIX D LAB CONCLUSION RUBRIC 49 Category Minimal 1 Point Basic 2 Points Proficient 3 Points Advanced 4 Points Title Present but not focused Describes focus of the lab Purpose Problem unclear Clear but many details missing Clear but some details missing Clear with details listed. Background Needs to be relevant to the lab Only partially relevant Relevant with some detail Relevant using proper vocabulary and details Hypothesis Needs to be stated as a hypothesis Needs to relate to problem more directly Directly related to problem, needs back-ground information Directly related to problem, and has back- Ground information Relevant Materials List Listed but not appropriate for the lab Listed, but needs to be more complete Listed and complete, but need quantity Listed, complete and quantity included Independent and Dependent variables Both need to be identified One identified or, both identified but mixed up Properly identified Procedure Needs a logical sequence Logical sequence, needs to prove or disprove and need to be step by step Logical sequence, Able to prove or disprove, not step by step Logical sequence, Able to prove or disprove Need to be step by step Data/Observations No data included One specific data information included Limited data included from the lab Includes specific data from the lab to explain results Analysis Needs to explain trends and oddities in the data. Explains the trends and oddities in the data. Need to explain error source. Explain the trends and oddities in the data. Explain sources of error. Real Life Application No explanation of what was learned Explains what was learned, but needs to relate to your life Explains how what was learned relates to your life Mechanics/Grammar Many grammatical errors. Few grammatical errors. No grammatical errors. No grammatical errors. Includes graphics (such as graphs and data charts) 50 APPENDIX E FEELINGS ON FEEDBACK PRE AND POST TREATMENT SURVEY 51 SA = Strongly Agree A = Agree D = Disagree SD = Strongly Disagree 1. Upon return of a graded assignment or test the first thing I do is look at the grade. SA A D SD 2. I often use the feedback from my teacher to improve my future assignments. SA A D SD 3. When performing a peer assessment (or review) I always value the feedback given by my peers. SA A D SD 4. When provided with a rubric I always self-assess myself using the rubric before handing in the assignment. SA A D SD 5. When I get a graded assignment returned to me, I know what I need to do to improve my work. SA A D SD 6. I want to learn in this class and am interested in biology. SA A D SD 7. Grading criteria is clear to me and I understand how I can achieve higher grades. SA A D SD 52 8. Before I hand in an assignment I can tell if I have done a good job. SA A D SD 9. My peers help me when I need help. SA A D SD 10. Helping others means giving them my work to let them copy the information down. SA A D SD 11. I like performing peer assessments (peer reviews). SA A D SD 12. If I know I can redo an assignment I still put in full effort and try my best on that assignment. SA A D SD 13. I am able to take constructive criticism and better myself with it. SA A D SD 14. I care about how I do in school only because of grades. SA A D SD 15. If an assignment isn’t graded I don’t have to do it, or put in a lot of effort. SA A D SD 53 APPENDIX F FEEDBACK AND SELF-REFLECTION INTERVIEW QUESTIONS 54 1) When you receive an assignment back what do you do with it? 2) If a teacher gives you feedback on an assignment what do you do with this information? 3) What would make you more likely to use feedback given by a teacher? 4) When peer reviewing, do you value the feedback given by your peers? 5) When provided, do you use a rubric to evaluate your own assignment before handing the assignment in? 6) How do you like the interactive science notebooks? Do you find them helpful? Why/why not? 7) Do you have anything else you want to add about your feelings on feedback and self- reflection? 55 APPENDIX G INTERACTIVE SCIENCE NOTEBOOK SURVEY 56 1) I use an interactive notebook in the following classes a. English b. Math c. Science d. History 2) This is the first time I have used an interactive notebook in school a. Yes b. No 3) The interactive notebooks have helped me maintain/improve my grades a. Yes b. No 4) The interactive notebooks in my classes have helped me learn better a. Agree b. Disagree 5) I would like to continue using interactive notebooks in my future a. Yes b. No c. Maybe 57 APPENDIX H MONTANA STATE UNIVERSITY INSTITUTIONAL REVIEW BOARD APPROVAL 58