VARIETY OF STRATEGIES USED TO TEACH DATA ANALYSIS AND CONCLUSION WRITING IN FRESHMEN PHYSICS by Lori Shaaban A professional paper submitted in partial fulfillment of the requirements for the degree of Master of Science in Science Education MONTANA STATE UNIVERSITY Bozeman, Montana July 2019 © COPYRIGHT BY Lori E. Shaaban 2019 All Rights Reserved ii ACKNOWLEDGEMENT I would like to thank Liberty High school for the wonderful students available to educate and Hillsboro School District (HSD) for supporting me financially through many courses to earn a Master of Science in Science Education (MSSE). My professional learning community is composed of Deka Smith-Mernard, Tom Olen, and Laury Rodriguez. We have all worked together in modifying the curriculum for our students and collaborated on modification for next year. Diana Paterson at Montana State University (MSU) and Kristina Gantt at HSD were invaluable in signing up for courses and getting them paid while John Graves and Kate Solberg have been invaluable with advice as I developed my master’s thesis. All of the instructors at MSU have presented data analysis in many unique ways that I can bring to the classroom. I would also like to thank Bret Davis, Deka Smith-Menard, and Ron Hellings for helping me present information more clearly. Lastly and most valuable, my husband and children have supported me by doing many more household chores. iii TABLE OF CONTENTS 1. INTRODUCTION AND BACKGROUND ................................................................. 1 2. CONCEPTUAL FRAMEWORK ................................................................................. 4 3. METHODLOLOGY ................................................................................................... 13 4. DATA ANALYSIS ..................................................................................................... 23 5. INTERPRETATION AND CONCLUSION ............................................................ 355 6. VALUE ....................................................................................................................... 40 REFERENCES CITED ................................................................................................... 488 APPENDICES .................................................................................................................. 51 APPENDIX A: Informed Consent ........................................................................ 52 APPENDIX B: Pre-Test (Conclusion)................................................................ 544 APPENDIX C: Conclusion Rubrics and Sentence Frames .................................. 56 APPENDIX D: Modified Scaffolding for Conclusion ......................................... 59 APPENDIX E: Conclusion Strips ......................................................................... 61 APPENDIX F: L1.4 Circle Lab ............................................................................ 63 APPENDIX G: Matching Game ........................................................................... 67 APPENDIX H: Unit One Test (Conclusion) ........................................................ 69 APPENDIX I: Preliminar and Post Self-Assessment Survey ............................... 71 APPENDIX J: Check-up....................................................................................... 73 APPENDIX K: Semester Exam(Conclusion) ....................................................... 75 APPENDIX L: Matching Math and Physics Terms on a Graph .......................... 77 APPENDIX M: Graphing with Pipe Cleaners ...................................................... 79 APPENDIX N: Graph to Text .............................................................................. 81 APPENDIX O: Unit Five Test on Waves (Conclusion) ....................................... 83 APPENDIX P: What does the Number in the Equation Mean? ........................... 85 APPENDIX Q: Interview Questions .................................................................... 87 iv LIST OF TABLES 1. Rubric .......................................................................................................................... 16 2. Triangulation Matrix ................................................................................................... 17 3. Percentage of students who showed increased performance on each assessment ...... 30 v LIST OF FIGURES 1. Distribution of Special Needs and Distribution of Race ............................................... 3 2. Conclusion in the Patterns Based Physics Curriculum .............................................. 10 3. Example Conclusion in the Patterns Based Physics Curriculum ................................ 11 4. Activities that had Helped Students Analyze Data the Most ..................................... 24 5. Survey on Confidence ................................................................................................. 25 6. Survey on Helpfulnes to Understand the Physics Concepts ...................................... 25 7. Survey on Importance for a Scientist .......................................................................... 26 8. Comparison of Surveys to Performance ..................................................................... 27 9. Preliminary Assessment and Unit One Performance .................................................. 28 10. Overall Performance on Conclusion Writing ............................................................. 29 11. Average Scores on Writing a Claim Over the Year .................................................... 31 12. Average Scores On Providing Evidence For A Claim Over the Year ........................ 32 13. Average Scores on Explaining Reasoning for a Claim Over the Year ....................... 34 14. Average Scores on Extending the Lab Over the Year ............................................... 34 vi ABSTRACT Due to our data driven society, students should understand how to make sense of graphs and be able to apply them. Educators need to teach students how to analyze data, communicate that understanding, and pose new questions. The Next Generation Science Standards (NGSS) places a heavy importance on analyzing and interpreting data, constructing explanations, and engaging in an argument from evidence due to an increasing need for this skill in the labor force. Two classes of freshmen physics learned techniques in reading, analyzing, and interpreting data to understand physics concepts. They were taught how to spot trends in data tables as well as graphs and used www.desmos.com to find lines of best fit equations. They tried to understand what the equation represented and why the phenomena occurred. Then using their equation, they made a prediction and explained the reasons for their confidence in that prediction. Lastly, they thought of new experiments they could do based on this latest information and how businesses could use data similar. This is a modified version of the Claim-Evidence-Reasoning (CER) conclusion in science classrooms. Since many of the lab reports were done as a group, the action research assessed in this document was not done from students’ experiments, but data provided on students’ individual tests. Pre- and post-tests, surveys, interviews, and group discussions were reviewed. It was found many students began the course with the ability to make a prediction based on an equation. They quickly figured out how to spot patterns in the data to make a claim. However, the most challenging for students was to explain the phenomena and provide confidence in their prediction. Students did not say any one component was much harder or important than another. Students were overconfident in their ability to explain their confidence scientifically throughout the year. In all, students found a variety of activities helpful as they continued to grow throughout the year. 1 INTRODUCTION AND BACKGROUND Analyzing data is more important in this generation than ever before. “Seventy three percent of organizations have already or plan to invest in Big Data by 2016” (Marr, 2015, p.1). If one does not know how to read data in this era, they are at a disadvantage in the workplace and at home. Businessmen, engineers, scientists, technicians, and consumers use data daily to make decisions. This is the reason Next Generation of Science Standards (NGSS) encourages students to analyze and interpret data (Science Practices, NGSS, 2013). As a science teacher, it is my responsibility to educate students in making sense of information and how it relates to their life. Twenty years ago, many introductory physics labs were prescribed worksheets where the students modified various variables to see how they would affect the system (Robinson, 1997). While students did not have input in what they studied, they looked at various aspects of the problem and noticed trends in the data. Those students were not usually asked to find the mathematical equations, uncertainty, or how to make precise predictions based on the data. The freshmen physics course at Liberty High school has been based on Patterns Based Physics (Hill, 2013). The program teaches students to find patterns between independent and dependent variables. Then students write full conclusions from the common physics phenomena. They model the data with equations, make predictions, and state their confidence level of their prediction (Hill, 2013). Since NGSS emphasizes student communication, conclusion writing has been an important emphasis in the freshmen course (NGSS, 2013). To analyze more than one scenario during a lab, a few 2 groups will have one value for the control while other have a different value. During the final discussion, students observe trends that are common for all groups as well as how the control affected the result qualitatively before writing their own conclusion (Hill, 2013). The research conducted did not focus on increasing use of inquiry in the classroom, since there is a plethora of papers on the subject written over the last thirty years, nor did the study address writing procedures or data collection. Instead, the research focused on methods in teaching quantitative and qualitative data analysis to make predictions and understand the concepts. Data was collected on students’ ability to state a claim, provide evidence, understand the equation and physics, make a prediction with confidence, and apply the results throughout the yearlong course. Near the beginning and again at the end of the year, the data on ability was compared with students’ self-reported confidence and importance about each component of a conclusion. Added to the year-end survey, there was an opportunity to report which activities they felt helped them learn how to analyze data best. This study was taken in Hillsboro, a suburb of Portland, Oregon. There are approximately 1,560 students at Liberty High School in attendance for the 2018-19 school year. It is located in the suburbs next to high technology facility and farmland. The population is mixed income with a variety of backgrounds. The study was conducted in two freshmen physics classes where 8% needed extra help with basic material as they were either English Language Learners (ELL), special needs (SpEd), or had a 504 plan. Of all the students, 17% were provided more advanced materials for some work as they 3 were labeled talented and gifted (TAG) by the district. Students who were in the Advancement Via Individual Determination Program (AVID) were provided front row seats and extra guidance due to lack of an educated family to assist them in schoolwork. The class race is a mix of 52% White, 34% Hispanic, 5% Asian, and another 9% other races. Teamwork is encouraged and taught throughout the course (Figure 1). Figure 1. Distribution of special needs and distribution of race. The left graph shows the distribution of race while the right shows special needs, (N=51). As far as my background, I can share that I have been in the teaching profession for over eighteen years. Two-thirds of that time was spent teaching Algebra 1 to freshmen who struggled with relating graphs and tables to an application. The other third of my teaching career has been teaching a few other math courses along with freshmen physics, general physics, and AP Physics 1. I have observed through the course of my career that one of the biggest struggles for students has been being able to connect the mathematics and the physics. This is often the case in typical problem solving and data analysis. Thus, I wanted to determine the most challenging components of writing conclusions based on data and what activities best helped them improve. The components White 55% Pacific Islander 4% African Americ an 4% Hispani c 29% Asian 6% Indian 2% SpEd 2% Avid 2% Tag 17% ELL 4% None 73% 504 2% 4 reviewed were: claim, evidence, reasoning, and reflection. Students were graded on the claim they made based on the relationship between two variables on a test with data provided. Three ways they were graded on supporting their claim with evidence were the shape of the line of best fit, the data, and the equation. Explaining reasoning was graded using three components as well: an explanation of why the phenomena occurred, a prediction using this phenomena, and what affected their confidence level in that prediction. To determine the most do this, challenging components were identified and activities were developed to assist students in those areas. Their progress was monitored throughout the year and activities were continually added to increase student success. CONCEPTUAL FRAMEWORK Importance Teachers should explore new ways to teach data analysis as it is increasingly important to the population to understand. The educational community tests data analysis with rigor due to the increasing importance in society. In fact, the American College Testing (ACT) science section is more focused on student ability to analyze scientific information than about indicating scientific knowledge. In order to be successful on the data analysis portion of the ACT, students need a basic knowledge of the content and an exceptional ability to interpret data (ACT, 2018). The current trend in educating and testing our youth in science is no longer solely about facts, figures, and calculations. Oregon Department of Education (ODE) recently posted a new template and scoring guide for scientific inquiry designed by the North Clackamas School district. The data and analysis portions require students to revisit the reason for the lab, use scientific knowledge to indicate trends, explain the science behind 5 the results, review data collection for errors, and extend their thinking to new areas of investigation (ODE, 2018). Also, the College Board abandoned the Physics B test which tested students’ ability to manipulate numbers on the AP Physics 1 exam. The new exam emphasizes students’ ability to make scientific arguments. The sixth science practice of the AP Physics 1 Curriculum asks students to compose an argument with three components: claim, evidence, and reasoning (CER) (CollegeBoard, 2016). Data Analysis Defined A definition of data analysis should be clarified before deciding how to teach the topic. The process of analysis is breaking something large into smaller pieces for better understanding. Therefore, when a scientist takes raw data and turns it into practical information which people can use to make decisions; this is data analysis. This process is done to test hypothesis or to answer general questions (Judd & McCleland, 1989). Data analysis should be looked at through two different lenses. One is the quantitative, where the researcher uses numerical values and statistics to compare differences and similarities. The second is qualitative, which describes what is happening in the situation as one would tell a story (Texas Higher Education Coordinating Board, 2017). To study data analysis and interpretation, the essential components were broken down by many educational committees: Next Generation Science Standards (NGSS), American College Testing Agency (ACT), Oregon Department of Education (ODE) and the College Board. To provide the most comprehensive program, one should synthesize all the methods. The NGSS determined the important science and engineering practices students should know after passing each grade level. While the fourth NGSS practice below is named Analyzing and Interpreting Data, it is not the only practice discussing the 6 results of the data. The last five practices for NGSS were included in this study since they have an integral role in sharing of their findings with others. The eight practices of science and engineering that the Framework identifies as essential for all students to learn and describe in detail are: 1. Asking questions (for science) and defining problems (for engineering) 2. Developing and using models 3. Planning and carrying out investigations 4. Analyzing and interpreting data 5. Using mathematics and computational thinking 6. Constructing explanations (for science) and designing solutions (for engineering) 7. Engaging in argument from evidence 8. Obtaining, evaluating, and communicating information (NGSS, 2013, p.1) The ODE and AP Physics 1 science practices add, ‘refining thoughts and extending the situation,’ once they understand the data. This was incorporated into the reflection component of student conclusions in this capstone research as well. The ACT assesses student ability to read graphs, therefore, ACT practice questions were sprinkled throughout the course as they provide students with opportunities to use context clues to understand the patterns. When analyzing data, scientists find, and use, a mathematical model that best represents the graph. With this equation, they explain the science behind the experiment, make a prediction, and possibly optimize a design. To give validity to their prediction, they use the graph and data table as evidence to prove this result. Science students use these practices as well and should learn to persuade others by demonstrating their reasoning. Current Challenges in Physics Classrooms Physics textbooks have done a good job explaining how to solve problems and explain content. They even provide many problems involving interpreting graphs. An 7 analysis of 10th through 12th grade physics textbooks in Yemen, however, revealed their physics books for 10th graders lacked an emphasis on teaching graphical analysis. Instead, they focused on experimenting and defining the terms (Aziz, 2010). Throughout the process of analyzing data and interpreting results, students have two main struggles that need to be taught with or without textbook support: making sense of the data and explaining the results in a scientific, yet meaningful way. Many textbooks do a poor job explaining how to make sense of data from experiments. Students tend to struggle reading graphs. Even college students struggle interpreting the best-fit line and connecting it to scientific concepts (Nixon, 2016). Some of the students in Nixon’s study thought the line of best fit was a way to average the values taken to minimize error. Many students generalized the line of best-fit as the line indicating the pattern between their independent and dependent variables but did not explain what slope meant in physical terms. When taking a test, only seven percent students in Nixon’s study indicated what the slope represented (including the units), then explained what the term meant (N=28). They also made the mistake of finding a data point when they should have found a range if they understood the relationship between the variables. High school students have similar difficulties. Geae found three categories students confuse when reading and interpreting graphs: interval/point, slope/height, and iconic (1990). Interval/point confusion is where students find one point instead of an interval (Gaea, 1990). For example, a student may be confusing an interval with a point if asked to find when the velocity is positive, they respond with one time rather than a time interval. Also, when given a velocity vs. time 8 graph, students should know the area under the curve represents displacement and the slope at a particular instant represents acceleration. They should also know that a point represents the velocity at that moment. Eshach includes a few more struggles dealing with reading graphs: comparing two graphs, and matching narratives to graphs (2014). Freshmen in our courses are taught all of the above, including maximum and minimum, except for finding the area under the curve. A comparison between reading mathematical graphs and kinematic graphs showed that, “Mathematics and physics questions requiring the same graph operations may have different appearances, which retard transfer of existing mathematics knowledge to kinematics” (Phage, 2017, p. 209). In 2016, Ivanjek, Susac, Maja, and Andrasevic analyzed students reading and interpreting three domains of graphs: mathematical, physics, and another context. Many students tried to use equations when reading physics related graphs, even though equations were not required. On the other hand, in many cases they were able to analyze situations in which they had no experience. Having seen an equation for a similar physics graph may have led them to believe it was an analogous situation. An example of this was thinking a line going up represented a hill, or slope always represented velocity when the axis did not indicate this. These findings coincide with those of Geae and Eshach. In addition to these issues, high school freshmen have tended to find the slope of a graph by counting squares instead of reading the values which the axis represents. This knowledge is from my 17 years’ of experience teaching freshmen algebra. Many of the studies discussed were conducted with college freshmen, so it was interesting to see how many similarities they have with high school freshmen. 9 There are a couple of components of data analysis that tend to get marginalized by students. One is stating the certainty of their results (Bowen, & Bartley, 2014). This helps give validity to the claim without overstating it. If the data is spread out, the uncertainty will increase. In the freshmen curriculum at Liberty, certainty in your prediction from the best fit is labeled “confidence” (Hill, 2013). When students find a difference in data, they should be able to distinguish if it is statistically different or if it is substantively different. In other words, students are contemplating the boundaries of their mathematical model. Teaching Strategies Much research has been done regarding reading kinematic graphs; however, it is necessary to learn to read all graphs not just the traditional kinematic graphs. This is analogous to encountering new text. Roth Wolff-Michael explored the relationship between teaching students how to read graphs versus texts and found many parallels. One example is when a student first reads a text or graph, they notice the topic and structure (Wolff-Michael, R., 2002). If they are familiar with various topics and structures, they can look for signs such as conjunction words in a sentence to see cause and effect. In a graph, they see the curve of best-fit then look at the axis to see what the inflections in the graph represent. If they are not familiar with the structure or the topic, they start breaking it up piece by piece. Good readers and scientists do this naturally, but the process of reading each element of a graph and deciphering its meaning should be taught to maximize efficient graph reading. Also, the more a person has seen a style, the easier it becomes to read, as long as they take precautions to verify the vocabulary. 10 Once students know how to analyze and interpret graphs, they can communicate understanding of the phenomena more clearly. A claim, evidence, reasoning approach in the form of a conclusion is common, while student talk and teacher led discussions are beneficial as well. The challenge comes when the students are tired from the lab and are asked to write a CER conclusion; they see the conclusion as daunting busywork. In order to address this in my classes, a few methods were tried last year. One was to write the conclusion the next day, another was to color code an example lab conclusion to increase organizational structure (Figure 2). Figure 2. Conclusion in the Patterns Based Physics Curriculum. In the conclusion, students state the answer to their original question. This can be an explanation for the reason for why certain variables behave the way they do, or the Skeleton of a Conclusion: Claim Clearly state your conclusion. Evidence Explain how the data you cite supports your claim Mathematical Model with Reasoning Communicate the mathematical model that behaves the same as the system you investigated. Along with the model you need to describe your reasoning about… What the # represents in the real world Why the pattern makes sense Generalized equation in words instead of letters. Prediction Communicate how the system you behave for the scenario presented at the beginning of the experiment. Justification Explain your thinking for your confidence in using your data to predict the future behavior of the system. Research Extension Question Use your experience with this investigation to create a thoughtful and interesting follow up experiment. 11 explanation of how variables are related (Figure 3). Students in Pattern Based Physics typically state the relationship between the x- and y- axis as one of the following: horizontal, proportional, linear, quadratic or inverse (Hill, B. 2013). Often students get together to explain the results in a “Board Meeting” before writing the conclusion. This provides collaboration and processing time for each student. This approach does very well at teaching students how to find patterns in the data. Figure 3. Example conclusion in the Patterns Based Physics Curriculum. 12 The AP Curriculum as well as the Pattern’s Based Physics Curriculum have students state the evidence. The student analyzes the data they collected by using it to back up their claim. The AP Physics 1 Workbook provides an example of a student backing up their clam verbally; “We analyzed the position verses time. As you can see in our graph, the slope of the tangent at various points in the curve does not change uniformly” (College Board, 2016, p. 68). The AP Physics 1 Science Practice have students spend more time thinking about the mathematical and conceptual physics involved, while the freshmen patterns focus more on the mathematical relationships. Barstow, Fazio, Schunn, and Ashley conducted research regarding how to teach scientific writing to psychology majors (2017). They provided computer aided diagraming of the components for their research papers which was composed of color- coded sections in which the students created a web on the computer to show their train of thought. The results were phenomenal and could be brought to a high school level as only a sentence or two was on each section. Barstow’s students then turned these diagrams into conclusions. Students who used diagramming methods had at least 50% more mentions of population, context, comparisons, sample size, the design, confounds, and evaluation per paper than those who did not use the diagraming method. Another way to teach data analysis is through analogies, extreme cases, and thought experiments (Stephens & Clement, 2010). Analogies can be used to tie concepts together. The analogy does break down eventually but helps solidify the concept. Thought experiments are not defined, but can be used as students kinesthetically manipulate objects and imagine the forces. 13 In conclusion, when analyzing data, students struggle interpreting physics related graphs and explaining their results. They need to be specifically taught the signs on a graph and how to read them as well as when to find the slope, or read a single point. Once taught, students need practice creating narratives and comparing graphs. After students have read the graph, they make a claim about the relationship between the variables. Then they site evidence on the graph or table and provide an equation with explanations. They give a logical reason for their confidence and for the physics involved with the phenomena. To carry out this difficult task, students can make a flow chart diagram, have a group discussion, or use color-coding to organize their thoughts. Lastly, through these processes, students pose more questions that extend their learning beyond the classroom. This leads to quality CER based conclusions that show their understanding of the phenomena they researched. Unfortunately, I did not discover diagramming with flow charts until mid-year, so that was not included in the research. Instead, many other methods were used to determine which ones had the greatest impact on helping students write scientific conclusions based on data. METHODLOLOGY The study was conducted over the entire year as the students were provided with a strong introduction in the first unit and various activities to improve their skills afterwards. (Appendix A). While students did many activities throughout the year, the activities explained below emphasized data analysis and conclusion writing the most. This research only analyzed the part of the test where students analyzed a set of data and wrote a conclusion. They took the Pre-Test the second week of school (Appendix B). The 14 main focus of the first unit was conducing labs, recognizing patterns, and writing conclusions. The Unit One Test was in October along with the preliminary self- assessment survey. There were only a few activities conducted in the second unit on motion before the Check-up in early November. The third and fourth units provided opportunity for students to practice analyzing data, but the emphasis was not on conclusion writing. The Semester Exam in late January was the next time an assessment was provided on graphical analysis and conclusion writing. The fifth unit comes back to data analysis as they learn of wave behavior. In early May, the students took the Unit Five Test and the post self-assessment survey. The research conducted for this study revealed challenging components of analyzing data and determined how well students responded to instruction in writing conclusions. All freshmen take the course, regardless of ability level. During the first weeks of school, students took a preliminary assessment (Appendix B). The preliminary self- assessment survey showed their understanding of what a conclusion entailed along with their ability to read and explain their thinking. It revealed that their conclusion writing was extremely poor. Therefore, students were provided a rubric for their lab reports with sentence frames, then a general outline of conclusion components (Appendix C and D). Students were expected to write a CERR (Claim-Evidence-Reasoning-Reflection) on all labs and selected assessments. The collaborative labs were not used as assessments in the research since the analysis was based on individual ability. Since actual labs were not used, reflecting on errors and how their ideas have changed since they began the experiment were not included in this research. A rubric for assessing CERR work was 15 created for this research to monitor students’ progress throughout the year where each component has two possible points (Table 1). The data collection methods are in the Triangulation Matrix (Table 2). 16 Table 1 Rubric 1 Point (example) 2 Points (example) Claim Pattern is stated. (The graph is linear- proportional) Accurate complete sentence describing the claim. (The relationship between acceleration and change in velocity is proportional) Evidence: Shape One description of shape. (The graph decreases, or there is no y-intercept) At least two descriptive features or higher understanding. (As the acceleration increases the change in velocity increases at a constant rate. When acceleration is zero, the change in velocity is zero) Evidence: Data Table Note the in the table. (As … doubles… halves) Also provides an example from the data (As... doubles...haves. For example when… goes from ... (units) to …(units), … decreases by half from … to …) Evidence: Equation Equation written using physics symbols not x and y. (v=#a) or (velocity equals # times acceleration) Also explains what the number might mean in the equation using units or conceptual understanding. (v=#a. The number is the same as the constant time it took the object to travel, thus the equation is most likely v=at) Reasoning: Why? Basic understanding of why. (When …. increases … decreases.) More depth and thought shown in explaining the phenomena. Possibly using units. Reasoning: Prediction Used the data table, graph, or equation to make a prediction but forgot the units or did not show work. Work is shown on finding a prediction and units are shown. Reasoning: Confidence Used one factor in stating how confident they are in the conclusion. (The prediction is within the data range or the line-of- best fit is through the center of all uncertainty boxes for high confidence. Answer includes both whether or not the prediction is within the range and how well the data fits the curve. Possibly includes an understanding why data far outside the data collection range will not be modeled by the equation. Reflection: Extend Provide a basic example extending the lab or providing a use outside the classroom. This may be one we mentioned in class. Provide a useful or thoughtful lab or use of the data outside the classroom. 17 Table 2 Triangulation Matrix 1 2 3 Focus Questions Measure Ability Measure Confidence Qualitative Data Primary Question What are effective methods for teaching data analysis? Secondary Questions 1. What are the most challenging components when analyzing data? Pre-Test Unit One Test Check-up Semester Test Unit Five Test Preliminary Self- Assessment Survey Post self- Assessment Survey Interviews 2. How have students improved their ability/confidence to provide… A scientific claim Scientific evidence Scientific reasoning Lab extension or application. Pre-Test Unit One Test Check-up Semester Test Unit Five Test Preliminary Self- Assessment Survey Post self- Assessment Survey Interviews 3. Which activities helped students learn to analyze data Pre-Test Unit One Test Check-up Semester Test Unit Five Test Post self- Assessment Survey Post-survey interviews Timeline for Unit One www.Desmos.com Lab 1: Reorder Conclusion Confidence Table introduced Lab 2&3: Discussion groups Lab 4: Did not conduct the lab, only analyzed data 2 Matching games Unit One Test and Preliminary self-assessment survey (October) To find the equations for their graphs they found in labs, students used the online graphing program www.desmos.com. Most students are concurrently enrolled in Algebra 18 1; therefore, they find it reassuring to even use desmos.com when checking their equations of proportional lines. Students completed a total of four labs before the Unit One Test, in which they practiced working with situations that compared linear- horizontal, linear proportional, inverse, and quadratic patterns. After each lab, students collaborated with each other, discussing their results before writing the conclusions. For the first, they only reordered slips of paper that had the conclusion written out (Appendix E). This method was chosen so they would think about the parts of the lab without the effort of writing so many first-year students have an aversion to. A confidence table was introduced to explain higher confidence comes from making a prediction where the data lies inside the data set and the curve of best fit passes through the center of the data points. ‘R-values’ were not discussed, but the level of confidence was a lead into more advanced statistical analysis. For the second and third labs, they had discussion cards that provided generalized sentences they use in English class. The discussion groups had a manager who made sure all of the students shared their ideas. To accomplish this, each student wrote their name on a piece of paper. They would share an idea and give one paper to the manager, they waited until the rest of the group shared before they shared again. Once everyone shared, anyone could express their ideas. In past years we have noticed many students seemed tired at the end of a lab; therefore, the fourth lab was turned into a data analysis situation where they figured out what the lab was about from the data and focused on writing conclusion instead of collecting the data themselves (Appendix F). Thinking this might be overwhelming for 19 the lower students, we allowed students to actually conduct the lab if they chose. However, just as predicted, they did not put much effort into writing the conclusion since they had more work to do in collecting the data, graphing, and writing the conclusion. Just before the Unit One Test, students played two matching games. The first game showed linear-proportional, linear-horizontal, linear, quadratic, and inverse patterns in various forms on each card (Appendix G). The forms were graph, table and equation and situation. Pairs of students organized the cards in whatever way they felt was appropriate. They were encouraged to explain their reasoning with their partners as they debated the organizational structure. Afterwards, they graduated to matching piecewise distance verses time situations, tables, graphs, and equations. A t-test was conducted to compare the preliminary assessment and the Unit One Test along with each subsequent assessment (Appendix H). The student preliminary self- assessment survey was administered just after the Unit One Test (Appendix I). The survey asked students about their confidence, helpfulness, usefulness to scientists, difficulty, and interesting components of conclusion writing. A mark of four represented very confident and useful, and a one represented no ability or usefulness. There was space for students to comment as well. Timeline for Unit Two (before the Check-up) Lecture on graph reading Group analysis with discussion cards Check-up (Early November) The second unit was on the kinematics of motion. Since students struggled to understand a situation by reading the graphs in unit one, therefore, a lecture was 20 conducted on reading x- and y- axis, y-intercepts, slopes, as well as using units to understand the number in the equation generated from a graph. Data gathered during Unit One also revealed that students needed guidance to provide sufficient reasoning for their claims including using the shape and data to support the general pattern/relationship. They also did not explain what the components of the equation meant. To assist them in their detailed explanations, a game was created where they worked in groups to come up with the best wording for each component of the conclusion. Each group was formed with mixed ability so the struggling students could listen to the wording the more advanced students had. A person from each group read the claim the group had written on a medium sized whiteboard then another student shared the answer with the class. Two points were awarded for a claim that explained the situation completely and concisely. One point was awarded for part of the claim. We then moved on to the next component of a conclusion where a different person read the whiteboard. As the game progressed, the class determined the number of points awarded to each team. All students had a turn sharing what their team wrote on their whiteboard. This group analysis activity allowed students to listen and share using scientific language. Students enjoyed the competition and were able to demonstrate their learning the next day during a conclusion “Check-up” (Appendix I). The name “Check-up” came from the notion of checking up on the students’ progress without the fear of losing points on a test. They looked at a time vs. velocity graph showing a person traveling to the same location using different modes of transportation. Afterwards, they wrote a conclusion to determine the relationship between how long it took to arrive at the same location. This was not a 21 traditional velocity vs. time graph, nor had they done any lab corresponding to the same independent and dependent variables. The students worked on it independently then talked it over with me as they turned it in. This allowed independent time with each student and the opportunity to give verbal answers in their conclusion. Timeline for Units Three and Four Various labs, not conclusion focused Semester Exam (Late January) Between the Check-up and the semester exam, little work was done on conclusion writing. Although labs were conducted, few conclusions were written down. One of the semester exam questions tested the students retention on analyzing and interpreting data (Appendix K). Timeline for Unit Five Relating math and physics from a graph Creating graphs using pipe cleaners Comparing how they read texts and graphs Worksheet on using units to understand the coefficient in an equation Unit Five Test, Post Self-Assessment Survey, and interviews (Early May) On the semester exam, students still seemed to struggle translating their graph reading they learned in mathematics to physics, so right before unit five, between energy and waves, they took a team test where they matched points on a damped oscillating trampoline system using math terms and physics terms. This test also gave students an opportunity to use their own phrasing to explain the points to make more connections to the concepts. The result of this activity was a better connection between their physics and mathematics courses and an introduction to wave motion in Unit Five (Appendix L). To understand that high frequency graphs take more energy to create students made 22 displacement vs time graphs of a high and low frequency waves with pipe cleaners (Appendix M). Before completing their final assessment for graphical analysis by taking their Unit Five Waves Test, students learned there are many similarities to reading text and graphs (Appendix N). In groups, students produced their own strategies on reading graphs then text. Afterwards, they found strategies that were the common for both. Students shared which they found easier to read: graphs or texts. I assigned students to take the situation they found harder and make it into the one they found easier. For example, a student who found it easier to read text, read an example graph and turned it into a story, and vise versa. One last effort was made to help students use units on the axis of graphs to determine the units of the slope and what they mean. The worksheet was overwhelming for the most students in the first period, so was only offered as extra credit in the second period. Then they took the Unit Five Test (Appendix N). A post self-assessment survey was conducted after the Unit Five Test along with interviews with ten students. On the extension to the survey students chose and shared how four activities taught them to analyze data more effectively. To avoid overwhelming the students, activities from the year were selected that seemed to impact their understanding of data analysis the most. Six activities were conducted before the Unit One Test, two before the Check-up, none before the Semester Test, and four before the Unit Five Test. Interviews were conducted to dig deeper into why certain activities helped more, what they were still confused about, how data analysis could be taught even better next 23 year, and what they improved most on in physics this year. Interviews were conducted during a group activity. Students were selected to be interviewed based on the following criteria: variety of ability level and variety of outward enthusiasm in the class, as well as students who had not had end enough one-on-on attention in the course. The total number of students interviewed was 9. Interviews each lasted approximately ten minutes and students interviewed one on one. The question can be found in Appendix Q. DATA ANALYSIS Survey Results Self-assessment surveys were given after the Unit One and Unit Five test as well as 9 interviews formally conducted the day after the final survey. While there were a combined 66 students in my two classes, only students present for all surveys and assessments were used in the survey (N=51). Not included in the report are questions on the survey that ranked the hardest, easiest, and most important components of a conclusion; as many freshmen did not follow directions on the survey and completed these sections incorrectly. What is included are rankings for each component of a conclusion on confidence, how much the section helped them understand the physics concept, and importance to a scientist. The Post Self-Assessment Survey also included a list of activities where graphical analysis and conclusion writing were central and students chose the activities they felt were most influential in helping them learn how to analyze data. There were five types of relevant activities just before the unit one test, two activities before the Check-up, none before the semester exam, and four before the final 24 Unit Five Test (Figure 4). The activities students stated as helping them the most were in the first unit; this is also where the greatest growth occurred. Note, there were no activities specifically designed to help students with conclusion writing between the Check-up and the Semester Exam; although students conducted some labs and wrote conclusions. Less than 20% of students said the conversation cards and re-ordering a conclusion were the more helpful activities. Figure 4. Activities that had helped students analyze data the most. Appendix reference in parenthesis. Average students’ confidence in their abilities increased between the Unit One Test and the Waves Test for all categories from the Likert Self-Assessment Surveys. (Figure 5). Since the categories all increased similarly I calculated the overall averages instead of individual components. The mean confidence level after the Unit One Test was 2.7 with a standard deviation of 0.1 while the Unit Five Test confidence was 3.1 with a standard deviation of 0.1. Before Unit 5 Test Before Check-up Before Unit 1 Test 25 Figure 5. Survey on confidence. No Clue how to write the section = 1. Definitely know how to right the section = 4. The results were slightly higher for the survey on how much each component of the conclusion helped them understand the physics concept. The arithmetic mean was 2.8 with a standard deviation of 0.1 after the Unit One Test, and 3.1 with a standard deviation of 0.2 after the Unit Five Test (Figure 6). Once again, there were no significant gains in student perspective over the year in one category over another. Figure 6. Survey on helpfulness to understand the physics concepts. Not at all = 1. It is definitely helpful = 4. Additionally, students expressed how important each section was to become better scientists. There was even greater gain in appreciation of the importance of components in a conclusion for a scientist than the previous surveys on confidence and 0.00 1.00 2.00 3.00 4.00 Claim Shape Data table Equation Why? Prediction Confidence Extend A ve ra ge S u rv ey R es u lt Unit 1 survey Waves test survey 0.0 1.0 2.0 3.0 4.0 DESMOS Claim Shape Data table Equation Why? Prediction Confidence Extend A ve ra ge S u rv ey R es u lt Unit 1 survey Waves test survey 26 helpfulness. The students at the Unit One Test rated the importance an average of 2.5 with a standard deviation of 0.05 and at the end a 3.0 with a standard deviation of 0.06 (Figure 7). This is an overall gain of 20%. Over the year students, increased their appreciation for Desmos, understanding the pattern (claim), stating their confidence and extending the lab the most. These all had gains of more than 24% over the year. For every survey there was gain in confidence, usefulness, and appreciation for all categories. Figure 7. Survey on importance for a scientist. Not at all = 1. It is definitely important = 4. The last analysis looked into whether or not student confidence, feeling of importance for them, and feeling of importance for scientists correlated with their improvement. There was little correlation between performance gains and student perception (Figure 8). Since the first survey was conducted after the Unit One Test, the Unit One Test was set as the baseline for comparison of the surveyed responses and performance increase. Most notable is the drop is students’ ability to explain how confident they were in their prediction scientifically between the Unit One and Unit Five Tests with a drop of 8%. 0.0 1.0 2.0 3.0 4.0 DESMOS Claim Shape Data table Equation Why? Prediction Confidence Extend A ve ra ge S u rv ey R es u lt Unit 1 survey Waves test survey 27 Figure 8. Comparison of surveys to performance between the Unit One Test and Unit Five Test. Student interviews echoed student surveys and abilities as 55% of student interviewed stated the most challenging component of data analysis was understanding, “what the missing value means in the equation” (Appendix Q). This ability would change a “writing equation” score from a one to a two. Of those interviewed, 11% said writing the conclusion was confusing for them, and when asked where they improve most this year in physics, one student said, “everything.” Others mentioned graphing and paragraph writing. One said, “we did a lot of graphs at the beginning of the year. It confused me so much. Now I see it, and its easy.” General Trends Five assessments revealed students’ ability to write conclusions throughout the year (N=51). They earned a scored between a zero and two with the extreme exception of three (Table 1). At the beginning of the study, the student strengths were stating a claim and making predictions with an average score of 0.57 and 0.92 respectively (Figure 9). All other areas of data analysis and interpretation had average scores lower than 0.4 at that time. By the Unit Five Test, the student strength was still stating a claim with an 28 average score of 1.47, and student weaknesses were understanding why the phenomena occurred and expressing confidence in their prediction scientifically with average scores of 0.90 and 0.47 respectively. Extending the data was weak as well with an average of 1.04 by the Unit Five test. Figure 9. Comparing the performance on the Preliminary Assessment (white) and Unit Five Test (Black). Average growth for each main component of data analysis can be seen throughout the year (Figure 10) The performance gains overall appear to be great in the first unit and increase slightly throughout the year. To increase student understanding of expectations, the components of a conclusion were provided on all tests after the preliminary assessment. The average normalized gain between the Preliminary test and their best score was 77%. Additionally, if one looks at the gain between the Unit One Test and the Unit Five Tests, the normalized gains were 16% overall. Notably the Semester Tests scores were lower across the board with a t-test proving the shape, reasoning why and prediction all decreasing for the general population (p<0.01). Some students were still improving in most categories on the final test. The Check-up did not include explaining 0.0 0.2 0.4 0.6 0.8 1.0 1.2 1.4 1.6 A v er ag e P o in t S co re 29 their reasoning of their prediction or extending the lab. There were a few students who refused to allocate effort to write conclusions on labs or tests but completed the Check- up. It was not only a stand-alone assessment, but shorter and the grade for the course reflected how hard they tried instead of performance. While there appears to be no change between the overall Check-up and Semester Test performance, the Checkup had a total of 10 possible points and the Semester Test 16 possible. Figure 10. Overall performance on conclusion writing. Maximum score was a 16 except for the Check-up as it had a maximum score of 10. The table below shows the percentage of students whose scores increased for each assessment in order to reveal the activities that correlate with growth for each component of graphical analysis and conclusion writing. Greater than 20% of students increased in every category on the Unit One Test. From there, student gains fluctuated differently for each assessment and component of the conclusion. A t-test was compiled to see if there was a statistical difference between the results for the performance for each category and is discussed in the proper category to follow (Table 3). 0.00 2.00 4.00 6.00 8.00 10.00 Pre-Test Unit 1 Check-up Sem. Test Unit 5 Claim Evidence Reasoning Extend 30 Table 3 Percentage of Students Who Showed Increased Performance on Each Assessment Unit One Check-up Sem. Test Unit Five Claim 56 37 12 0 Evidence: shape 62 62 8* 2 Evidence: data table 50 62 6 8 Evidence: Equation 31 8 10 19 Reasoning: Why? 31 48 8* 0 Reasoning: Prediction 21 N/A 19 6 Reasoning: Confidence 35 N/A 19 10 Extend 48 N/A 25 8 Note. Italic font indicates statistical correlation on a t-test from the prior assessment. Statistically less students improving. Claim Some of the early gains on conclusion writing came from stating the claim with 56% of students on the Unit One Test increasing their score (Table 3). From then, the gains in stating a claim did not increase as drastically as many students already showed a mastery of the full two points by the Unit One Test. However, 37% of students showed gains on the Check-up and 12% made gains on the Semester Test. By the Unit Five Test, there were no longer any gains in stating the claim. The t-test found there was a significant difference in their abilities to state a claim between the preliminary test and the unit test as well as the Check-up (p<0.01). Not only were there fewer students making gains on stating their claim for the last two assessments, but the average score of 1.5 was 11% lower than their average score on the Check-up (Figure 11). Over 50% of students surveyed indicated www.desmos.com as helping them understand how to analyze data better (Figure 4). This program helped students see the pattern and the form of the equation. They did a variety of labs with various patterns before the first unit test as well as graph matching that showed the pattern as an equation, graph, and situation. After the Unit One Test there was a small 31 lecture on reading graphs, but no activity associated with it. While only 19% and 26% of students said that conversation cards and group analysis game help them analyze data better, there was a 55% increase in students score as a whole from 1.1 to 1.7 average point score on writing a claim between the Unit One Test and the Check-up. These were the only activities focused on conclusion writing between the assessments. The Check-up was a situation the students had not seen before and the claim is quoted by a student on their assessment, “We discovered the relationship between speed and time for all ways of transportation is inverse.” This quote was like those made during the group analysis, yet before the doing the group analysis activity, a typical student response when asked for the claim was, “The relationship is inverse.” Figure 11. Average scores on writing a claim over the year. Evidence Students made great strides in increasing performance on providing evidence for the pattern they claimed using shape, data table, and equation (Table 3). The number of students who increased their score on the Unit One Test in using shape and the data table were 62% and 50% respectively. Sixty-two percent of students gained in each of those two categories on the Check-up as well. The gains in providing the proper equation with 0.00 0.50 1.00 1.50 2.00 Pre-Test Unit 1 Check-up Sem. Test Unit 5 A v er ag e P o in t S co re 32 appropriate symbols or words was lower than the other two with 31% of students increasing after the first test, then less than 10% growth on subsequent tests. The least number of students increased their score on the Semester Test with less than 11% for each component. Students averaged a gain of at least 0.8 from the pre-assessment to the Unit Five Test on all three components in supplying evidence (Figure 12). Student scores statistically increased on using the shape to explain their reasoning for their pattern for each assessment unit except the last two tests (p<0.01). There was a statistical decrease in the number or students improving using shape to provide evidence on the semester exam. The gains in using the data table as evidence are only statistically significant up to the Unit One Test. Interestingly, the significant gains for writing equations came on the first Unit Test, then the Unit Five Test where the average scores increased 0.5 and 0.7 respectively (Figure 12). The drop was in the Semester Test as it had average scores lower than the Check-up in all categories. Figure 12. Average scores on providing evidence for a claim over the year. 0.00 0.50 1.00 1.50 2.00 Shape Data Table EquationA v er ag e P o in t S co re Pre-test Unit 1 Check-up Sem. Test Unit 5 33 Reasoning As a whole, students increased their ability to reason throughout the year. Note, they were not asked to make a prediction or share their confidence in their prediction for the Check-up. More than 20% of students increased their reasoning skills on the Unit One Test (Table 3). Forty eight percent of students increased their ability to explain why the phenomena occurred on the Check-up with an average score of 1.6 out of two. The Check-up was handed to me personally and if the reasoning section was not filled out, students had the opportunity to answer this question verbally. Many students when asked, stated the reason correctly, “Since the person is traveling faster, they will be able to get there in less time.” The Check-up did not have as many components on the conclusion either as they were not asked for a prediction, reason, or extension. Of all the students, 19% increased their ability to predict on the Semester Test. Students started with a general idea how to make a prediction if they had an equation, with an average of 0.92 points on the preliminary assessment (Figure 13). To get the prediction correct, the students either had to have an equation or estimate the results from the graph. Graph estimating was possible on the Semester Test and the Unit Five Tests only. Lastly, most of the gains on explaining confidence occurred on the Unit One Test. This was right after they were provided a table on how to determine their confidence. While the table was not brought out again, students did not lose their ability to share their confidence, nor did they improve. The t-test revealed a significance between the Pre-Test and the Unit One Test for explaining why and sharing their confidence only (p<0.01). There was no 34 significant correlation between the Pre-Test and Unit 1 Test in predicting an outcome; recall this was their highest scoring component on the Pre-Test. Figure 13. Average scores on explaining reasoning for a claim over the year. Extend Since the assessments were not on the actual labs, reflections did not include error analysis, but students did brainstorm a lab extension or use for someone outside the classroom. There was a statistical significance between the Pre-Test and the Unit One Test only (p<0.01). Once students knew they were expected to provide an extension due to scaffolding, the average score increased to above 0.94 points (Figure 14). Twenty-five percent and 8% of the students increased their scores on the Semester Exam and Unit One Five respectively (Table 3). Yet, the average scores dropped on the Semester Test. Figure 14. Average scores on extending the lab over the year. 0.00 0.50 1.00 1.50 2.00 Why? Prediction Confidence A v er ag e P o in t S co re Axis Title Pre-test Unit 1 Check-up Sem. Test Unit 5 0.0 0.5 1.0 1.5 Pre-test Unit 1 Checkup Sem. Test Unit 5 Waves A v er ag e P o in t S co re 35 INTERPRETATION AND CONCLUSION Survey Results Students’ feelings of how various components were important, difficult, or useful did not result in any statistical trends with performance, yet all increased as a whole. Both surveys showed student confidence in expressing their confidence high, yet their ability was low. This was due to providing the survey before handing back the tests. Thus, they were unaware of their actual abilities. Students’ growth was not correlated with attitude, but more likely how well the component was taught and the higher-level thinking skills involved. The exposure to data analysis throughout the year had a significant effect: they gained more confidence, they learned data analysis helps to understand physics concepts, and realized scientists use these same skills in research. The values on all surveys started high. This is most likely since they just learned how to think scientifically prior to taking the Preliminary Self-Assessment Survey. The rest of the year was honing their skills through new circumstances. One of the main goals of Freshmen physics was to learn how to think scientifically, and this study shows progress was made. General Trends The first unit in Freshmen physics was focused around teaching conclusion writing; thus, it was not surprising to see the greatest gains near the beginning. When activities refined those skills throughout the year students improved. The Check-up had impressive results. There are a few factors that may have helped this. I told students the assessment was not based on how well they did, but on how hard they tried; my extremely low students wrote more on this assessment than the other assessments on conclusion writing. If a component was left blank, I asked them 36 verbally for their answers; it was right over half of the time. The assessment was shorter as prediction, confidence, and extension were removed and no other testing was done during this time. All other assessments were a part of unit exams. Many students improved their scores from ones to twos by articulating themselves well, just as they had done the day before in the group analysis activity. An obvious relapse occurred on the Semester Test. For the two units prior, students did not write conclusions, but verbally discussed the results of labs in their groups and as a class with little structure. The Semester Exam also had the most questions in addition to this data analysis section. However, revisiting conclusion writing near the end of the year with new activities refreshed their minds on how to analyze data as seen on the results of the Unit Five Test. These activities helped make analyzing data fun. Many students enjoyed seeing the relationship between analyzing text and analyzing a graph. They also liked creating waves with pipe cleaners to feel the graph and the relationships. Students improved their ability in analyzing data and writing conclusions over the course of their Freshmen physics course. Claim Many students were able to make a basic claim of the pattern for the data before the class began, especially when it was linear. Most of the gains occurred in the first few months of school as they were asked weekly to identify the pattern given a graph. This is on the lower level of Bloom’s Taxonomy thus needed more rigor. They improved their pattern recognition throughout the year and their ability to state it in a complete sentence as this was reinforced monthly. The group analysis and conversation card activities 37 seemed to have had an influence on the higher quality claim statements on the Check-up as they recently had to say the claim in front of the class in a complete sentence. There were a few students who continued to improve on the semester exam even though little lab work was done before this time, however most students slid on the Semester Test. The low grades may have been due to less recent practice on using both variables when expressing relationships, tiredness on the long test, or confusing the claim with the prediction. Evidence Students’ improvement on sharing their evidence was mixed. While students quickly caught on to the equation based on shape, they would not rewrite the information in the conclusion. This is a valid point as higher level physics students would not explain why a graph is inverse based on the shape, they would look at the data table or the equation to differentiate it from an inverse square or possibly as shifted decaying function. Many participants also demonstrated their ability to look at how doubling the independent variable affected the dependent variable. Students practiced this often in the first few weeks of school. Over half of the students, 51%, showed an advanced understanding of using the actual values from the data table with appropriate units at some point on the assessments. During Unit One students conducted many labs where their data was entered into Desmos and they moved a slider to draw a line of best fit to find the evidence. The increase in equation writing is most likely do to this, as the students taking Algebra 1 have not encountered graphing by this time. Instead, they cover 38 it before the Semester Test. Unfortunately, there was not a correlation of an increase due to their math class in creating an equation from the graph by the end of the semester. The hardest component for students in stating evidence was not finding the equation, but writing it using physics symbols rather than generic x and y values as well as understanding the coefficient in the equation with appropriate units. Our current math program for algebra and geometry had students use x and y when graphing instead of occasionally asking them to convert to variables that simulate the “real world” concepts. Once they caught on to using physics symbols, they had a hard time moving to the next level of explaining what the coefficient meant. While throughout the year, students were shown how the units coincided with the equations, one last effort was made on the day of the survey for students to determine the units then the variable of the constant in their equation. Since the majority of the class during first period looked at the sheet bewildered, it quickly became an extra credit assignment. This sheet was helpful to those who were capable, as 25% of students said the worksheet helped (Figure 4). There is a high possibility that students said it helped because they looked at it recently as less than 25% of students turned it in. The Unit One Test was not a typical physics phenomenon, but more common sense; the Check-up was a situation they could rationalize, the semester exam was about the universe expanding, and the Semester Exam used an equation they were already familiar with. Therefore, of all the situations, the universe expanding should have had the lowest results, which it did, but not by much. The more the situation made common sense, the more students could understand the facets of the equation. 39 Reasoning The most challenging components of data analysis for the students continues to be providing a reason for the phenomena and their confidence in their prediction. The situations that were more abstract for them were hard to reason through. Students showed improvement on reasoning on the Check-up. This was right after the group analysis and conversation cards. In these groups they were pushed to explain their thoughts more clearly while providing them with scaffolded cards to use. Allowing students to explain their reasoning verbally most likely helped as well as they said their reason with a questioning tone of voice as they were hesitant to put it on paper. Regarding confidence, students were told verbally many times to include how well the data fit the claim and where their prediction was compared to their data set; however, the chart was not shown more than once in class, so their visualization of how to share their confidence fell back to how confident they felt instead of a scientific explanation. Their self-assessments were inaccurate throughout the year on how well they could state their confidence. Their growth was low as well. Apparently, I did not offer enough feedback to correct common unscientific answers such as, “I plugged the numbers into the equation, so I must be correct.” Once students had the correct equation, they were good at predicting results. Most of the lower scores were due to students not having an equation to work with, the prediction not easily read from the graph, including units with their prediction. Without the units, they were doing a generalized math problem instead of applying their math into the physical world. 40 Extend Many students could not come up with a creative extension to the lab or use for people outside the classroom. Some of this may have be due to not fully understanding what they were trying to do and other being lack of experience with industry. There were no formal activities that practiced extending the lab throughout the year. VALUE Survey Results For students to succeed on various parts of conclusion writing, I did not need to “sell” the importance any more than I did, but to help students better comprehend what was expected of them. Then students needed some scaffolding to help reach those expectations. Components of CERR with high levels of thinking such as reasoning through the physics, expressing confidence, and extending their thinking need more attention than given this past year. Since student confidence and student ability did not correlate, students need more detailed feedback on labs and assessments as to their progress toward each goal. As a result, students would know how close they are to meeting expectations in each category. General Trends While students improved their conclusion writing significantly in the first unit, they found it daunting. This past year students began with an example conclusion to re- order, next year, we will have a flow chart with targeted questions to begin without a formal conclusion. The flow chart is the diagramming that had phenomenal results in the literature review. In Unit One they will be given an interesting phenomenon to discuss, 41 then a conclusion to grade and improve. Hopefully, this will reduce the intimidation factor, make it more fun, and get higher quality analysis. The current format in data analysis narrows the focus too much around the pattern. Next year we will introduce more phenomena to think about and verbalize CERR problem solving amongst the class and in small groups. As the year progresses, students will write more conclusions instead of less as their writing skills improve from Freshmen English classes. With the quality of writing seen on the Check-up, it was apparent students learned from scaffolded discussions with their groups and competition to have the best explanations for their analysis. To explain their ideas like scientists, conversation cards were useful for the quieter students and non-native English speakers; they will be used more throughout the year. However, the generalized cards will be modified to have less options to choose from. The reason for the occasional use is twofold. Many students found the card to hinder their creative process; however, it was found this year that injecting conclusion writing skills intermittently throughout the year revitalized their cognition as it is a spiral approach with an emphasis of the learning done at the beginning of the year. Providing an opportunity for a non-graded assessment on the Check-up resulted in lower end students doing much better than any other assessment as I told them they would earn an ‘A’ just for trying. This served as a great form of formative assessment. The students get immediate feedback and earn credit for creative thinking that may be incorrect. The activity that compared analyzing a graph to analyzing text should be done earlier in the year, not in the sixth month as done this past year. I asked my students when 42 they felt the activity would be most useful, and an overwhelming majority put up two or three fingers indicating the second month of school. I agree, they need to have some experience reading graphs in physics and text in English before bringing the two together. Therefore, this activity would be best served after matching graphs on the cards (Appendix G), yet before the Unit Two Test on motion. Students found it very rewarding to be allowed to generate a form that is more familiar to them, such as reading text or graphs, from a form that is more challenging. They were able to form their own connections between the two. Hopefully, this helps the more visual mathematical students perform better in English class as well as helping the more literary students understand a graphical representation of the world. Next year we will coordinate with the English Department to use consistent vocabulary and skills in our course. This will also let us know when they are learning different components of analyzing the text to best place the activity. Claim While students made tremendous gains in the first few months in identifying patterns, not all students were expressing the pattern between two variables in a complete sentence. Students were able to quickly identify claims when asked the simple question, “what is the claim?”, but did not explain it thoroughly. I can remedy this with more verbal expectations when communicating as a large group and in small groups, as well as, “multiple sentence frames” as one student suggested in the interviews, “to inspire more creativity.” To say a pattern is inverse is not showing as complete of an understanding as “The relationship between the frequency and the wavelength on a string is inverse.” 43 Activities that helped identify the pattern were the numerous labs along with lectures, but the activities that forced them to express it in complete sentences to the entire class in the group analysis game took students to the next level and will be done more often. Due to the overwhelming nature of conclusion writing for Freshmen physics students, a few methods worked well. It needs to be a highly scaffolded process with the scaffolding slowly released. Group activities where students can share their ideas then present them to class and receive feedback increase students’ knowledge of expectations and provides opportunity to hear and grow from other students. Evidence One of the challenges for students was to provide all three components of evidence of their pattern: shape, data table, and equation at the high level. Next year we will not use shape as evidence for a pattern in a conclusion, as it is common scientific knowledge. A worksheet in unit one will be added where they are to determine the pattern from the data and explain how doubling the input will affect the output. They will be able to use the values from the graph or the data table. Then they will have a sentence frame to help write out their reasoning. They can make a prediction, then go to a lab station and test their prediction with confidence. Another short worksheet will provide an opportunity to practice moving from the x’s and y’s of mathematics to manipulating equations using more descriptive variables, then writing out the equation using symbols, then words. Using units to understand what is happening will be applied early in the year as well. When unit conversions are covered during Unit 4, a simplified version of “What does the number in the equation mean worksheet” will be provided to enhance graphical 44 analysis skills (Appendix P). I should create an easier activity to correlate slope and the units of the two-axis to help them understand the constant. It will help them understand the situation as well with a less math intimidation than the worksheet provided this year. Reasoning The most challenging of the three components in analyzing and interpreting data were reasoning through the situation and expressing confidence in a scientific manner. To help students understand the connection between equations and graphs, students related their physics and math terms to describe situations (Appendix F, L, & P). In leu of Appendix P, next year the freshmen teachers decided to have a more accessible worksheet, discussed above, where students will practice using the units on the graph to determine what the slope represented on the graph. Also, to connect the mathematics, students need to see some examples of graphical data that they put into words as they did in the text/graph activity. This gets the students to comprehend what the axis actually means. This will be done in the second unit so graphs have more meaning for the students yet will be done after a month of working with text in their English course. Students grasped the concept of high energy waves being higher frequency and lower time period due to the pipe cleaner activity. This helped drive home that frequency and period were inverse as well. Other activities we will use to relate graphs and situations will be graph matching with motion detectors as students and objects move toward and away from the sensor. They will also witness discrepant events void of equations to help them understand how to explain the phenomena using CERR. One discrepant event is to have students design a robotic mechanism that will kick three balls down the hallway. The 45 energy cannot come from a store bought motor or human movement. Students see how several types of mechanisms interact with the ping pong ball, tennis ball, and soccer ball differently. This is a lead into motion, conservation of momentum, and impulse they will see later in the year. Meanwhile, they write their questions on the wall and discuss why the tennis ball moved the furthest in most cases and why the heavier apparatuses moved it the most. With more reasoning practice without equations, it will become more second nature for students and help them become life-long learners. Many students come into freshmen physics with the ability to make a prediction based on an equation. For those who do not, they need some lab station practice where they do the predictions at their desk from provided data, then test their predictions at the station. Last year all predictions were from student generated data, so there was not ample enough time for about half of the students to improve their prediction scores. They should be assigned tasks where they look for the prediction on the graph and the table as well as the equation to ensure they can use all three methods. We will coordinate with the algebra teachers about teaching this skill with similar vocabulary. Their understanding of confidence needs quite a bit of work as the average score was still 0.5 out of 2 by the end of the year. Students need to realize that the prediction is only as good as the data provided and the data may change as the prediction lies outside of the domain. I did not show them the confidence table often enough last year. Next year I will use it more often and make a game that uses the table. This is an extension to the prediction activity done above. To win the game, students will need to recognize that the equations/patterns have limits. Some of the situations line up with the equations, but 46 others do not. As students get these challenging problems wrong, they change their thinking for the next problem; subsequently, they learn throughout the game to think like a scientist, not just a mathematician. By the end, students should know that they are more confident when the data goes through the center of the points and when the prediction is inside the data range. However, if the prediction is outside the data range, they need to use their reasoning abilities to determine how likely the modeled equation is the best predictor of the result. Extend It was obvious many students struggled to think outside the box and extend the lab. In subsequent years, students will have more practice by brainstorming ideas for future labs and uses for their experiment on whiteboards in a group competition. I will also demo a lab, then they design a new one by varying different independent and dependent variables or controls. Lastly, we will watch a video of a person trying to accomplish a goal and discuss what lab would be beneficial to optimize the situation. Final Remarks The findings in my report are similar to those from others in my literature review. Both Nixon and I saw students struggle understanding what the slope represented in a proportional pattern (2016). Hopefully, making a connection between dimensional analysis and qualitative observations will help this next year. Eshach and I both see the issues surrounding creating a storyline from text (2014). Also, I found a distinct disconnect between student’s math abilities and their understanding of the physical concepts as did Phage (2017). 47 I am truly looking forward towards next year. The goal is for students to have fun debating phenomena using CERR reasoning. The stronger foundation connection between mathematics and physics earlier on with an increased emphasis in writing will hopefully provide for more structure for the students with lower math skills and bring them all to an appreciation for learning earlier on. I will also separate the conclusion from the prediction for more manageable and concise assignments. Students will learn that their skills in English, math, and physics interrelate. 48 REFERENCES CITED 49 ACT. (2018). Science Test Description for the ACT. (2018, March). Retrieved from https://www.act.org/content/act/en/products-and-services/the-act/test- preparation/description-of-science-test.html At-A-Glance School and District Profiles. Retreived Dec 10, 2018 from https://www.ode.state.or.us/data/reportcard/reports.aspx Aziz, M. and Zain A. (2010). The Inclusion of Science Processes Skills in Yemeni Secondary School Physics Textbooks. European J of Phyiscs Education, 1. Retrieved from: https://files.eric.ed.gov/fulltext/EJ1053817.pdf. Barstow, B., Fazio, L, Schunn, C. and Ashley, K. (2017) Experimental Evidence for diagramming Benefits in Science Writing. Instructional Science, 45 .537-556. DOI: 10.1007/s11251-017-9415-3. Bowen, M. and Bartley, A. (2014). The Basics of Data Literacy. Arlington, VA: NSTA Press. The CollegeBoard. College Board, Advanced Placement Program. (2016). AP Phyiscs 1: Workshop Handbook and Resources. Eshach, H. (2014). The Use of Intuitive Rules in Interpreting Students’ Difficulties in Reading and Creating Kinematic Graphs. Canadian Journal Physics, 92. 1-8. Retrieved from http://www.nrcresearchpress.com/doi/pdf/10.1139/cjp-2013-0369. Gaea, L., Zaslavsky, O., and Stein, M. (Spring 1990). Functions, Graphs and Graphing: Tasks, Learning and Teaching. Review of Educational Research, 60 No.1. Retrieved from http://www.jstor.org/stable/1170224. Hill, B. (March 2013). Patterns Based Physics. The Science Teacher. Ivanjek, L., Susac, S., Maja, P., and Andrasevic, A. (2016). Student Reasoning About Graphs in Different Contexts. Physical Review Physics Education Research. Zagreb, Croatia. Judd, C & McCleland, G.. (1989). Data analysis: A model comparison approach. New York: Harcourt Brace Jovanovich. Marr, B. (2015, Sep 30). Big Data: 20 Mind-Bobbling Facts Everyone Must Read. Forbes/Tech. Retrieved Feb 22, 2017, from https://www.forbes.com/sites/bernardmarr/2015/09/30/big-data-20-mind- boggling-facts-everyone-must-read/#23438fd717b1 50 McGrath, M. and Scanaill, C. (2014). Sensor Technologies: Healthcare, Wellness, and Environmental Applications. Apress Media, LLC. NGSS Lead States. (2013). Next Generation Science Standards: For States, By States: Appendix F. Washington, DC: National Acadmic Press. National Generation Science Standards. Nixon, R., Godfrey, T., Mayhew, N., and Wiegert, C. (Feb 2016). Graduate Student Construction and Interpretation of Graphs in Physics Lab Activities. Physical Review Physics Education Research, 12. Oregon Department of Education (ODE). (2018). Local Performance Assessment Requirement. Oregon Department of Education (ODE (2011). Official Scientific Inquiry Scoring Guide. Retrieved March 2018. http://www.oregon.gov/ode/educator- resources/essentialskills/ScoringGuides/science_inquiry_hs_eng.pdf Phage, I., Lemmer, M., and Hitge, M. (2017). Probing Factors Influencing Student’s Graphs Comprehension Regarding Four Operations in Kinematic Graphs. African Journal of Research in Mathematics, Science and Technology Education. 22:2., 200-210. DOI: 10.1080/18117295.2017.1333751. Robinson, P. (1997). Conceptual Physics: Third Edition: Laboratory Manual. Addison- Wesley Publishing Company, Inc. Roth, Wolff-Michael. (2002) Reading graphs: Contributions to an integrative concept of literacy. Journal of Curriculum Studies. 34:1, 1-24. DOI:10.1080/00220270110068885 Stephens, A and Clement, J. (Nov. 2010). Documenting the Use of Scientific Reasoning. Processes by High School Students. Physics Education Research, 6. Texas Higher Education Coordinating Board.). Making Decisions With Data. Texas Higher Education Coordinating Board. Retrieved 2017 from, http://www.txprofdev.org/apps/datadecisions/node/50.html. 51 APPENDICES 52 APPENDIX A INFORMED CONCENT 53 54 APPENDIX B PRE-TEST (CONCLUSION) 55 Phyics Pretest Name:_____________________ Period:_____________________ Date:_____________________ Useful Resources: on back 1. For the patterns (math class would call these functions) a-c draw what it would look like in graphical form. a. Linear b. Quadratic c. Inverse 2. Observation – Jesus notices that the longer he talks on his cell phone, the lower his remaining battery life. He wonders, “How does the time talked affect the battery life?”. He collects the following data about his phone. a. Graph the data set at a high school level , including uncertainty (error bars) and sketch of a best-fit line. Time Talked (hours) +/- 0.1 Battery Life (hours) +/- 0.5 0.0 5.0 0.5 4.5 1.0 4.0 1.5 3.5 2.0 3.0 4.0 1.0 b. Which of the patterns from question 1 seems to describe your best-fit line? _______________________ c. What does the y-intercept tell you about the cell phone? d. What does the slope of the line mean? e. Write the equation that represents your best-fit line? ____________________ f. Write a full conclusion for the graph. Include a prediction for the battery life if he talked for three hours as well as your confidence in that prediction. (Claim/evidence/reasoning) __________________________________________________________________________ __________________________________________________________________________ __________________________________________________________________________ __________________________________________________________________________ __________________________________________________________________________ __________________________________________________________________________ SP1 Developing and Using models (patterns) SP4-8 Analysis & Conclusion SP8 Communicating Information SP5 Mathematical and Computational Thinking Q’s: 1, 2b,3a Q’s: 2-3 Q’s: 4-5 Q’s: 17,18 56 APPENDIX C CONCLUSION RUBRIC AND SENTENCE FRAMES 57 Student Claims, Evidence, Reasoning (CER) grading Rubric Write a paragraph in complete sentences, be sure to cover each section. The words in highlighted italics are sentence frames to help you. Unless you have an academic reason for not using your own words, the highest grade you can get using ONLY the sentence frames is a B.. 1. Claim: Write one sentence stating the lab results as it relates to the predicted pattern or stating your scientific opinion as to how the independent and the dependent variables relate to each other. We discovered that for a ______________ there is a ____ relationship between ____ and _____. Be sure that you talk about BOTH of the variables!! 4 2. Evidence: Evidence: 1-3 sentences. Report the data (graphs, data, trends or other analysis) that you obtained in the lab. It should be consistent with your claim. Tell how your evidence either supports or does not support your claim. A. What specific observations did we see in the lab? Use numbers from your data table and / or graph. My data shows that when I ____________(what change did you make), _________________ happened. When I doubled the (IV) ___ from __ to ___ the (DV) _____ from ___ to __. B. Mathematical Model If you have an equation, describe in words the relationship between the D.V. and I.V. by describing the equation, symbols and constant. Our mathematical model was ________. The # represents _____. 6 3. Reasoning /Prediction: Reasoning: 3-5 sentences. A. Connecting the Lab: How does this lab relate to science as a whole? I believe this happened because ___ (what’s the physics?) B. Practical Uses / Prediction: As a result of the lab what might this new knowledge be used for? We can use this information to predict . My prediction is ___ C. Confidence / Accuracy: Are you confident in your findings? Does your line of best fit touch all of your data? 4 4 2 58 4. Reflection: Address the following concerns/questions in complete sentences (Each 2 points) A. How have my ideas changed? Before I did this experiment, I thought______ . In reality, ______. The actual _______________ was _____ and we were off by ____. This is within/outside the uncertainty of _____. B. What new EXPERIMENTAL question(s) do I have, and what new things do I have to think about? After doing this activity I am now curious about ______. 5 Extension (Bonus): A. Describe one real life applications or connection of this laboratory work. B. Propose a question for the next stage of experimentation. The question must have a specific but new independent and dependent variables. How does this lab tie into TWO concepts about which I have learned in class? C. What is your percent error? % error = (Expected outcome- Actual outcome) Expected outcome 3 59 APPENDIX D MODIFEID SCAFFOLDING FOR CONCLUSIONS 60 Claim Evidence of the claim Shape Data table Equation Reasoning What’s the physics? Prediction Confidence in the prediction Reflection Errors in experiment Hypothesis / how ideas have changed Another lab that would be interesting An example beyond the classroom 61 APPENDIX E CONCLUSION STRIPS 62 2.9 seconds +/- 0.3 and the best-line is _________________. Research extension question: How does the _______ of the pendulum affect the time of 1 swing? ______________________________________________________ ___________________________________________________________. It makes sense that it is a horizontal line since the larger the angle the faster it falls and the further it travels in the same amount of time compared to a small angle which falls slower for a shorter distance. Time of One Swing = Same for all Angles, This system of a pendulum swinging can be mathematically modeled as: Time of One Swing = 2.9 seconds T = 2.9 , where 2.9 represents the 2.9 seconds it took the pendulum to complete one swing at any starting angle. I am very confident that I am correct since the data points match the line of best-fit and the prediction is within the data range. When I doubled the starting angle from 10 to 20 degrees, the time of the swing stayed about the same from 2.9 to 2.8 seconds. Using this model, I predict that at a starting angle of 45 degrees, the pendulum will take _____+/- ____ seconds to make a full swing. Some sources of error are our reaction time and the protractor rotating when we released the mass. To get more accurate data we could ____________________________. 63 APPENDIX F L1.4 CIRCLE LAB 64 65 66 67 APPENDIX G MATCHING GAME 68 69 APPENDIX H UNIT ONE TEST (CONCLUSION) 70 71 APPENDIX I SURVEY 72 73 APPENDIX J CONCLUSION “CHECK-UP” 74 75 APPENDIX K SEMESTER EXAM (CONCLUSION) 76 77 APPENDIX L MATCHING MATH AND PHYSICS TERMS ON A GRAPH 78 79 APPENDIX M GRAPHING WITH PIPE CLEANERS 80 81 APPENDIX N GRAPH TO TEXT 82 83 APPENDIX O UNIT TEST 5 ON WAVES (CONCLUSION) 84 85 APPENDIX P WHAT DOES THE NUMBER IN THE EQUATION MEAN? 86 87 APPENDIX Q INTERVIEW QUESTIONS 88 INTERVIEW QUESTIONS Name____________________________ April 16th 2019 1. Which of the activities helped you the most? Why? 2. Tell me what is still confusing about analyzing graphs? 3. What can I do to help the students next year analyze data better? 4. What did you improve most on this year in physics?