Bell's Hill Elementary
Differentiated Instruction to Support Engagement in Mathematics
Primary Researchers
Molly Bates, Intern, Baylor University
Barry Horst, M.S. Ed, Mentor Teacher, Bell’s Hill Elementary, Waco ISD
Sheri Sowder, M.S. Ed, Supervisor, Baylor University
Rationale/Introduction
Within a second-grade class of students with varying mathematic ability levels, I noticed my students struggle with engagement throughout whole-group mathematics lessons. Students who are gifted in math find the material accessible and do not need extra support, whereas other students need explicit instruction on the methods and reasoning behind the concepts. This causes both groups to disengage throughout the lesson when it does not meet their needs. In a study completed by the Journal of School Health, elementary students with high teacher support were 89% more likely to feel engagement, whereas low teacher support doubled disengagement from 35% to 73% (Klem 2004). Providing an opportunity for students to receive help and support from the teacher in a small group environment can enhance students’ engagement by allowing them to get support during instruction directly.
Question/Wondering
How does small-group, differentiated instruction support engagement and acquisition of skills in mathematics?
Methodology/Results
When students learn math content, they must engage in the lesson activities to construct their understanding. While learning grade-level objectives, proficient students require additional practice to utilize critical thinking and problem-solving skills. Math lessons were taught in small groups to support engagement and knowledge acquisition, providing differentiation and guided practice for all students. Over three weeks,the class completed math stations during instructional time. This included four rotations of twelve minutes, allowing students to complete a lesson at the teacher's table, independent practice, math technology, and paper activities. Providing a limited time for students to complete a variety of practices improved their engagement independently; however, the most significant growth occurred during the small-group lessons at my table.
Completing the lesson with 4-5 students instead of 17 provided many opportunities to support their understanding. Practice problems were differentiated based on the student's ability level, such as regrouping into various place values or requiring multiple steps. Rather than being overwhelmed by a problem they do not have the tools to solve or determine the answer within seconds, students were able to receive practice at their skill level to grow understanding. Additionally, I provided multi-modal resources to students who required concrete representations of a number to support their independent practice. Many gifted math students were challenged to think critically about mathematics, such as identifying the reason for formulas and methods taught.
I calculated the average percentiles for students' responses during the math lesson to analyze the data. During whole-group instruction, 64.7% of students responded throughout ten minutes, with an average of 2.3 responses. On the other hand, 100% of students answered questions during small-group instruction over the same time, and on average, spoke 3.7 times. Not only were all students provided an opportunity to answer questions within their capability, but they could also answer more frequently. I also measured students' acquisition of knowledge from the lesson using independent practice scores. After learning a concept in a whole-class setting, students scored an average of 4.11/5 on a worksheet completed independently, and eight students demonstrated mastery of the objective. When utilizing small-group instruction, students scored an average of 4.65/5, with twelve students showing mastery and the rest demonstrating a formational understanding. Lastly, I surveyed students' responses to determine how they felt about math small groups and
whether it benefited their engagement and learning. Nearly every student stated that they preferred math stations over the whole group; they felt more engaged, which helped them understand a concept, with only two students saying it was equivalent in one or more categories. Overall, the data showed the benefits of small-group instruction in mathematics in supporting engagement and acquisition of understanding.
Implications/Recommendations
Throughout this inquiry, I examined the effect of small-group differentiated math instruction on a coed class of students. My findings strongly support that this instruction allows students to engage more, acquire knowledge, and enjoy learning. All students demonstrated significant increases in their engagement throughout math lessons and practice and were able to gain content understanding at a faster rate. Given that the instruction was individualized to their level and needs, students could experience meaningful learning to construct their knowledge. Furthermore, students enjoyed completing math in a station format and looked forward to it daily. Their excitement and focus helped support their understanding of the content.
Overall, I believe incorporating differentiated instruction in small groups benefits all students by engaging them with the content and forming holistic knowledge. By providing activities and practice that are planned around students' strengths and areas for support, the curriculum can be adapted to each individual student and support them in achieving the learning goals.
References
Klem AM, Connell JP. Relationships Matter: Linking Teacher Support to Student Engagement and Achievement. Journal of School Heath. Sep 2004
Croninger RG, Lee VE. Social capital and dropping out of high schools: benefits to at- risk students of teachers’ support and guidance. Teachers College Record. 2001
Steinberg A, Almeida C. From the Margins to the Mainstream: Effective Learning Environments for Urban Youth. Boston, Mass: Jobs for the Future; forthcoming
Skinner EA, Belmont MJ. Motivation in the classroom: reciprocal effects of teacher behavior and student engagement across the school year. J Educ Psychol. 1993
Positively Affirmed: How Daily Affirmations Affect Self-Efficacy
Primary Researchers
Kelly Bradford, Intern, Baylor University
Jennifer Garza, Mentor Teacher, Bell’s Hill Elementary, Waco ISD
Jina Clemons, Intern Supervisor, Baylor University
Rationale/Introduction
This year, I have had the opportunity to serve as an intern in a fourth grade gifted and talented cluster classroom. This experience has provided me with a wealth of knowledge relating to both gifted and high-achieving students. While in the classroom, I have observed one student who is twice-exceptional exhibiting strong reactions related to what could potentially be described as a deficit in self-efficacy. Over the last few weeks, I have witnessed this student get overwhelmed multiple times when approaching tasks that he perceived as overwhelming or outside his ability, which leads to an expression of many negative statements related to what he perceives he is incapable of. While this student operates under the belief that he is not capable of the assignments he is presented with, he is an identified gifted and talented student whose scores demonstrate his impressive ability. His tendency to get overwhelmed by new tasks often hinders his ability to give them a try, which further limits his self-efficacy because he does not give himself the chance to prove that he can complete them. As Szente (2007) states, “The more children believe in themselves and in their abilities to be successful, the more they persevere and keep on trying to achieve their set goals” (p. 453). By encouraging him to speak kindness over himself and recognize that he is in fact capable of doing hard things, my goal is to elevate his self-efficacy and build confidence in himself that will help him to succeed academically.
Question/Wondering
In what ways do positive, daily self-affirmations affect a fourth grade student’s self-efficacy?
Methodology/Results
For this study, I worked with a fourth grade student who has been identified as twice-exceptional. During the first week of my study, I observed the student by taking anecdotal notes regarding his response to assignments in the classroom. I observed statements such as “I can’t do this”, “I failed an entire question”, and “I don’t think I can do it”. However, I also observed an instance where once he demonstrated to himself that he could complete one problem on his worksheet, his attitude changed completely, and he was excited to come up with multiple answers for the open-ended problem. Observing commentary and behaviors like this led me to conclude that this student was operating with a deficit in self-efficacy, so I created an intervention to encourage him to speak positively over himself by using affirmations. Part of what makes interventions like positive self-affirmations so effective is that they “change students’ mind-sets to help them take greater advantage of available learning opportunities” (Yeager & Walton, 2011, p. 274). For five minutes each morning, I met with him one-on-one and had him read the following positive self-affirmations:
- I am smart.
- I am confident.
- I am capable of doing hard things.
- I will not give up when things are tough.
- I will always try my best.
- The sixth one was one that I asked him to come up with each day on his own. I provided him with these sentence stems:
- One thing I like about myself is _______.
- I am ______. / I am good at _____.
- I will ______.
This study took place over the course of four weeks with a total of thirteen days where we met to review the daily self-affirmations. During the first and last meetings, I asked the student to complete the Self-Efficacy Survey that I designed to utilize as a form of quantitative data throughout the study. It consisted of 10 questions, all of which have a response of “No, never”, “Sometimes”, and “Yes, always”. Each answer choice is numbered 1 – 3. The maximum number of points that a student could score is 30, while the minimum is 10. The idea here is that the closer a student is to a 30, the higher their self-efficacy is. Based on this scale, a 10 – 16 would be considered low self-efficacy, 16 – 23 would be considered average self-efficacy, and 24 – 30 would be considered high self-efficacy. I also chose to collect qualitative data by writing anecdotal notes. Throughout the observation period, whenever I observed behaviors relating to the study (ex. he opts to try an activity he would usually be overwhelmed by), I would jot down a note about what happened. Because this depended on a variety of factors (day, time, subject, how he’s feeling, etc.), I did not have specific days designated to note this data.
At first glance, the results of this study are not indicative of a great change within this student’s self-efficacy. In fact, looking at the numbers, the student scored a 22/30 on the pre-test and a 21/30 on the post-test, demonstrating a 3% decrease in self-efficacy overall. However, a closer analysis of the data indicates that there are still many little “wins” to be seen in the data. For example, in two categories (math and writing), the student’s self-efficacy increased by one point each. Because of this, the student only recorded one “No, Never” answer on the post-assessment, compared to two “No, Never” responses in the pre-assessment. Knowing this student personally, the increase in the writing category is a huge success. My anecdotal notes throughout the study have indicated that this student often struggles with writing the most, whether that was during writing itself or in other subjects that required him to write. Because of these consistent observations, this student indicating an increase in writing self-efficacy over the four weeks of this study is something to be celebrated! Additionally, six out of ten of his answers stayed consistent for both assessments, which while not an increase, is still something important to note, as they were all scores that were “Sometimes” (2 points) or “Yes, Always” (3 points). I am, however, unsure as to why there would have been a decrease in both reading and social studies self-efficacy, but it is important to remember that self-efficacy depends on much more than positive affirmations. There are plenty of other reasons, like the time of day, subject, student’s mood, and other aspects that can affect their level of self-efficacy each day. So, while there is a decrease in these two categories, it would not necessarily be accurate to conclude that it was because the intervention did not work.
Implications/Recommendations
Based on my study, I would recommend that positive affirmations be utilized in the classroom, not only for students demonstrating low self-efficacy, but for all students. While at first glance, the results do not demonstrate overwhelmingly positive effects, a closer look at each independent aspect of this study reveals that there is at least a slight positive effect on the student’s self-efficacy. However, this study does lead to its own unique set of questions – how would the study be different if the length of the study increased? If the intervention was implemented during a different time of day? If the intervention was implemented more than once a day? If the affirmations were utilized by another student, or the entire class?
Overall, through the implementation of this study, my conclusion is that affirmations are beneficial for students in a multitude of ways, including overall and academic self-efficacy. I think that having the student create his own affirmation was also very beneficial because it allowed the student to be invested in the activity, rather than just repeating the affirmations I created. Regardless of age or grade level, I think affirmations are a helpful tool that teachers can refer to when they see the need for it, as I did in this study.
Reference(s)
Szente, J. (2007). Empowering young children for success in school and in life. Early Childhood Education Journal, 34(6), 449–453. https://doi.org/10.1007/s10643-007-0162-y
Yeager, D. S., & Walton, G. M. (2011). Social-psychological interventions in education: They’re not magic. Review of Educational Research, 81(2), 267–301. https://doi.org/10.3102/0034654311405999
Classroom Participation
Primary Researchers
Maggie Lindsay, Intern, Baylor University
Amanda Martinez, Mentor Teacher, Bells Hill, Waco ISD
Sheri Sowder, MS Ed, Educational Leadership, Professor, Baylor University
Rationale/Introduction
I have been placed in a co-ed second-grade classroom for my intern year. Every day, I observe my students.
I watch as I teach lessons to see who participates and understands, who always does their work, and if students remain engaged. After a semester of observing my students, I have noticed and teamed about their personalities, learning styles, and habits. When contemplating my research, I decided to focus on participation. I have the same few students who participate all day. I want to test if I can implement a system to increase my students' participation. My research will be implemented into our reading and math blocks. I hope to motivate all my students to increase their participation during lessons by implementing a participation chart.
Question/Wondering
How will adding a participation tracker with incentives in a second-grade co-ed classroom encourage all students to increase their participation?
Methodology/Results
My research will involve all students in my second-grade class. The class includes 18 students. Ten girls and eight boys who are seven to eight years old. The majority are Hispanic, along with two African American students. The students in my class come from lower economic homes and are of varying instructional levels. I will introduce the visual I made to help students keep track of their participation and explain that every time you share, you will get to add a bird to your tree. Once a student has participated two times, they cannot participate again until our whole class has participated and filled up their trees. While collecting the number of times my students participate, my mentor will complete engagement forms as data. I will be focusing on math and reading blocks. I will use this method because I want the students to encourage each other to participate because the students must work together during lessons to ensure everyone does their part. If the class meets their participation goal every day, at the end of the week, the students will receive a whole class reward. This will help increase their motivation to participate and collect all the birds needed. The methods I will use to collect data are one engagement form a week, counting the questions I ask, and counting the times students participated each day. These data points will help me show if there was an increase in participation, if the questions I ask allow students to participate successfully, and if the data shows an increase in students' overall participation.
First, I collected the number of times each student participated in reading and math for two days to establish a baseline. My mentor then conducted the first engagement form, and I counted the number of questions I asked. This information gave me a reliable baseline of data.For my research, I created birds and trees to track the students' participation. When they participated, I gave them a bird on their tree. This helped the students know if they could answer again or not. I made a visual to help my students keep track of their numbers and to assist my visual and kinesthetic learners. When the students got a bird, they placed it on their tree. I used Velcro so the bird would stick to the tree. They removed the birds at the end of the lesson, and I collected them to be ready for the next lesson. This allowed the students to move around and keep track of how many times they had answered.
Then, I started collecting my data over the 14 days of research. My mentor collected one engagement form a week. I counted the number of times each student participated. I counted how many questions I asked to ensure I gave each student an equal opportunity to participate. If the students reached the goal of everyone participating at
least twice during the lessons, they received a whole class reward. The rewards included a dance party, a paper snowball fight, and a basketball dunk contest (using paper and the recycle bin).
Finally, after 14 days of research, I analyzed my data. I first took my baseline data and made a graph. The graph showed that more students participated in reading than in math. The reading participation was evenly spread, but the math had nine students who never participated. After realizing this, I chose three other dates to analyze my student's participation. I created graphs for January 29th, February 8th, and 15th. These dates allowed me to analyze how my students' participation changed. In math, I had 16 students who participated 0-3 and two who participated 6-7 times on January 23rd. By January 29th, I had five students who participated zero times, nine who participated once, and four who participated twice. In those six days alone, the number of student participation increased. On February 8th, all 18 students participated twice. On the last day I analyzed, February 15th, all 18 students had still participated twice. Based on this data, I reached my goal of increasing student participation during math lessons. All my students were able to participate twice, which was my goal, and they used teamwork to help each other meet this goal. Next, I analyzed the reading participation using the same dates. As I mentioned, my reading baseline was much more spread out than math. I had ten students participating between 0-2 times and eight students participating between 4-7 times on January 23rd. By January 29th, all but three students participated 1-2 times. Only three had not participated. On February 8th, all my students participated at least twice, and some even made it three times.
Once everyone had participated twice, everyone could participate again. That is how I have some students participating more than twice. By February 15th, every student had participated 2-4 times. After analyzing this data, my students also met the goal of increasing participation during reading lessons. This showed me that my method was successful, and with my student's hard work, I was able to increase overall classroom participation.
After analyzing my engagement forms, I discovered that having at least an 88 percent engagement allowed my students to participate successfully. On my first form, I received an 88 percent engagement. All students observed were on task most of the time, but one. This was nota surprise. I had been brainstorming with my mentor about how to keep this student engaged for some time before this research. This was my baseline engagement, so it gave me a good place to start. On the next form, I received a 94 percent engagement. This was a great lesson, and my student participation had already increased. On the third form, I received an 89 percent engagement. All but one of my students were engaged the whole time. This student is fidgety. He must be moving all the time. He reached his goal of participating twice during the lesson, so I know he was still taking in the information. I started to brainstorm how to help. I decided to talk to him. I told him that if he felt he could not pay attention very well, he could stand on the back of the carpet, take a wiggle break, and then rejoin us. This did help him. He does not fidget as much on the carpet now. For my final engagement form, I received a 93 percent. The students who were off task were only off task for seven percent of the lesson. They all reached the goal of participating two times as well. After analyzing my engagement forms, I have found engagement is vital to participation. If students are not engaged in a lesson, it will be harder for them to pay attention and participate.
I also realized that the number of questions I asked directly correlated to the success of my students. If the students are engaged, and I ask enough rigorous questions, they will be more successful in participating. I looked at the four days I analyzed and counted the number of questions I asked. On the first day of data collection, I asked 19 questions in reading and 21 in math. This was not enough for my students to reach the goal of participating twice. I started to ask more questions over the days during the lessons. On January 29th, I asked 22questions in reading and 17 in math. On February 8th, I asked 38 questions in reading and 36 in math. This number of questions allowed my students to participate twice during both lessons. On February 15th, I asked 42 questions in reading and 36 in math.
Implications/Recommendations
More students will be engaged when a teacher asks enough questions for all students to respond. I found that by using incentives and controlling the number of answers from each student, every student could answer at least two questions in a lesson. Students who usually answered many questions were able to help their peers become more involved in sharing ideas with them. The visual aspect of birds in the trees kept students on track with their participation. After reflecting, I have realized how important it is to plan for equal opportunity for all students. I will plan a range of questions, so I know all my students will be able to participate the best they can. This research will be on my mind when planning my own classroom management. By conducting my research, I learned that it is possible to motivate all students to increase their participation using visuals and incentives. A note that I have is that depending on the lesson, the teacher will need to pan out questions before a lesson to help the teacher and the students stay on track. If you do not ask enough questions, the students will not be able to reach their goals, which can affect their motivation during the lesson and independent work. Another wondering I have is if the increase in participation during lessons affects the student’s grades and progress.
Reference(s)
Cecelia Reddington Raymond Canada. "How Does Student Participation Influence Student Achievement?" NYU Steinhardt, 11 May 2006, steinhardt.nyu.edu/departments/teaching and-learning/research/practitioner-action-research/how-does-student.
Davis, Barbra. "Increasing Student Participation." Center for Teaching and Learning, 1993, ctl.wustl.edu/resources/increasing-student-participation/#:~:text=Organize%20each
%20class%20session%20to, advance%20their%20knowledge%20and%20thinking.
Nguyen, Hoa P. "How to Open Class Participation to Everyone." Edutopia, George Lucas Educational Foundation, 5 Nov. 2021,
www.edutopia.org/article/making-sure-every-student-has-voice-classroom/.
The Impact of Asking Closing Comprehension Questions
Primary Researchers
Jessica Thompson, Intern, Baylor University
Sarah Tatum, BS Ed, Generalist EC-6 and ESL, Mentor Teacher, Bell’s Hill Elementary, Waco ISD
Sheri Sowder, BS. ED. Baylor, M.ED. Texas Tech, Professor, Baylor University
Rationale/Introduction
My action research will consist of six (6) individual students who are Level P readers. I will be focusing on asking them comprehension questions that will involve a before and after reading session. During this study, I will be focusing on Non-Fiction and Fiction stories. As I am asking each student the comprehension questions, there will be a lower, a middle, and an upper level question asked.
When developing these questions, I am referring to Bloom’s Taxonomy to help me create an understanding of each student’s comprehension level. I will pull out my students occasionally, one at a time, both in the morning and in the afternoon. This switchin timing may help to demonstrate if there are any subtle differences in their thought processes. While asking each student a question, I will score them on how well they answered the question. There will be a rating system that will have scores ranging from 0-2. If students do not know the answer to the comprehension question, they will receive a 0. If the student gives me one detail, they will receive a 1. Lastly, if the student gives me more than one detail, they will receive a 2. At the end of my research, I will put each of the student’s scores on a scatter plot and determine how well each student did with fiction and non-fiction stories. I will also place scores between non-fiction and fiction stories, distinguishing how well students did in the mornings versus the afternoons and the types of questions that were asked.
Question/Wondering
How does asking follow-up questions during guided reading instruction impact 2nd grade on-level readers’ abilities to answer higher level questions?
Methodology/Results
I collected my data by creating an Excel sheet that was related to the questions asked, so that throughout the weeks of collecting my data, I could record everything that I was completing. I focused on six on-level readers within my 2nd-grade classroom. With the six students in my classroom, there was a variety of females and males. I would pull my specified group of students to a back table in the classroom to read the story I was providing them for that particular week. After each student finished reading the story for the day, they would return to their seats and reflect on what they had just read. The next day, I would have prepared questions about the story they had read the day prior. Those questions would consist of both lower-level questions and upper-level questions. I would ask the students “before reading” questions, give them time to read, then ask them “after reading” questions. On Day 2 of those questions, I focused on the lower-level questions. Then, depending on their comprehension of the story after answering the questions to the best of their abilities, I would move on to my upper-level questions or redirect them if they didn’t comprehend the story fully.
While they were reading the story aloud to me, I would make notes of the students’ answers. I would analyze my data by the rating system that I created. The rating scale was from 0-2; 0 = No response, 1= 1 detail; 2 = More than 1 detail. Throughout several weeks of collecting data, I would add all the number of points each student received from before and after. Once I tallied the scores for each student, I added everyone’s scores together for a before and after reading comprehension analysis. I would then calculate the data to find my percentage of whether my students could comprehend Fiction or Non-Fiction better. The results that I gathered directly related to my initial thoughts, which was to determine how asking follow-up questions would better support students' comprehension. For example, looking at my data from the “before” and “after” of Non-Fiction text, my students scored a total of 53 points before reading, and a higher total of 75 points after reading. Once I entered this data, I used it and the difference to calculate a percentage. After finding my percentage, my student's comprehension increased by 41.5%. Then, looking at my data from the “before” and “after” of the Fiction text, my students scored in total of 64 points before reading and an increased total of 82 points after reading. My student's comprehension rose by 28.1%. After analyzing the data, I can see that my students could read Fiction text with accuracy and understanding, but the students’ level of comprehension was far greater in reading Non-Fiction text. I can confidently say the data shows progress in comprehension when asking follow-up questions after students have read Non-Fiction rather than Fiction, even though progress was demonstrated in the latter as well. Before beginning my Action Research Study, I predicted that students' comprehension levels would increase with Fiction text compared to Non-Fiction, solely based on a students’ preferences that draws them towards Fiction text first, lending them to higher attention spans and keeping them more engaged. To my surprise, I saw how my student comprehension level actually increased more with Non-Fiction than it did with Fiction. I came to the conclusion of my prediction because the text is informational, and it tells facts. It’s easy for kids to answer the questions because there is a right or wrong answer. From the onset of my research, I observed how my students didn’t seem as confident when answering lower and upper-level questions. The majority of the time, the students in my study were mumbling their answers to the questions being asked. Something that I started doing was to pull each student individually to have one-on-one conversations instead of the typical group conversation. This brought the confidence level up for many of my students when answering these questions and giving them that “wait time” to think on their own.
Implications/Recommendations
My Action Research supported my instructional practice in all subject areas by enhancing my questioning. Something I could implement into my teaching is to ask guiding questions that would enhance a student’s understanding of the context or text they are reading. Asking the “before” questions, whether that be in reading, or in any other subject, gets kids’ minds to start thinking.When asking those “after” questions, I could see which students could comprehend the material or text in the best possible way and which of those students I need to work with more diligently. The strength of my study was being able to pull students throughout the day to fulfill my research, my data and my theories. A weakness I discovered in my study was the unknown of students being absent. An absentee would always affect my data. The biggest lesson I learned was the importance of consistent rigorous questioning.
Reference(s)
leeandlowbooks. (2012, September 19). Fabulous Follow-Up Questions. Lee & Low Blog. https://blog.leeandlow.com/2012/09/19/fabulous-follow-up-questions/#:~:text=3.
Davis, R. (2014, November 7). Tactics for Asking Good Follow-Up Questions. Harvard Business Review. https://hbr.org/2014/11/tactics-for-asking-good-follow-up-questions