# South Bosque Elementary

### Math Matters, It’s a Fact

**Primary Researchers **

Abby Holland Intern, Baylor University

Kathryn Owens, BS of Interdisciplinary Studies, Mentor Teacher, South Bosque Midway ISD

Lisa Plemons PhD, Professor, Baylor University

**Rationale/Introduction**

In a 2nd grade classroom, it was observed that students were struggling with their math lessons because they kept making simple calculation errors. This affected student comprehension of new math concepts because their focus was on math facts, rather than the new lesson at hand. I have decided to pull a group of four students for explicit fact fluency practice, in hopes that automaticity will increase student performance.

**Question/Wondering**

Could students start to understand more complex math concepts with the addition of explicit math fact fluency practice?

**Methodology/Results**

Since the beginning of the semester, I kept noticing that students were making careless mistakes on their math assignments. This was not due to the lack of ability to perform, but because of simple addition and subtraction errors. I decided to pull a small group of 4 students with ranging abilities, so I could work on math fact fluency with them, in hopes that both their confidence and automaticity would increase.

For 3 weeks, I collected data on 2 female and 2 male students of various ethnicities, ranging from the ages of 7 to 8 years old. Originally, my plan was to time these students on a math fact sheet 4 days a week. However, after the first week, my plans quickly changed due to lack of enthusiasm and structure. I conducted a pretest so I could collect baseline data and proceeded to give them the same test for the post test, 3 weeks later. One note to add is that spring break occurred in the midst of my data, causing my data to not be linear. I pulled my small group for 30 minutes, with the intentions of doing a warmup, a fact sheet for 2 minutes, then having them correct any mistakes and finish the rest of the worksheet untimed. To bring in self-monitoring, I wanted them to fill out a chart that encouraged them to beat their score from the previous day. As mentioned earlier, my structure changed because I realized I needed more concrete and hands on activities to catch the attention of my students. I still pulled my students for 30 minutes, but I only tested them on the fact sheet on the first day of the week and the last day of the week. When we were not reviewing the sheet, we were playing addition and subtraction strategy games, speed doubles, and Sum Dogs. The objective of the addition and subtraction strategy game was to be the first team to complete the page of problems. To add some competition, students would have to roll a dice and only solve the problems that were allotted under the number that they rolled. The speed doubles game turned into a race where students would see who could solve their doubles facts the quickest. Sum Dogs was the most fun for the students because they competed against each other while being tested on both addition and subtraction fact families. During these games, I was able to make both informal and formal observations on student approaches towards math problems. Later, I asked them their strategies and most of them said, they already know the answers, they use their fingers, or they count in their heads.

Within the small group, the students were expected to work diligently on their fact fluency. I found that focusing on automaticity for simple math facts improved student work, confidence, and content knowledge. During a timed pretest of 2 minutes, student A completed 15 problems correctly, student B completed 33 correctly, student C completed 35 problems correctly, and lastly, student D completed 33 problems correctly. As for the posttest, I saw an increase in number of correctly completed problems in every student. Student A increased to 26 problems, student B increased to 39 problems, student C increased to 42 problems, and student D completed 40 problems. Not only did the students increase the number of problems they were able to do under a timed test, I also saw improvement in their assignments and confidence. In order to get accurate data, I noted the students class work such as exit tickets and other assessments. Although a majority of the time, my group knew the content being taught, they would fail to make 100s because of simple calculation errors. One week post data collection, I was grading assessment papers and every single student in the group received a 100. The sheets were focusing on multiplication. To explain further, the sheet required the students to write the multiplication equation and the repeated addition equation. I ended up pulling the group once more to discuss their grades and ask them about their strategies. These students blew me away by explaining their thinking regarding the repeated addition. They said that they were able to complete the problems because of the practice they had been receiving through our small group time. Even though the data shows good results, it also warms a teacher’s heart to see their student’s confidence grow. Student A first came to the group timid and sad due to their weakness in math, which caused them to not like the subject. I asked this student how they felt after the explicit training, and they said they love math because they are not intimidated by the numbers anymore. This student told me multiple times about how happy she was to do math because it is not as stressful to recall the facts. Even her teacher noted her automaticity skills, which made her face light up, grinning ear to ear. I learned that understanding math facts can help students learn the math concepts quicker. “Automaticity is an important piece to the success of a student because as the student progresses through different math concepts, the brain will have difficulty processing two things at once. The math facts have to be committed to memory in order to learn more complex concepts” (Baker and Cuevas 2018). In this study, Baker and Cuevas noted that automaticity regarding addition helped students understand multiplication better. I found the same ideas to be true and noticed that fewer errors were made.

**Implications/Recommendations**

Although I knew that math concepts stack on one another, meaning you need to understand concept one in order to understand concept ten, I never realized how much automaticity of facts would increase a student’s confidence and overall success. Knowing this, we have continued to implement fact fluency within our lessons. Many times, students will warm up by practicing their facts and then they will also get on fact practice games, when they finish their station work. Throughout my inquiry, I quickly realized the imperfectness of my methods. As mentioned earlier, my idea was to focus on math sheets only. I wanted to drill the facts only using pencil and paper. Two days later, and I grew tired of my methods. I changed to hands on methods that included competition and self-monitoring. The strength of this study was that it intrigued the students and caused them to want to come back every day. A weakness would be consistency because my students would get pulled from group for varying reasons, which caused them to miss a lesson, at least once a week. The timing of the data was also hindering to the program because spring break occurred. The students’ brain went on vacation mode and I noticed regression in the data. With all that being said, I am thankful that we were able to bounce back and see overall improvements. If I were to do the study again, I would want to make the duration of the data a few weeks longer. I would also want to pick one strategy to teach the students everyday. For example, I would want to teach them touch math or a similar concept. This would allow us to have more structure and a mini lesson. It would also create unity within the group because they would all be using the same strategy. As for next steps, I would want to track math accuracy to see if handwriting had anything to do with their quality of work. With more time, I would also focus on students’ timeliness. I found that if the students were eager to finish quickly, they would write numbers backwards and squished together, which prohibited them from reading their own writing. This caused calculation errors because numbers were read incorrectly. Lastly, I wonder how attitude and behavior contributed to the results of this study. Some days students were tired and lacked the enthusiasm to work. When they were off task and distracted I noticed that they would not get as many problems correct.

**Reference(s)**

Baker, Austin T. and Cuevas, Josh (2018) "The Importance of Automaticity Development in Mathematics," Georgia Educational Researcher: Vol. 14 : Iss. 2, Article 2.

DOI: 10.20429/ger.2018.140202

### Implications of Alternative Seating on Engagement

**Primary Researchers **

Bailey Lewallen, Intern, Baylor University

Jessica Hogg, Mentor Teacher, South Bosque Elementary School, Midway ISD

Lisa Plemons, Intern Supervisor, Baylor University

**Rationale/Introduction**

In my first-grade classroom, my students have three seating options that are rotated once or twice a day. These options include a typical plastic desk chair, a small exercise ball, or a stationary bike with a small desk attached. If students are utilizing the typical desk chair or the ball seat, they remain at their desks. If they are using the bike desk, then they are in a different portion of our room. For the purposes of this research, I will be describing the alternative seating as the ball seat, desk chair or bike desk options. I have observed that some kids benefit from all forms of alternative seating, some benefit from one form over the others, and some kids only remain engaged during the instructional time if they are in the typical desk chair option. Hulac, Mickelson, L. R., Briesch, A. M., Maroeca, H., Hartjes, C., Anderson, K., & Ederveen, K. (2022) imply that “the effectiveness of [these] alternative seating method[s] are mixed” within any given classrooms. Due to this observation in my own classroom, I have decided to research the engagement levels of a random selection of two boys and two girls over a period of three weeks during my direct teaching math instruction.

**Question/Wondering**

In what ways do three seating options impact the engagement of four first-grade students in my classroom during fifteen minutes of direct teach math instruction?

**Methodology/Results**

During my internship in a first-grade class, I observed that a large number of students prefer various alternative seating options to the traditional desk chair their classrooms are equipped with. Due to this fact, our classroom rotates through ball seats, bike desks and traditional desk chairs. Unfortunately, our classroom is only supplied with six ball seats and two bike desks. This means not every student gets the opportunity daily to sit in one of the alternative seating choices. While serving as an intern, I wanted to observe which seating choices would positively or negatively affect the engagement of a small selection of students.

This research homed in on how alternative seating choices impact engagement levels compared to a traditional desk chair arrangement. I focused on two female and two male students ages 6 and 7 years old. I observed these students over three weeks during the direct-teach portion of my math instruction. Baseline data had been observed through anecdotal notes throughout the start of the Spring semester in all seating choices. During my three weeks of gathering data, I noticed that each student had a different success rate in each of their seating choices. Student A had to be redirected the most while they were on a bike desk and the least when they were on the ball seat. Student B had far fewer redirections needed while on the ball seat and was least successful in both the bike and regular desk chair. In other words, they required the same quantity of redirections for both their desk chair and the bike desk. Student C, however, was most successful on the bike desk. Finally, Student D was the most engaged on a ball seat. Student C is the only participant who performed the best on a bike desk. All other students were the most engaged and required the least number of redirections on ball seats.

The engagement of my students also showed improvement towards the end of my data collection. I believe this is because my students had gotten more into their routines within our classroom and had more time to readjust to classroom procedures after their winter break.

**Implications/Recommendations**

I found that the best seating choice for my students was not universal. I expected there to be more similarities between my participants that there was. Student A, B, and D were all the most engaged on ball seats, however, the quantity of redirections they required for the alternative seating choices varied greatly. Ultimately, I would allow all students to explore their seating arrangements over multiple days and subject areas. This would provide me the opportunity to gauge which seating option was best for each individual student if I were to assign the alternative seats. If I were to not assign the seats, I would rotate as a class like we do now. We rotate when students arrive in the morning and after lunch. This maximizes the opportunities students have for experiencing the variations in seating. Space and functionality is also a key factor to consider when providing alternative seating. For example, my classroom has access to three bike desks, but there is only room for two. Rearranging space constantly allows for there to be the best possible arrangement and balance for alternative seating options.

Ultimately, I would love to use alternative seating options within my future classrooms as long as I have the means to do so. Students all learn in different ways and providing them with opportunities to evaluate their own learning style will grow their metacognition skills and make them a better learner.

**Reference(s)**

Hulac, Mickelson, L. R., Briesch, A. M., Maroeca, H., Hartjes, C., Anderson, K., & Ederveen, K. (2022). Stability Balls and Student on-Task Behavior. Journal of Behavioral Education, 31(3), 543–560. https://doi.org/10.1007/s10864-020-09412-3

### Building Fraction Logic with Manipulatives

**Primary Researchers **

Julia Morrison, Intern, Baylor University

Lindsey Pick, Mentor Teacher, South Bosque Elementary, Midway ISD

Lisa Plemons, Professor, Baylor University

**Rationale/Introduction**

Various forms of data revealed a small group of students’ lack of conceptual understanding when comparing fractions. These students resorted to strategies that are scarcely founded on logical understanding. To address the misconceptions, I intervened with a small group of 5 students in the spring semester of 2023. During this instructional time, I stressed the use of concrete manipulatives to assist students in forming logical thought processes to compare fractions without physical tools, as outlined in “Using Number Sense to Compare Fractions” (Bray and Abreu-Sanchez, 2010). I collected student data in the form of anecdotal notes, written explanations, and exit tickets to monitor students’ comprehension.

**Question/Wondering**

In what ways does using concrete manipulatives affect a small group of five 4th grade students' conceptual understanding of comparing fractions?

**Methodology/Results **

The research group consisted of 5 Caucasian students whose average age was 10 years old—2 boys and 3 girls. This group was chosen based off of the Learning Inventory of Needs Math screener exam administered at the beginning, middle, and end of year. After the middle of the year testing, 4 students were labeled as needing intervention, which indicates a lack of mastery on a grade level concept. Another student was labeled as striving in this skill. This indicated to me that these 5 students needed further instruction on comparing fractions.

Before beginning instruction, I administered a pre-assessment where students compared different fractions and explained their reasoning. Students attempted to use the “butterfly” method, which is a shortcut when comparing fractions. If the student could not explain why the butterfly method worked, they were not allowed to use it. None of the five students could explain why this method worked and were left with guessing or drawing models, which quickly became tricky when the numerators got larger. The average of these five students’ pre-assessment scores was a 68% with two of the students scoring 80%, two 70%, and one 40%.

To address this, I met with these five students for 30 minutes a day as a small group for 3 weeks. I created lesson plans which followed strategies outlined in Bray and Abreu-Sanchez’s “Using Number Sense to Compare Fractions,” which include using circle fraction tiles to build four logical thinking processes when comparing fractions (2010). In their article, Bray and Abreu-Sanchez explained the value of the circle fraction tiles and how they appear to students. When a piece is missing from the whole circle, it is obvious how many pieces are missing or how close the fraction is to ½ or 1 whole. This contrasted with the manipulative originally used to teach fractions in the fall, which consisted of different size pieces of a bar. The bar manipulative is not as obvious when comparing fractions because the whole is not as obvious—a bar can be as long as students imagine whereas a circle can only be filled to a certain extent. I introduced the circle manipulatives to students and allowed them to explore representing different fractions with the circles.

To begin addressing comparing fractions, I began with presenting students with fractions that have the same denominators within word problems. When students built these fractions, I asked how we could remember to compare these fractions without the tiles. Students were able to come up with a strategy for when the denominators are the same in fractions: when there are the same number of pieces, or denominators, we can look at how many pieces there are, the numerator, to compare the fractions. All five students were successful in using this strategy on day two of instruction.

The next skill I addressed was comparing fractions that have the same numerator, or same amount of pieces out of the whole. Students explored these types of problems and were led to another thinking strategy: when two fractions have the same amount of pieces, we look to see what size the pieces are. Students came up with an additional thinking strategy to help them remember this: the bigger the denominator, the smaller the pieces and vice versa. Three out of the five students were successful in using the strategy on day two of instruction. I met with only the two that did not grasp the strategy the next day to further their comprehension. After four days of instruction on this skill, students completed an exit ticket (without using their manipulatives), where they compared these two types of fractions and explained their reasoning. The group average was an 88%, with the lowest score being a 75%.

I used the rest of the time on the more challenging comparisons, where fractions had different numerators and denominators. The next skill we practiced was comparing fractions to ½. I gave students word problems that included comparing a fraction to ½ which they solved with their manipulatives. Then I prompted conversation on how thinking of ½ can be useful when comparing fractions. We generated other models that represented ½ so that students recognized equivalent fractions to ½. I administered another exit ticket this day. The results averaged 77%, with two of the students scoring 50%. We continued to practice this skill for the next two days. As we worked the next two days, I introduced how we can compare fractions to not only ½ but also to one whole to incorporate Bray and Abreu-Sanchez’s final strategy (2010). I gave students lists of fractions and had them write whether it was closest to 0, ½, or 1 whole. All five students did this correctly without manipulatives, showing they had built enough conceptual understanding with the manipulatives to not need them anymore.

After the final day of instruction, I administered the same assessment used as a pre-assessment to measure students’ overall growth. The students took the assessment one-on-one with me and were prompted to use the strategies we had learned. The results showed 4 students earning 100% and one earning 80%. The average of these scores increased from the pre-assessment by 14 points.

Throughout this process I collected data through exit tickets, which were composed of abstract problems and short answer questions that revealed students’ conceptual understanding. I also collected anecdotal notes daily on how students answered key questions and their mathematical reasoning. Anecdotal notes and students’ responses to short answer questions reveal they have developed many more comparing strategies than shown in the pre-assessment.

**Implications/Recommendations**

This study had many strengths as it was based on true research as referenced (Bray and Abreu-Sanchez, 2010). I also relied on my daily data and observations to make choices for the next day of instruction, which is of utmost importance when meeting students’ needs. In the future, I will definitely seek to use these strategies when teaching comparing fractions and will incorporate the values of this research project into my teaching philosophy as well. This project emphasized student-based learning experiences that gave students real context and hands-on involvement in building their own understanding.

This study could have improved with more consistent meetings. Because of many uncontrollable interruptions, there were days my group could not meet, pushing back instruction. Students might have retained more knowledge if instruction had not stretched on as long. This instruction could have been more meaningful if it was the initial introduction students had to comparing fractions. They had already been taught other comparing strategies in the previous semester, meaning I was occasionally working against misconceptions.

It would be valuable to conduct a research study with the whole class using the circle fraction manipulatives as an initial teach to comparing fractions.

**Reference(s)**

Bray, W. S., & Abreu-Sanchez, L. (2010). Using Number Sense to Compare Fractions. NCTM Publications, https://doi.org/10.5951/TCM.17.2.0090.

### Explicit Instruction of High-Frequency Word Parts

**Primary Researchers **

Emma Newton, Intern, Baylor University

Melissa Ellis, Mentor Teacher, South Bosque Elementary, Midway ISD

Lisa Plemons, M. Ed, Intern Supervisor, Baylor University

**Rationale/Introduction**

While working with a small group of second-grade students with Reading Disabilities, I noticed they have difficulty remembering high-frequency words (also known as sight words). The students attempt to sound the words out phonetically, but in doing so, they cannot pronounce the words correctly, as many high-frequency words do not follow the typical sound-symbol patterns of English. “Heart Word Magic is a complimentary teaching tool to help students learn to read and spell high-frequency words and sight words, particularly those that aren’t very decodable.” (Heart Word Magic, 2015). I want to see if providing explicit instruction, using the Heart Word Magic tool as an aide, for these students will result in improved mastery of high-frequency words.

**Question/Wondering**

I would like to know if explicitly teaching two second-grade males and one second-grade female to recognize the regular and irregular parts of various high-frequency words will improve their retention of high-frequency words.

**Methodology/Results**

This action research was conducted with the intent to improve the retention of high-frequency words in a small group of three second-grade students who I work with for 30 minutes each school day. My small group of three second-grade students consisted of two Caucasian males and one Caucasian female of various socio-economic backgrounds. Two of the students (one male and one female) are diagnosed with a Reading Disability with the condition of dyslexia and the other male exhibits dyslexic tendencies.

First, I collected a list of 100 high-frequency words to use for my research. Before implementing my research procedures, I assessed each student individually on their prior knowledge of high-frequency words using my list of 100 high-frequency words by having them read each word and noting which words they knew and which ones they did not know. The data showed one male knew 50 words on the list, the other male knew 58 words, and the female knew 23 high-frequency words. I used this information to determine which words the students unanimously did and did not know, which helped me determine which high-frequency words to teach the students. Next, I started my research procedure by having the students complete a Heart Word Magic worksheet where they had to write five high-frequency words, the word parts of each word, and identify the “tricky” part(s) of each word (the part of the word that is not spelled how it sounds) which is part of the word they needed to “know in their hearts.” The students also colored a heart above the “tricky” part of the word to indicate the part of the word they needed to “know in their hearts.”

As I taught the students new high-frequency words, I explicitly taught the regular and irregular word parts. I emphasized the irregular word part and explained why that specific part of the word was “tricky.” For instance, ai in the word said is the tricky word part because it does not say its typical long sound “ā;” instead, ai says the short sound “e.” This explicit instruction greatly supported the students’ mastery of pronunciation and spelling of the word said. In addition, I reviewed high-frequency words they had already mastered during each session we worked together. Once I taught the students new words, I would have them write high-frequency words on the back of their paper for extra practice with spelling the high-frequency words, or I would go through flashcards as practice for the automaticity of sight words. This extra reinforcement kept the students from regressing and even gave them more confidence when spelling and reading high-frequency words. Eventually, the students would come to our small group spelling new words we had learned together.

After implementing explicit instruction, I assessed each student individually on their new knowledge of high-frequency words. One of the males now knows 79 high-frequency words, the other male knows 73, and the female knows 53. Each student learned at least 15 new words in only a few weeks. Therefore, my research did prove that explicitly teaching the regular and irregular parts of high-frequency words improved the students’ retention of high-frequency words tremendously. Each student mastered several more sight words after explicit instruction of regular and irregular parts of high-frequency words.

**Implications/Recommendations**

I highly recommend explicitly teaching the regular and irregular (“tricky”) parts of each high-frequency word because this particular instructional strategy improved the students’ retention of words and word parts. This instruction will not only improve students’ automaticity of high-frequency words, but also improve their reading and writing. Having high-frequency word automaticity will allow the students to read more fluently, as well as help students to spell those words correctly when writing. I was extremely pleased with the results of my research because each student learned significantly more high-frequency words. While the students gained much confidence in spelling high-frequency words, they did not gain as much confidence when reading those words. One thing I would change about how I conducted my research would be to encourage the students to read the words with more confidence. I did tell them to be more confident when reading the words, but not consistently enough. Overall, this study far surpassed the positive results I could have imagined.

**Reference(s)**

Heart Word Magic. (2015). Heart Word Magic / Really Great Reading. Reallygreatreading.com.https://www.reallygreatreading.com/heart-word-magic

### Problem Solving with CPA Model

**Primary Researchers**

Elizabeth Perkins, Intern, Baylor University

Lindsey Pick and Amber Brown, Elem Ed, Mentor Teacher, South Bosque, Midway ISD

Lisa Plemons, M.Ed., Intern Supervisor, Baylor University

**Rationale/Introduction**

In my fourth-grade classroom, I have noticed three students that consistently underachieve when solving fourth-grade operations within word problems. A lack of mathematical reasoning and visualization is evident when analyzing the students' work. In conversations with the students, it becomes apparent they have isolated what they think are “keywords” and made illogical presumptions that misrepresent the math narrative. The Concrete-Pictorial-Abstract (CPA) model of instruction has been known to improve students’ abilities within concepts such as math operations and fractions. This research will study the model’s effect on students’ ability to apply operations within multi-step word problems.

**Question/Wondering**

How do math interventions utilizing the CPA model affect students’ ability to solve multi-step word problems in three low-level, fourth-grade students?

**Methodology/Results**

In this study, three white female students were identified as needing intervention in multi-step word problems through district benchmarks and progress monitoring assessments. To evaluate the CPA model’s influence on these students’ math problem-solving skills, I met with the small group in 30-minute increments three days a week for three weeks to provide intervention instruction. Each week they utilized a different step of the CPA model.

During the first week, I provided each student with manipulatives that directly related to the word problems the group worked to solve. For instance, a word problem had students calculate the number of tables needed for an event, given the number of people. Provided materials included brown paper circles and rectangles to represent the two types of tables mentioned in the word problem. All three students worked through the word problem individually before the group discussed the processes and thinking. On the fourth day, I individually assessed the students, referred to as Students 1, 2, and 3, using concrete materials to solve a multi-step word problem. Though Students 1, 2, and 3 ended with the correct answer, they each sought support while working through the word problem. None could successfully solve the problem independently despite their access to concrete materials.

In the second week, I introduced pictorial models to represent the operations in word problems. Students used these pictures to work through word problems individually before the group discussed their methods and thinking. After three days of this intervention, the students individually solved a given word problem using pictorial representations. Student 1 found the correct answer using the correct abstract operational methods, but the drawings did not accurately represent the three operations in the word problem. Student 2 accurately represented and solved the two steps of the word problem but did not show an attempt at the final step. Student 3 accurately represented and solved the first step, but step 2 was not accurately depicted or solved. Student 3 also did not show an attempt at the final third step.

To conclude the intervention, the final week reverted to the initial teaching students received abstract computation of multi-step word problems. During the week, students were again given the opportunity to complete the word problem on their own before the group would discuss methods and thinking. At the end of the week, I pulled each student to complete a word problem and explain the reasoning. Student 1 correctly identified the operations necessary for the word problem, multiplication, and addition and correctly solved the first step. However, Student 1’s second step did not operate on the correct numbers in the word problem. Therefore, Student 1 did not find the correct solution. Student 2 correctly identified the addition necessary in the problem but did not identify multiplication until she explained her process. Student 2 went so far as to describe the entire correct process for solving the word problem but then decided that the incorrect steps shown in her work were the proper solution to the problem. Student 3 also identified the correct operations but performed them on the numbers in the order they came in the word problem.

The students’ assessment data did not support my intervention strategy. Students showed little growth in their conceptual understanding of given word problems.

**Implications/Recommendations**

Through this study, I have found that teaching students word problems using the CPA model is ineffective. If students do not have a conceptual understanding of an operation, they will not be able to successfully solve mathematical scenarios. In the future, I believe I will employ the CPA model when initially teaching operations. In analyzing my research results, I wonder if building a foundational understanding of the concrete construction and visual representations of the four operations would produce more success in students as they solve multi-step word problems.

**Reference(s)**

Putri, H. E., Suwangsih, E., Rahayu, P., Nikawanti, G., Enzelina, E., & Wahyudy, M. A. (2020). Influence of Concrete-Pictorial-Abstract (CPA) Approach on the Enhancement of Primary School Students’ Mathematical Reasoning Ability. Elementary School Forum (Mimbar Sekolah Dasar), 7(1), 119–132.

Purwadi, I. M. A., Sudiarta, I. G. P., & Suparta, I. N. (2019). The Effect of Concrete-Pictorial-Abstract Strategy toward Students’ Mathematical Conceptual Understanding and Mathematical Representation on Fractions. International Journal of Instruction, 12(1), 1113–1126.

### A Token Economy and Visual Timer Approach

**Primary Researchers **

Zoe Wilson, Intern, Baylor University

Kelli Zander, BS Psychology, Mentor Teacher, South Bosque Elementary, Midway ISD

Lisa Plemons, MS Ed, Professor, Baylor University

**Rationale/Introduction**

In my ECSE classroom, I’ve observed a 4-year-old Hispanic female with autism who can consistently and independently check her visual schedule, take the picture to the center, and attach it in its correct place. However, she then proceeds to a more preferred activity, avoiding the center altogether. Fittipaldi-Wert’s research “found positive findings for the use of visual supports in acquiring skills, increasing social interactions, and decreasing off-task behaviors.” This research project shows the changes caused by using a token board and visual timer to increase this student’s time and engagement in centers.

**Question/Wondering**

How does a token economy and visual timer impact a 4-year-old Hispanic female with autism’s engagement in centers?

**Methodology/Results**

To begin my research and after discussing it with my mentor, I first removed known distractors. This included the sign-in board the student repeatedly removed the names from. Further, I covered a blue floor tile she would lay next to and stare at. Next, I gathered baseline data for one week to clearly understand my student’s current engagement in centers. Her routine includes eating breakfast and checking her schedule for a center icon once finished. She places the general center picture on a black foam board on which she has six specific center icons. This student is accustomed to this process but, after placing an icon at its coordinating center, continues to a preferred activity. I prepared a duration chart to record the start and end times of each center and activity where my student interacted, recording any additional observations and context. My student spent 37.5% of the time given playing in centers. She alternated between three centers: fifteen minutes in books, four and a half minutes in sensory, and three minutes in toys over a span of forty-eight minutes during the week. To conduct the intervention, I created and gathered a penny board and icon picture of preferable items. I selected items from a preference survey sent to my student’s parents and items she’s demonstrated an interest in at school, specifically goldfish, iPad, and swing. As she finished her breakfast, I led her to check her visual schedule for centers and presented a penny board. I gave her two options to work for. She selected a specific center’s icon and placed it at its coordinating center. Once there, I rewarded one penny; she received her preselected item upon obtaining five pennies. I periodically gave more pennies as she played with toys, initially rewarding every small interaction with the center. As she learned that she later received her preferred item, I gradually prolonged the time between giving pennies. If she left a center before getting the pennies needed, I lead my student back to the center board to choose from the remaining options. Once she received five, I presented the preferred item. In the case goldfish were selected as the reward, I gave one before continuing centers, but when rewarding with the activities, I set a visual timer for two minutes. Displaying this timer prepared her for an easier transition since she could anticipate her time ending. She repeated the process of choosing another center and earning her chosen reward through pennies as time allowed before the class’s beginning circle. Over the span of two weeks collecting data as I did this with my student, she increased her time in centers from 37.5% to 51% of the time given. Compared to the baseline of three centers, she interacted with five centers: books, sensory, blocks, toys, and home. Across the two weeks, she spent 28 minutes in books, 9 minutes in sensory, 4 minutes in blocks, 12 minutes in toys, and 9.5 minutes in home living.

As stated previously, this project set out to change my student’s behavior pattern of only matching the center’s coordinating icon and leaving to routinely interact in the centers. Based on the data, the percentage of time spent in a center increased for my 4-year-old Hispanic female, and not only that, but she explored two new centers.

**Implications/Recommendations**

Reflecting on this study, I identified both strengths and weaknesses. This study was strong in that I worked with my student in the mornings after she finished breakfast while other students also participated in centers. This allowed me to focus on this student more than if I were leading a lesson. Not only that, but it provided opportunities for social interaction with peers. A student, on two occasions, invited this student to play with him. Removing known distractors allowed me to examine how she interacts with centers. Weaknesses also existed, however. A weakness of this study was that the amount of time offered for centers varied daily based on how long she ate. More of a weakness, both baseline and intervention data include me leading the student to choose a center, so it missed the independent component.

This study demonstrated that my student does well with working for a reward on a penny board, so I intend to implement this into instruction. Two areas that might prove beneficial are during whole group activities and table time. I will create reward options as she expresses interest in others.

Wonderings surface as I reflect on this study, one asking the question of how decreasing the reward activity’s duration to one minute affects this student. Would that increase the overall time in centers or create frustration? Additionally, I would be curious to see if removing the student’s top choices, books, and toys would increase time in unfamiliar centers or create a resistance to centers.

**Reference(s)**

Fittipaldi-Wert, J. (2007). The use of visual supports for students with autism in inclusive physical education (Order No. 3273355).