top of page
Quantitative Evidence

For my class students take the Next Generation Sunshine State Standards (NGSSS) Civics End-of-Course Exam (EOC). This assessment is a criterion-based test based on American government Civics content. The EOC is offered once at the end of the school year however Miami-Dade County Public School District creates ten "mini-assessments" that are computer-administered throughout the year in preparation for the larger assessment. As students progress through the mini-assessments, teachers can make real-time decisions about instructional focuses and can form a basis for review and remediation strategies. The EOC is a useful tool to track student reading comprehension as many of the test items require specific reading skills. This assessment can be further useful when teachers invest their students in data and goal-setting around their scores and ground students in the process of improvement.  

TABLE OF CONTENTS

1. Item-Specifications and Sample Questions

2. Student Benchmark Assessment Scores

3. Score Improvement Plan

4. Student Final Scores

1. Item Specifications and Sample Questions

One challenge teachers face in preparing students for the EOC exam is that the state does not release current or previous test items publicly. This means that teachers are responsible for providing aligned materials for students that will maximize their potential for success on the exam. One tool that the state provides in assisting teachers are the item specifications. The document below describes the specific aspects of Civics content that will be tested including the type of stimuli that will be used to gauge student knowledge. For example for the content benchmark 1.1 (Enlightenment Thinkers) will use a graphic to have students decipher between the ideas of John Locke and Baron de Montesquieu. While this information is not overly specific, it helps me as a teacher to determine the most important information that I need to get across to students. A more detailed look at the specific item information can be found for each benchmark in the document below:

The document above holds a treasure trove of information that will guide teachers in preparing students for their End-of-Course examination. As students engage in practice questions, the state categorizes them from a difficulty level of easy through challenging. Additionally they describe what percentage of students state-wide would be expected to get that question right. This scale is very helpful as we engage in exam review because it allows me to see where my students stand in relation to their peers. When creating classroom assessments,I am sure to pull from each category of questions to ensure my students mastery of the content from recall through analysis. 

SAMPLE QUESTION #1: Easy Complexity

The sample question below reflects an "easy complexity" question. As shown in the excerpt above, at least 70% of my students would have been expected to get this question correct in order to remain on pace with their peers across the state. This question was pulled from the state item specifications and included in a teacher-created benchmark assessment to ensure alignment to state standards. The correct answer for the question below is "D" in which the graph measure 148 out of 189 students answering correctly for a total of approximately 78%. As a teacher this tells me a few things: First, that my students as a whole are proficient in this benchmark as they are out-performing their state-wide peers. Second, it reveals any misconceptions that groups of my students have surrounding this benchmark. For example, a potential question that I could ask myself is "why did eleven of my students select Magna Carta for this answer choice? How can I better clarify why this answer is not Magna Carta during remediation?"

SAMPLE QUESTION #2: Average Complexity

The sample question below reflects a "medium complexity" question. As shown in the excerpt above, between 40% and 70% of my students would have been expected to get this question correct in order to remain on pace with their peers across the state. This question was also pulled from the state item specifications and included in a teacher-created benchmark assessment to ensure alignment to state standards. The graphic for this question is the same as in the proceeding question, however the answer choices have been made more rigorous as this assessment was given later in the school year. The correct answer for the question below is "B" in which the graph measure 109 out of 189 students answering correctly for a total of approximately 58%. As a teacher this tells me that my students as a whole are proficient in this benchmark as they are performing on-par with their state-wide peers. Second, it reveals any misconceptions that groups of my students have surrounding this benchmark. Since this question did not call for simple identification, but rather for application, I need to work with groups of students to work on applying what we know to a given situation.

SAMPLE QUESTION #3: Challenging Complexity

The sample question below reflects a "challenging complexity" question. As shown in the excerpt above, below 40% of my students would have been expected to get this question correct in order to remain on pace with their peers across the state. This question was also pulled from the state item specifications and included in a teacher-created benchmark assessment to ensure alignment to state standards. The graphic for this question is the same as in the other two question examples, however the answer choices have been made more rigorous as this assessment was given towards the end of the school year. The correct answer for the question below is "B" in which the graph measure 74 out of 189 students answering correctly for a total of approximately 39%. As a teacher this tells me that my students as a whole are proficient in this benchmark as they are performing on-par with their state-wide peers. Overall I was pleased with this result as my students fell at the higher end of the state designated range however a few red flags were raised with this question in particular. While many students selected the correct answer, a large contingency of students selected A as the correct answer which is incorrect. As the teacher this tells me that I need to review this question explicitly to clear up misconceptions involving voter registration and its relationship with the American Revolution. All in all my students made drastic progress in the span of these three examinations. Had I given my students questions with the distinction of "challenging" on their initial exam then I do not believe that they would score in the proficient range. Since questions were scaffolded over time and students built up skill sets and knew how to work through problems.

2. Student Benchmark Assessment Scores

In order to measure student progress along the way, the Miami-Dade County Public Schools Social Studies Department creates periodic Benchmark Assessments that are meant to mirror the End-of-Course Exam. At the conclusion of each topic students are given a computer-administered exam featuring ten questions on the previous topic. The questions have been written and screened by district personnel to be aligned with state standards. Teachers do not have direct access or input in the question creation and review process, however they are informed by state item specifications on how to best prepare students for assessments. Based on End-of-Course exam scoring standards (which will be reviewed in the next section), the district draws the line for proficiency at around 50% on each benchmark assessment. That is,is a student answers at least five questions correct out of ten, then they are deemed proficient for that benchmark. These assessments are an excellent tool for teachers because they allow us to take stops along the way to measure student progress and assess performance. Using this data, teachers can then make informed instructional decisions about how to review benchmarks and re-teach when necessary. Using tools like differentiated instruction, I used a variety of methodology to increase student performance from benchmark to benchmark to get us as close to mastery as possible for our final exam. 

The graph on the left portrays student benchmark assessment scores on each of the initial eight benchmark assessments from each of my six Civics classes. On the first few assessments my classes in large were falling below proficiency as deemed by the district. While there are a variety of reasons for this, I needed to establish a plan to improve student performance across my classes. Towards the latter half of the semester all of my classes scored above proficiency with my advanced class scoring a 76.8% average on their final benchmark assessment; 26.8 percentage points above proficiency. In all every class grew at least 15 to 20 percentage points over the course of the semester. My improvement plan is outlined in the next session.

3. Score Improvement Plan

In order to improve student benchmark assessment scores over time to ensure success on their End-of-Course exams, I needed to develop a strategy to assist students. Ultimately I decided to carve out a two-part approach to facilitating student improvement. First, I needed to invest my students in reflecting on their performance and setting goals for themselves to improve on. Then, I needed to equip my students with the skills and strategies necessary to reach their rigorous goals. 

Part 1: Reflection and Goal-Setting

The form on the left is a student data chat form recommended for use within my social studies department. This form is incredibly useful in investing students in their own progress as I review the form with each student through individual conferencing at the end of the semester. Students are responsible for recording their data on their form at the end of each assessment. At the completion of the final benchmark assessment students are to review their weakest and strongest benchmarks with the teacher in order to determine a plan for remediation in preparation for the End-of-Course exam. Finally students answer a few reflection questions on their progress thus far. 

Once students become acquainted with where they stand in relation to their former scores and that of their peers they can begin to determine how they want to improve their scores. As the teacher I invest students in setting tangible goals for themselves. Typically I push students to improve at least 10% on each benchmark assessment. For the student example on the left, this student improved relatively steadily over the course of the semester rising as high as a 90% on their final benchmark assessment. One reason for this student's success is because they were pushing themselves over the course of each lesson to retain as much information as possible to "meet their goal." In addition to individual student goals, students were rewarded as an entire class in a contest against my other class periods. Whichever class period scored the highest average with all students scoring over proficiency earned a pizza party. This was an effective motivational strategy because students were motivated to not only focus on themselves but towards assisting their classmates in comprehending the material. This contest manifested itself in students also keeping track of one another in regards to classroom management and completion of classwork assignments. 

Part 2: Test Taking Strategies

Once students were invested in their own progress it was my job to equip my students with the tools necessary to accomplish their goals. Since the End-of-Course exam is a seventy-five question, multiple choice question exam, I wanted to give my students not only knowledge on Civics content, but also test taking strategies. From decoding questions to interpreting graphs, throughout the year I worked with students to ensure they had a full tool kit at their disposal to practice on their benchmark assessments and ultimately thrive on their End-of-Course exam. The document below describes some of the strategies that I worked on with my students over the course of the year.

4. Student Final Scores

Throughout the year many of the activities and benchmark assessments for the course are building up towards our culminating End-of-Course exam. The exam is taken during the last week of April over a two-day period and scores are not released until the summer. Over the summer I call home to parents and students to let them know their scores as soon as I get them. Often times students are very excited to receive their scores (even going as far as to email me everyday!). Students are scored on a scale between 1 and 5 with a 3 or above being proficient:

In order to gain credit for the Civics course, students need to maintain a 2.0 grade point average for the entire year and pass as "proficient" for their end of course exam. With that in mind many students take the exam very seriously as they are eager to move on to the next course. A passing score at level 3 begins at 394 points. Depending on student results in a given year this typically equates to answering approximately 60% of the exam's questions correctly. Since different questions are assigned a different number of points based on complexity, it is unclear exactly how much each question is worth. Since the test itself is secured from year to year, the actual questions are not released however student scores are given as well as how individual schools stand against other schools in their district.

The document on the left describes the Civics scores for all schools in Miami-Dade County. The columns in the chart measure what percentage of students per school scored a 3 or above (proficient) on their End-of-Course exam. On page two, Norland Middle School had 57% of students score proficient on the exam. This score is an accumulation of my classes with that of the two other Civics teachers on campus. From 2015 to 2017, my time at Norland, our scores rose from 47% proficiency to 57% proficiency for the Civics EOC. 

 

 

On an individual level, 76% of students in my classes demonstrated proficiency on their EOC, well above my school and district average. A break down of each students scores can be found in the chart below. All in all, my class was commended for demonstrating one of the most improved in the district.

Dramatic Student Growth                                                                                                                                                   Qualititative

bottom of page