Was HB7069 Just Another VAM Scam?

When parents complain about over-testing, local school board members often advise them to write to their legislators. When similar concerns are brought to the Florida Department of Education (FLDOE), parents are told “these are local decisions.” Which is it?

In April 2015, following a year where parents had loudly expressed their displeasure with the explosion of testing in public schools and the new Florida State Assessment (FSA) had a disastrous roll out, HB 7069, “An act relating to education accountability” was signed into law and immediately went into effect.

HB 7069 promised to (among other things):

  • reduce “state and local assessment requirements, including those commonly associated with progress monitoring.”
  • “eliminate prescriptive remediation and progress monitoring requirements for low-performing students and provide for targeted instructional support in reading in K-3 students.”
  • grant “districts greater flexibility in measuring student performance in courses not associated with statewide, standardized assessments and in evaluating instructional personnel and school administrators”

With such lofty aspirations, why haven’t we seen a reduction in progress monitoring, remediation and testing, in general, in the classrooms?

Reduction of Progress Monitoring:

Much of the overall testing in the classroom comes in the form of “progress monitoring”: assessments designed to quantify a child’s progress, often used to predict performance on upcoming state testing. I like to call these “the test that tests whether you’re ready to take the test.” Prior to HB7069, progress monitoring was required to assess reading skills of all K-3 students. Additionally, after third grade, progress monitoring was required for any student scoring a level 1 or 2 on a state assessment. In many districts, ALL students were progress monitored, regardless of FCAT/FSA score AND K-3 students were often progress monitored in Math in addition to the required progress monitoring in reading.

The use of standardized testing in K-3 students is problematic to begin with, ignoring developmental differences amongst our youngest learners and focusing the classroom on structured academic learning rather than more effective and developmentally appropriate play based learning. In general, standardized tests are considered invalid and unreliable for children under 8 (read more here, here and here).

Section 2 of HB7069 (Section 1002.20, F.S.) deleted the requirement that each elementary school regularly assess the reading ability of each K-3 student. This was good news and could have eliminated almost ALL of the standardized testing in K-3. Districts were still required to monitor students who were not meeting the performance requirements, but surely qualified teachers could have identified those children in need of intervention.

Sadly, our “littles” continue to be tested and retested in ways that are not developmentally appropriate and, more than likely, are harmful. Where are the districts that took the opportunity HB7069 offered and eliminated standardized progress monitoring in their K-3 classrooms? My own county (Monroe) continues to progress monitor all K-3 students in Reading and Math, using computer based standardized testing. (read more about the inappropriateness and unreliability of computer based testing in small children here).

Instead of requiring progress monitoring for all students (section 1008.25.20, F.S), districts were given three options for monitoring students who were “not meeting the school district or state requirements for satisfactory performance” or did not score a level 3 on a state standardized assessment. Any such student must be covered by one of the following plans:

  • A federally required student plan such as an individual education plan (IEP);
  • A schoolwide system of progress monitoring for all students, except a student who scores Level 4 or above in the specific subject area statewide assessment may be exempted from participation by the principal; or
  • An individual progress monitoring plan.

Districts could have chosen to individualize the progress monitoring plan, dramatically decreasing the amount of standardized testing for many students. Sadly, many (if not all) districts have chosen plan “B”: monitor ALL students in Math and Reading, whether they need it, or not.

Eliminate Prescriptive Remediation / Intensive Reading:

Prior to HB 7069, the state required that any student scoring a 1 or 2 on state assessments must be enrolled in remedial courses.  This led to the uncomfortable situation of students being enrolled in both Advanced Placement English Literature and Intensive Reading, at the same time.  The mandated remedial courses, also, meant many struggling students lost their electives to remediation.

Sections 3 and 4 of HB 7069 (Section 1003.4156, F.S. and 1003.4282, F.S.) eliminated the requirement for middle and high school students scoring Level 1 or 2 on state testing to automatically be enrolled in a remedial course. The decision to provide remedial courses is now supposed to be a local decision.

Section 9 of HB 7069 (section 1008.25, F.S.) discussed the use of “support” as opposed to remediation: “each student who does not achieve a Level 3 (satisfactory) or above on a statewide, standardized assessment must be evaluated to determine the nature of the student’s difficulty, the areas of academic need, and strategies for providing academic support to improve the student’s performance.”

So, mandatory remediation is no longer required and placement in remedial courses is now a district decision. Still, in many districts, students continue to be placed in remedial courses primarily, and sometimes entirely, based on test scores. My county, chose to use progress monitoring data to identify students in need of remediation, even, in some cases, where the teacher did not, or would not, recommend that child for such interventions. Why not return these decisions to the classroom teacher, who would surely be better at determining “the nature of the student’s difficulty, the areas of academic need, and strategies for providing academic support to improve the student’s performance” than a test score?

Flexibility in District created final exams:

Perhaps, the biggest outcry over testing during the 2014-15 school year was the unfunded state mandate that required districts to create common final exams for all grades and all subjects, including Kindergarten art class and 1st grade PE, with test results to be used, in part, to evaluate teachers. The requirement for district finals in all grades and all subjects was a requirement of SB736. Called the “Student Success Act,” SB736 also mandated the use of test scores in teacher evaluations (more about VAM below) and was passed in 2011, as a prerequisite to obtaining Race To The Top funding.     SB736 was the first bill Governor Scott signed after taking office.

Section 7 of HB 7069  (Section 1008.22 F.S.) eliminated SB 736’s requirement for “all grades, all subject” district created final exams. This should have allowed districts to return to teacher created final exams in most situations. Sadly, many districts are continuing to use and develop new district final exams in non-state tested subjects. District created midterms are not uncommon and some districts have created common 9 week assessments.

Why, when given the opportunity, didn’t districts choose to move towards other methods of measuring student performance? The initial mandate allowed for performance or portfolio assessments, but districts still seem to be moving toward multiple choice, common midterms and finals for many of their middle and high school courses. Why not use teacher created exams, performance assessments or portfolios? When teachers create the assessment, children are tested on what has been taught. When districts create the assessment, teachers must teach what will be tested and the results become as much of an assessment of the teacher as the student.

Why Is there Still So Much Testing?

HB7069 appears to have returned the responsibility of progress monitoring, remediation and final exams to the districts. School Boards were given the opportunity to dramatically decrease the amount of testing in their districts. They could choose to eliminate standardized testing in K-3. They could choose to individualize progress monitoring for their struggling students and eliminate it for the rest. They could choose individual teacher created final exams.

Or could they?

Screen Shot 2016-06-15 at 7.48.04 AM

Near the end of their FOIL presentation on HB 7069, the FLDOE described their “opportunity”: “To reclaim the powerful potential of VAM to support leaders in making data-driven decisions that support student learning and educator growth.” VAM (or Value Added Model) is a formula designed to evaluate teachers based on student test scores. Rarely are teacher created assessments or decisions included in “data-driven decisions.”

The use of VAM in teacher evaluations is one of education reformer’s favorite ideas, despite the fact that the method has been discredited by multiple studies (here, here, and here) and, frankly, defies common sense. Steven Klees, professor University of Maryland, wrote:

“The bottom line is that regardless of technical sophistication, the use of VAM is never [and, perhaps never will be] ‘accurate, reliable, and valid’ and will never yield ‘rigorously supported inferences” as expected and desired.”

In this 3 minute video, Stanford Professor Emeritus, Edward Haertel, further explains the flaws of using Value-Added Models for teacher assessment.

Teachers harmed by their VAM scores are beginning to sue. In May, a judge in New York found in favor of fourth-grade teacher Sheri G. Lederman, calling the state’s use of VAM in her evaluation “arbitrary” and “capricious.” Similar cases are popping up across the country.

Why continue to use such a controversial and discredited method of teacher evaluation? Who benefits? The answer may be as simple as following the money.

The design and implementation of VAM in Florida Public schools was contracted to American Institutes for Research or AIR (the same company that later won the bid to create the new FSA). The initial contract of almost $4M and millions more for annual “maintenance”  fees per this long term contract.

I was, personally, surprised to discover that the “VAM formula” had annual maintenance fees. Floridians are spending massive amounts of tax payer money, annually, on a flawed formula that has essentially been discredited and has been declared arbitrary and capricious in a court of law. The FLDOE should expect to pay even more tax dollars defending against the VAM lawsuits that are sure to come. Paying vendors excessive amounts tax dollars on unproven, or disproven, education policies in the name of “Accountability”? Sounds like education “reform” at work…

The AIR representative for the coordination of VAM contract, which led to the development of Florida’s Value Added Model, was Christy Hovanetz, who served as Contract Manager. Prior to working for AIR, Ms. Hovanetz served as the Assistant Deputy Commissioner of the FLDOE. Ms. Hovanetz now works as a Senior Policy Fellow at Jeb Bush’s Foundation for Excellence in Education (FEE). Not surprisingly, the FEE’s Florida affiliate, Foundation for Florida’s Future (FFF) applauded the initial implementation of VAM, stating “This model will transform Florida’s historic law into a powerful tool to raise the quality of public education and establish the Sunshine State as a national model for teacher quality.”

It should come as no surprise to anyone that Jeb Bush and his Foundations have had unprecedented influence on Florida education policy, apparently without regards to costs. The state’s continued allegiance to VAM is a perfect example of that influence. One often wonders why Florida would continue to promote questionable policies and fund flawed practices, essentially ignoring the research data to the contrary and often defying even common sense (I’m looking at you, and your “Best and Brightest” bill, Erik Fresen). You don’t have to look too far to understand that the policy makers have been more interested in keeping Jeb, and perhaps his investor friends, happy than doing what was right for the public school children in Florida. Senator Gaetz said as much to Politico reporter, Jessica Bakeman:

Reflecting on the bill’s fate during the session that ended in March, Gaetz, a term-limited Niceville Republican, said House Republicans resisted changes because of their loyalty to Bush. But he argued they were wrong to assume Bush would oppose his plan.

  To read the full article, click here.

Who benefits from VAM? Test developers, like AIR, (and their investors) who profit from the formula’s use as well as the tests needed to provide the data, and education reformers, like former Gov. Bush, who can use the data to further the narrative that public schools are failing and teachers are to blame. With Gov. Bush back in charge of the FEE, parents will have an even more difficult time being heard.

So, herein lies the problem: If the purpose of progress monitoring, remediation and common finals is to collect data for VAM and other data-driven decisions, and the FLDOE wanted to use this opportunity to “reclaim” VAM’s potential, then HB7069 was a sham. Legislators, bombarded with concerns about over testing, were convinced to vote for a bill that merely shifted the blame for over testing from the state to the districts and maintained the data collection requirements needed to sustain the use of VAM. HB7069 made it possible for Tallahassee to say “we have given control back to the districts” while districts understand the truth behind that illusion.  State mandates continue to force districts to comply with data-driven accountability mandates, leaving little room for true local control. Under constant threats of decreased funding, districts focus all their efforts on maintaining compliance.

To be fair, HB7069 did eliminate the 11th grade FSA ELA, before it was ever administered, and the mandatory administration of the PERT assessment, previously given in 11th grade. It also placed a limit on state and district testing at 5%, or a ridiculously high 45 hours per year (yet no system was set up to monitor that and few, if any, districts had to reduce testing to get in under the bar). The bill also reduced the use of test scores in teacher’s evaluation from 50% to 33%. These components may be admirable, but they do little to lessen the real impact of high stakes testing on our kids.

In the end, parents complained about excessive testing and Tallahassee responded by shifting the blame to the districts; which is ironic when the bill was titled “An act relating to education accountability.” Who is accountable for the over-testing problems in Florida’s public schools? If you ask Tallahassee, who created the policies, the answer appears to be “not me.”

Parents are tired of laws that promise change but don’t deliver. Ignoring research data and continuing to pay millions for failed programs like VAM is not true accountability. Blaming others for the consequences of your own policies, by passing sham laws like HB7069, makes a mockery of the word “accountability.” Enough is enough. We are AGAIN asking for a full review of the accountability system in Florida.

And this time we suggest listening to someone besides just Jeb!

Algebra 1 EOC: Ridiculously High Stakes

5/22/2016, ACCOUNTABALONEY UPDATE:

At the 2016 Spring FOIL Conference (Florida Organization of Instructional Leaders), The FLDOE gave a presentation on the details contained in HB 7029:School Choice, one of the two long “train bills” from the 2016 session. To give you an idea of the bill’s complexity, the FLDOE summary presented was 135 pages long! It included this good news:

Section 27. Amends s. 1011.61, F.S…to “Delete the provisions requiring an FTE adjustment when a student does not pass an end-of-course exam required to earn a high school diploma. The FTE adjustment was scheduled to begin in the 2016-17 school year.”

In other words, the Algebra 1 Performance Based Funding, discussed in this blog, has been repealed. The other high stakes attached to the Algebra 1 EOC remain in place.


Occasionally, reformers will dismiss concerns regarding the high stakes attached to standardized testing in our schools by comparing these state mandated tests to the test you must pass to get your driver’s license. For example, in 2015, current Senate President, Andy Gardiner told an Orange County Legislative Delegation (read about it here):  “My driver’s test was high-stakes, because I didn’t want my parents driving me around anymore.” Yes, being driven around by your parents is terrible…

The stakes attached to the Algebra 1 EOC recently got quite a bit higher, so we decided this was a good time to compare the consequences of the current state mandated Algebra 1 End of Course exam (EOC), which we will argue has the highest stakes of all of Florida’s state mandated assessments, with having your parents drive you around when you are 16.

The current Algebra 1 EOC was created by AIR, the same company that created the Florida Standards Assessment or FSA and first administered in 2015.  Cut scores for the 2015 administration were finally set in January 2016, resulting in an overall FAILURE RATE of 44% (with some schools and districts having worse performance than others).

Passing the Algebra 1 EOC is a high school graduation requirement. Students who fail the 3 hour, computer based assessment must either retake and pass the exam or earn a concordant score on the PERT exam.

A student’s score on the Algebra 1 EOC, like all of Florida’s EOCs, is worth 30% of the students grade. Some Magnet Schools require the completion of Algebra 1 before a student can gain admission to their programs. Students failing Algebra 1 will be ineligible for admission to these highly sought after programs.

Student scores on the Algebra 1 EOC are used to calculate their teacher’s VAM score, which affects their teacher’s effectiveness rating and merit pay. Teachers receiving multiple ineffective ratings may lose their jobs.

Student scores on the Algebra 1 EOC are a major component of the Middle School A-F School Grade calculation, making up at least 11% of the possible points. Middle schools are rated on the percentage of eligible students who pass a high school EOC (most often Algebra 1, though occasionally students will be offered other courses with EOCs) or earn an industry certification.

FullSizeRender-17

The number of eligible students is determined by the number of students in the class AND all 8th graders who scored a 3 or higher on EITHER Reading or Math in their 7th grade state assessment. To be clear, the number of students deemed eligible to take a high school level Algebra 1 course in 8th grade is NOT determined by their performance on any Algebra readiness standards but on their previous year’s FSA Math and Reading scores.

FullSizeRender-15

Performance on all of the State mandated Math EOCs, including the Algebra 1 EOC, along with calculated learning gains, also represent 30% of the calculation of High School A-F grades.

FullSizeRender-13

Schools achieving high letter grades receive extra funding while school’s receiving failing grades can be threatened with take-over or closure.

Performance on the Algebra 1 EOC is also calculated into the School District’s overall grade calculation.

FullSizeRender-20

As if the stakes described above weren’t high enough, there is now a thing called “Algebra 1 Performance Based Funding”, described in this slide from the DOE:

FullSizeRender-16

So, yes, you read that right. Beginning in 2016-17, if a student takes an Algebra 1 class and then, at year’s end, fails the mandated EOC, that student’s school must return 1/6 (representing the cost of one out of 6 classes) of the FTE money (this is the money the school receives to educate each student) back to the state. The money (amounting to over $1000 per student who fails the EOC) must be returned after a student has been taught by a teacher, in a classroom, for an entire school year… Schools in low income areas, with lower passing rates, will be hit particularly hard.

Apparently, schools will not be penalized if the student “subsequently enrolls in a segmented remedial course delivered online.”  I called my daughter’s middle school and, at this point in time, they have been given no direction from the DOE as to what a “segmented remedial course” looks like, but they are scrambling to assure it will be available to their students by the end of next school year.

So, yes, the stakes attached to the Algebra 1 EOC (high school graduation, 30% of course grade, Magnet School eligibility, teacher’s evaluation and job security, school and district grades and school funding) are ridiculously high and, I think we can all agree, are significantly worse than being dropped off at a high school dance by your parents.

The high stakes attached to the Algebra 1 EOC (and the rest of Florida’s state mandated assessments) are real, they lack common sense and, unless they are addressed, Florida will never have a valid accountability system.

In the meantime, all comparisons to getting your driver’s license are baloney.

 

Addendum: All images are from a FLDOE presentation entitled “School Accountability and State Assessments” at the Fall 2105 FOIL conference and can be found, along with lots of other great information, in their archived materials.

 

SB1360: Questions for Senator Gaetz

Somewhere in Florida, Spring 2017:

Anne is a student at school “A”, a perpetually “A” rated school in a relatively wealthy neighborhood in a high performing district.  Her district chose the new SAT for their annual assessment and she scored a respectable 1270, including a 670 in Critical Reading and Writing  and a 600 in Math.  With those scores, she satisfies both the 10th grade ELA (formerly the 10 grade reading FCAT) and the Algebra 1 EOC graduation requirements AND she is exempt from taking the state mandated End of Course exams (EOCs) in Algebra 1, Geometry, Algebra 2, U.S. History and Biology 1.  When she takes those classes, her course grades are determined by her teacher based on her class performance during the school year. She is delighted with the reduced number of state mandated tests she is required to take because it will allow her to focus on her Advanced Placement exams.

Carol attends school “C”, in a less affluent neighborhood in the same district.  She scores a 1170 on her SAT (770 Critical Reading and Math, 400 Math).  Her near perfect 770 in Reading satisfies the State’s 10th grade ELA graduation requirement but her Math and combined scores mean that she will be required to take the state mandated EOCs in Algebra 1, Geometry, Algebra 2, U.S. History and Biology 1.  For each course, her performance on the state mandated EOC will be worth 30% of her course grade. For Carol, little has changed in the amount of testing and she is envious of the students in her class who are not required to take the state EOCs. She will have little time to focus on the AP literature classes that she loves.

Frank is a student in the same district. He has always struggles with math but he is an amazing artist and a creative writer. He dreams of a career as an illustrator or graphic artist. Because of his school’s concerns regarding his ability to pass the Algebra 1 EOC, he is placed in an Oracle Database certification course, instead of his desired Art electives, with the hopes that he can meet the high school graduation requirements that way.

Does this sound like a fair and appropriate system to you? These scenarios are not far fetched, they are based on the details in the proposed legislation in Florida’s SB1360, commonly known as the Seminole Solution.

This, in essence, is the accountability system that  SB 1360, as currently written, describes: It allows students who obtain high enough math and reading scores on a nationally normed assessment, to be exempt from U.S. History and Biology state mandated End of Course exams (EOCs). It allows students who complete FAA Ground School or obtain an Oracle Database certification to be exempt from the high school graduation requirement to pass the Algebra 1 EOC. Fourteen pages of SB1360 (between pages 7 and 21) are devoted to describing the many options for exemptions from state required assessments.

SB1360 is NOT a simple substitution of a nationally normed assessment for the flawed Florida State Assessment (FSA).

SB1360 is, also, NOT a bill that allows parental choice in assessments. It’s the districts choice.  Parents are ONLY allowed to choose if their district chooses an alternate assessment and the parents prefer the FSA. In districts that choose the FSA, parents are given NO choice.

According to Senator Gaetz, he believes this bill will save “the infrastructure of accountability” in Florida’s education system. We are more skeptical.

We appreciate the attempts to minimize duplicative testing and to shorten the “testing season” with a pencil/paper option. We would have understood if, after the disastrous rollout of the FSA, a decision had been made to move to a different testing system (like, perhaps, the ACT system which provides grade level testing through college readiness exams). We are not asking for the complete elimination of testing (which is, of course, required by federal law) but would like the state to move towards a system involving appropriate use of test data, implemented fairly for all students and schools. In that regard, SB1360 is a giant step in the wrong direction.

SB1360 passed through the Senate Education PreK-12 committee (watch it here) with barely a discussion, other than to declare how great it was. In his close on the bill (at 1:23), Gaetz states “if there are things about the bill that are a bit rough, that blame belongs with me”, adding that, if the bill moved forward, they would keep working on it to make it better.

We hope Senator Gaetz was being genuine and is planning to make this bill better. As always, “the devil is in the details.” We hope these questions will guide that process:

Scoring:

  • Are the score reports from the SAT, ACT, and PSAT detailed enough to inform instruction? Can educators use the reported data from these tests to make necessary instructional improvements?
  • Is there an alignment study between ACT Aspire, ACT, SAT, PSAT and FSA that demonstrates these assessments can be used interchangeably? How can scores on different tests be used to fairly rank schools and districts or to measure annual learning gains?
  • When a school administers the SAT or ACT to all its students as a part of our Accountability system, who controls those scores? Are they released directly to the students or will they go to the state for review before being released to the public?
  • If a student scores a 1200 (75% tile) on the new SAT (which is composed of Critical Reading, Writing and Math) they are exempt from both the U.S. History and the Biology state mandated EOC (page 11). Why do scores on unrelated standardized tests exempt students from History and Science EOCs? Again, these EOCs are mandated to be 30% of a student’s grade. Is it fair to the remainder of the students in the class that these students are exempt from the EOC without demonstrating any proficiency in course content?

State mandated End of Course exams:

  • Students who score high enough on an alternate assessment are exempt from taking certain state mandated EOCs. Pages 7-12 of the bill outline the scores necessary on the ACT, ACT Aspire, PSAT and ACT to satisfy the requirements for the state mandated EOCs (Algebra 1, 2, Geometry, Biology and U.S. History). Why are there no comparable FSA scores to allow EOC exemption? In a fair system, all students should have the opportunity for EOC exemptions.
  • How is the requirement that state mandated EOCs are worth 30% of a students course grade fair, when some students are exempted from the EOC completely? How will exempted students’ course grades be calculated? Will this have an unfair affect on GPA calculation for students across the state?
  • If the top test takers are now exempt from taking the state mandated EOCs, what impact will that have on the passing rates of the EOCs? One would predict that if the top 25-50% of test takers were eliminated, and the remaining cohort performed similarly to previous years, then the apparent failure rates will skyrocket (the number of students failing would remain relatively constant but the denominator- the number of students taking the exam- would be dramatically smaller). How will this affect perceived achievement gaps? Will cut scores be adjusted downward to mitigate this? How will that affect the “rigor” of these classes?
  • If the top 25% or more of students are exempted from the Math EOCs because of concordant scores, how will middle school grades be calculated- which depend on performance in Algebra and Geometry EOCs?
  • How are learning gains calculated in a class where only some of the students take the EOC?

Validity:

  • Are the scores referenced in the bill from the old SAT or the new SAT? (Old SAT max score was 2400, new SAT (yet to be administered) max score will be 1600.) We believe this should be clarified in the bill.
  • Has the new SAT been validated? Has it been evaluated for fairness, reliability and validity for Florida’s vulnerable subpopulations of students: ESE, ELL, low income? We believe the answer to this is “not yet.”
  • Why are we considering the use of the new SAT, which has never been given before and has questions regarding its validity? Could this lead to the FSA fiasco all over again, for districts choosing the SAT?
  • Currently, ESE students can receive accommodations during the FSA administration. Are accommodations allowed for the SAT, PSAT, ACT or ACT Aspire? If accommodations are provided for the SAT and ACT, will the resulting scores be usable for college admission? Will a student allowed accommodations on the PSAT be eligible for a National Merit Scholarship?

School grades:

  • How will learning gains be calculated if the administered test changes from year to year in districts?
  • Passage rates on the Biology EOC and the US History EOC are components of the High School Grading Formula. How will this be impacted by student exemptions?
  • How can you honestly and fairly compare middle school based on their advanced math scores when one school might have all students taking the Algebra 1 EOC, another might have mostly students exempt from the EOC based on ACT Aspire scores, and a third might have all their students take the FAA Ground School Industry Certification, avoiding teaching Algebra in that school all together?

VAM – Teacher Evaluations:

  • How will VAM be calculated for teachers when only some of the students in the class are required to take the EOC and the rest are exempt due to a test score they achieved before enrolling in the class?

Parental Choice:

  • In SB1360, if a school district chooses to use an alternate assessment (PSAT, ACT, etc), a parent can elect for their child to take the FSA, instead (keep in mind, there is no other reason to administer the FSA, except for annual testing).  However, if a school district chooses to stick with the FSA, parents are NOT allowed to choose an alternate assessment to satisfy their child’s annual testing requirement (even though every school district will still be administering the ACT, PSAT and SAT). Why not allow parents to choose an alternate assessment?

Exemptions:

  • If SAT and ACT scores have been shown to correlate strongly with socioeconomic level, won’t this system of exempting students with high test scores from multiple other tests, only minimize testing, on average, for more advantaged students and schools?
  • Why doesn’t passing an AP literature exam count as an exemption for the 10th grade ELA graduation requirement?

Funding:

  • How can the transition from primarily FSA/AIR created tests to the use of FSA, ACT Aspire, ACT, PSAT and SAT, with the number of counties choosing any one test varying every year, be a budget neutral endeavor?
  • Currently, schools receive an extra stipend for students enrolled in AP courses. If a student gains credit for taking an AP course just by passing the AP exam, does the school receive the AP stipend? If a student passes the AP exam without taking the associated course, is their AP exam score calculated into their high school’s school grade?
  • Has there been any consideration to having a three-year pause in the repercussions of our Accountability system (including funding implications), while we transition to this more complicated plan?

Alignment to new ESEA law:

  • We have heard there are 6 other states which use ACT Aspire for their annual testing. Do any of those states assess their 3-8th graders with two different tests?
  • Are there any states which use a smorgasbord of assessments for their annual assessment of high school students?
  • Can federal testing requirements to annually assess every student in math and reading be fulfilled by scores from multiple different tests administered in the same year?

As you can see, we have found quite a few things about this bill that seem “a little bit rough.” If Senator Gaetz is serious about wanting to make this bill better, he has his work cut out for him.

At Accountabaloney, we are not against testing; we want appropriate tests used appropriately.  We want an accountability system that can be trusted. SB1360, though well intentioned, will cause chaos in the current, already shaky, accountability system.  Districts will scramble to choose the testing scheme they predict will result in the highest school grades.  The DOE will be overwhelmed with the requirements that they determine comparable scores between a myriad of tests, set cut scores on nationally normed assessments, negotiate/renegotiate contracts with 3 different testing companies that must ultimately be “budget neutral” and develop a system to collect, monitor and compare test data from multiple sources.  And, perhaps most concerning, high performing schools, where many students will be exempt from state mandated EOCs, will see a decrease in testing, but less advantaged schools, where fewer students will reach the threshold for EOC exemption, will see little or no reduction in testing. Is that fair and/or appropriate?

We hope Senator Gaetz and his team can address these issues. As written, this bill will not shore up the accountability infrastructure but will serve to make a shaky foundation even more unstable.

Get out your hard hats.  Watch for falling debris.