Louisiana Believes – Teacher Erosion Holds Steady

Louisiana Believes – Teacher Erosion Holds Steady


Teachers in high poverty schools are more likely to flee Louisiana schools than teachers in low poverty schools

BATON ROUGE, La. – The Department of Education today released what we are choosing to call an analysis of teacher hiring, retention, and departure data over three prior years. We would like you to believe this contradicts recent assertions that teacher attrition has spiked 27% statewide this year, by ignoring the current year entirely. As the data showed 3 years before our most draconian policies were implemented; teachers simply fled the profession at a steady rate in anticipation of these changes. Obviously, now that the changes are being implemented teachers are fleeing much faster. State data show that attrition rates among teachers have experienced minimal to no variation over the three years before the policies took place. In the 2009-10 school year, when VAM was not in place, 11 percent of teachers left the classroom; 12 percent left in both 2010-2011 and 2011-2012. Furthermore and forthwith, to make our argument seem even more nonsensical, new teacher hiring has increased statewide in recent years to compensate for the spontaneous pumpkinification of teachers, which is not considered an exit reason for our calculations.

The data also demonstrate that the state has seen success in retaining and promoting its teachers in wealthier districts and schools; teachers who are leaving the profession have tended to be from schools with higher concentrations of poverty than those who remain. Findings show that among teachers measured through a model called “value-added,” boasting a pace-setting 40% 2 year accuracy rate, those who remained in the workforce the following school year were more likely to be teaching at “wealthier schools” than were those who exited. To put it another way, those teachers exiting the workforce were more likely to be teaching in “high poverty schools” than were those who stayed. In addition, recent studies released at various forums friendly to LDOE’s agenda show how no academic harm comes from the early retirement of experienced teachers (while we contend this does not happen we have the data to show the flight that is happening is meaningless anyways.) The reason teacher quality is not important for low performing/high poverty schools is that funding is tied to performance now, so low performing schools get less money than high performing schools – making their task of closing the gap impossible. Eventually these schools will become eligible to be taken over by one of our eager campaign donors and then we will fund them enough to generate profits and even more campaign contributions.

Superintendent John White said, “It is important that we claim teachers are staying in our classrooms at normal historical rates to keep people from being alarmed. But more important, the data show that we are jettisoning teachers at our low performing schools, making privatization all but assured. Our report also shows that we are driving many of our highest performing teachers out of the profession entirely and into administrative roles. By our own estimates, 27 percent of effective teachers who left the teaching ranks over the past three years did so to accept a promotion to an administrative position. LDE firmly believes students learn best when the best teachers are free from the responsibility of teaching students, and our declining test scores I inflated by at least 15 points last year clearly show that.”

The Teacher Retirement System of Louisiana (TRSL) recently reported a 27% increase in the rate of teacher retirements based on questionable calculations like the numbers of teachers seeking retirements and exiting the profession. LDOE feels it is worth mentioning “retirement” covers only teachers who can retire, and their numbers do not include spontaneous pumpkification either! Because their numbers are obviously much more accurate than the LDOEs make believe numbers, and clearly show a dramatic increase in teacher retirements we feel we need to use a lot of complicated words, phrases and doublespeak to make it seem like we have a rational argument to make. Such employees who retire, choose to end their careers for a variety of reasons, many of them financial, some spiritual, some win the lottery, one or two probably join the circus. . . Frankly, we need to hire one of these veteran teachers to write our press releases because these releases are filled with massively flawed logic and make us sound like uneducated simpletons. (Nevertheless, we shall continue to release meaningless press releases.)

“Take note,” said John White. “I am using vehemence to make up for my lack of salient points! The Department’s data clearly disproves that teacher attrition is peaking. Also note that the number of teaching licenses granted to new teachers in the state actually accelerated over the past several years. Despite what ‘logic’ may tell you, an untested subject I might add, the fact that more veteran teachers are retiring and an acceleration of new teacher licenses means veteran teachers are staying at the same rate.”

John White continued, “I like to say the same thing many ways to drive home my erroneous points.  The data show that since the issuance of teaching licenses is way up we do not have a statewide shortage of teachers. Pretend for the moment that doesn’t just mean we are hiring more teachers to fill in for all the fleeing ones” said White. “Equally important, let me distract you by saying the Compass system focuses on identifying, developing, and keeping great teachers. That is a big change, and it is working because I have great data that shows it is working, and even greater interpretive skills to believe it is working.”

“In closing I would also like to insert a few random numbers here for you to consider,” said John White. ” 13%, 9200, 27% and 52.3. Those are numbers. That is data. Data has meaning to me and it should to you too.”

To read the Department’s report on teacher attrition and hiring, please click Data Report for Superintendent White.

Well, they didn't retire exactly. . .
Well, uh, they didn’t retire exactly. . .

Louisiana Believes – in Digitizing Children

Louisiana Believes – in Digitizing Children


BATON ROUGE, La. – The Louisiana Department of Education today released updated reports on the progress of technology readiness by messenger pigeon for the sake of irony.  ALEC, the American Legislative Exchange Council, instructed Jindal on an excellent plan to save money – by fully digitized our public school children by 2014-2015.  This plan is expected to be even more cost-effective that virtual charter schools and Louisiana actually expects to be able to make money on this venture by 2015.  This will allow Louisiana to provide tax rebates to corporations and individuals making in excess of 10 million dollars a year – to continue with the creation of additional high paying jobs overseas, and the creation of jobs domestically in the rapidly growing ass-wiping and food tasting service industries.

State Superintendent John White said, “Every public school child deserves to be digitized. We believe students achieve high standards, so long as we lower the actual standards and re-label them as “high.” Additionally, we have seen that digitizing people is possible from movies such as Max Headroom and the Lawnmower Man – and those movies are pretty old. The Department will continue to support districts in their efforts by providing quality, affordable technology options for digitizing their children and up-to-date information to make certain that 100 percent of our districts are prepared to digitize 110% of their kids. Our ultimate goal is to make certain that our public students are workforce ready. With that goal in mind we intend to ensure your kids are easily uploadable into industrial machinery or other tools and gadgets that can be found at Harbor Freight or Brookstone.

Districts and schools have worked to upgrade and enhance the technology available to digitize their children through everyday “classroom devices“, like meat grinders, sewing machines, and stone crushers. 82,754 devices meet the new standards, an increase from 67,038 six months ago. Both of those are big numbers. Districts now only need to purchase an additional 14,913 devices, down from 37,000 in July, which are also big, overly specific numbers meant to impress with our unnecessary preciseness.

Several districts have made notable gains in digitalizing readiness:

  • Plaquemines Parish. Only 1 school was digitally ready in July 2012, now all 8 schools meet the recommended standards. Their students are now gainfully employed running elevators and mixing machines. Fancy ones.
  • Concordia. All 10 schools meet the recommended standards, up from only 1 in July 2012. Their students are earmarked for fully electricalized magic eight balls. (No shaking and turning required!)
  • City of Monroe. All 19 schools have been digitized. Their kids now control programmable refrigerators, alarm clocks, and Forman grills.

Districts will continue to submit data on new devices or upgrades to current technology throughout the process of becoming digitizing ready by the 2014-2015 school year.

Non-Public schools are exempted from this ambitious goal. ALEC believes it is important that we groom some of our children to actually purchase all of these new, student implanted, devices.

Louisiana Believes – in Digitizing

Louisiana Be-be-be-be-believes
Louisiana Be-be-be-be-believes

Introducing: Louisiana Believes Anything

Introducing: Louisiana Believes Anything


BATON ROUGE, La. – In response to the feedback of pro-charter and reform groups, virtual  school operators, and testing companies, the Louisiana Department of Education today announced a complete overhaul of its website. The website’s URL has changed to louisianabelievesanything.com, reflecting the state’s comprehensive plan to ensure every student is fleeced for the maximum state funding before they track to prison or an exciting chicken plucking career. With the change comes a redesign of the entire website with the goal of making navigation through the site easier, by eliminating all useful or historical content. The Department tailored the changes to address concerns that the old site was too revealing and contained accurate information that contradicted pithy press releases like this one. The new site reflects the premise of Louisiana Believes Anything. Based on this discovery, our true bosses expect high profit margins from warehousing students and incessant testing. Empowering charter operators and testing companies to charge exorbitant fees that can be kicked back to fund political campaigns and candidates favorable to this agenda is key to the success of the Louisiana Believes Anything mantra.

“As part of our commitment to providing clear, concise information to help families and educators make informed decisions, we redesigned our website to ensure virtual visitors can find useless information and photos of me with my sleeves rolled up handing out giant fake checks, quickly and efficiently,” said State Superintendent of Education John White. “Rather than an expansive list of programs and regulations, which we have discontinued in favor of hiring more PR folks and lawyers to fend off legal challenges to our draconian policies, the new site focuses on distracting visitors from the lack of support for true student achievement.”

Louisianabeilevesanything.com is constructed with stooges, politicians, and the legally blind in mind, specifically the exclusion of anything that contradicts the narrative I’m trying to craft that is unsupported by the actual data being collected and reported. The new website also features a Library that contains some random documents, forms and other information about education in Louisiana that to a casual observer might seem useful. This Library was created with all of Louisiana in mind, realizing that I just asked BESE to stop requiring schools to fund actual libraries or librarians, I thought it would be ironic to create an empty useless “library” on the Department’s website. In this new “Library”, families can find information about what their old libraries used to contain as well as a coupon for a free smoothie with the purchase of any LouisianaBelievesAnything John White action figure, complete with real sleeves that can be rolled up to wrist, elbow or even ripped off for an effect I call the “Rambo.” We’ve also included extremely summarized data detailing the state’s academic results we want to show, without any context or supporting figures. We originally shied away from showing even this much, but ultimately we were compelled to do so because we accepted a 4 million dollar grant from IES, the Institute of Education Statistics in return for showing something data-esque-y – and John White Action figures don’t buy themselves.”

Additionally, information about Department programs and initiatives are now categorized under one of eight headings –Alphabet Soup, Teaching to the Test, You can’t spell Assessment without ASS, Accountability Shmountability, Funding Campaign Contributors, Early Childhood Lost, Shadow Schools, and Coursers and Other Big Horses. The new website also will highlight “Hot Topic” fashions – a Goth inspired chain store. This was done in part to confuse people a little more, but mostly because they donated 5,000 dollars to Jindal’s election campaign and another unspecified sum to his wife’s, Supriya’s, “charity.”

“We encourage everyone to click around our new and charter approved website,” said Superintendent White. “If you have thoughts or suggestions to improve our website, please email us at louisianabelievesanything@la.gov so we can add those items to our list of things never to do.

To access the Department’s redesigned website, please visit louisianabelievesanything.com.

check 1

capture 2

louisiana believes anything logo

Excellent essay from edweek on why Value Added is junk science

Probing the Science of Value-Added Evaluation

by R. Barker Bausell, edweek.org January 16th 2013

Value-added teacher evaluation has been extensively criticized and strongly defended, but less frequently examined from a dispassionate scientific perspective. Among the value-added movement’s most fervent advocates is a respected scientific school of thought that believes reliable causal conclusions can be teased out of huge data sets by economists or statisticians using sophisticated statistical models that control for extraneous factors.

Another scientific school of thought, especially prevalent in medical research, holds that the most reliable method for arriving at defensible causal conclusions involves conducting randomized controlled trials, or RCTs, in which (a) individuals are premeasured on an outcome, (b) randomly assigned to receive different treatments, and (c) measured again to ascertain if changes in the outcome differed based upon the treatments received.

The purpose of this brief essay is not to argue the pros and cons of the two approaches, but to frame value-added teacher evaluation from the latter, experimental perspective. For conceptually, what else is an evaluation of perhaps 500 4th grade teachers in a moderate-size urban school district but 500 high-stakes individual experiments? Are not students premeasured, assigned to receive a particular intervention (the teacher), and measured again to see which teachers were the more (or less) efficacious?

Granted, a number of structural differences exist between a medical randomized controlled trial and a districtwide value-added teacher evaluation. Medical trials normally employ only one intervention instead of 500, but the basic logic is the same. Each medical RCT is also privy to its own comparison group, while individual teachers share a common one (consisting of the entire district’s average 4th grade results).

From a methodological perspective, however, both medical and teacher-evaluation trials are designed to generate causal conclusions: namely, that the intervention was statistically superior to the comparison group, statistically inferior, or just the same. But a degree in statistics shouldn’t be required to recognize that an individual medical experiment is designed to produce a more defensible causal conclusion than the collected assortment of 500 teacher-evaluation experiments.

How? Let us count the ways:

• Random assignment is considered the gold standard in medical research because it helps to ensure that the participants in different experimental groups are initially equivalent and therefore have the same propensity to change relative to a specified variable. In controlled clinical trials, the process involves a rigidly prescribed computerized procedure whereby every participant is afforded an equal chance of receiving any given treatment. Public school students cannot be randomly assigned to teachers between schools for logistical reasons and are seldom if ever truly randomly assigned within schools because of (a) individual parent requests for a given teacher; (b) professional judgments regarding which teachers might benefit certain types of students; (c) grouping of classrooms by ability level; and (d) other, often unknown, possibly idiosyncratic reasons. Suffice it to say that no medical trial would ever be published in any reputable journal (or reputable newspaper) which assigned its patients in the haphazard manner in which students are assigned to teachers at the beginning of a school year.

• Medical experiments are designed to purposefully minimize the occurrence of extraneous events that might potentially influence changes on the outcome variable. (In drug trials, for example, it is customary to ensure that only the experimental drug is received by the intervention group, only the placebo is received by the comparison group, and no auxiliary treatments are received by either.) However, no comparable procedural control is attempted in a value-added teacher-evaluation experiment (either for the current year or for prior student performance) so any student assigned to any teacher can receive auxiliary tutoring, be helped at home, team-taught, or subjected to any number of naturally occurring positive or disruptive learning experiences.

• When medical trials are reported in the scientific literature, their statistical analysis involves only the patients assigned to an intervention and its comparison group (which could quite conceivably constitute a comparison between two groups of 30 individuals). This means that statistical significance is computed to facilitate a single causal conclusion based upon a total of 60 observations. The statistical analyses reported for a teacher evaluation, on the other hand, would be reported in terms of all 500 combined experiments, which in this example would constitute a total of 15,000 observations (or 30 students times 500 teachers). The 500 causal conclusions published in the newspaper (or on a school district website), on the other hand, are based upon separate contrasts of 500 “treatment groups” (each composed of changes in outcomes for a single teacher’s 30 students) versus essentially the same “comparison group.”

• Explicit guidelines exist for the reporting of medical experiments, such as the (a) specification of how many observations were lost between the beginning and the end of the experiment (which is seldom done in value-added experiments, but would entail reporting student transfers, dropouts, missing test data, scoring errors, improperly marked test sheets, clerical errors resulting in incorrect class lists, and so forth for each teacher); and (b) whether statistical significance was obtained—which is impractical for each teacher in a value-added experiment since the reporting of so many individual results would violate multiple statistical principles.

Of course, a value-added economist or statistician would claim that these problems can be mitigated through sophisticated analyses that control for extraneous variables such as (a) poverty; (b) school resources; (c) class size; (d) supplemental assistance provided to some students by remedial and special educators (not to mention parents); and (e) a plethora of other confounding factors.

Such assurances do not change the fact, however, that a value-added analysis constitutes a series of personal, high-stakes experiments conducted under extremely uncontrolled conditions and reported quite cavalierly.

Hopefully, most experimentally oriented professionals would consequently argue that experiments such as these (the results of which could potentially result in loss of individual livelihoods) should meet certain methodological standards and be reported with a scientifically acceptable degree of transparency.

And some groups (perhaps even teachers or their representatives) might suggest that the individual objects of these experiments have an absolute right to demand a full accounting of the extent to which these standards were met by insisting that students at least be randomly assigned to teachers within schools. Or that detailed data on extraneous events clearly related to student achievement (such as extra instruction received from all sources other than the classroom teacher, individual mitigating circumstances like student illnesses or disruptive family events, and the number of student test scores available for each teacher) be collected for each student, entered into all resulting value-added analyses, and reported in a transparent manner.

Vol. 32, Issue 17, Pages 22-23, 25

New Orleans Graduation Rate – Miracle . . . or Make-believe?

New Orleans Graduation Rate – Miracle . . . or Make-believe?

Recently we’ve seen a rash of publicity coming out of New Orleans and from Leslie Jacobs about the New Orleans turnaround miracle. According to her press releases and op-ed pieces in the Time Picayune and Baton Rouge Advocate, New Orleans leads the both the state and nation in graduation rate. That would be some wonderful news . . . if it were true. I think this claim needs to be examined a little more closely and not just accepted as fact, and I will explain why.

For many years, New Orleans provided some of the worst data of all of Louisiana’s parishes, even before Katrina. I know this, because I was the one who had the thankless task of reviewing and pointing out obvious omissions – trying to get at least the outright ridiculous data fixed. Many of their schools failed to report discipline actions or attendance, or rather, some of their lowest performing schools were reporting perfect attendance, for every single student, and zero suspensions and expulsions, year after year. We had no audit or enforcement arm with any teeth at LDOE, and this parish was not the only one doing this, so this type of poor data was reluctantly tolerated. Most years, prior to Katrina, they had trouble sending in any data at all, and were always sending data down to the wire come submission time, forcing us to accept what they gave us or delay reporting anything for anyone.

After Katrina, in 2005, New Orleans was wiped out just as the 2005-2006 school year was beginning. Their data systems were under water, as were most of their hard copy student files, as were any residents unlucky enough to end up stranded in the city. This was a very stressful time for everyone, where everyone was forced to evacuate, and in some cases evacuate again after Hurricane Rita ripped another hole in Louisiana in western part of our state. During this time we had to start making data collections monthly and tracking where all these students were going. Many of them ended up in the other states, 48 of them plus DC if I recall correctly. Most of our evacuees from New Orleans ended up in Houston and Baton Rouge. Into this vast chaos came the charter schools, and the RSD, Recovery School District, was born.

Slowly the city was emptied of water and the reeking refuse of thousands of rotten refrigerators. Even the air quality in Baton Rouge, from all the mold spores filling the air from a rotting city 60 miles away, was quite poor (not that air quality in Baton Rouge with all of our chemical plants and cars is that great anyway, but it was definitely noticeable to allergy sufferers and asthmatics like myself.) Gangs were roaming the streets at night in New Orleans and in some places during the day, and the National Guard was patrolling the 9th ward. FEMA was ridiculed, and rightly so, for being largely ineffective and unorganized. Ray Nagin, the Mayor of New Orleans, was making strange speeches about losing his “Chocolate City” as it became apparent that darker skinned and less affluent residents were having a harder time returning to New Orleans than many paler hued ones. Most police officers and teachers were unable to return or report to work because their homes were destroyed and housing was very limited (all Orleans teachers were also actually fired and told they could apply for new spots, if they opened up, but many charter operators developed a taste for much less experienced and expensive TFA recruits) Many people had to commute to New Orleans on busses from Baton Rouge every day, and home every night. A number of my coworkers from the Louisiana Department of Education volunteered, or were volunteered, to supervise children in schools that were opening up. These “schools” were operating as not much more than glorified daycare facilities for the children that remained or were able to return. Not much learning happened in the 2005-2006 school year, but a number of my coworkers were physically assaulted and had their cars vandalized for their trouble. This data could not and should not be compared to anything and to report doing better than 2005 is a foolish and disingenuous claim to make.

It was into this chaos the New Orleans charter movement was ruthlessly spawned. People were afraid they had no other choice, and a lot of money eventually started flowing from federal recovery coffers. There were plenty of outstretched hands waiting to receive those funds.

The 2006-2007 school year was also largely lost. An assistant Superintendent named Robin Jarvis chose, or was chosen, to try leading the fledgling RSD. A vendor named Tyler-Munis was selected by RSD to collect and report their student data. This collection system was never implemented. Even today, LDE has no idea how many students came and went through RSD’s doors. RSD did not send any discipline or attendance data for any of its schools that year (schools were keeping sketchy records on spreadsheets, or hardcopy, or nothing.) The February collection resulted in RSD sending some students 10 or more times with different student ID numbers, sometimes at the same schools, sometimes as different ones, making it impossible to know how many students they had, or who had them, or when. RSD actually gave up and did not send any end of year data, so we had no information on graduates or dropouts, transfers in or out. This caused many other districts in the state to have dropouts they did not deserve, because the students that transferred to RSD never had their records transmitted by RSD to the state.

The Orleans Parish school system also reopened and their data collections and reporting went much smoother with a different incarnation of their previous vendor. Orleans Parish kept most of the high performing magnet schools, some of which were the best in the state or Nation pre-Katrina. The creation of the RSD system left them with the easiest students to serve, the more affluent and gifted ones, and RSD and the various charters were left with whoever was left over. Dozens of charters came and went over the next few years with varying degrees of compliance with submitting data and varying degrees of success in educating students. Although many independent charters did not submit everything they needed to, or were obviously submitting false data. RSD was in charge of ensuring these charter operators were complying with state laws and policy, but they did a piss poor job. RSD usually ignored our requests to investigate data issues of their own or of the charters they were supervising. It was during this time, 2006-2007 and 2007-2008 that charters started realizing they could be selective with their enrolments, or rather who they did not enroll. We noticed many of these operators were not enrolling very many special education students, claiming they did not have the personnel to address SPED student needs. This was true, this was by design, but this was not legal. These operators could and should have engaged the services of experienced Special Education teachers and necessary facilities, but those sorts of hires and purchases would have cut into profits (and CEO salaries for non-profits), so many charters did whatever they could to discourage and redirect these students to RSD.

In 2007-2008 Paul Vallas came to town. He brought with him a gaggle of cronies looking to make a quick buck, including his Right Hand Technology Man, Jim Flanagan. From the very first meeting with Flanagan it was obvious he already had a new vendor in mind, PowerSchool. He’d done business with them before, and even though his earlier implementation with them was a failure, he’d felt they had improved. (That was a big red flag to me. Flanagan actually relayed several projects he’d worked on in St Louis and other places, all of which were dismal failures by his own reckoning, yet somehow he felt compelled to tell us that, like failure was a pre-requisite to working in New Orleans.) I was trying to encourage my folks to pick a local Louisiana vendor name by us as JPAMS. They had an exemplary track record with us LDE folks, and their clients and already represented more than half the state. (PowerSchool did not even have one client in Louisiana at the time.) After reviewing an embarrassing RFP from PowerSchool and a great one from JPAMS we had to make our decision. Flanagan gave PowerSchool all 10s on his evaluation for every category; even though they told us they could only meet 1/3 of the RFP requirements, they could not meet our timeline, and they bid 2 to 3 times as much via some weird ala carte’ proposal. Still, I was told to make the selection close by my other team members, or PowerSchool would challenge the selection, so I went along, so in the end I got who I wanted.

It took most of 2007-2008 but eventually JPAMS got RSD mostly straightened out by supplying many of their own personnel to do much of the data entry work for RSD. It wasn’t until 2008-2009 that LDE was finally getting decent data.

This was about the same time we started collecting truancy data. There was quite a bit of variation in how districts were reporting this to us. I came up with a pretty strict mathematical description of how I wanted this calculated, but we had a few districts that told us they would not comply. New Orleans Parish was one of those. Apparently their superintendent did not want to report that indicator as defined, because it would make them look bad to the media. If I recall correctly New Orleans Parish School Board reported a 0.027% truancy rate while average rates ran around 5% – 10% for those doing it correctly – based on the definition at that time. I believe RSD had a rate of around 30-40% and some of the charters had rates as high as 70 or 80%.

Another neat little trick that was employed in years past is changing the exit codes for students who were listed on “preliminary drop out rosters” to exiting out-of-state or to a non-public school. No documentation is required for this change. We’ve had a few enterprising principals that instructed their staff to “fix” their dropout problems. Changing exit codes to statuses we can’t track – like exits to non-publics (who don’t share student level data with us) or other states (who also don’s share data with us) takes care of the “problem.” Only a few schools have been caught doing this in the past, they were very obvious about it, and it was years ago, before the reigns of John White and Paul Pastorek, when LDE actually did a little rudimentary data auditing (interestingly enough, we did this auditing before the data was as important as it is for school performance scores and tracking State Superintendent progress.)

These are just a few examples of data deception. I don’t want to bore you anymore with my overly elaborate data collection horror stories, but suffice it to say I will be impressed if New Orleans is really graduating as many quality graduates as they are claiming. I think it’s worth mentioning that since I posted my blog entries (one and two) on Louisiana’s bogus dropout rate I’ve had confirmation that districts in Louisiana have been rolling over their adult education students for the last 3 years keeping them from enrolling in grade 9 or becoming dropouts. Around 3 or 4 years ago an enterprising LEA found some language in the NCES (National Center for Education Statistics) dropout definition that permitted LEAs to not exclude students from dropout counts if they enrolled in an “LEA monitored” adult education program and were pursuing their GED. I argued against making this change because I felt it could easily be abused, and because very few of these students actually do earn GEDs. At most, all this change would do is shift dropouts to another year, if data was entered correctly.

I ran some historical reports on this data comparing the GED completers to students that became dropouts after dropping out of school to enroll in adult education programs. I found that of the 7 or 8 thousand students that did this, fewer than 500 obtain GEDS in 2 years. Coincidentally, the average decrease in dropouts over the past 3 or 4 years is about 8 thousand students. (Apparently all of our school districts realized they were monitoring their adult education programs working to obtain GEDs, and despite this careful monitoring, were unable to report when those students left those programs.) Students that exit school to an adult education program in 8th grade never enter a 9th grade cohort, and never “dropout” data-wise. They also never graduate and never end up in a denominator. If our graduation rate and enrollment is skyrocketing, why are our graduates remaining relatively flat or only showing modest gains?

Another nice little tidbit I think worth noting is that the author of these op-ed pieces is Leslie Jacobs, was also a member of BESE, the state board of education, when New Orleans went all takeover crazy. She is not exactly an unbiased source since a positive view obviously reflects well on her decision and negative report would not.

Still, it would be nice to verify this claim with even unreliable data, but the Louisiana Department of Education has made it a policy to deny most data requests, even simple ones. Just this past week the took down all their previously published historical data to make researching their claims that much more challenging.

To give you just one example of what most data requesters and researchers have to deal with. . . I recently was subpoenaed to testify about the existence of data that LDE claimed did not exist and claimed they could not provide a researcher. The problem is, not only did that data exist; I created it before I left. I was asked to provide it to a different researcher; friendly to LDE, on the condition they only write nice things about them. (I have a copy of the MOU in my possession that says the researcher can only release data and reports that LDE approves of.) I also gave that same student data to a lawyer working with the Jindal office for the BP lawsuit. But other than those two other groups, no one else . . . that I know of. Nevertheless, LDE refused to give data to the researcher that contacted me so I agreed to help.

When the LDE reps and their lawyer saw me in the courtroom, I guess they decided not to perjure themselves that day. Instead, they changed course, stipulated to the existence of the data, and then made the argument that FERPA did not allow them to provide that data, and that in any event, they were under no obligation to provide data to anyone other than those they chose to share it with . . . and they asked for a continuance to make this new argument. It’s been over a year since the initial request for basic enrollment, entry and exit data (that might indicate if New Orleans schools, and especially charters, have an inordinate number of out-of-state transfers for instance) for New Orleans schools, was made and many thousands of dollars in lawyer fees.

How long do you think it would take to get this data on graduates?

And if that’s not enough, LDE has made it a habit of excluding low performing schools from district composite reports so they can report continuous improvement. If those schools were included, RSD’s performance score would actually be declining according to Dr. Barbara Ferguson. As of the last count, 12 schools were excluded from the last calculation. I wonder how many schools might be missing in Leslie’s report. . .

Since LDE only provides summarized data or makes press releases announcing grandiose claims, but does not proffer any data to support those announcements and claims, I would recommend strongly against believing them. When I worked there, LDE actually produced and provided reports and data to anyone who asked – if we had the data and the work was not overly excessive. (Even then, the data had significant problems and limitations to its use.) Now LDE has very few people left in the data collection area qualified to review the data, or even understand it, and they have a superintendent who has determined the data released must only say positive things about his work, even if it means accepting obviously impossible claims or excluding schools from calculations and reports that don’t forward the narrative he is trying to craft. Until data is freely released and historical data is restored (that would reveal the existence of schools excluded in the future for instance) I would strongly recommend against believing anything they claim. If you could show your claims were legitimate, that you could put people like me in my place, wouldn’t you want to share that data with the world to prove your claims and prove your detractors wrong? What LDE has done instead is actually remove all previous data from their website, even data contained in previous press releases.

LDE and John White even renamed their website to www.louisianabelieves.com Does that sound like the act of a confident person, or the act of a desperate coward? They really hope you will just believe, but you do so not just at your peril but with the cost of our children’s futures at stake.

When was the last time a politician told you to just believe him on blind faith, and that turned out well for you?

John White definitely has a god complex, but Louisiana, you don’t have to worship him.


white god

John White’s Performance Review Press Release – Time Machine Edition

For Immediate Release – January 15th, 2013

BESE Congratulates Superintendent John White on performing an outstanding job

Chas Roemer , BESE President, explained to a packed Louisiana Purchase Room, filled to the brim with charter lobbyists and brown-nosing sycophants, that without John White’s leadership, Louisiana would have followed in the path of so many failures before it, by reporting accurate data.

“John White is a true champion of the Reform movement and Louisiana. He is not afraid to radically change our entire scoring system to in order to make it possible for 97% of our high school students to believe they are achieving more than their predecessors. Under his leadership I have no doubt we will reach whatever seemingly random goal he sets for us, and even shatter it!”

“Under John White’s guidance, Louisiana has innovated to become a true leader in manufacturing faux performance gains. Whether it’s raising SPS scores by 15 points, excluding low performing schools from being included in calculations, or shadow schools altogether, White has been instrumental in showing Louisiana how to turn ignorant belief into mesmerizing reality.”

Holly Boffy, BESE cheerleader, exclaimed “Give me a W! Give me an I! Give me a G H T! Goooooooo Wight!” Mrs. Boffy was informed later of the actual spelling of John White’s name (although she remained skeptical it was not the same as the undead bloodsucking traitors of Tolkien lore.)

Students First, a corporate shill organization run and funded by charter organizations, has recognized John White’s outstanding contributions to their cause and given John White and Louisiana a top grade in conforming to their profiteering agenda. This endorsement came at some personal cost to Students First as they had to endorse a State that rated an “F” on achievement according to Education Week magazine, ranking slightly ahead of the absolute worst state, Mississippi.

When asked what his goal was for next year, John White replied, “Mississippi is going down! It’s about time Louisiana was the top of a list for once! We haven’t driven off all our best teachers and replaced them with crappy virtual school simulations and defunded traditional districts just to stay 49th!”

Based on the applause and cheers greeting this statement, most of those assembled appeared to believe John White could achieve this goal.

For Questions about this release, simply attend the BESE meeting at the Claiborne Building in downtown Baton Rouge, January 15th and 16th, 2013, while they review John White’s performance and BESE ignores anyone who raises objections.

 To download a copy:sFor Immediate Release

The Mainstream Media is Dead

The Mainstream Media is Dead

You can’t rely in any mainstream media anymore.

Maybe you never could, but the world seemed more sane and less corrupt when I thought you could.

For me, the realization crystalized fully when I recently had a story pulled, for the second time, on LPB (Louisiana Public Broadcasting).  It was about shadow schools (schools like the ones in Iberville that are not being reported to the state or federal government so small school districts can racial and financially segregate their students.)   I jumped through all the ridiculous hoops and roadblocks they kept throwing in my path and even agreed to do the interview non-anonymously at no small personal risk to myself and my family (who were all against it.)

I know that if it was my children in those failing schools, being masked as adequate, that I would really hope someone would tell me what was going on if they knew.  My children only have one childhood and one chance at an education.  Once that chance is lost, once my children’s formative years sacrificed so wealthier whiter kids have a better shot, there is no going back.

However the editorial staff at LPB told the reporter I was working with, “it will never show on LPB.”

Not because there was anything factually untrue, but because there was “a chance” it might offend Jindal and John White.

Before I started getting involved with blogging, trying to contact and work with reporters, and the media I was under the false assumption that at least PBS and NPR were immune to meddling; that we had some sort of watchdog group looking out for us if we truly lost our way.

The sad truth is that corporate and political interests control all mainstream media.  If the media you watch or read employs people who depend on a job, those people are afraid to cover anything too controversial or to get too involved that they step on any toes.  And these days, who doesn’t?

Sure, we have flavors of news, like TCBY ice cream flavors, but would a Fox News reporter be able to cover a Republican sex or corruption scandal like they cover the made up ones like Benghazi?  Would Rachel Maddow spend a lot of time covering a possible Planned Parenthood issue or a possible issue with ACORN?    (I’m not saying there issues brought up about Planned Parenthood or ACORN are legit, just if they were, we probably would not hear about them from CNBC first nor in the amount of salacious detail Fox would slather on such topics.)control all groups that rely

If your news organization relies on advertising, government funding, or donations for the existence of their station, they are beholden to those interests.  They are beholden to CEOs, shareholders, advertisers, and politicians.  Bobby Jindal wields line item veto power over anything in the budget in Louisiana.  LPB survives, in part, on state funding.  Bobby Jindal has a robust reputation for punishing those who say things he disapproves of, firing staffer after staffer who says anything, even under oath, other than the approved talking points.  I suppose I can understand why LPB would feel they have to appease Jindal and cover local festivals and 3 legged dogs large tomato contests rather than anything of substance, but that doesn’t mean I’m happy about it, nor that this is how I think things should be.

Bloggers can be wrong, but at least they are not afraid to be right – which is more than can said for most reporters and media these days.

I’m sure I would feel different if my own livelihood depended on not offending anyone, but isn’t that the problem?


A Guest column response, by Herb Bassett, to Superintendent John White’s reply to Deborah Tonguis’s question about the inflated 2012 Louisiana High School Performance Scores.(with notes from yours truly)

An open response by Herb Bassett (one of the researchers who has recently produced a paper the inflation of Louisiana’s SPS scores as well as LDOE’s published intent to radically inflate next year’s scores) to Superintendent John White’s reply to Deborah Tonguis’s question about the inflated 2012 Louisiana High School Performance Scores. On November 29, 2012, Deborah Tonguis sent an e-mail to Superintendent John White asking for explanation of the unusually high 2012 Louisiana High School SPSs.  In his reply, John White explained that there were three factors that led to the high scores, and that all three were determined in advance of his arrival. The original post that generated this response can be found here: https://crazycrawfish.wordpress.com/2013/01/08/john-white-just-a-leaf-in-the-win/

I was astonished! (well, actually not really surprised,) I mean astonished to see John White’s assertion that he had no control over the High School SPS inflation in your recent article, John White “I’m just a leaf in the wind.”

Quite the opposite, of course,  is true.  In June 2012, the Louisiana Register, the official publication of BESE rules, records that — while John White was Superintendent — a key formula for computing the 2012 High School SPSs was changed, adding an average of 4.0 points to the scores.  Who should we assume changed that formula?  Did John White sleep through the BESE meeting where it was passed?

Surely he knew about the change because at the same time he was busy writing (with help from STAND for Children) totally new rules for the 2013 School Assessment System.  Oddly, the very formula he changed in June goes away in his radical new rules for computing the SPSs for 2013.  Why did he even bother to change it?

When I started researching the scores, I thought my math chops (I have 30 college credits of math) would be the most important tool in my kit. But when I discovered BESE Bulletin 111, the official guide to the School Performance Scores, I quickly realized that I had to become a historian.  To understand the SPSs you have to understand the changes, year after year, to Bulletin 111 and how to find them.

Now, back to John White’s assertion that he had no control over the High School SPS inflation.

John White’s words:

The story is, as is said there, that three factors, each determined years in advance of my arrival in this role, let to a significant increase in scores. You can argue whether they are inflated per se or not, of course. But the intent of each factor was a good one at least. The factors are:

  1. The decision to count graduation rate in the high school SPS (the grad rate grew in response by more than four percentage points, increasing schools’ numbers dramatically)
  2. The inclusion of a bonus for all schools with rates over 65 percent (the increase in the rate above was compounded in its impact through this bonus)
  3. The onset of the EOC tests, and schools becoming accustomed to these tests, perhaps faster than anticipated

In each case, the Department made a decision years ago to give points in specific ways….

Having extensively studied the history of BESE Bulletin 111 through the Louisiana Register, I  will grant him points #1 and #3. I have issues with certain characterizations he slips through, but in the end, those things were in place before he arrived.

I do wonder if the order in which he listed the factors was a Freudian slip.  Considering the  assertion that he had no control over the scores, factor #2 is quite aptly numbered.  His word choice of “bonus” also shows that he understands how inflationary it is.

Now for the details and history lesson.

Point # 2 refers to the “cohort graduation rate adjustment factor” (CGRAF) described in Bulletin 111 section 613.  Section 613 has been changed over and over.  This is the history of section 613.

In mid-2010, BESE mapped out the changeover from basing the High School SPSs on the GEE to the EOC in 2012. At that same time they decided to introduce the CGRAF in the 2011 SPSs.  The purpose of the CGRAF was to spread the scores so that the schools with high graduation rates would be scored even  higher and those with low graduation rates would be made even lower.

In September 2010 the CGRAF was set so that in 2011, when it would be first implemented, schools with graduation rates above a 65% threshold would get extra points.  Below 65%, points were taken away.  In 2012 the threshold would be raised to 70%.  (Notice that the CGRAF both giveth and taketh away, it is not really a “bonus” as John White called it.)

Enter the push for privatization and “sweeping reforms”.  Now, just before the introduction of Letter Grades for the schools, it was decided that the CGRAF would benefit too many schools.  The schools needed to fail so that more kids would be eligible for vouchers.

So in August 2011, just before it was implemented for the first time, the CGRAF was changed so that the schools below 65% graduation rate would still be penalized the same, but schools would get a boost – and far less of it – only if their graduation rate was above 80%.

Due to a clerical error, this last-minute rule change did not get published until November 2011, after the 2011 SPSs were released.

In early 2012, when John White became Superintendent, the CGRAF was set so that schools would get the “bonus” only if the graduation rate was over 80%.

Then, inexplicably, in June 2012, the CGRAF was changed back to the original formula with one modification.  Remember how in Sept. 2010 the threshold was set to be 70% in 2012?  That was lowered to 65% by the John White administration.  Now schools got a “bonus” for a graduation rate over 65% instead of having to be over 80% (like the year before) or 70% (as originally intended).

Let’s be clear: That gave an average of 4.0 points and up to 6.75 extra points to the SPSs.   Apparently he had declared victory for getting the sweeping reforms though the legislature and wanted to celebrate.

The discrepancy between the GEE and EOC-based scores was evident before he took office.  The 2011 Scores were in and a comparison should have been available.  Did he not see the inflation coming? Why not?  If he saw it coming, why did he choose to inflate the scores even more?

He states that there were three factors put in place before he became Superintendent that he could not change.  Still, he changed #2, the easiest one to manipulate.  The record clearly shows that.  Why didn’t he try to fix the others?

In the end, it appears that the inflated scores are an embarrassment and he is trying to distance himself from them.  Data highlighting the inflation was mislabeled in the public release of scores.  And now John White says he had no control over the inflation, but the official record says he did. He not only took control, but made it worse.

Which will Louisiana Believe?

Herb Bassett


The Full exchange between Deborah Tonguis and Superintendent John White:

Dear Superintendent White,

I had hoped to see some sort of public response from the LDOE by now about the flaws in the statistical analysis of the Louisiana high school SPS scores.

Here is the link to the information I am referring to:


Dr. Mercedes Schneider and I could tell with the naked eye that something was very wrong with the high school scores the day they came out. Attached to this e-mail is a comparison of select Louisiana high school SPS scores from 2001 to the present. It doesn’t take someone with a Ph.D in Statistics (which she does have) to see the statistical impossibility of the huge gains in the last year.

That’s when we went to the math. I told her to send her analysis to you and the BESE board. She did, but to no avail.

How long will you make parents wait before you tell them that their child might be attending a “failing” high school due to an incorrect mathematical formula that inflated all Louisiana high school SPS scores? I, for one, am holding out hope that this was human error and not an attempt to inflate certain charter high school scores. People make mistakes, but then they admit them, correct them and make every attempt to heal the harm. I haven’t seen anyone from the LDOE even acknowledge that the scores are wrong, much less correct them.

Since my attendance at the BESE board meeting in October, I had hoped that you meant what you said about getting teacher leaders like me “on board” with the reforms we are implementing in our state. But I don’t feel like the LDOE is building trust with teachers when we ask for pertinent information about our own campuses and don’t get it. The public taxpayers, whose money you spend every day, deserve transparency from their government. I am disappointed that we are holding teachers to a higher standard than the elected and appointed officials entrusted to govern our educational system.

I tried in good faith to understand how the new legislative policies would drive real reform in our schools. I know, as someone who has spent 30 years of my life as a classroom teacher, that we can do better. I know that you have a heart for service. Let the public know that they can hand over their children to you, and that you will make wise, compassionate decisions on their behalf. The biggest steps in the right direction might be to take a cautious approach, and consider the feedback that employees like me are giving you.


Deborah Hohn Tonguis

John White’s reply:


Thanks as always. I’m sorry we have not been back sooner to Dr. Schneider. We receive hundreds of letters each week, and we try to get responses back as quickly as we can. In this case, I believe we have a response coming today.

To your point, and to her analysis, I think I’ve been as up front about the situation as possible. See this article: http://theadvocate.com/home/4474689-125/high-schools-raise-scores.

The story is, as is said there, that three factors, each determined years in advance of my arrival in this role, let to a significant increase in scores. You can argue whether they are inflated per se or not, of course. But the intent of each factor was a good one at least. The factors are:

1. The decision to count graduation rate in the high school SPS (the grad rate grew in response by more than four percentage points, increasing schools’ numbers dramatically)

2. The inclusion of a bonus for all schools with rates over 65 percent (the increase in the rate above was compounded in its impact through this bonus)

3. The onset of the EOC tests, and schools becoming accustomed to these tests, perhaps faster than anticipated

In each case, the Department made a decision years ago to give points in specific ways. We now see the results of those decisions. In one sense, the decisions achieved exactly what they were supposed to achieve: schools focused on graduation and on EOC tests. On the other hand, according to some, they skewed the results. One way or the other, they were decisions made for valid reasons with the best information the Department had at the time.

My role in specific has been to raise standards away from the system described above. Under my management, the Department brought the ACT and AP into the system, reducing the focus on EOCs alone. I did so precisely out of a desire to raise standards, in anticipation of the Common Core standards taking effect. If anything, I have been criticized for raising the bar too high and for the potential that schools’ scores will be dropped under the new system.

These are all difficult policy determinations. After all, this is just about people saying that something is important and making decisions about what they think the impact of their decision will be. But I think the story of this situation will be as it should: the schools met one challenge resoundingly and we raised the bar such that the next challenge is laid out accordingly.

Thank you again for writing.

John White

Louisiana Department of Education

Twitter @LouisianaSupe

CCF remarks – John White sounds too knowledgable about the specifics for this oversight (lie) to have been accidental, but I’m sure my audience can make up their own mind about that.  While I would prefer to think he’s an unqualified, intellectually challenged, ignorant tyrant, his own words seem to paint him as a willfully deceitful and paint him as someone just pretending to be ignorant of the implications of his own actions.  It makes  a certain amount of sense. If you had control of the grading scale that would be used to evaluate your performance, and could give yourself a couple of letter grade bumps that would impact whether you stayed or received a raise, and no one would be the wiser, it would be tempting to go that route, no?

Gates MET value added debunking

Excellent initial rebuttal to the recently released self-serving Gates/MET “value added is great with other measurement methods thrown in!” Report.

School Finance 101

Not much time for a thorough review of the most recent release of the Gates MET project, but here are my first cut comments on the major problems with the report. The take home argument of the report seems to be that their proposed teacher evaluation models are sufficiently reliable for prime time use and that the preferred model should include about 33 to 50% test score based statistical modeling of teacher effectiveness coupled with at least two observations on every teacher. They come to this conclusion by analyzing data on 3,000 or so teachers across multiple cities.  They arrive at the 33 to 50% figure, coupled with two observations, by playing a tradeoff game. They find – as one might expect – that prior value added of a teacher is still the best predictor of itself a year later… but that when the weight on observations is increased, the…

View original post 1,436 more words

Finance 101 rheeport

Excellent analysis of the meaninglessness of being ranked high by Students First. If anything, getting high marks in education progress from this group is like getting a human rights award from Adolf Hitler.

School Finance 101

On Monday, the organization Students First came out with their state policy rankings, just in time to promote their policy agenda in state legislatures across the country. Let’s be clear, Students First’s state policy rankings are based on a list of what Students First thinks states should do. It’s entirely about their political preferences – largely reformy policies – template stuff that has been sweeping the reformiest states over the past few years. I’ll have more to say about these preferred policies at the end of this post.

Others have already pointed out that Students First gave good grades to states like Louisiana and Florida, and crummy grades to states like New Jersey or Massachusetts – but that states like Louisiana have notoriously among the worst school systems – lowest test scores – in the nation – whereas states like New Jersey and Massachusetts have pretty darn good test scores…

View original post 3,874 more words