A study released yesterday by the U.S. Department of Education claims that public schools are performing better than private schools. Having worked with different types of schools over the past few years, I found this difficult to take at face value, so I began reading the full 66 page report titled “Comparing Private Schools and Public Schools Using Hierarchical Linear Modeling.” I did not have to look far to see that the results seem skewed. Now, I certainly do not want to detract from the progress that some of the public schools have made over the past few years, but the publishers of the report seemed to go out of their way to present a favorable slant for public schools.
For starters the National Center for Education Statistics (NCES), who performed the study along with several companies-Pearson Educational Measurement, and Westat, utilized a very small amount of data from the private schools as compared to public schools. The report examined data from the 2003 National Assessment of Educational Progress(NAEP)assessments in reading and mathematics. In 2003: over 6,900 public schools and only 530 private schools participated in the grade 4 assessments. Over 5,500 public schools and over 550 private schools participated in the grade 8 assessments. This is an unbalanced and unfair method of evaluation. The private school sample is too small to give a fair analysis.
In the state I live, public school students take the state’s Educational Assessment test, and the majority of private schools take the SAT, or other form of privatized test. We have tried, but for the most part have been unable to compare scores of public and private schools to one another because the tests are different, and the students learn very different curriculum in private vs. public schools in some cases. It is like comparing apples to oranges. In this particular study they tried to make it seem like they could compare apples to apples.
Next, the report utilizes an interesting model to interpret the data. The Executive Summary describes this attempt to level the playing field by taking out factors such as economic characteristics, location of schools and more. The report notes: “Among the student characteristics considered were gender, race/ethnicity, disability status, and identification as an English language learner. Among the school characteristics considered were school size and location, and composition of the student body and of the teaching staff.” Taking out these important factors certainly could change the overall results. I live in a major metropolitan area, and there are tremendous differences between the urban and suburban public schools and to a smaller degree the urban & suburban private schools. For a very simplified example the major city has a graduation rate of 48%, most of the surrounding suburbs have graduation rates of 80% or above. If we take the location out of the mix anyone could state that in this major metropolitan area the average graduation rate is 64% (80 + 48 divided by 2=64). This is not true for either location.
The report actually states that “The average private school mean reading score was 14.7 points higher than the average public school mean reading score…” In addition: “The average private school mean mathematics score was 7.8 points higher than the average public school mean mathematics score…, and finally “In grades 4 and 8 for both reading and mathematics,students in private schools achieved at higher levels than students in public schools.”
So how in the world does the headline in Saturday’s New York Times read “Public Schools Perform Near Private Ones in Study?” I think the current administration wants to show that their NCLB initiative is making headway. I think the program will make headway in time, but they should keep all of those pesky unwanted characteristics in their studies to better reflect real life.