Do NAEP Procedures Skew Results?
This item is an addendum to my August 26 piece on the state of America’s unimpressive standing in the international field of education. It raises questions about the NAEP that came to light as a result of my posting.
On August 26, I posted a piece in this blog (available for reading herein) titled ”True or False: Children Are America’s Most Valuable Resource” citing New York Times’ OP/ED journalist Charles M. Blow about America’s sorry international standing in the field of education. Among other sources, Blow drew his figures from the Broad Foundation’s report that, in its turn, cited figures derived from the National Assessment of Educational Progress (NAEP). The NAEP tests students from around the country primarily in the areas of math, science, reading and writing.
Following the publication of my piece, a dedicated teacher friend of mine sent me an email telling me of a personal experience with the NAEP, an experience that has made the teacher deeply skeptical of the numbers they have reported. Briefly, here is what was reported.
My friend’s high school was randomly selected and one hundred seniors were picked for the assessment. It was the teacher’s view that the sample size was too small because it represented just one hundred of the thousands of students in that area of the state. The selected students were pulled from their regular classes, told that they were to take a test that a) would not affect their grade and, b) were offered a free lunch in return for their cooperation.
A number of students from the lower end of the socio-economic scale told my teacher friend that since the test didn’t count against them they simply randomly ”guessed” the answers so they could get out of there and get the promised lunch.
The more advanced students were upset that they had been pulled out of their Advanced Placement classes to take a meaningless test just days before a real test that mattered to them. As a result, many of them answered as quickly as possible in order to get back to the study classes they were missing.
Needless to say, my teacher friend now has a skeptical view of any NAEP report that shows United States’ students falling behind their international counterparts. Rather, the teacher’s concerns are graduation rates, post-graduation opportunities for undocumented students and inadequate efforts to train technologically-wary older teachers.
This is the view of a committed and dedicated teacher that derives from personal experience with the NAEP.
Before giving my views on the issue, I must say that I am not a teaching professional. Nevertheless, I will venture some opinions on the issues my friend has raised. First, I will provide some information on NAEP that arises from a cursory review of the articles one can find on the Internet.
The National Assessment of Educational Progress (NAEP) is part of the national Department of Education and as such is the 800-pound gorilla in the field. As mandated by congress, the reportedly independent bipartisan 26-member governing board is comprised of educators, local and state school officials, state governors and legislators, business representatives and members of the general public. Their job is to assess student achievement and to provide information related to students’ learning. Their primary focus is on math, science, reading and writing, but they periodically assess the arts, civics, geography, economics and U.S. history. As an aside, I would guess that ”periodically” means ”infrequently.”
My limited research on the Internet revealed only one article critical of the NAEP, a 2011 article in the Washington Post by Valerie Strauss quoting James Harvey who wrote ”A Nation at Risk.” Based on this article, I surmise that that the NAEP shrugs off anything critical of their methodology or results.
Looked at from my perspective, the sample size cited by my friend may or may not be a valid representation of the larger group of students in the area that were not tested. Since I am not versed in the field of statistics, the sample size seems less significant to me than it did to my friend.
More serious in my view, however, is that the conditions presented to the selected students may have skewed the national results derived by NAEP from the study. Nationally, are all the selected students told that the test they are to take will not affect their grade, and are they offered a free meal for taking part?
The questions to be asked of NAEP are: 1) are these conditions still in effect in current surveys, and 2) are these conditions in effect throughout the country? Depending on the answers, the results of the survey must be questioned if the student response throughout the country is similar to that encountered by my friend.
My readers may find that the article on which I relied for my piece has flaws that raise questions about the accuracy of the NAEP’s findings. On their face, the NAEP results would not have been known to either the Broad Foundation or Mr. Blow because the NAEP procedures were not known to them or questioned.
As indicated above, any criticism of NAEP or its methodology is given short shrift by the organization. Indeed, the procedures reported to me by my friend may be a local phenomenon and may or may not skew the overall results. However, the NAEP needs to be questioned about what may be a significant issue on the accuracy of their findings.
I would appreciate your views and comments.