Humanities Resource Center Online
Font Size: 
 
 
A PROJECT OF THE AMERICAN ACADEMY OF ARTS AND SCIENCES

     
       
Indicator I-2 Writing Proficiency
Print
Back to Section I-A

Updated (6/25/2010).

See the
Note on the Difference between NAEP "Achievement" and "Performance" Levels.

In addition to reading, writing is a core humanistic competency measured by the NAEP (see also U.S. history and civics). Students in the fourth, eighth and 12th grades are evaluated on the basis of essays they write in response to standardized prompts (see examples of prompts and student responses).

In the most recent assessment year, 2002 for fourth graders and 2007 for eighth graders, close to 90% of students in these grades demonstrated at least basic writing competency, a level that reflects modest growth since 1998 (Figure I-2). This rise is attributable mostly to an increase in the share of students demonstrating higher-order writing skills.

Figure I-2, Full Size
Supporting Data Supporting Data

Twelfth-grade performance has been somewhat lower and more dynamic. Seniors’ performance slipped between 1998 and 2002, with the percentage of students scoring at the basic achievement level or better declining from 78% to 74%. But over the next five years that lost ground was recovered. In 2007, the percentage of 12th graders scoring at the basic level or higher was 82%. This gain among 12th graders reflected a growing share of students who demonstrated basic writing competence. There was no growth between 2002 and 2007 in the share of high school seniors exhibiting true writing proficiency. Since 1998, fewer than one in four soon-to-be high school graduates have been assessed as writing at the proficient level or higher (within the NAEP framework, a proficient writer is one who demonstrates a grasp of writing skills that are essential for success in most walks of life; these skills include the use of transitional elements and the ability to select language appropriate for the intended audience).

Recent cohorts of young people have been more successful than their predecessors in maintaining their writing competence as they transition from middle to high school. Although the NAEP is not a longitudinal assessment (i.e., it does not follow the same students over time), the spacing of the assessments permits tracking of cohorts’ performance as they move through the educational system. The sample of eighth graders assessed in 1998 was from the same cohort of students from which the 2002 sample of 12th graders was drawn. Between early and late adolescence, this cohort’s writing ability declined, with 26% of them failing to demonstrate at least basic competency as high schoolers, compared with only 16% of the sample drawn from this same group when they were in middle school. However, for those students who started eighth grade in 2002, the drop off in high school does not seem to have been so precipitous. Fifteen percent of this later cohort scored below the basic achievement level in middle school, while a comparable 18% did so in 12th grade (the latter percentage is an approximation of this cohort’s high school performance based on 2007 data—the NAEP was not administered in 2006, the year these students would have been seniors). For both cohorts, the 12th graders who were assessed as part of NAEP did not include those students who dropped out of school before reaching their senior year. Had these individuals remained in school long enough to be tested, they would presumably have increased the percentage of students demonstrating less than basic achievement in writing.

(The NAEP Data Explorer permits analysis of these assessment data by gender, ethnicity, and a number of other key variables. With this tool one can also obtain results of recent writing assessments for individual states and compare these with student outcomes in other parts of the country. For both an overview of Explorer and tips for its effective use, see http://nces.ed.gov/nationsreportcard/pdf/naep_nde_final_web.pdf. The Explorer itself is located at http://nces.ed.gov/nationsreportcard/naepdata/.)




Note on the Difference between NAEP "Achievement" and "Performance" Levels

The figures for Indicators I-1 through I-4 display the percentages of students scoring at certain levels on the National Assessment of Educational Progress (NAEP). The NAEP reading examination is scored differently from the other NAEP examinations (such as those in writing, history, and civics). On the latter exams, students are assessed according to grade-specific achievement scales. A student’s level of achievement is judged to be “basic,” “proficient,” or “advanced” depending on his or her score on the appropriate scale. A child scoring at the “advanced” achievement level on the 12th-grade civics exam is demonstrating different skills than a fourth grader scoring at the “advanced” level.

In contrast, the NAEP reading assessment uses a single scale, referred to as a performance scale, for 9-, 13-, and 17-year-olds. What constitutes “basic,” “proficient,” and “advanced” performance depends on the age of the examinee. Both a 9-year-old and a 17-year-old may score at Level 250 (able to interrelate ideas and make generalizations). Such a score would constitute an advanced level of performance on the part of the 9-year-old and a basic level of performance on the part of the 17-year-old.

Back to Content

Back to Top

Skip Navigation Links.  




View figures and graphics: