Testing What Matters Least

What we learned when we took the Praxis Reading Specialist Test

By Maika Yeigh, Andie Cunningham, Ruth Shagoury

Illustrator: Michael Duffy

Illustration: Michael Duffy

We clutched our printed-out paper tickets in one hand and driver’s licenses in the other, passing both to the grim attendant who directed us to our seats. In case these documents had changed in the 15 seconds it took to walk to our assigned seats, she checked them again and instructed us to leave them on the desk for her continued access. We felt like we were about to embark on some weird plane trip. As it turns out, it was quite a trip—into the bizarre world of standardized testing for adult professionals, in our case, the Praxis test for reading specialists.

We are three professors of literacy in a teacher education program that leads to a reading endorsement. Our graduates are reading specialists, literacy coaches, and district language arts specialists—but only if, in addition to the coursework and practicum experience that make up our National Council for Accreditation of Teacher Education (NCATE) certified reading program, they also pass the Praxis Reading Specialist Test. Like any students approaching a high-stakes test, the adults we work with are anxious about the Praxis, a two-hour, 120-question, multiple-choice, gatekeeper test for their chosen positions as literacy leaders.

From their reports, we had become increasingly concerned about what seems to be a very narrow focus of this professional test. We decided it was time for us to embark on a journey into the two-dimensional world of the Praxis test.

We discovered that, at best, the Praxis test measures the least important aspects of professional knowledge. At worst, it reinforces harmful ideas about the profession: that reading specialists and literacy coaches should “know their place,” not buck the system, and certainly not encourage their colleagues to critique the standardized tests that dominate our profession.

A Political Subtext to the Test

The Praxis test we took showed no understanding of the intended use of standardized tests, and even reinforced misuse of large-scale testing results. A telling test item asked what the reading specialist should do with state assessment results. One answer option was for the specialist to put the results into teacher mailboxes. Another choice was to use the results to group students for instruction. There was no answer that implied that standardized assessment scores should be used as just one piece of collected data regarding student learning. As test takers, we were forced to choose which wrong answer seemed the least offensive, or guess what was in the test maker’s mind. How could this be a thoughtful way to evaluate a potential reading specialist’s understanding of the role of standardized testing data?

As we took the Praxis test, a disturbing subtext emerged from the questions. The exam implied that the main function of a reading specialist is to do the bidding of the building administrator and the school district—that the position is not one of decision-making but of implementation and enforcement. Test questions communicated no expectation that reading specialists and literacy coaches should rely on their own understanding of the reading process or look to students in making decisions. One test question scenario, for example, involved a school that scored significantly lower on a standardized reading achievement test than others in the district; the reading specialist was asked by the superintendent to recommend how to restructure the reading program in the low-performing school. As test takers, we were asked what steps would be most important for the reading specialist to take. The choices were all top-down options that assumed following the superintendent’s request without question. We recall one possible answer was to evaluate the adequacy of the licensure and certification of the teachers in the low-performing school; another was to compare the students’ scores to state and national norms. There was no option for educating the superintendent about the needs of the school, the teachers, and the children rather than “fixing” the problem. Once again, there was no answer we were comfortable choosing.

Rather than doing their administrator’s bidding, the role of the literacy leader is to support teachers in their daily work—planning, modeling, team-teaching, and providing feedback in collaboration with teachers in a school, creating a community of teachers empowered to discuss children’s needs and plan together. By omitting any reference to such a crucial role, the Praxis test implies a different stance, one we find troubling.

By contrast, the International Reading Association (IRA) recommends that the reading specialist’s role in assessment includes evaluation of the school’s literacy program on a broad basis (2003). Specifically, the reading specialist should focus on individual student strengths and needs and be able to communicate these to various stakeholders and paraprofessionals.

Measuring What Matters Least

A majority of the test items on the Praxis exam fit the definition of “easily measured.” We found a clear preference from the test makers for questions that test terminology: Schwa, cloze, digraph, homonym, SQ3R, and lexile were all test items on the exam. Certainly, these are terms that should be familiar to literacy specialists, but the sheer number of terminology questions raised their importance. For example, one question went something like this:

You have a student who struggles to read “banana, “lemon,” and “some.” This child could be helped with a word sort on:

a. digraphs
b. schwas
c. diphthongs
d. morphemes

By breaking reading instruction down into isolated facts for the purposes of a standardized test, the creators of the Praxis test measure what matters least.

In an attempt to see if the candidate knows what a schwa is, its importance is elevated to a teaching strategy. (For the uninitiated, the definition of schwa, from Macmillan’s dictionary, is “a vowel sound used in unstressed syllables, for example, the sound of ‘a’ in ‘above.’”) We can think of no reason to teach a young struggling reader to find schwas—or find any research to support such a task. Nonetheless, on each of our versions, there were at least three questions that included the term schwa.

We also found an emphasis on isolated terminology that must be some test writer’s pet strategy. “Say-it-fast-break-it-down” and “narrow reading” were terms included on the test. These are not mainstream—or particularly useful—activities for working with young readers. But they are easy to measure!

One of the major influences on the field of literacy has been the proficient reader studies, which focus on the process good readers use, instead of the deficit model of “fixing” poor readers. This research has shifted instruction to an emphasis on comprehension strategies, as well as what all readers bring to the text. This field-changing body of work stresses a student’s culture and background knowledge, and was noticeably missing from the Praxis exams we took (Pearson et al., 1992). Instead, the more easily tested “literal and higher-order” hierarchy was included in test item after test item.

Not Measuring What Matters Most

The test failed to address or attempt to measure some of the most important standards created by IRA/NCTE for the preparation and certification of reading specialists (2003). There are five standards by which our teacher education program is measured in this area: Foundational Knowledge; Instructional Strategies and Curriculum Materials; Assessment, Diagnosis, and Evaluation; Creating a Literate Environment; and Professional Development. Within these five standards are descriptors of well-rounded literacy leaders’ actions, work, and strategies that cannot be measured using a one-dimensional assessment.

Let’s look more closely at the fourth standard as an example:

Creating a Literate Environment: Assist the classroom teacher and paraprofessionals in selecting materials that match the reading levels, interests, and cultural and linguistic background of students. (standard 4.1)

One facet of creating this literate environment is the ability to support teachers and paraprofessionals in the selection of materials that match the needs, backgrounds, and interests of students. This is a complex skill that requires building relationships with colleagues and students, making keen observations and judgments, and exercising a deep multicultural sensitivity.

Supporting family literacy is an important endeavor for many of our inservice teachers, including those who work with families that are not native English speakers. Recently one student, inspired by Jennifer Allen’s book swap project, documented in Becoming a Literacy Leader (2006), created a book library for her families whose native language was Spanish. She wrote a successful grant to purchase picture books written in Spanish and created a mini-library that parents could access. She kicked off her project with a family literacy night at which she showed parents the books, demonstrated the checkout system, and taught parents how to picture-read with their children. She also worked with teachers to support students bringing these texts into their classrooms during the reading block. This inservice teacher obviously meets the standard of “creating a literate environment”—yet it is impossible for a standardized test question to capture the depth of this project.

Instead, the Praxis test questions imply reliance on a basal reader, with the reading specialist’s role to facilitate its use. The suggestion is that a basal reader is the preferred vehicle for accessible literature. For example, one item asked whether reading specialists should judge textbooks according to curriculum objectives, students’ reading levels, students’ performance on standardized tests, or the classroom teachers’ preference in reading textbook format. We still aren’t sure what the intended “right” answer is. Although textbooks may be part of the reading diet in a classroom, the omission of any questions about reading workshop instruction, use of book clubs, or how reading specialists might promote use of trade books within classrooms reinforces the message that purchased reading textbooks are the norm—and the expectation.

These test maker assumptions oppose the standard itself, which states that the reading specialist should support the teacher in finding relevant and interesting reading material. There is a wealth of reading material that “match the reading levels, interests, and cultural and linguistic backgrounds of students.” How well a potential reading specialist knows appropriate reading material—and can make it accessible to teachers and students—is indeed worth measuring. Not surprisingly, we found no mention of assessments other than high-stakes standardized tests on the exam, even though the IRA/NCTE standard 3.1 states that reading specialists should “Compare and contrast, use, interpret, and recommend a wide range of assessment tools and practices.” [emphasis added] Miscue analysis, for example, is a widely accepted assessment tool that the test ignored. Portfolios—another respected method—failed to make the test as well.

But how can 120 multiple-choice questions possibly encompass the complex tapestry that literacy specialists are expected to weave every day? Why break down the complexity to sound bytes that distort the real world of schools, classrooms, children, and teachers?

Fundamentally, the Praxis test ignores the role of reading professionals in modeling a love of reading and demonstrating how reading and writing can be utilized in the world. Readers read for real purposes: critically interrogating the world, questioning assumptions, probing for implied values, and even challenging bogus assessments of achievement—like this test!

A Call to Action

Currently 15 states require the Praxis Reading Specialist Test, and the number is growing each year. Our conclusion is that this exam does nothing to help prepare these critical education specialists. If anything, it miscommunicates what it means to be a reading professional to test takers. Based on our experience, the test directly contradicts the professional standards that we embrace.

Why are we allowing the testing industry to usurp our responsibility to assess and evaluate teachers who have gone through the rigors of an advanced professional program? What is the rationale for lining the pockets of the testing industry? The test costs $85 to take, with a required additional registration fee of $50. And that’s the minimum; if you need your scores expedited or want to check them online before 30 days, there are hefty additional fees. And many teachers take it more than once before they achieve a passing score.

It isn’t just reading specialists who face these needless and damaging standardized tests. We recently spoke with colleagues at our institution from teacher education, educational leadership, and counseling psychology programs about our Praxis test experience. Chagrined, they realized that they hadn’t thought to take the tests that their students must take to apply for licensure or endorsement. They also realized they didn’t know enough about either the content of the exam or the intersection between their coursework and the required tests.

We believe it is time to take back our professional responsibility. Our next step is to initiate dialogue with colleagues nationwide and craft resolutions through our professional organizations (such as NCTE) to oppose continuation of this testing requirement. And we hope this kind of grassroots effort will have an impact on others who work in our schools and communities. In the long run, we can’t stop the damaging effects of standardized testing on the children until we examine the impact it is having on teachers as well as on the larger culture of schools.n

References
Allen, Jennifer. Becoming a Literacy Leader. Portland, Maine: Stenhouse, 2006.

International Reading Association. Standards for Reading Professionals. 2003.  www.reading.org/downloads/resources/545standards2003
/index.html
.

Pearson, David, et al. “Developing Expertise in Reading Comprehension,” in A. Farstrup and J. Samuels, eds., What Research Has to Say About Reading Instruction. Newark: International Reading Association, 1992.

Maika Yeigh teaches at Willamette University in Salem, Ore. Andie Cunningham and Ruth Shagoury teach at Lewis & Clark College in Portland, Ore.