Posts Tagged ‘reading assessments’

Reading Counts! Claims and Counterclaims

Accelerated Reader or Reading Counts!

AR or RC?

The purpose of this article on Reading Counts! is threefold: 1. To briefly summarize the basics of the Reading Counts! (RC) independent reading management program 2. To analyze three key claims made by Houghton Mifflin Harcourt (HMS) regarding the efficacy of the RC (formerly Scholastic Reading Counts!) program and provide counterclaims by reading researchers, librarians, students, teachers, and Yours Truly. 3. To promote my own reading intervention program at the end of the article with free teaching resources ūüôā


I previously ventured into the deep waters of independent reading management programs a number of years ago with my article, The 18 Reasons Not to Use Accelerated Reader. Accelerated Reader‚ĄĘ is the most popular independent reading management program with 180,000 book titles (January 2019) assigned a Reading Practice Quiz. RC is the second place challenger with 45,000. Teacher comments on my article tend to focus more on the abuses of the program, and less so on the program itself. Many teachers are quite defensive about their use of the AR program. Understandably so. We teachers view our instructional choices as reflections of our professionalism. Curriculum is personal. In anticipation of similar comments to this article on Reading Counts!, I would like to preemptively respond by saying, “I’m sure that you are doing your part to adapt the Reading Counts! program to the needs of your kids, and I respect your professional judgment that you know your students best.” Please don’t shoot the messenger! However, as I re-read “The 18 Reasons Not to Use Accelerated Reader” in preparation for this article, I would have to say that most of the problems in the AR program are applicable to the RC program, as well. I won’t cover the same ground in this article. However, I will analyze three of the claims made in the RC program, which I see as being¬†more¬†exclusive to this program. But first, a brief overview of how the RC program works.

How Reading Counts! Works

  • A school or district pays a school start-up fee of $375.00 and is assigned a sales representative. The RC software management program is licensed for an annual fee of $4.00 per student (a lower price for 2019). The reading placement and monitoring assessment, recently re-named the¬†Reading Inventory¬†(RI), costs an additional $4.00 per student. So, if my math is correct, that’s $4,000.00 for a 500 student elementary school every year. Plus, more money…
  • The school and/or district re-allocate portions of their budgets to purchase books included within the RC program. Currently, RC has about 45,000 titles, but unlike the books in the AR program, the company makes money from each sale, because HMH publishes them! These purchases will necessarily become an every-year budget item.¬†
  • The HMS sales representative in-services school librarians, teachers, and administrators (lots of online help, as well) on how to implement the RC program. Suggestions as to how to inform and work with parents and corresponding resources are provided. The program resources are relatively easy to use, but time-consuming.
  • The classroom teacher or librarian administers the computer-adaptive Reading Inventory¬†(RI) as a reading placement test to all students participating in the RC program. This test provides a Personal Lexile¬ģ score for each student.
  • Teachers use the¬†Student Achievement Manager (SAM) data and management system¬†to generate student and class reports. The reports list the results of the RI as a Personal¬†Lexile¬ģ number (level or measure) for each student and a class Lexile average. A higher Lexile number indicates a higher reading level ability.
  • The reports also list the students’ optimal Lexile text readability levels (a numerical range). A text‚Äôs Lexile level is determined by its semantic and syntactic degree of difficulty and sentence length. Once students know their reading levels, they can select books from the Search Book Expert Online ,within these reading levels. Although the RC is a Lexile-based program, it also includes grade-level equivalency and guided reading levels in this search engine. Additional filters include grade-level interest (K‚Äď2, 3‚Äď5, 6‚Äď8, high school, and high interest/intervention), fiction and non-fiction, subject areas, genre, and curriculum-integrated books. Note that the HMS reading intervention programs, READ 180 Next Generation¬ģ and System 44¬ģ¬†include some RC titles for their independent reading rotations.
  • Teachers and students set reading goals in terms of a point system. Each book is assigned a specific point value based upon its length and text complexity. Many teachers establish a monthly points requirement.
  • Once students have finished their books, they take a corresponding quiz on the computer, or the teacher may choose to print the quiz. Although the test bank for each quiz includes 30 items, the default number of questions is 10. The RC authors and sales representatives make much ado about the larger quiz bank of questions compared to that of the AR program. They claim that is less easy for students to cheat due to the randomized 10 question default when students are sitting side-by-side. This may be true; however, a quick search indicated plenty of RC quiz “cheat sites,” as are found with the AR program. Where there’s a will, there’s a way. Students are allowed to examine their incorrect responses, but there is no pay-off for doing so if the quiz re-takes use different questions.¬†
  • If the students achieve a predetermined score (mastery criteria set by the teacher), they¬†receive a “congratulations screen” and an opportunity to rate the book they read on the “Read-o-Meter.” Students can also check their own RC Student Progress Report.¬†Points are awarded based upon the percentage of quiz questions answered correctly. If the students do not achieve¬†mastery, the teacher may require them to¬†read the book again and retest or re-visit the students’ RI Lexile level range and the level and content of the book. Students are able to take the 10-question quiz 3 times, because there are 30 questions.
  • Teachers generate reports on students’ quiz scores and track the amount of reading and student test scores. They can also receive alerts when a student has not taken a quiz within a given period.
  • Once individual student point goals (usually set monthly) have been mastered, the student receives a certificate of achievement.
  • The Reading Counts! Educator’s Guide provides plenty of reproducibles to supplement the quizzes, such as reading logs, story charts, book reports, parent letters (in several languages), and guides for¬†teachers to write their own quizzes (if the school library does not have the RC book).

Claims and Counterclaims

Claim 1: Students improve their reading more when¬†the complexity of the text they read matches their reading ability.¬†The best test to measure that optimal match or zone of proximal development¬†(Vygotsky, 1978)? The HMH Reading Inventory. Why? The RI is a criterion (compared to a fixed goal, such as a Common Core Standard) and norm-referenced (compared to other students) test. This is important because the test design allows teachers to administer the RI twice more within the school year to monitor progress. The Lexiles, which RI uses, have improved readability assessments (standard errors of measurement have been minimized and the amount of comprehension variance that can be explained by text difficulty has been improved.¬†Accelerated Reader’s STAR test doesn’t have those advantages.

Counterclaim: Given that the RI is state of the art, in terms of Lexile levels and matching students to texts, and given that the ability to administer the test three times per year does provide a valid measure to monitor progress. But, the entire design of the RC programs begs the question. It assumes what has yet to be proven. As noted reading researcher, Dr. Tim Shanahan asserts,

…Lexiles have greatly improved readability assessment … and yet we are in no better shape than before since there are no studies indicating that if you teach students at particular Lexile levels more learning will accrue.¬†

…we have put way too much confidence in an unproven theory. The model of learning underlying that theory is too simplistic. Learning to read is an interaction between a learner, a text, and a teacher. Instructional level theory posits that the text difficulty level relative to the student reading level is the important factor in learning. But that ignores the guidance, support, and scaffolding provided by the teacher. [In doing so, educators] have striven to get kids to levels where they will likely learn best with minimal teacher support.

Matching the right books to readers is simply more complex than the quantitative Lexile approach RC uses. Content, theme, and sophistication of thought matter, as well as the age and maturity of the reader are critically important factors to consider when students select books for independent reading. Most would find the following strictly quantitative Lexile measurements, listed in parentheses, to be inappropriate criteria for these grade levels.

  • 2nd Grade:¬†Night ‚ÄstWiesel (570)
  • 3rd Grade:¬†The Sun Also Rises¬†‚Äď Hemingway (610);¬†Twisted¬†‚Äď Anderson (680);¬†Incarceron¬†‚Äď Fisher (600)
  • 4th Grade:¬†Grapes of¬†Wrath¬†‚Äď Steinbeck (680);¬†The Color¬†Purple¬†‚Äď Walker (670)
  • 5th Grade:¬†For¬†Whom the Bell¬†Tolls ‚ÄstHemingway (840);¬†Kite Runner¬†‚Äď Hosseini (840);¬†A Farewell to¬†Arms¬†‚Äď Hemingway (730);¬†Cat‚Äôs Cradle¬†‚Äď Vonnegut (790)
  • 6th¬†Grade:¬†As I Lay Dying¬†‚Äď Faulkner (870);¬†The Sound and the Fury¬†‚Äď Faulkner (870);¬†To Kill a Mockingbird¬†‚Äď Lee (870);¬†Fahrenheit¬†451¬†‚Äď Bradbury (890)

Additionally, the authors of the Common Core State Standards, with their emphases on text complexity, specifically challenge the notion that reading instruction should focus solely on texts at student ability levels. The authors cite research suggesting that with such scaffolds as close reading, even struggling readers can access significantly more complex text than that to which they have been traditionally given access.[1].pdf

“Below are bibliographic citations for the 26 studies referenced in Shanahan (2014) regarding students making gains with more complex text when given appropriate scaffolding. In addition abstracts and full-text PDF‚Äôs of all studies are available as well. These references were provided by Shanahan in ‚ÄúBuilding Up To FrustrationLevel Text‚ÄĚ in Reading Today Online available here:”

Furthermore, reading research has repeatedly demonstrated the important variable of prior knowledge with respect to reading comprehension. When readers have significant prior knowledge on a topic, familiarity with the genre, or experience with the author’s writing style, even high Lexile level texts can be accessible. Prior knowledge and scaffolding relevant content and context can often trump the quantitative challenges of complex semantic and syntactic text for students.

Motivation is another significant variable in matching readers to text that can override the limitations of the RC Lexile levels. My youngest son was in 4th grade when¬†the last Harry Potter novel,¬†Harry Potter and the Deathly Hallows, came out. Clearly, the quantitative Lexile level of 880 should have prevented his MA reading specialist father (me) from purchasing this ‚Äúfrustration level‚ÄĚ book. Instead, I dutifully ignored the quantitative data and waited in line with my fourth grader for the midnight release of this treasured book. My son plowed through the book with a high level of comprehension. By the end of fourth grade, my son was reading significantly above grade level. Thanks to motivational influence of J.K. Rowling and the dozens of peers who were concurrently reading and discussion that book during recess.

Others would agree that reader motivation is far more important than instructional reading levels in book selection. From¬†Ricki Ginsberg’s article, “This is my Anti-Lexile, Anti-Reading Level Post” (Ginsberg is Assistant Professor of English Education at Colorado State University):

I’m a 6th grader and when I took a Lexile test for my grade, I got stuck with books I hate so much. We had to search for books in my Lexile. I am so bored of those books. I want to read whatever I want to.

I took my grandson (a few years ago) to his book fair to purchase some books with him. He chose a few, and then we went back to his classroom to get his things, where I met his teacher. She took a look at the books he had chosen, and was excited about, and said, ‚ÄúOh, I think these are too hard for you. You need to choose ones more at your level.‚ÄĚ She didn‚Äôt know that I was a teacher, and I didn‚Äôt tell her. I almost hit her, but I didn‚Äôt do that either. She was the one who pretty much stopped his excitement about reading…

As a librarian, I have fought for years against leveling books. I was supported my district years ago against AR, but my job as a librarian was shifted to support classroom curriculum instead of supporting reading enjoyment, reference process, and library skills. Now a new deputy superintendent, whose old District used a Lexile based reading program, is spending money on a program that is Lexile leveled. While library books are hardly given any budget money, tens of thousands are being spent… The skills that teachers built by learning how to ‚Äúfit‚ÄĚ a book to a student and teaching students to self-select challenging and interesting reading material is being prostituted to paying publishers for poorly written formulaic books dressed up with attractive level numbers. It is a disservice to our students that ultimately destroys their confidence in becoming independent readers.


Used with permission

Claim 2: RC provides the accountability to ensure that students are reading independently.

At the heart of this powerful program is the practice provided by its quizzes. Unlike other reading assessment programs, no two quizzes in Reading Counts! are the same, struggling readers have the opportunity to retake quizzes, and quiz settings can be customized based on individual students’ needs for extra support or challenge. This quiz quality leads to more accurate and actionable data to keep students on track for success.

[Reading Counts!] automatically generates a quiz that meets each student’s reading needs. Because every quiz provides a true formative, curriculum-based assessment,¬†As a computer-based program, RC provides immediate feedback and unique opportunities for mastery. Students can review questions that were incorrectly answered. Because each quiz is drawn from a database of up to 30 questions, students not showing an expected level of mastery can retake quizzes with a different set of questions.¬†Research shows that when students are provided with immediate feedback, they are able to self-correct and make academic progress (Branford, Goldman & Vye 1991).¬†

Counterclaim: While the reading research is clear that students who read independently are significantly more likely to outperform peers who do not read on their own (Anderson, Wilson & Fielding 1998), and those who read more independently score higher on reading tests compared to those who read less (Juel, 1988; Juel, Griffith, & Gough, 1986; Stanovich, 1986), the research does not support the claim of the RC authors and editorial board that the type of accountability that the program uses (quizzes) is necessary to achieve optimal reading gains.

Each of the 45,000 RC quizzes includes a test bank of 30 questions. They are primarily recall questions with some vocabulary and a minimum number of inferential questions. Few of the questions are relevant to the big ideas or themes of the corresponding books. In essence, the quizzes are designed to hold students accountable for reading their books.

Some researchers such as Dr. Stephen Krashen, argue that free voluntary reading, without accountability, produces greater reading gains than independent reading programs with accountability, as with the quizzes in the RC program. You may wish to check out my dialogue with Dr. Krashen on in-class independent reading and accountability. I disagree with Dr. Krashen and support independent reading with accountability.

My take is that we teachers have much better methods to hold students accountable for independent reading that also reinforce effective reading practice. For example, as a middle school teacher, I use online peer book clubs and student-parent discussions for my middle school students. I’ve also taught high school ELA and supervised elementary teachers doing the same. Plenty of accountability and practice, using the motivating social nature of reading. And no in-class independent reading. It’s all homework. I’m no guru, but I’m persistent, and I get between 80‚Äď90% participation (more the first semester than the last).

I teach students and their parents how to self-select reading, informed, but not limited by word recognition measures. However, challenging books need not be the only books students read. Reading at multiple levels has clear advantages and reflects real-world reading. I also train students how to discuss their reading in their online book clubs with their peers (one daily post and two comments required using the SCRIP Comprehension Bookmarks… download follows… to prompt), and I pop in to add my 2 cents. At Back-to-School Night (I require at least one family member to attend, and arrange infinite make-up sessions until I meet with every parent or guardian), I train adults how to hold 3-minute student-led reading discussions and parents assign points for their kid’s 5-days-per-week independent reading and discussion. I’m in a lower, poverty-challenged school with 75% free and reduced lunch, multi-ethnic, multi-languages, etc. If you have tricks up your sleeves to hold students accountable for reading that don’t require additional teacher correction or huge amounts of time, please add to the comments section of this post. At the end of this article, I link to a nicely organized list of articles and free resources for ELA and reading intervention teachers with quite a few more ideas on independent reading.

In the RC program, the SAM management system tracks individual and class quiz scores and also the number of words students have read in each book. If a student doesn’t pass the quiz after three attempts, she or he loses credit for having read the book. This means that the number of words the student has read is not tallied, and the student doesn’t receive a reward certificate as quickly. If it’s the independent reading that reinforces comprehension, vocabulary acquisition, and fluency, why doesn’t the student receive credit for doing so? The bottom line is that students receive positive reinforcement for mastering quizzes, not for reading. Reading is not rewarded; passing the quizzes is.

Claim 3: RC EMPOWERS educators with reports and actionable data at the student, school and district level. As a supplementary reading program, RC REINFORCES comprehension, vocabulary, and fluency skills. 

Counterclaims:¬†The reports do provide information to the teacher regarding who read what, at what Lexile levels, how many pages read, what quiz scores were achieved, who hasn’t taken a quiz for awhile (alerts), and more. Plenty of information about what your students are and are not doing with respect to their independent reading. All interesting information, but information which takes time to input, analyze, and report (whoever says that technology is a time-saver is crazy); and information which RI administrators (like your principal) can access and compare to that of your colleagues. Although not advocated by the authors of the RC program, most teachers do use this data in various ways to provide incentives for participation in terms of rewards and/or grades. Of course, the incentives can become problematic. See my¬†article, The 18 Reasons Not to Use Accelerated Reader¬†for examples. In short, the SAM reports do provide data collection and management functions (ones which could be done by paper and pencil or a simple Excel¬ģ spreadsheet in less time at no cost); however, none of these data informs reading instruction.

Next, let’s take a look at the claim about empowering educators with actionable data. Remember, the two assessments of the RC program are the three-times per year, Lexile-based HMH Reading Inventory (used for initial placement and subsequent progress monitoring) and the 45,000 quizzes. To my mind,¬†actionable¬†data should mean¬†teachable¬†data derived from prescriptive assessments that are reliable and valid. Let’s examine whether these two assessments provide information which is¬†teachable.

For example, let’s say the students in your class take the RI during the first week of school. One of your bright students, Amanda, scores an above grade-level Personal Lexile score of 700, while¬† your class average is 550. With the SAM management software, you are able to use that data to match readers to books. However, other than that use (which we’ve already shown to be of questionable value), those initial RI Lexile scores provide no data to inform our reading instruction. On the RI given 3 months later, Amanda improves to a 750 and her average quiz scores from 80‚Äď90%, but your class averages the same 550 Lexile level and has not improved its 70% quiz average.

What does that data indicate? Something appears to be helping Amanda improve her reading, but we have no idea what it is. It¬†could¬†be the RC program; it¬†could be the independent reading, itself; it¬†could¬†be the reading instruction you are doing in class, though you may not know exactly what instruction is helping; it could¬†be what her parents are doing at home.¬†Regarding your class, average Lexile and quiz scores, something appears not to be working. But what is the something¬†so we can do¬†something¬†about it? We don’t know. You could look at subgroups and find out that your girls have improved more than your boys, or one ethnic group over the other, etc.¬†But how does the Lexile and quiz data inform our instruction?¬†The short answer? It doesn’t.¬†The RI and quizzes provide no information about which reading skills have not yet been mastered and which have been mastered by Amanda or class as a whole. Neither assessment offers the teacher any specific data regarding what to teach and what not to teach.¬†So why test if it does not provide actionable¬†data?

A good question. Of course, teachers have been creating diagnostic and formative assessments for years that do inform their reading instruction in specific sub-skills. Good teachers are more than willing to test when the data pinpoints¬†what needs to be taught and practiced¬†and¬†what does not require repeated instruction.¬†Like many teachers, I’ve developed my own assessments to inform my instruction. I’ve written and field tested 13 diagnostic reading assessments with recording matrices and audio files, which provide teachable¬†data. I provide them free of charge to help your students, and because some teachers would prefer not to re-invent the wheel by creating their teaching resources to correspond to each assessment item. Yes, you can buy those instructional resources from Pennington Publishing. Simply click the link and look in the header to download and print the free assessments. Additionally, skim the Articles and Resources to find over 700 articles of interest to the ELA and reading teacher, including a slew of articles on how to create your own no-cost independent reading program that I think does a better job for students than either the¬†Accelerated Reader‚ĄĘ and Reading Counts! programs.

Both the Accelerated Reader‚ĄĘ and Reading Counts! program authors are careful to label their independent reading management programs as¬†supplementary programs, as they should. However, as every teacher knows, instructional time is reductive: if you add on¬†this, you have to take away¬†that.¬†Because both programs are designed for in-class and home practice, AR and RC supplant other instruction, most always¬†reading¬†instruction. Accepting at face value the RC claim that¬†RC REINFORCES comprehension, vocabulary, and fluency skills, my question to teachers would be… Which would help your students improve their reading more? REINFORCING or TEACHING? Feel free to download my SCRIP Comprehension Strategies TEACHING¬†resource at the end of this article as a reward for slogging through this rather long diatribe. I look forward to your comments.


Mark Pennington, MA Reading Specialist, is the author of the comprehensive assessment-based reading intervention curriculum, the Teaching Reading Strategies and Sam and Friends Guided Reading Phonics Books BUNDLE. Ideal for students reading two or more grade levels below current grade level, tiered response to intervention programs, ESL, ELL, ELD, and special education students. Simple directions, YouTube training videos, and well-crafted activities truly make this an almost no-prep curriculum. Works well as a half-year intensive program or full-year program. Phonological awareness, phonics, syllabication, sight words, fluency (with 128 YouTube modeled readings), spelling, vocabulary and comprehension. The 54 accompanying guided reading phonics books each have comprehension questions, a focus sound-spelling pattern, controlled sight words, a 30-second word fluency, a running record, and cleverly illustrated cartoons by David Rickert to match each entertaining story. These resources provide the best reading intervention program at a price every teacher can afford.

Teaching Reading Strategies and Sam and Friends Guided Reading Phonics Books BUNDLE

Teaching Reading Strategies and Sam and Friends Guided Reading Phonics Books

Check out the quality of these programs with a resource which works for both grade-level and struggling readers to improve internal monitoring of reading: 

These five FREE lessons will help you teach the SCRIP Comprehension Strategies for both grade-level and struggling readers. This FREE download is sent safe and secure to your email, and includes a set of SCRIP Posters and Bookmarks.

Get the SCRIP Comprehension Strategies FREE Resource:

Literacy Centers, Reading , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

English-language Arts Standards

Common Core State Standards

Common Core State Standards

Standards-based education is now the norm in public and most parochial schools. Having largely captured the focus of the educational reform movement over the last 25 years, standards-based instruction is now the instructional mandate in all 50 states. Although some states have rescinded their adoption of the Common Core State Standards and some, like Texas, never did adopt the Standards, each state has adopted its own set of standards and some have developed their own state assessment systems. Teachers and district administrators continue to align curriculum to the instructional demands of the Common Core English Language Arts Standards.

Although the authors of the Common Core State Standards assert that literacy instruction must be a shared responsibility within the school, the largest burden still falls on the shoulders of ELA teachers. Of the four Reading, Writing, Speaking and Listening, and Language Strands, the Language Strand presents the greatest challenge for many teachers. Most ELA teachers simply have not had the undergraduate or graduate coursework to prepare them to teach the L.1, 2, 3, 4, 5, and 6 Standards in grammar and usage, mechanics, spelling, language application, and vocabulary.

This author, Mark Pennington, has written articles and developed free teaching resources on the Common Core ELA Standards and included these in his Pennington Publishing Blog to support fellow ELA teachers and reading intervention specialists. Mark’s assessment-based teaching resources are available at Pennington Publishing.

This article and resource compilation is jam-packed with FREE resources, lesson plans, and samples from grades 4‚Äďhigh school ELA and reading intervention programs, developed by teacher and author, Mark Pennington. Each of the following 25+ articles has multiple links to research, related articles, and free or paid resources:

Common Core Literalism

The Common Core State Standards were never written to be the Bible for ELA and reading intervention teachers. Read what the Common Core authors have to say and see how a common sense approach to teaching to the Standards can benefit both students and teachers.

FREE Instructional Resources: Syllable Awareness Assessment, 20 Advanced Syllable Rules, 10 English Accent Rules

Response to Intervention and the Common Core

Many teachers have never read the entire Common Core English Language Arts Standards. Sure, they’ve read their own district or state summaries of the Standards, but not the documents themselves. To understand the instructional role of the Standards, teachers must read the  appendices, which discuss important reflections and research regarding, for instance, reading intervention.

Grammar and the Common Core

More than any other Strand within the Common Core State Standards, the Language Strand with its focus on direct grammar, mechanics, and vocabulary instruction has been whole-heartedly embraced or intentionally ignored by teachers.

Common Core Instructional Minutes

With all the CCSS mandates, how can an ELA teacher allocate instructional time to be faithful to the Standards, while maintaining some sense of one’s own priorities? This article gets down to the minute-by-minute.

Common Core Academic Language Words

Of course, history, science, and technology teachers need to teach domain-specific academic vocabulary. However, there is a difference between academic language and academic vocabulary. The latter is subject/content specific; the former is not. Reading more challenging expository novels, articles, documents, reports, etc. will certainly help students implicitly learn much academic language; however, academic language word lists coupled with meaningful instruction do have their place. So, which word lists make sense?

Common Core Greek and Latinates

The bulk of Vocabulary Standards are included in the Language Strand of the Common Core State Standards (CCSS). Greek and Latin affixes (prefixes and suffixes) and roots are key components of five of the grade level Standards: Grades 4‚ąí8. Which Greek and Latin affixes and roots should we teach? How many should we teach? How should we teach them?

Grammar, Mechanics, Spelling, and Vocabulary (Teaching the Language Strand)

Grammar, Mechanics, Spelling, and Vocabulary (Teaching the Language Strand)¬†is part of a comprehensive Grades 4‚ąí12 language program, designed to address each Standard in the Language Strand of the Common Core State Standards in 60‚ąí90 weekly instructional minutes. This full-year curriculum provides interactive grammar, usage, mechanics, and spelling lessons, a complete spelling patterns program, language application openers, and vocabulary instruction. The program has all the resources to meet the needs of diverse learners. Diagnostic assessments provide the data to enable teachers to individualize instruction with targeted worksheets, each with a formative assessment. Progress monitoring matrices allow teachers to track student progress. Each instructional resource is carefully designed to minimize teacher preparation, correction, and paperwork. Appendices have extensive instructional resources, including the¬†Pennington Manual of Style¬†and downloadable essay-comments. A student workbook accompanies this program.

Overview of the Common Core Language Strand

English-language arts teachers have long been accustomed to the four-fold division of our ‚Äúcontent‚ÄĚ area into Reading, Writing, Listening, and Speaking. These divisions have been widely accepted and promoted by the NCTE, publishers, and other organizations. In a nod to the fearsome foursome, the Common Core State Standards in English Language Arts maintains these divisions (called¬†strands) with two notable revisions: Speaking and Listening are combined and Language has its own seat at the table.

Common Core Grammar Standards

The Common Core State Standards in English Language Arts are divided into Reading, Writing, Speaking and Listening, and Language strands. The Common Core Grammar Standards are detailed in the Language Strand. It is notable that grammar and mechanics have their own strand, unlike the organization of many of the old state standards, which placed grammar and mechanics instruction solely within the confines of writing or speaking standards.

Of course, the writers of the Common Core use the ambiguous label, Language, to refer to what teachers and parents casually label as grammar and mechanics or conventions. To analyze content and educational philosophy of  the Common Core State Standards Language Strand, it may be helpful to examine What’s Good about the Common Core State Standards Language Strand? as well as What’s Bad about the Common Core State Standards Language Strand? chiefly from the words of the document itself.

How to Teach the Common Core Vocabulary Standards

What most teachers notice after careful reading of the Common Core Vocabulary Standards is the expected breadth, complexity, and depth of instruction across the grade levels. These vocabulary words require direct, deep-level instruction and practice in a variety of contexts to transfer to our students’ long-term memories. So what instructional strategies make sense to teach the Common Core Vocabulary Standards? And what is the right amount of direct, deep-level vocabulary instruction that will faithfully teach the Common Core Vocabulary Standards without consuming inordinate amounts of class time? Following is a weekly instructional plan to teach the L.4, 5, and 6 Vocabulary Standards.

CCSS Language Progressive Skills

The Language Strand has been one of the most controversial components of the COMMON CORE STATE STANDARDS FOR ENGLISH LANGUAGE ARTS & LITERACY IN HISTORY/SOCIAL STUDIES, SCIENCE, AND TECHNICAL SUBJECTS. The Language Progressive Skills document emphasizes the essential grammar, usage, and mechanics skills, which need to be reviewed and reinforced year after year..

Common Core Curricular Crossover

The Common Core State Standards (CCSS) produces some interesting curricular crossover. The traditional English-language arts divisions of reading, writing, listening, and speaking have been replaced with four new strands: reading, writing, speaking and listening, and language. The six Standards of the Language Strand borrow a bit from each of the traditional divisions. The inclusion of the Language Strand as its own set of Standards has created some concern in the ELA community.

Spelling Word Lists by Grade Levels

As an MA Reading Specialist and author of quite a few spelling curricula (eight at last count), I’m often asked about spelling word lists by grade levels. Which words are right for which grade levels? Is blank (substitute any word) a third or fourth grade word? Which spelling words are the most important ones to practice? The short answer is…

Common Core Essay Writing Terms

I propose using the CCSS language of instruction for the key writing terms across all subject disciplines in elementary, middle school, and high school. Some of us will have to come down out of our castles and give up pet writing terms that we’ve used for years, and ones that, indeed, may be more accurate than those of the CCSS. But for the sake of collaboration and service to our students, this pedagogical sacrifice is a must.

Common Core Content Area Reading and Writing

Nothing in the new Common Core State Standards (CCSS) has worried English-language arts teachers more than ‚ÄúThe Great Shift.‚ÄĚ This shift changes the emphasis of reading and writing in K-12 English-language arts (ELA) classrooms from the literature and narrative to the informational (to explain) and argumentative (to persuade) genres.

Common Core Language Standards

Teachers are generally quite familiar with the CCSS Reading and Writing Standards, not so with the Language Strand Standards. The Language Strand includes the grammar, usage, mechanics, and vocabulary Standards.

Standards and Accountability

Sometimes we teachers can be our own worst enemies. Check out this article, published in the Answer Sheet of The Washington Post.

Turning Dependent into Independent Readers

The Common Core State Standards for English-language Arts makes a compelling case for not doing business as usual in our ELA classrooms. That business consists of the traditional ‚Äúsage on the stage‚ÄĚ methodology of reading an entire novel or play out loud and parsing paragraphs one at a time. Our new business? Scaffolding just enough reading strategies and content as we act as ‚Äúguides on the side‚ÄĚ to facilitate independent reading. In other words, the days of¬† spoon-feeding have got to go.

Why and How to Teach Complex Text

A growing body of research presents a challenge to current K-12 reading/English-language Arts instruction. In essence, we need to ‚Äúup‚ÄĚ the level of text complexity and provide greater opportunities for independent reading. The Common Core State English-language Arts Standards provides a convincing three-reason argument in support of these changes in instructional practice. Following this rationale, I will share ten instructional implications and address a few possible objections.

Common Core State Writing Standards

The Common Core State Writing Standards have used a rather utilitarian approach to categorize essays into two classifications: argument and informational/explanatory writing.  The approach used by the English-language Arts committee was to examine the writing assignments of freshman English college professors then define the essay accordingly for the purposes of the Common Core State Writing Standards.

How to Teach the English-language Arts Standards

Every English-language arts teacher shares the same problem‚ÄĒtoo much to teach and not enough time to teach it. So, where are the magic beans that will allow us to teach all of the have-to‚Äôs (think ELA Standards) and still have a bit of time to teach the want-tos? Following are a few suggestions to help the clever ELA teacher have her cake and eat it, too.

Should We Teach Standards or Children?

The excesses of the standards-based movement frequently run contrary to the need to differentiate instruction, according to the diagnostic needs of children.

More Articles, Free Resources, and Teaching Tips from the Pennington Publishing Blog

Bookmark and check back often for new articles and free ELA/reading resources from Pennington Publishing.


Pennington Publishing’s mission is to provide the finest in assessment-based ELA and reading intervention resources for grades 4‒high school teachers. Mark Pennington is the author of two Standards-aligned programs: Teaching Essay Strategies and Grammar, Mechanics, Spelling, and Vocabulary (Teaching the Language Strand). Mark’s comprehensive Teaching Reading Strategies and the accompanying Sam and Friends Guided Reading Phonics Books help struggling readers significantly improve their reading skills in a full-year or half-year intensive reading intervention program. Make sure to check out Pennington Publishing’s free ELA and reading assessments to help you pinpoint grammar, usage, mechanics, spelling, and reading deficits.

Grammar/Mechanics, Literacy Centers, Reading, Spelling/Vocabulary, Study Skills, Writing , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , , ,

ELA and Reading Assessments Do’s and Don’ts #6

Ah… the final episode of ELA and Reading Assessments Do’s and Don’ts. Will they or won’t they kill off the hero? Of course, in the movies or on television, a final episode may or may not be the last. With the plethora of reunion shows (Roseanne¬†last year and Murphy Brown¬†this year) we all take the word¬†final¬†with a grain of salt.¬†If you’ve missed one of the following got-to-see episodes, check it out after you watch this one.

In case you were up in the lobby for part or all of the previous five episodes, we’ve previously covered the following assessment topics in Episodes 1‚Äď20:

Episode 1

  • Do use comprehensive assessments, not random samples.¬†
  • DON’T assess to assess. Assessment is not the end goal.¬†
  • DO use diagnostic assessments.¬†
  • DON’T assess what you won’t teach.”¬†

Episode 2

  • DO analyze data with others (drop your defenses).¬†
  • DON’T assess what you can’t teach.¬†
  • DO steal from others.¬†
  • DON’T assess what you must confess (data is dangerous).

Episode 3

  • DO analyze data both data deficits and mastery.
  • DON’T assess what you haven’t taught.
  • DO use instructional resources with embedded assessments.
  • DON’T use instructional resources which don’t teach to data.

Episode 4

  • DO let diagnostic data do the talking.¬†
  • DON‚ÄôT assume what students do and do not know.¬†
  • DO use objective data.¬†
  • DON‚ÄôT trust teacher judgment alone.

Episode 5

  • DO think of assessment¬† as instruction.¬†
  • DON’T trust all assessment results.¬†
  • DO make students and parents your assessment partners.¬†
  • Don’t go beyond the scope of your assessments.


ELA and Reading Assessments

Do’s and Don’ts: Assessments

Today’s topics include the following: DO use both diagnostic and¬†formative assessments. DON’T assess to determine a generic problem. DO review mastered material often. DON’T solely assess grade-level Standards.

Let’s kick your feet up (if you’re in one of those new theaters) and grab a handful of popcorn to read further. And make sure to stay until the end to download our FREE reading fluency assessment with recording matrix.

DO use both diagnostic and formative assessments.

Good teaching begins with finding out what students know and don’t know about the concept or skill before instruction begins. So often we assume that student do not know what we plan to teach. We start at the beginning, when a brief diagnostic assessment might better inform our instruction. You wouldn’t hire a contractor to remodel a bathroom without seeing the existing bathroom. Nor would you think much of a contractor who insisted on building a new foundation when the existing foundation was fine and ready to build upon.

When teachers complete a diagnostic assessment and find that 1/3 of their class lacks a certain skill, say commas after nouns of direct address, they have three options: 1. Skip the comma lesson because “most (2/3) have mastered the skill.” 2. Teach the whole class the comma lesson because “some (1/3) don’t know it and it won’t kill the rest of the kids (2/3) to review.” 3. Provide individualized or small group instruction “only for the kids (1/3) who need to master the skill” while the ones who have achieved mastery work on something else. As a fan of assessment-based instruction, I support #3.

However, if we just use diagnostic assessments, we miss out on an essential instructional component: formative assessment. Formative assessment checks on students’ understanding of the concept or skill with the context of instruction. Following instructional input and guided practice, brief formative assessment informs the teacher’s next step in instruction: Move on because they’ve got it. Re-teach to the entire class. Re-teach to those to have not mastered the concept or skill.

Need an example of an effective formative assessment?

Write three sentences: one with a noun of direct address at the beginning, one in the middle, and one at the end of a sentence.

DON’T assess to determine a generic problem.

Let me step on a few toes to illustrate a frequent problem with teacher assessments. Most elementary school teachers administer reading fluency assessments at the beginning of the year. Yes, middle and high school ELA teachers should be doing the same, albeit with silent reading fluencies. However, teachers select (or their district provides) a grade-level passage to read. Teachers dutifully compare student data to research-based grade level norms. Some teachers will re-assess throughout the year with similar grade-level passages and chart growth. All well and good; however, what does this common assessment procedure really tell us and how does it inform our reading instruction? Answer: The fluency assessments only tell us generically that Brenda reads below, Juan reads at, and Cheyenne reads above grade-level fluency norms on a grade-level passage. 

All we really know is that Brenda has a generic problem in reading grade-level passages.¬†What we don’t know (but would like to know to inform our instruction)¬†are the following specific data: Brenda has a frustrational reading level with grade 5 passages, but is instructional at grade 4 and independent at grade 3. Brenda. That specific data would inform our instruction and pinpoint appropriate reading resources for Brenda’s practice (as well as for Juan and Cheyenne).

Of course, you could follow the initial assessment with other grade level assessments to get the specificity, but why would you if an initial assessment would give you not only grade-level data, but also instructional level data? You’ll love our FREE download!

In other words, if you’re going to assess, you might as well assess efficiently and specifically. Knowing that a student¬†has a problem¬† is okay; knowing exactly¬†what the student problem is¬†is much more useful.

DO review mastered material often.

The Common Core State Standard authors speak often in Appendix A about the cyclical nature of learning. Beyond the normal forgetting cycle, students often require re-teaching. Once mastered, always mastered is not a truism.

Additionally, Summer Brain Drain is all-too-often a reality teachers face with a new set of students each year. Frequently, last year’s assessment data provided by last year’s teacher may seem to indicate starting points higher that what the students indicate on even the same assessments given on Day One. Sometimes the new teacher may assume padded results from the previous year’s teacher to impress parents and administrators. However, who loop with their students are often surprised by how much re-teaching must be done to get students up to¬†where they were.

The Test-Teach-Test-Teach-Test model is what assessment-based instruction is all about.

DON’T solely assess grade-level Standards.

I once taught next door to an eighth grade teacher whom the kids adored. He was funny, bright, and cared about his students. He was also glued to the Standards. So much so, that he only taught grade-level Standards. Irrespective of whether students were ready for the individual Standard; irrespective of whether students were deficit in much more important concepts or skills (such as being able to read); and irrespective of whether students already knew the Standards.

His philosophy was “if every teacher taught the grade-level Standards, no remediation would be required.” He said, “I’m an eighth-grade teacher and I teach the eighth-grade Standards, nothing more and nothing less.”

One day I got up the nerve to ask him, “Wouldn’t it make more sense if your philosophy was¬†“if every student learned¬†the grade-level Standards, no remediation would be required”?

His middle and upper kids did fine, although I suspect they had some significant learning gaps. The lower kids floundered or were transferred into my classes.


I’m Mark Pennington, ELA teacher and reading specialist. Check out my assessment-based ELA and reading intervention resources at Pennington Publishing.


The ‚ÄúPets‚ÄĚ expository fluency passage is leveled in a unique pyramid design: the first paragraph is at the first grade (Fleish-Kincaid)¬†reading level; the second paragraph is at the second grade level; the third paragraph is at the third grade level; the fourth paragraph is at the fourth grade level; the fifth paragraph is at the fifth grade level; the sixth paragraph is at the sixth grade level; and the seventh paragraph is at the seventh grade level. Thus, the reader begins practice at an easier level to build confidence and then moves to more difficult academic language. As the student reads the fluency passage, the teacher will be able to note the reading levels at which the student has a high degree of accuracy and automaticity. Automaticity refers to the ability of the reader to read effortlessly without stumbling or sounding-out words.¬†The 383 word passage permits the teacher to assess two-minute reading fluencies (a much better measurement than a one-minute timing).

Get the The Pets Fluency Assessment FREE Resource:

Grammar/Mechanics, Literacy Centers, Reading, Spelling/Vocabulary, Study Skills, Writing , , , , , , , , , , , , , , , , , , , ,

ELA and Reading Assessments Do’s and Don’ts #5

Do's and Don'ts of ELA and Reading AssessmentsNotice how movie theaters have jumped on the rewards bandwagon? Yes, we earn points of our rewards card toward a free popcorn or soda. I’m all about the rewards, but we now have a desk drawer full cards.

However, for my Do’s and Don’ts of ELA and Reading Assessments¬†series, you “don’t need no stinkin’ card” (Mel Brook’s Blazing Saddles)¬†to get our FREE assessments, audio files, progress monitoring matrices, and lessons.

If you’ve missed one of the following got-to-see episodes, check it out after you watch this one.

  1. Episode 1
  • Do use comprehensive assessments, not random samples.¬†
  • DON’T assess to assess. Assessment is not the end goal.¬†
  • DO use diagnostic assessments.¬†
  • DON’T assess what you won’t teach.”¬†
  1. Episode 2
  • DO analyze data with others (drop your defenses).¬†
  • DON’T assess what you can’t teach.¬†
  • DO steal from others.¬†
  • DON’T assess what you must confess (data is dangerous).
  1. Episode 3
  • DO analyze data both data deficits and mastery.
  • DON’T assess what you haven’t taught.
  • DO use instructional resources with embedded assessments.
  • DON’T use instructional resources which don’t teach to data.
  1. Episode 4
  • DO let diagnostic data do the talking.¬†
  • DON‚ÄôT assume what students do and do not know.¬†
  • DO use objective data.¬†
  • DON‚ÄôT trust teacher judgment alone.

Now, sit back in your plushy seat and enjoy the flick. In Episode 5 we are taking a look at the following:

DO think of assessment¬† as instruction.¬†DON’T trust all assessment results.¬†DO make students and parents your assessment partners.¬†Don’t go beyond the scope of your assessments.

Wait ’til you download the featured assessment and matrix. It’s worth the wait.

DO think of assessment  as instruction. 

So often teachers view assessments as extraneous¬†got-to’s, not as integral instructional components. I’ve heard, “I got into teaching to¬†teach, not to assess” more times than I can count.

I kindly suggest that we should re-orient our thinking. No teacher would want to use an instructional resource that provided inaccurate information. No teacher would want to hand out a worksheet that her students had already completed. No teacher would want to waste time teaching something that her students already had mastered. Yet, teachers do so all the time when they have not assessed what students know and what they don’t know.

Diagnostic and formative assessments inform our instruction. No one would trust a doctor who would write a prescription without a diagnosis. Diagnosis is part of the exam. The same is true for teaching. Assessment is an integral component of instruction.

If they know it, they will show it; if they don’t they won’t. So don’t blow it; make ’em show it.

DON’T trust all assessment results.¬†

Even the best of doctors will suggest¬†a second opinion. This is sound advice for teacher diagnosticians as well. Sometimes it makes sense to use an alternative assessment to double-check what students know and what they don’t know, especially when the results seem inconsistent with other data.

When I was in fifth grade, I was pulled out of class to be tested for the¬†gifted program.¬†The assessment consisted of a timed test of orally delivered questions. After the second or third question, I hit upon a strategy to give me more¬†think time.¬†After each oral question, I asked, “What?” I got the question again and had twice as long to answer the question. I don’t remember if I qualified for the program, but I do remember being referred to the audiologist for hearing loss.

When in doubt, double-check with a different assessment.

DO make students and parents your assessment partners. 

Test data shouldn’t be secret. Both students and parents need to know what is already¬†known and what needs to be¬†known.¬†Most elementary teachers share some form of data at student-parent-teacher conferences, but secondary teachers rarely do so.

My suggestion is to share both diagnostic and formative assessment data on a regular basis with students and parents. Both are encouraged and motivated by progress. Share progress monitoring matrices with your partners.

Don’t go beyond the scope of your assessments.

Good assessments are limited assessments. They test specific concepts and skills, not general ones. Teachers over-reach when they try to make assessments walk on all fours. In other words, when teachers make assessments prescribe generalizations or treatments beyond the scopes of their applications.

For example, a student who fails to correctly punctuate an MLA citation on a unit test, may not need further instruction in what and what not to cite. Or a student who does not know when and when not to drop the final silent e when adding on a suffix, may not need to practice reading silent final e sound-spellings (the former is a spelling skills; the latter is a phonics skill).

Effective assessment-based instruction sticks to the limits of the assessment and does not generalize.

Glad you dropped by to watch Episode 5? Before you re-fill that¬†unlimited re-fills¬†popcorn on your way out, better grab your ticket for the next installment of¬†ELA and Reading Assessments Do’s and Don’ts: Episode 6. This once could sell out! Also get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 99% score on Rotten Tomatoes! Here’s the preview:

  1. DO use both diagnostic and formative assessments.
  2. DON’T assess to determine a generic problem.
  3. DO review mastered material often.
  4. DON’T solely assess grade-level Standards.


I’m Mark Pennington, ELA teacher and reading specialist. Check out my assessment-based ELA and reading intervention resources at Pennington Publishing.

Get the Consonant Sounds Phonics Assessment, Audio File, and Recording Matrix FREE Resource:

Grammar/Mechanics, Literacy Centers, Reading, Spelling/Vocabulary, Writing , , , , , , , , , , , , , , , , ,

ELA and Reading Assessments Do’s and Don’ts #4

ELA and Reading Assessment Do's and Don'ts

Assessment Do’s and Don’ts

I’ve been using a silly movie theme to weave together a series of articles for my¬†Do’s and Don’ts of ELA and Reading Assessments¬†series. So far I’ve offered these suggestions over the trailer and first three episodes:

    1. Episode 1
  • Do use comprehensive assessments, not random samples.¬†
  • DON’T assess to assess. Assessment is not the end goal.¬†
  • DO use diagnostic assessments.¬†
  • DON’T assess what you won’t teach.”¬†
    1. Episode 2
  • DO analyze data with others (drop your defenses).¬†
  • DON’T assess what you can’t teach.¬†
  • DO steal from others.¬†
  • DON’T assess what you must confess (data is dangerous).
    1. Episode 3
  • DO analyze data both data deficits and mastery.
  • DON’T assess what you haven’t taught.
  • DO use instructional resources with embedded assessments.
  • DON’T use instructional resources which don’t teach to data.

Permit me to tell a brief anecdote. As a junior in high school, I got my license on my sixteenth birthday. At last, I could take my girlfriend out on a real date! Where to go? The movies, of course. Just one problem.

Friday night was guys’ night. My group of buddies and I always got together on Friday night. When Richard called me up after school to tell me that he would pick me up at 7:00, I quickly lied and told him that I was sick. Of course, I had already called my girlfriend to ask her to go to the movies.

We were munching on popcorn, half-way through the movie, when an obnoxiously loud group of guys entered the theater. Yes… my friends. I slumped down in my seat and told my girlfriend that I needed to see all the credits before leaving. When I assumed my friends had left the theater for their next Friday night adventure, my girlfriend and I slowly made our way up to the lobby.

Richard was the first friend to¬†greet¬†me. Let’s just say I paid dearly for that lie.

This article’s focus?

DO let diagnostic data do the talking. DON’T assume what students do and do not know. DO use objective data. DON’T trust teacher judgment alone.

The FREE assessment download at the end of this article includes a recording matrix and two great lessons… all to convince you to check out my assessment-based ELA and reading program resources at Pennington Publishing.

DO let diagnostic data do the talking.

One of the first lessons new teachers learn is how to answer this student or parent question: “Why did you give me (him or her) a ___ on this essay, test, project, etc.?”

Of course, every veteran teacher knows the proper response (with italics for speech emphasis): “I didn’t¬†give¬†you (him¬†or¬†her) anything. You (he or she)¬†earned¬†it.

A less snotty and more effective response is to reference the data. Data is objective. Changing the subjective nature of the question into an objective answer is a good teacher self-defense mechanism and gets to the heart of the issue.

Diagnostic data is especially helpful in answering¬†why¬†students are having difficulties in a class. Additionally, the data in and of itself offers a prescription for treatment. Going home from the doctor with a “This should go away by itself in a few weeks” or a “Just not sure what the problem is, but it doesn’t seem too serious” is frustrating. Patients want a prescription to fix the issue. Parents and students can get that prescription with assessment-based instructional resources.

One other application for both new and veteran teachers to note: A teacher approaches her principal with this request: “I need $$$$ to purchase Pennington Publishing’s¬†Grammar, Mechanics, Spelling, and Vocabulary¬†BUNDLE. Our program adoption does not provide the resources I need to teach the CCSS standards.”

Answer: “Not at this time.”

Instead, let diagnostic data do the talking.

“Look at the diagnostic data on this matrix for my students. They need the resources to teach to these deficits.”

Answer: “Yes (or Maybe)”

DON’T assume what students do and do not know. 

We teachers are certainly not free of presuppositions and bias. As a result, we assume what has yet to be proven. In other words, we beg the question regarding what our students know and don’t know.

He must be smart, but just lazy. His older sister was one of my best students. They’re in an honors class; of course they know their parts of speech. I have to teach everything as if none of my students knows anything; I assume they are all¬†tabula rasa¬†(blank slates).¬†You all had Ms. Peters last year, so we don’t have to teach you the structure of an argumentative essay.

Effective diagnostic assessments eliminates the assumptions. Regarding diagnostic assessments, I always advise teachers: “If they know it, they can¬†show¬†it; if they don’t, they won’t.”

DO use objective data.

Not all diagnostic assessments are created equally. By design, a random sample assessment is subjective, no matter the form of sampling. Those of you who remember your college statistics class will agree.

Teachers need objective data, not data which suggests problem areas. Teachers need to know the specifics to be able to inform their instruction. For this application, objective means comprehensive.

The “objective” PAARC, SWBAC, or state-constructed CCSS tests may indicate relative student weaknesses in mechanics; however, teachers want to know exactly which comma rules have and have not been mastered. Teachers need that form of¬†objective data.

DON’T trust teacher judgment alone.

After years of teaching, veteran teachers learn to rely on their judgment (as they should). After a few more years of teaching, good teachers learn to distrust their own judgment at points. Experienced teachers look for the counter-intuitive in these complex subjects of study that we call students. What makes them tick? Kids keep our business interesting.

Diagnostic and formative assessments bring out our own errors in judgment and help us experiment to find solutions for what our students need to succeed. Assessments point out discrepancies and point to alternative means of instruction.

For example, a student may score high in reading comprehension on an un-timed standards-based assessment. Also, she was in Ms. McGuire’s highest reading group last year. Most teachers would assume that she has no reading problems and should be assigned to an advanced literacy group.

Yet, her diagnostic spelling assessment demonstrates plenty of gaps in spelling patterns. A wise teacher would suspend her initial judgment and do a bit more digging. If that teacher gave the Vowel Sounds Phonics Assessment (our FREE download at the end of this article), the student might demonstrate some relative weaknesses. She may be an excellent sight-word reader, who does fine with stories, but one whom will fall apart reading an expository article or her science textbook.

Like my dad always told me… Measure twice and cut once.

Thanks for watching Episode 4. Make sure to buy your ticket for the next installment of¬†ELA and Reading Assessments Do’s and Don’ts: Episode 5 before you sneak out of the theater with your girlfriend or boyfriend. Also get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 94% score on Rotten Tomatoes! Here’s the preview:¬†DO treat assessment¬† as instruction.¬†DON’T trust all assessment results.¬†DO make students and parents your assessment partners.¬†Don’t go beyond the scope of your assessments.


I’m Mark Pennington, ELA teacher and reading specialist. Check out my assessment-based ELA and reading intervention resources at Pennington Publishing.

Get the Vowel Sounds Phonics Assessment with Audio File and Matrix FREE Resource:

Grammar/Mechanics, Literacy Centers, Reading, Spelling/Vocabulary, Writing , , , , , , , , , , , , , , , , ,

ELA and Reading Assessments Do’s and Don’ts #3

ELA and Reading Assessments Do's and Don'ts

Do’s and Don’ts: ELA and Reading Assessments

The thing about movie sequels is that we feel a compulsive necessity to see the next and the next because we’ve seen the first. I’d be interested to know what percentage of movie-goers, who saw all three¬†Lord of the Rings¬†movies, watched both¬†Hobbit prequels. My guess would be a rather high percentage.

If my theory is correct, I’d also hazard to guess that the critic reviews would not substantially alter that percentage.

Of course my hope is that I’ve hooked you on this article series and the FREE downloads ūüôā of assessments, recording matrices, audio files, and activities in order to entice you to check out¬†my corresponding assessment-based products at Pennington Publishing.

In my¬†Do’s and Don’ts of ELA and Reading Assessments¬†series, I’ve offered these bits of advice so far:

    1. Episode 1
  • Do use comprehensive assessments, not random samples.¬†
  • DON’T assess to assess. Assessment is not the end goal.¬†
  • DO use diagnostic assessments.¬†
  • DON’T assess what you won’t teach.”¬†
    1. Episode 2
  • DO analyze data with others (drop your defenses).¬†
  • DON’T assess what you can’t teach.¬†
  • DO steal from others.¬†
  • DON’T assess what you must confess (data is dangerous).

DO analyze data both data deficits and mastery.

Students are like fixer-upper houses.

Kids are fixer-uppers, waiting to be fixed and flipped.

Teachers are fixers. In some sense we view our students as “as is” houses or fixer-uppers, waiting for us to determine what needs repair and updating so that we can flip them in market-ready condition to the next teacher.

Teachers should use diagnostic assessments in this way. Most all students need to catch up while they keep up with grade-level instruction.

However, we miss some of the value of diagnostic assessments when we don’t analyze data to build upon the strengths of individual students. For example, teachers are frequently concerned about the student who has high reading fluency rates, but poor comprehension. Yes, some students are able to read quickly with minimal miscues, but understand and retain little of what they have read. Just weird, right?

Looking only at the diagnostic deficit (lack of comprehension) might lead the teacher to assume that the student is a sight word reader in need of extensive decoding practice to shore up this reading foundation. However, if we look at the relative strength (fluency), we might prescribe a different treatment to build upon that strength. It may certainly be true that the student might have some decoding deficits, but if the student is able to recognize the words, it makes sense to use that ability to teach the student how to internally monitor text with self-questioning strategies.

Both relative strengths and weaknesses matter when analyzing student assessment data.

DON’T assess what you haven’t taught.

Teachers love to see progress in their students. Our profession enables us to see a student go from A to B throughout the year with us as the relevant variable. Assessment data does provide us with extrinsic rewards and a self-pat-on-the-back. I love our profession!

But we have to use real data to achieve that self-satisfaction. Otherwise, we are only fooling ourselves. As the new school year begins, countless teachers will administer entry baseline assessments, designed to demonstrate student ignorance. These assessments test what students should know by the end of the year, not what they are expected to know at the beginning of the year. Often the same assessment is administered at the end of the year to determine summative progress and assess a teacher’s¬†program effectiveness.

Resist the temptation to artificially produce a feel-good assessment program such as that. Such a baseline test affords no diagnostic data; it does not inform your instruction. It makes students feel stupid and wastes class time. The year-end summative assessment is too far removed from the baseline to measure the effects of the the variables (teacher and program) upon achievement with any degree of accuracy.

Test only what has been taught to see what they’ve retained and forgotten.

DO use instructional resources with embedded assessments.

In my work as an ELA teacher and reading specialist at the elementary, middle school, high school, and community college levels, I’ve found that most teachers use three types of assessments: 1. They give a few entry-level assessments, but do little if any thing with the data. 2. They give unit tests once a month, but do not re-teach or re-test. 3. They give some form of end-of-year or term summative test (the final) with little or no review or re-teaching of the test results.

As you, no doubt, can tell, I don’t see the value in any of the above approaches to assessment. It’s not that these tests are useless; it’s that they tend to be reductive. Teachers give these instead of the tests they should be using to inform their instruction. Diagnostic assessments (as detailed in the previous section) are essential to plan and inform instruction. Also, what’s missing in their assessment plan? Formative assessments.

My take is that the best method of on-going formative assessment is with embedded assessments. I use embedded assessments to mean quick checks for understanding that are included in each lesson. Both the teacher and student need to know whether the skill or concept is understood following instruction, guided practice, and independent practice. For example, in the FREE diagnostic assessment (with audio file), recording matrix, and lessons download at the end of this article, the lesson samples from my Differentiated Spelling Instruction programs are spelling pattern worksheets. These are remedial worksheets which students would complete if the Diagnostic Spelling Assessment indicated specific spelling pattern deficits. Each worksheet includes a writing application at the end of the worksheet, which demonstrates whether the student has or has not mastered the practiced spelling pattern. These are embedded assessments, which the teacher can use to determine if additional instruction is unnecessary or required.

Use instructional materials which teach and test.

DON’T use instructional resources which don’t teach to data.

The converse of the previous section is also important to bullet point. To put things simply: Why would a teacher choose to use an instructional resource (a worksheet, a game, software, a lecture, a class discussion, an article, anything) which is not testable in some way? Of course, the assessment need not include pencil and paper; informed teacher observation can certainly include assessment of learning.

Let’s use one example to demonstrate an instructional resource which does not teach to data and how that same resource can teach to data: independent reading. This one will step on a few toes.

Instructional Resource: “Everyone take out your independent reading books for Sustained Silent Reading (SSR).” Okay, you may do Drop Everything and Read (DEAR) or Free Voluntary Reading (FVR) or…

Practice:  20 minutes of silent reading

Assessment: None

The instructional resource may or may not be¬†teaching.¬†We don’t know. If the student is reading well at appropriate challenge level, the student is certainly benefiting from vocabulary acquisition. If the student is daydreaming or pretending to read, SSR is producing no instruction benefit. Following is an alternative use of this instructional resource:

Instructional Resource: “Everyone take out your challenge level independent reading books for Sustained Silent Reading (SSR), your

SCRIP Comprehension Bookmarks

SCRIP Comprehension Strategy Bookmarks

SCRIP Comprehension Strategies Bookmarks, and your pencil for annotations (margin notes).”

Practice with Assessment:¬† Read for 10 minutes, annotating the text. Then do a re-tell with your assigned partner for 1 minute, using the SCRIP Comprehension Strategies Bookmarks as self-questioning prompts. Partners are to complete the re-tell checklist. Repeat after 10 more minutes. Teacher randomly calls on a few readers to repeat their re-tells to the entire class and their partners’ additions. If the checklists and teacher observation of the oral re-tells indicate that the students are missing, say, causes-effect relationships in their reading, the teacher should prepare and present a think-aloud lesson, emphasizing this reading strategy with practice.¬†This practice uses data and informs the teacher’s instruction. Plus, it provides students with a purpose for instruction and holds them accountable for learning.

Thanks for watching Episode 3. Make sure to purchase your ticket for the next installment of¬†ELA and Reading Assessments Do’s and Don’ts: Episode 4 before you walk out of the theater. This episode will sell-out fast! Also get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 98% score on Rotten Tomatoes! Here’s the preview:¬†DO let diagnostic data do the talking. DON’T assume what students do and do not know. DO use objective data. DON’T trust teacher judgment alone.


I’m Mark Pennington, ELA teacher and reading specialist. Check out my assessment-based ELA and reading intervention resources at Pennington Publishing.

Get the Diagnostic Spelling Assessment, Mastery Matrix, and Sample Lessons FREE Resource:

Grammar/Mechanics, Literacy Centers, Reading, Spelling/Vocabulary, Writing , , , , , , , , , , , , , , , ,

ELA and Reading Assessments Do’s and Don’ts #2

English and Reading Assessments

ELA and Reading Assessments

You know how it is with movie sequels; the sequel rarely lives up to the promise of the original movie. However, there are exceptions and you’re reading one ūüôā

In my¬†Do’s and Don’ts of ELA and Reading Assessments¬†series, I began with a¬†trailer¬†to introduce the articles, in which I argued, “Do use comprehensive assessments, not random samples.”¬†I followed with the first episode, in which I elaborate on the following:¬†“DON’T assess to assess. Assessment is not the end goal.¬†DO use diagnostic assessments.¬†DON’T assess what you won’t teach.”¬†Both the trailer and first episode provide some of my¬†15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. Take a look at these later, but you’ve got to read this article first and grab the FREE download.

As an ELA teacher and reading specialist, I believe in the power of ELA and reading assessments. However, as with many educational practices, appropriate use is often coupled with misuse (or even abuse); hence, the Do’s and Don’ts of ELA and Reading Assessments.

DO analyze data with others (drop your defenses).

We teachers love our independence, but it sometimes comes with a cost to our students.

My eighth-grade ELA colleague in the classroom next door has the reputation of being a fine teacher. She serves as our department chair and we’ve taught together for a dozen years. I can tell you all about her two kids and husband. Of course, I spell her once in a while for a bathroom break, but I’ve never seen her teach; nor has she seen me teach. I’ve found this scenario to be quite typical. Our classrooms are our castles. We let down the drawbridges a few times a year for administrative walk-throughs or evaluations, but rarely more than that.

Our department meetings are all business: budget, supply status, pleas to keep the workroom clean, schedules, and novel rotations. We also meet twice-per-month for grade-level team meanings. Again, more business with some curricular planning and the usual complaint-sharing about students, parents, the district, and administrators. Administrators want us to have common assessments, mainly to ensure consistent instruction. We do, but get around that requirement by adding on our own assessments and make these the ones that matter. We never analyze student data, except the Common Core annual assessment (and that data is aggregated by grade-level subject, not by individual teacher). Of course, that data is out-of-date (months ago) and so general as to be of minimal use.

At the beginning of the school year I sing the same old song: “Can’t we set aside time at each meeting to look at each others’ student work and learn from each other?” I mean assignments, essays, and unit tests… the stuff that we are now teaching. Everyone agrees we should, but we never have enough time. Why not?

We’re afraid.

What if she finds out that I’m just a mediocre teacher? What if he finds out that I have no clue about how to teach grammar? What if they discover that I really don’t differentiate instruction, though I have a reputation for doing so? Would I be able to or willing to change how I teach? My colleagues aren’t my bosses.

It’s time we take some risks and let the assessment data do the talking.¬†None of us is as good or bad as we think. Everyone has something to contribute and something to learn. We need different perspectives on analyzing data; looking solely at your own data without comparison to others’ data may lead to inaccurate judgments and faulty instruction.

Let’s drop our defenses and let our colleagues into our professional lives.¬†Data analysis as a community of professional educators can produce satisfying results and helps us grow as professionals.

DON’T assess what you can’t teach.

When teachers sit down and brainstorm what baseline assessments to give at the start of the school year, someone invariably suggests a reading comprehension test and a writing sample. I chime in with a mechanics test. Here’s why my suggestion makes sense and my colleague’s does not.

A mechanics test is teachable: 9 comma rules, 7 capitalization rules, and 16 italics, underlining, quotation marks, etc. rules. A reading comprehension test and a writing sample are not. Check out my article, Don’t Teach Reading Comprehension when you have time. Suffice it to say that the latter two tests will not yield the same kind of specific data as, say, that mechanics test. Want to download that mechanics test and progress monitoring matrix? The FREE download is at the end of the article; you can teach to this assessment.

Bottom line? You don’t have time to assess for the sake of assessing. Refuse to assess what will not yield teachable data.

DO steal from others.

Teacher constructed assessments provide the best tools. Work with colleagues to create diagnostic and formative assessments to measure student achievement and quick follow-up assessments designed to re-assess, once you re-teach what individual students did not master the first time.

Steal exercises, activities, and worksheets from colleagues that will re-teach. No better compliment can be paid to a fellow teacher than “Would you mind making me a copy of that?”

DON’T assess what you must confess (data is dangerous).

I would add an important cautionary note to sharing assessment data. First, students do have a right to privacy. Be careful to keep data analysis in-house. On my recording matrices I suggest using student identification numbers when posting results in the classroom. Second, ill-informed parents and administrators will sometimes misuse data to make judgments about the teacher rather than the student. Lack of mastered concepts and skills could be used to accuse previous or present teachers of educational malpractice. Some administrators will cite quantitative data on evaluations to comment on lack of progress.

Teachers should be judicious and careful in publicizing data. Most parents and administrators will welcome the information, understand it in its proper context, and recognize the level of your professionalism. Set some department or team-level guidelines for data sharing and test the waters before sharing everything.

To clarify, it’s not the data that is dangerous; it’s the misuse that needs to be avoided.

That’s it for now. Some of you will jump up into the aisle to head to the lobby upon seeing “The End.” Others will relax and let the theater clear out before walking out. Make sure to purchase your ticket for the next installment of¬†ELA and Reading Assessments Do’s and Don’ts: Episode 2 and get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 87% score on Rotten Tomatoes! Here’s the preview:¬†DO analyze both data deficits and mastery. DON’T assess what you haven’t taught. DO use instructional resources with embedded assessments. DON’T use instructional resources which don’t teach to data.


I’m Mark Pennington, ELA teacher and reading specialist. Check out my assessment-based ELA and reading intervention resources at Pennington Publishing.

Get the Diagnostic Mechanics Assessment with Recording Matrix FREE Resource:

Grammar/Mechanics, Literacy Centers, Reading, Spelling/Vocabulary, Writing , , , , , , , , , , , , , , , , , , ,

ELA and Reading Assessments Do’s and Don’ts #1

Many movie theaters are now opting to sell you specific seats for a show time, rather than the traditional first come first served model. Although you have to pay a premium for this advanced purchase option, I think it’s worth every penny. Here’s why: If you time it right, you can show up to your assigned seat right before the start of the movie and skip the annoying previews (usually known as¬†trailers¬†for some reason). According to an editor on Reddit, these trailers (including commercials and warnings to “Please silence your cell phone”) average 15-20 minutes.

Do's and Don'ts of ELA and Reading Assessments

ELA and Reading Assessment Do’s and Don’ts: The Movie Trailer

In my¬†Do’s and Don’ts of ELA and Reading Assessments¬†series, I began with a¬†trailer¬†to introduce the articles. This preview, Do use comprehensive assessments, not random samples,¬†focused on why teachers want quick, whole-class, comprehensive assessments which produce the specific data regarding what students know and what they don’t know about a subject and why normed tests and achievement tests, such as the PAARC, SWBAC, and other state CCSS tests don’t provide that data. As an enticement to read the articles (and check out my Pennington Publishing¬†programs to teach to the assessments)¬†I provided two assessments which meet that desired criteria: the 1. Alphabetic Awareness Assessment and the 2. Sight Syllables (Greek and Latin prefix and suffix) Assessment. Additionally, the respective downloads include the answers,¬†corresponding matrices, administrative audio files, and ready-to-teach lessons.

But first, let’s take a look at the first three-part episode in the¬†Do’s and Don’t of ELA and Reading Assessments¬†series:¬†DON’T assess to assess. Assessment is not the end goal.¬†DO use diagnostic assessments.¬†DON’T assess what you won’t teach.¬†Plus, wait ’til you see the FREE download at the end of this article! Plus, a bonus.

DON’T assess to assess. Assessment is not the end goal.

A number of years ago, our seventh and eighth-grade ELA department gathered over a number of days in the summer to plan a diagnostic assessment and curricular map to teach the CCSS grammar, usage, and mechanics standards L. 1, 2, and 3. I was especially pleased with the diagnostic assessment, which covered K-6 standards and felt that the team was finally ready to help students catch up while they keep up with grade-level standards.

By the end of the first two weeks of instruction, every ELA teacher had dutifully administered, corrected, and recorded the results of the assessment on our progress monitoring matrix. I began developing worksheets to target the diagnostic deficits and formative assessments to determine whether students had mastered these skills and concepts. I placed copies of the worksheets in our “share binder.” My students were excited to see their progress in mastering their deficits while we concurrently worked on grade-level instruction.

At our monthly team meeting, I brought my progress monitoring matrix to brag on my students. “That’s great, Mark.” “Nice work. I don’t know how you do it.” No one else had done anything with the diagnostic data.

Somehow I got up enough courage to ask, “Why did you all administer, correct, and record the diagnostic assessment if you don’t plan on using the data to inform your instruction?”

Responses included, “The principal wants us to give diagnostic assessments.” “The test did give me a feel for what my class did and did not know.” “It shows the students that they don’t know everything.” “It confirms my belief that previous teachers have not done a good job teaching, so I have to teach everything.”

Class time is too valuable to waste. Assessment is not an end in and of itself.

DO use diagnostic assessments.

Let’s face it; we all bring biases into the classroom. We assume that Student A is a fluent reader because she is in an honors class. Of course, Student B must be brilliant just like her older brother. Student C is a teacher’s kid, so she’ll be a solid writer. My assumptions have failed me countless times as I’m sure have yours.

Another piece of baggage teachers carry is generalization. We teach individuals who are in classes. “We all talk about a class as if it’s one organism. “That class is a behavioral nightmare.” “That class is so mean to each other.” “It takes me twice as long to teach anything to that class.” “This class had Ms. McGuire last year. She’s our staff Grammar Nazi, so at least the kids will know their parts of speech.” We lump together individuals when we deal with groups. It’s an occupational hazard.

To learn what students know and don’t know, so that we can teach both the class and individual, we have to remove ourselves as variables to eliminate bias and generalizations. Diagnostic assessments do the trick. Wait ’til you download the FREE diagnostic assessment at the end of this article; it transformed my teaching and has been downloaded thousands of times over the years by teachers to inform their instruction.

Additionally, diagnostic assessments force us to teach efficiently. When we learn that half the class has mastered adverbs and half has not, we are forced to figure out how to avoid re-teaching what some students already know (wasting their time) while helping the kids who need to learn. As an aside, many teachers avoid diagnostic assessments because the results require differentiated or individualized instruction. Naivete is bliss. Diagnostic assessments are amazing guilt-producers.

Be an objective teacher, willing to let diagnostic data guide your instruction. Teaching is an art, but it is also a science.

DON’T assess what you won’t teach.

Many teachers begin the school year with a battery of diagnostic assessments. The results look great on paper and do impress administrators and colleagues; however, the only data that is really impressive is the data that you will specifically use to drive instruction. Gathering baseline data is a waste of time if you won’t teach to that data.

I suggest taking a hard look at the diagnostic assessments you gave last year. If you didn’t use the data, don’t do the assessment. Now, this doesn’t mean that you can’t layer on that diagnostic assessment in the spring if you are willing (and have time) to teach to the data. Diagnosis is not restricted to the fall. Teachers begin the school year with high expectations. Don’t bite off more than you can chew at once.

Additionally, more and more teachers are looking critically about the American tradition of unit-ending tests. Specifically, teachers are using unit tests as formative assessments to guide their re-teaching. Rather than a personal pat on the back (if students scored at an 85% average) or a woe-is-me-I’m-a-horrible-teacher-or-my-students-are-just-so-dumb-or-the-test-was-just-too-hard response (if students scored at a 58% average), unit tests can serve an instructional purpose.

Now I know that teachers will be thinking, “We have to¬†cover all these standards; we don’t have time to re-teach.” I’ll address this concern with a simplistic question that more than once has re-prioritized my own teaching.¬†It really is an either-or question: Is teaching or learning more important?

For those who answer,¬†learning,¬†don’t add to your admirable burden by assessing what you won’t teach.

That’s it for now. The credits are rolling, but keep reading because the end of the credits may have a few surprises. Purchase your ticket for the next installment of¬†ELA and Reading Assessments Do’s and Don’ts: Episode 2 and get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 92% score on Rotten Tomatoes! Here’s the preview:¬†DO analyze data with others¬†¬† (drop your defenses). DON’T assess what you can’t teach. DO steal from others. DON’T assess what you must confess (data is dangerous). Check it out HERE!


I’m Mark Pennington, ELA teacher and reading specialist. Check out my assessment-based ELA and reading intervention resources at Pennington Publishing.

Get the Diagnostic Grammar and Usage Assessment with Recording Matrix FREE Resource:

Get the Grammar and Mechanics Grades 4-8 Instructional Scope and Sequence FREE Resource:


Grammar/Mechanics, Literacy Centers, Reading, Spelling/Vocabulary , , , , , , , , , , , , , , , , , ,