Archive

Posts Tagged ‘Universal Design for Learning’

ELA and Reading Assessments Do’s and Don’ts #5

Do's and Don'ts of ELA and Reading AssessmentsNotice how movie theaters have jumped on the rewards bandwagon? Yes, we earn points of our rewards card toward a free popcorn or soda. I’m all about the rewards, but we now have a desk drawer full cards.

However, for my Do’s and Don’ts of ELA and Reading Assessments series, you “don’t need no stinkin’ card” (Mel Brook’s Blazing Saddles) to get our FREE assessments, audio files, progress monitoring matrices, and lessons.

If you’ve missed one of the following got-to-see episodes, check it out after you watch this one.

  1. Do use comprehensive assessments, not random samples. 
  2. DON’T assess to assess. Assessment is not the end goal. 
  3. DO use diagnostic assessments. 
  4. DON’T assess what you won’t teach.” 
  5. DO analyze data with others (drop your defenses). 
  6. DON’T assess what you can’t teach. 
  7. DO steal from others. 
  8. DON’T assess what you must confess (data is dangerous).
  9. DO analyze data both data deficits and mastery.
  10. DON’T assess what you haven’t taught.
  11. DO use instructional resources with embedded assessments.
  12. DON’T use instructional resources which don’t teach to data.
  13. DO let diagnostic data do the talking. 
  14. DON’T assume what students do and do not know. 
  15. DO use objective data. 
  16. DON’T trust teacher judgment alone.

Now, sit back in your plushy seat and enjoy the flick. In Episode 5 we are taking a look at the following:

DO think of assessment  as instruction. DON’T trust all assessment results. DO make students and parents your assessment partners. Don’t go beyond the scope of your assessments.

Wait ’til you download the featured assessment and matrix. It’s worth the wait.

DO think of assessment  as instruction. 

So often teachers view assessments as extraneous got-to’s, not as integral instructional components. I’ve heard, “I got into teaching to teach, not to assess” more times than I can count.

I kindly suggest that we should re-orient our thinking. No teacher would want to use an instructional resource that provided inaccurate information. No teacher would want to hand out a worksheet that her students had already completed. No teacher would want to waste time teaching something that her students already had mastered. Yet, teachers do so all the time when they have not assessed what students know and what they don’t know.

Diagnostic and formative assessments inform our instruction. No one would trust a doctor who would write a prescription without a diagnosis. Diagnosis is part of the exam. The same is true for teaching. Assessment is an integral component of instruction.

If they know it, they will show it; if they don’t they won’t. So don’t blow it; make ’em show it.

DON’T trust all assessment results. 

Even the best of doctors will suggest a second opinion. This is sound advice for teacher diagnosticians as well. Sometimes it makes sense to use an alternative assessment to double-check what students know and what they don’t know, especially when the results seem inconsistent with other data.

When I was in fifth grade, I was pulled out of class to be tested for the gifted program. The assessment consisted of a timed test of orally delivered questions. After the second or third question, I hit upon a strategy to give me more think time. After each oral question, I asked, “What?” I got the question again and had twice as long to answer the question. I don’t remember if I qualified for the program, but I do remember being referred to the audiologist for hearing loss.

When in doubt, double-check with a different assessment.

DO make students and parents your assessment partners. 

Test data shouldn’t be secret. Both students and parents need to know what is already known and what needs to be known. Most elementary teachers share some form of data at student-parent-teacher conferences, but secondary teachers rarely do so.

My suggestion is to share both diagnostic and formative assessment data on a regular basis with students and parents. Both are encouraged and motivated by progress. Share progress monitoring matrices with your partners.

Don’t go beyond the scope of your assessments.

Good assessments are limited assessments. They test specific concepts and skills, not general ones. Teachers over-reach when they try to make assessments walk on all fours. In other words, when teachers make assessments prescribe generalizations or treatments beyond the scopes of their applications.

For example, a student who fails to correctly punctuate an MLA citation on a unit test, may not need further instruction in what and what not to cite. Or a student who does not know when and when not to drop the final silent when adding on a suffix, may not need to practice reading silent final sound-spellings (the former is a spelling skills; the latter is a phonics skill).

Effective assessment-based instruction sticks to the limits of the assessment and does not generalize.

Glad you dropped by to watch Episode 5? Before you re-fill that unlimited re-fills popcorn on your way out, better grab your ticket for the next installment of ELA and Reading Assessments Do’s and Don’ts: Episode 6. This once could sell out! Also get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 99% score on Rotten Tomatoes! Here’s the preview: Do understand that not all assessment data helpful. Don’t rely solely on teacher observation for assessment. Do review; mastery is not permanent. Don’t just assess Common Core State Standards. 

*****

I’m Mark Pennington, ELA teacher and reading specialist. Check out my assessment-based ELA and reading intervention resources at Pennington Publishing.

Get the Consonant Sounds Phonics Assessment, Audio File, and Recording Matrix FREE Resource:

Grammar/Mechanics, Literacy Centers, Reading, Spelling/Vocabulary, Writing , , , , , , , , , , , , , , , , ,

ELA and Reading Assessments Do’s and Don’ts #4

ELA and Reading Assessment Do's and Don'ts

Assessment Do’s and Don’ts

I’ve been using a silly movie theme to weave together a series of articles for my Do’s and Don’ts of ELA and Reading Assessments series. So far I’ve offered these suggestions over the trailer and first three episodes:

  1. Do use comprehensive assessments, not random samples. 
  2. DON’T assess to assess. Assessment is not the end goal. 
  3. DO use diagnostic assessments. 
  4. DON’T assess what you won’t teach.” 
  5. DO analyze data with others (drop your defenses). 
  6. DON’T assess what you can’t teach. 
  7. DO steal from others. 
  8. DON’T assess what you must confess (data is dangerous).
  9. DO analyze data both data deficits and mastery.
  10. DON’T assess what you haven’t taught.
  11. DO use instructional resources with embedded assessments.
  12. DON’T use instructional resources which don’t teach to data.

Permit me to tell a brief anecdote. As a junior in high school, I got my license on my sixteenth birthday. At last, I could take my girlfriend out on a real date! Where to go? The movies, of course. Just one problem.

Friday night was guys’ night. My group of buddies and I always got together on Friday night. When Richard called me up after school to tell me that he would pick me up at 7:00, I quickly lied and told him that I was sick. Of course, I had already called my girlfriend to ask her to go to the movies.

We were munching on popcorn, half-way through the movie, when an obnoxiously loud group of guys entered the theater. Yes… my friends. I slumped down in my seat and told my girlfriend that I needed to see all the credits before leaving. When I assumed my friends had left the theater for their next Friday night adventure, my girlfriend and I slowly made our way up to the lobby.

Richard was the first friend to greet me. Let’s just say I paid dearly for that lie.

This article’s focus?

DO let diagnostic data do the talking. DON’T assume what students do and do not know. DO use objective data. DON’T trust teacher judgment alone.

The FREE assessment download at the end of this article includes a recording matrix and two great lessons… all to convince you to check out my assessment-based ELA and reading program resources at Pennington Publishing.

DO let diagnostic data do the talking.

One of the first lessons new teachers learn is how to answer this student or parent question: “Why did you give me (him or her) a ___ on this essay, test, project, etc.?”

Of course, every veteran teacher knows the proper response (with italics for speech emphasis): “I didn’t give you (him or her) anything. You (he or she) earned it.

A less snotty and more effective response is to reference the data. Data is objective. Changing the subjective nature of the question into an objective answer is a good teacher self-defense mechanism and gets to the heart of the issue.

Diagnostic data is especially helpful in answering why students are having difficulties in a class. Additionally, the data in and of itself offers a prescription for treatment. Going home from the doctor with a “This should go away by itself in a few weeks” or a “Just not sure what the problem is, but it doesn’t seem too serious” is frustrating. Patients want a prescription to fix the issue. Parents and students can get that prescription with assessment-based instructional resources.

One other application for both new and veteran teachers to note: A teacher approaches her principal with this request: “I need $$$$ to purchase Pennington Publishing’s Grammar, Mechanics, Spelling, and Vocabulary BUNDLE. Our program adoption does not provide the resources I need to teach the CCSS standards.”

Answer: “Not at this time.”

Instead, let diagnostic data do the talking.

“Look at the diagnostic data on this matrix for my students. They need the resources to teach to these deficits.”

Answer: “Yes (or Maybe)”

DON’T assume what students do and do not know. 

We teachers are certainly not free of presuppositions and bias. As a result, we assume what has yet to be proven. In other words, we beg the question regarding what our students know and don’t know.

He must be smart, but just lazy. His older sister was one of my best students. They’re in an honors class; of course they know their parts of speech. I have to teach everything as if none of my students knows anything; I assume they are all tabula rasa (blank slates). You all had Ms. Peters last year, so we don’t have to teach you the structure of an argumentative essay.

Effective diagnostic assessments eliminates the assumptions. Regarding diagnostic assessments, I always advise teachers: “If they know it, they can show it; if they don’t, they won’t.”

DO use objective data.

Not all diagnostic assessments are created equally. By design, a random sample assessment is subjective, no matter the form of sampling. Those of you who remember your college statistics class will agree.

Teachers need objective data, not data which suggests problem areas. Teachers need to know the specifics to be able to inform their instruction. For this application, objective means comprehensive.

The “objective” PAARC, SWBAC, or state-constructed CCSS tests may indicate relative student weaknesses in mechanics; however, teachers want to know exactly which comma rules have and have not been mastered. Teachers need that form of objective data.

DON’T trust teacher judgment alone.

After years of teaching, veteran teachers learn to rely on their judgment (as they should). After a few more years of teaching, good teachers learn to distrust their own judgment at points. Experienced teachers look for the counter-intuitive in these complex subjects of study that we call students. What makes them tick? Kids keep our business interesting.

Diagnostic and formative assessments bring out our own errors in judgment and help us experiment to find solutions for what our students need to succeed. Assessments point out discrepancies and point to alternative means of instruction.

For example, a student may score high in reading comprehension on an un-timed standards-based assessment. Also, she was in Ms. McGuire’s highest reading group last year. Most teachers would assume that she has no reading problems and should be assigned to an advanced literacy group.

Yet, her diagnostic spelling assessment demonstrates plenty of gaps in spelling patterns. A wise teacher would suspend her initial judgment and do a bit more digging. If that teacher gave the Vowel Sounds Phonics Assessment (our FREE download at the end of this article), the student might demonstrate some relative weaknesses. She may be an excellent sight-word reader, who does fine with stories, but one whom will fall apart reading an expository article or her science textbook.

Like my dad always told me… Measure twice and cut once.

Thanks for watching Episode 4. Make sure to buy your ticket for the next installment of ELA and Reading Assessments Do’s and Don’ts: Episode 5 before you sneak out of the theater with your girlfriend or boyfriend. Also get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 94% score on Rotten Tomatoes! Here’s the preview: DO treat assessment  as instruction. DON’T trust all assessment results. DO make students and parents your assessment partners. Don’t go beyond the scope of your assessments.

*****

I’m Mark Pennington, ELA teacher and reading specialist. Check out my assessment-based ELA and reading intervention resources at Pennington Publishing.

Get the Vowel Sounds Phonics Assessment with Audio File and Matrix FREE Resource:

Grammar/Mechanics, Literacy Centers, Reading, Spelling/Vocabulary, Writing , , , , , , , , , , , , , , , , ,

ELA and Reading Assessments Do’s and Don’ts #3

ELA and Reading Assessments Do's and Don'ts

Do’s and Don’ts: ELA and Reading Assessments

The thing about movie sequels is that we feel a compulsive necessity to see the next and the next because we’ve seen the first. I’d be interested to know what percentage of movie-goers, who saw all three Lord of the Rings movies, watched both Hobbit prequels. My guess would be a rather high percentage.

If my theory is correct, I’d also hazard to guess that the critic reviews would not substantially alter that percentage.

Of course my hope is that I’ve hooked you on this article series and the FREE downloads 🙂 of assessments, recording matrices, audio files, and activities in order to entice you to check out my corresponding assessment-based products at Pennington Publishing.

In my Do’s and Don’ts of ELA and Reading Assessments series, I’ve offered these bits of advice so far:

  1. Do use comprehensive assessments, not random samples. 
  2. DON’T assess to assess. Assessment is not the end goal. 
  3. DO use diagnostic assessments. 
  4. DON’T assess what you won’t teach.” 
  5. DO analyze data with others (drop your defenses). 
  6. DON’T assess what you can’t teach. 
  7. DO steal from others. 
  8. DON’T assess what you must confess (data is dangerous).

But, wait… there’s more!

DO analyze data both data deficits and mastery.

Students are like fixer-upper houses.

Kids are fixer-uppers, waiting to be fixed and flipped.

Teachers are fixers. In some sense we view our students as “as is” houses or fixer-uppers, waiting for us to determine what needs repair and updating so that we can flip them in market-ready condition to the next teacher.

Teachers should use diagnostic assessments in this way. Most all students need to catch up while they keep up with grade-level instruction.

However, we miss some of the value of diagnostic assessments when we don’t analyze data to build upon the strengths of individual students. For example, teachers are frequently concerned about the student who has high reading fluency rates, but poor comprehension. Yes, some students are able to read quickly with minimal miscues, but understand and retain little of what they have read. Just weird, right?

Looking only at the diagnostic deficit (lack of comprehension) might lead the teacher to assume that the student is a sight word reader in need of extensive decoding practice to shore up this reading foundation. However, if we look at the relative strength (fluency), we might prescribe a different treatment to build upon that strength. It may certainly be true that the student might have some decoding deficits, but if the student is able to recognize the words, it makes sense to use that ability to teach the student how to internally monitor text with self-questioning strategies.

Both relative strengths and weaknesses matter when analyzing student assessment data.

DON’T assess what you haven’t taught.

Teachers love to see progress in their students. Our profession enables us to see a student go from A to B throughout the year with us as the relevant variable. Assessment data does provide us with extrinsic rewards and a self-pat-on-the-back. I love our profession!

But we have to use real data to achieve that self-satisfaction. Otherwise, we are only fooling ourselves. As the new school year begins, countless teachers will administer entry baseline assessments, designed to demonstrate student ignorance. These assessments test what students should know by the end of the year, not what they are expected to know at the beginning of the year. Often the same assessment is administered at the end of the year to determine summative progress and assess a teacher’s program effectiveness.

Resist the temptation to artificially produce a feel-good assessment program such as that. Such a baseline test affords no diagnostic data; it does not inform your instruction. It makes students feel stupid and wastes class time. The year-end summative assessment is too far removed from the baseline to measure the effects of the the variables (teacher and program) upon achievement with any degree of accuracy.

Test only what has been taught to see what they’ve retained and forgotten.

DO use instructional resources with embedded assessments.

In my work as an ELA teacher and reading specialist at the elementary, middle school, high school, and community college levels, I’ve found that most teachers use three types of assessments: 1. They give a few entry-level assessments, but do little if any thing with the data. 2. They give unit tests once a month, but do not re-teach or re-test. 3. They give some form of end-of-year or term summative test (the final) with little or no review or re-teaching of the test results.

As you, no doubt, can tell, I don’t see the value in any of the above approaches to assessment. It’s not that these tests are useless; it’s that they tend to be reductive. Teachers give these instead of the tests they should be using to inform their instruction. Diagnostic assessments (as detailed in the previous section) are essential to plan and inform instruction. Also, what’s missing in their assessment plan? Formative assessments.

My take is that the best method of on-going formative assessment is with embedded assessments. I use embedded assessments to mean quick checks for understanding that are included in each lesson. Both the teacher and student need to know whether the skill or concept is understood following instruction, guided practice, and independent practice. For example, in the FREE diagnostic assessment (with audio file), recording matrix, and lessons download at the end of this article, the lesson samples from my Differentiated Spelling Instruction programs are spelling pattern worksheets. These are remedial worksheets which students would complete if the Diagnostic Spelling Assessment indicated specific spelling pattern deficits. Each worksheet includes a writing application at the end of the worksheet, which demonstrates whether the student has or has not mastered the practiced spelling pattern. These are embedded assessments, which the teacher can use to determine if additional instruction is unnecessary or required.

Use instructional materials which teach and test.

DON’T use instructional resources which don’t teach to data.

The converse of the previous section is also important to bullet point. To put things simply: Why would a teacher choose to use an instructional resource (a worksheet, a game, software, a lecture, a class discussion, an article, anything) which is not testable in some way? Of course, the assessment need not include pencil and paper; informed teacher observation can certainly include assessment of learning.

Let’s use one example to demonstrate an instructional resource which does not teach to data and how that same resource can teach to data: independent reading. This one will step on a few toes.

Instructional Resource: “Everyone take out your independent reading books for Sustained Silent Reading (SSR).” Okay, you may do Drop Everything and Read (DEAR) or Free Voluntary Reading (FVR) or…

Practice:  20 minutes of silent reading

Assessment: None

The instructional resource may or may not be teaching. We don’t know. If the student is reading well at appropriate challenge level, the student is certainly benefiting from vocabulary acquisition. If the student is daydreaming or pretending to read, SSR is producing no instruction benefit. Following is an alternative use of this instructional resource:

Instructional Resource: “Everyone take out your challenge level independent reading books for Sustained Silent Reading (SSR), your

SCRIP Comprehension Bookmarks

SCRIP Comprehension Strategy Bookmarks

SCRIP Comprehension Strategies Bookmarks, and your pencil for annotations (margin notes).”

Practice with Assessment:  Read for 10 minutes, annotating the text. Then do a re-tell with your assigned partner for 1 minute, using the SCRIP Comprehension Strategies Bookmarks as self-questioning prompts. Partners are to complete the re-tell checklist. Repeat after 10 more minutes. Teacher randomly calls on a few readers to repeat their re-tells to the entire class and their partners’ additions. If the checklists and teacher observation of the oral re-tells indicate that the students are missing, say, causes-effect relationships in their reading, the teacher should prepare and present a think-aloud lesson, emphasizing this reading strategy with practice. This practice uses data and informs the teacher’s instruction. Plus, it provides students with a purpose for instruction and holds them accountable for learning.

Thanks for watching Episode 3. Make sure to purchase your ticket for the next installment of ELA and Reading Assessments Do’s and Don’ts: Episode 4 before you walk out of the theater. This episode will sell-out fast! Also get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 98% score on Rotten Tomatoes! Here’s the preview: DO let diagnostic data do the talking. DON’T assume what students do and do not know. DO use objective data. DON’T trust teacher judgment alone.

*****

I’m Mark Pennington, ELA teacher and reading specialist. Check out my assessment-based ELA and reading intervention resources at Pennington Publishing.

Get the Diagnostic Spelling Assessment, Mastery Matrix, and Sample Lessons FREE Resource:

Grammar/Mechanics, Literacy Centers, Reading, Spelling/Vocabulary, Writing , , , , , , , , , , , , , , , ,

ELA and Reading Assessments Do’s and Don’ts #2

English and Reading Assessments

ELA and Reading Assessments

You know how it is with movie sequels; the sequel rarely lives up to the promise of the original movie. However, there are exceptions and you’re reading one 🙂

In my Do’s and Don’ts of ELA and Reading Assessments series, I began with a trailer to introduce the articles, in which I argued, “Do use comprehensive assessments, not random samples.” I followed with the first episode, in which I elaborate on the following: “DON’T assess to assess. Assessment is not the end goal. DO use diagnostic assessments. DON’T assess what you won’t teach.” Both the trailer and first episode provide some of my 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. Take a look at these later, but you’ve got to read this article first and grab the FREE download.

As an ELA teacher and reading specialist, I believe in the power of ELA and reading assessments. However, as with many educational practices, appropriate use is often coupled with misuse (or even abuse); hence, the Do’s and Don’ts of ELA and Reading Assessments.

DO analyze data with others (drop your defenses).

We teachers love our independence, but it sometimes comes with a cost to our students.

My eighth-grade ELA colleague in the classroom next door has the reputation of being a fine teacher. She serves as our department chair and we’ve taught together for a dozen years. I can tell you all about her two kids and husband. Of course, I spell her once in a while for a bathroom break, but I’ve never seen her teach; nor has she seen me teach. I’ve found this scenario to be quite typical. Our classrooms are our castles. We let down the drawbridges a few times a year for administrative walk-throughs or evaluations, but rarely more than that.

Our department meetings are all business: budget, supply status, pleas to keep the workroom clean, schedules, and novel rotations. We also meet twice-per-month for grade-level team meanings. Again, more business with some curricular planning and the usual complaint-sharing about students, parents, the district, and administrators. Administrators want us to have common assessments, mainly to ensure consistent instruction. We do, but get around that requirement by adding on our own assessments and make these the ones that matter. We never analyze student data, except the Common Core annual assessment (and that data is aggregated by grade-level subject, not by individual teacher). Of course, that data is out-of-date (months ago) and so general as to be of minimal use.

At the beginning of the school year I sing the same old song: “Can’t we set aside time at each meeting to look at each others’ student work and learn from each other?” I mean assignments, essays, and unit tests… the stuff that we are now teaching. Everyone agrees we should, but we never have enough time. Why not?

We’re afraid.

What if she finds out that I’m just a mediocre teacher? What if he finds out that I have no clue about how to teach grammar? What if they discover that I really don’t differentiate instruction, though I have a reputation for doing so? Would I be able to or willing to change how I teach? My colleagues aren’t my bosses.

It’s time we take some risks and let the assessment data do the talking. None of us is as good or bad as we think. Everyone has something to contribute and something to learn. We need different perspectives on analyzing data; looking solely at your own data without comparison to others’ data may lead to inaccurate judgments and faulty instruction.

Let’s drop our defenses and let our colleagues into our professional lives. Data analysis as a community of professional educators can produce satisfying results and helps us grow as professionals.

DON’T assess what you can’t teach.

When teachers sit down and brainstorm what baseline assessments to give at the start of the school year, someone invariably suggests a reading comprehension test and a writing sample. I chime in with a mechanics test. Here’s why my suggestion makes sense and my colleague’s does not.

A mechanics test is teachable: 9 comma rules, 7 capitalization rules, and 16 italics, underlining, quotation marks, etc. rules. A reading comprehension test and a writing sample are not. Check out my article, Don’t Teach Reading Comprehension when you have time. Suffice it to say that the latter two tests will not yield the same kind of specific data as, say, that mechanics test. Want to download that mechanics test and progress monitoring matrix? The FREE download is at the end of the article; you can teach to this assessment.

Bottom line? You don’t have time to assess for the sake of assessing. Refuse to assess what will not yield teachable data.

DO steal from others.

Teacher constructed assessments provide the best tools. Work with colleagues to create diagnostic and formative assessments to measure student achievement and quick follow-up assessments designed to re-assess, once you re-teach what individual students did not master the first time.

Steal exercises, activities, and worksheets from colleagues that will re-teach. No better compliment can be paid to a fellow teacher than “Would you mind making me a copy of that?”

DON’T assess what you must confess (data is dangerous).

I would add an important cautionary note to sharing assessment data. First, students do have a right to privacy. Be careful to keep data analysis in-house. On my recording matrices I suggest using student identification numbers when posting results in the classroom. Second, ill-informed parents and administrators will sometimes misuse data to make judgments about the teacher rather than the student. Lack of mastered concepts and skills could be used to accuse previous or present teachers of educational malpractice. Some administrators will cite quantitative data on evaluations to comment on lack of progress.

Teachers should be judicious and careful in publicizing data. Most parents and administrators will welcome the information, understand it in its proper context, and recognize the level of your professionalism. Set some department or team-level guidelines for data sharing and test the waters before sharing everything.

To clarify, it’s not the data that is dangerous; it’s the misuse that needs to be avoided.

That’s it for now. Some of you will jump up into the aisle to head to the lobby upon seeing “The End.” Others will relax and let the theater clear out before walking out. Make sure to purchase your ticket for the next installment of ELA and Reading Assessments Do’s and Don’ts: Episode 2 and get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 87% score on Rotten Tomatoes! Here’s the preview: DO analyze both data deficits and mastery. DON’T assess what you haven’t taught. DO use instructional resources with embedded assessments. DON’T use instructional resources which don’t teach to data.

*****

I’m Mark Pennington, ELA teacher and reading specialist. Check out my assessment-based ELA and reading intervention resources at Pennington Publishing.

Get the Diagnostic Mechanics Assessment with Recording Matrix FREE Resource:

Grammar/Mechanics, Literacy Centers, Reading, Spelling/Vocabulary, Writing , , , , , , , , , , , , , , , , , , ,

ELA and Reading Assessments Do’s and Don’ts #1

Many movie theaters are now opting to sell you specific seats for a show time, rather than the traditional first come first served model. Although you have to pay a premium for this advanced purchase option, I think it’s worth every penny. Here’s why: If you time it right, you can show up to your assigned seat right before the start of the movie and skip the annoying previews (usually known as trailers for some reason). According to an editor on Reddit, these trailers (including commercials and warnings to “Please silence your cell phone”) average 15-20 minutes.

Do's and Don'ts of ELA and Reading Assessments

ELA and Reading Assessment Do’s and Don’ts: The Movie Trailer

In my Do’s and Don’ts of ELA and Reading Assessments series, I began with a trailer to introduce the articles. This preview, Do use comprehensive assessments, not random samples, focused on why teachers want quick, whole-class, comprehensive assessments which produce the specific data regarding what students know and what they don’t know about a subject and why normed tests and achievement tests, such as the PAARC, SWBAC, and other state CCSS tests don’t provide that data. As an enticement to read the articles (and check out my Pennington Publishing programs to teach to the assessments) I provided two assessments which meet that desired criteria: the 1. Alphabetic Awareness Assessment and the 2. Sight Syllables (Greek and Latin prefix and suffix) Assessment. Additionally, the respective downloads include the answers, corresponding matrices, administrative audio files, and ready-to-teach lessons.

But first, let’s take a look at the first three-part episode in the Do’s and Don’t of ELA and Reading Assessments series: DON’T assess to assess. Assessment is not the end goal. DO use diagnostic assessments. DON’T assess what you won’t teach. Plus, wait ’til you see the FREE download at the end of this article! Plus, a bonus.

DON’T assess to assess. Assessment is not the end goal.

A number of years ago, our seventh and eighth-grade ELA department gathered over a number of days in the summer to plan a diagnostic assessment and curricular map to teach the CCSS grammar, usage, and mechanics standards L. 1, 2, and 3. I was especially pleased with the diagnostic assessment, which covered K-6 standards and felt that the team was finally ready to help students catch up while they keep up with grade-level standards.

By the end of the first two weeks of instruction, every ELA teacher had dutifully administered, corrected, and recorded the results of the assessment on our progress monitoring matrix. I began developing worksheets to target the diagnostic deficits and formative assessments to determine whether students had mastered these skills and concepts. I placed copies of the worksheets in our “share binder.” My students were excited to see their progress in mastering their deficits while we concurrently worked on grade-level instruction.

At our monthly team meeting, I brought my progress monitoring matrix to brag on my students. “That’s great, Mark.” “Nice work. I don’t know how you do it.” No one else had done anything with the diagnostic data.

Somehow I got up enough courage to ask, “Why did you all administer, correct, and record the diagnostic assessment if you don’t plan on using the data to inform your instruction?”

Responses included, “The principal wants us to give diagnostic assessments.” “The test did give me a feel for what my class did and did not know.” “It shows the students that they don’t know everything.” “It confirms my belief that previous teachers have not done a good job teaching, so I have to teach everything.”

Class time is too valuable to waste. Assessment is not an end in and of itself.

DO use diagnostic assessments.

Let’s face it; we all bring biases into the classroom. We assume that Student A is a fluent reader because she is in an honors class. Of course, Student B must be brilliant just like her older brother. Student C is a teacher’s kid, so she’ll be a solid writer. My assumptions have failed me countless times as I’m sure have yours.

Another piece of baggage teachers carry is generalization. We teach individuals who are in classes. “We all talk about a class as if it’s one organism. “That class is a behavioral nightmare.” “That class is so mean to each other.” “It takes me twice as long to teach anything to that class.” “This class had Ms. McGuire last year. She’s our staff Grammar Nazi, so at least the kids will know their parts of speech.” We lump together individuals when we deal with groups. It’s an occupational hazard.

To learn what students know and don’t know, so that we can teach both the class and individual, we have to remove ourselves as variables to eliminate bias and generalizations. Diagnostic assessments do the trick. Wait ’til you download the FREE diagnostic assessment at the end of this article; it transformed my teaching and has been downloaded thousands of times over the years by teachers to inform their instruction.

Additionally, diagnostic assessments force us to teach efficiently. When we learn that half the class has mastered adverbs and half has not, we are forced to figure out how to avoid re-teaching what some students already know (wasting their time) while helping the kids who need to learn. As an aside, many teachers avoid diagnostic assessments because the results require differentiated or individualized instruction. Naivete is bliss. Diagnostic assessments are amazing guilt-producers.

Be an objective teacher, willing to let diagnostic data guide your instruction. Teaching is an art, but it is also a science.

DON’T assess what you won’t teach.

Many teachers begin the school year with a battery of diagnostic assessments. The results look great on paper and do impress administrators and colleagues; however, the only data that is really impressive is the data that you will specifically use to drive instruction. Gathering baseline data is a waste of time if you won’t teach to that data.

I suggest taking a hard look at the diagnostic assessments you gave last year. If you didn’t use the data, don’t do the assessment. Now, this doesn’t mean that you can’t layer on that diagnostic assessment in the spring if you are willing (and have time) to teach to the data. Diagnosis is not restricted to the fall. Teachers begin the school year with high expectations. Don’t bite off more than you can chew at once.

Additionally, more and more teachers are looking critically about the American tradition of unit-ending tests. Specifically, teachers are using unit tests as formative assessments to guide their re-teaching. Rather than a personal pat on the back (if students scored at an 85% average) or a woe-is-me-I’m-a-horrible-teacher-or-my-students-are-just-so-dumb-or-the-test-was-just-too-hard response (if students scored at a 58% average), unit tests can serve an instructional purpose.

Now I know that teachers will be thinking, “We have to cover all these standards; we don’t have time to re-teach.” I’ll address this concern with a simplistic question that more than once has re-prioritized my own teaching. It really is an either-or question: Is teaching or learning more important?

For those who answer, learning, don’t add to your admirable burden by assessing what you won’t teach.

That’s it for now. The credits are rolling, but keep reading because the end of the credits may have a few surprises. Purchase your ticket for the next installment of ELA and Reading Assessments Do’s and Don’ts: Episode 2 and get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 92% score on Rotten Tomatoes! Here’s the preview: DO analyze data with others   (drop your defenses). DON’T assess what you can’t teach. DO steal from others. DON’T assess what you must confess (data is dangerous).

*****

I’m Mark Pennington, ELA teacher and reading specialist. Check out my assessment-based ELA and reading intervention resources at Pennington Publishing.

Get the Diagnostic Grammar and Usage Assessment with Recording Matrix FREE Resource:

Get the Grammar and Mechanics Grades 4-8 Instructional Scope and Sequence FREE Resource:

 

Grammar/Mechanics, Literacy Centers, Reading, Spelling/Vocabulary , , , , , , , , , , , , , , , , , ,

ELA and Reading Assessments Do’s and Don’ts

As an ELA and reading intervention teacher at the elementary, middle school, high school, and community college levels (I know… the grass

Do's and Don'ts of ELA and Reading Assessment

Do’s and Don’ts of Assessment: The Trailer

is always greener :)), I’ve had the opportunity to learn the value of assessment-based instruction. So when a fellow teacher challenged me at a recent professional development workshop on assessment with the following rhetorical question, I answered quickly and moved on to the rest of my presentation.

She asked/stated, “Don’t you think it makes more sense to spend valuable class time teaching, rather than assessing?”

Later, I sat down at the computer to provide a more comprehensive answer. Happens to me all the time. I think of the really good answer, quip, or comeback later when the moment has passed. I came up with 52 solid reasons to support assessment-based instruction.

Now, I doubt if the teacher wanted to hear even my quick answer, let alone my 52-part answer. Don’t worry, you’ll only get the one reason in this article, but the rest will follow.

I’ve opted for a Do’s and Don’ts approach to clearly explain what does and does not “make sense” for ELA and reading assessments, but in classic movie sequel promotion, I’ll provide a cliffhanger to entice viewers to check out the next article. More Do’s and Don’ts probably won’t bring everyone back into the theater and sell more popcorn (Yes, my ELA and reading intervention resources are for sale in the lobby at https:\\www.penningtonpublishing.com); however, my 15 free ELA and reading assessments, with corresponding matrices, administrative audio files, and ready-to-teach lessons just might do the trick. Tell you what… I’ll kick start this first episode with two assessment freebies. So, dim the lights because the “coming to a theater or drive-in near you” trailers are over and the feature now begins. Please silence your cell phone.

Do’s and Don’t of ELA and Reading Assessments 

1. DO use comprehensive assessments, not random samples.

As an ELA teacher and reading specialist, I certainly value random sample normed assessments. In fact one downside of the Common Core State Standards was the replacement of nationally normed assessments. The new PAARC, SWBAC, and other state iterations are criterion referenced (the Standards) achievement tests, not statistically normed tests. For example, we used to be able to state the reading comprehension and vocabulary grade levels percentiles for individual students, but no longer.

However, to be honest, the normed assessment data did not inform instruction (and frankly, the CCSS assessments do only marginally better). What both the normed and Standards-based tests provide are random samples of ability or achievement, respectively. In other words, they can accurately state, “Houston, we’ve got a problem.”

However, knowing that there is a problem is of limited value. Back in 1970 the NASA team in Houston worked round the clock to test what would and what would not work to help the three Apollo 13 astronauts survive and make the re-entry into the Earth’s atmosphere. Their specific data were informative and applicable to the astronauts. They made it home alive! (if you haven’t seen the movie).

Identifying the fact that a student has a problem is not helpful data. What teachers want are comprehensive assessments which specifically determine “What my kids know and what do they not know.” The Standards-based tests may permit some ability grouping or class placements, but the data do not target instruction. Following are two quick, but comprehensive small group or whole class assessments with recording matrices, which provide specific data that will provide exactly what each individual student has an has not yet mastered. I’ve included one for Pre-K, grades 1, 2, 3 and reading intervention, English-language development, and special education teachers, and one example for grades 4 through adult learners.

Assessment #1: The Alphabet

It may come as a shock to secondary teachers that many older students do not yet know the alphabet. Of course, this comes as no surprise to those who work with struggling English readers. One of the most popular reading intervention programs, Read 180, includes the normed Foundational Reading Assessment. The test provides 10 items designed to measure students’ knowledge of uppercase and lowercase letter names.

Last I checked, the English alphabet has 26 letters. Teachers want to know precisely which upper and lower case letters students can name, identify, match, and sequence and which ones they cannot. A comprehensive alphabetic assessment provides these data. Download it below.

Assessment #2: Sight Syllables 

The Standards-based assessments may be able to accurately summarize that a student has not yet mastered sight syllable recognition of the common affixes through random sample test problems. However, from the test results we can’t learn exactly which of the common Greek and Latin prefixes and suffixes a student has and has not yet mastered in terms of syllable recognition. The former doesn’t help the teacher; the latter could transform a teacher’s instruction and student learning. A comprehensive assessment on the research-based, high frequency Greek and Latin prefixes and roots provides these data. Download it below.

COMING ATTRACTIONS!

Enough for now. But, get your ticket for the next installment of ELA and Reading Assessments Do’s and Don’ts: Episode 1 and get more 15 FREE ELA and reading assessments, corresponding recording matrices, administrative audio files, and ready-to-teach lessons. A 95% score on Rotten Tomatoes! Here’s the preview.

2. DON’T assess to assess. Assessment is not the end goal.

3. DO use diagnostic assessments.

4. DON’T assess what you won’t teach.

*****

I’m Mark Pennington, ELA teacher and reading specialist. Check out my assessment-based ELA and reading intervention resources at Pennington Publishing.

Get the Alphabet Assessment, Matrix, Activity, and Game Cards FREE Resource:

Get the Sight Syllable Greek and Latin Assessment, Matrix, Activity, and Game Cards FREE Resource:

Grammar/Mechanics, Literacy Centers, Reading, Spelling/Vocabulary, Study Skills, Writing , , , , , , , , , , , ,

Don’t Teach to Learning Styles and Multiple Intelligences

Learning Styles and Multiple Intelligences

Learning Styles

Most teachers believe in some form of learning styles or multiple intelligences theories. The notion that each child learns differently, so we should adjust instruction accordingly (learning styles) just seems like such good old-fashion common sense. The theory that each child has different innate abilities (multiple intelligences) just seems to be confirmed by common experience. But common sense and experience are otten untrustworthy and unreliable guides to good teaching. Despite what we’ve thought or heard about why we should be teaching to learning styles and multiple intelligences, these brain-based teaching approaches are simply without merit. Here are five reasons why.

1. We don’t know enough about how the brain works to change the way we teach. What we do know about the brain suggests that catering instruction to specific modalities can be counter-productive. Knowledge is stored in the form of memories and only 10% of those memories are visual and auditory representations. Meaning-based memories make up the 90% (Willingham on Learning Styles Don’t Exist–TeacherTube). Those impressive-looking illustrations of the brain on the Universal Design for Learning site and interesting graphic organizers on the multiple intelligences sites hopelessly simplify what we know is a far more complex subject. Daniel T. Willingham, cognitive psychologist and neuroscientist at the University of Virginia advises districts, schools, and teachers to “save your money” on any brain-based instructional in-services or instructional resources. See Willingham’s excellent YouTube video on the fallacy of brain-based instruction. Another great one is a Ted Talk by Tesia Marshik, Assistant Professor of Psychology at the University of Wisconsin.

2. Research does not support adjusting instruction according to learning styles or multiple intelligences theories. To sum up his extensive meta-analysis of modality research, Willingham states “…we can say that the possible effects of matching instructional modality to a student’s modality strength have been extensively studied and have yielded no positive evidence. If there was an effect of any consequence, it is extremely likely that we would know it by now (American Educator 1995).” With respect to research on multiple intelligences, “The fundamental criticism of MI theory is the belief by scholars that each of the seven multiple intelligences is in fact a cognitive style rather than a stand-alone construct (Morgan, 1996). Morgan, (1996) refers to Gardner’s approach of describing the nature of each intelligence with terms such as abilities, sensitivities, skills and abilities as evidence of the fact that the “theory” is really a matter of semantics rather than new thinking on multiple constructs of intelligence (http://www.indiana.edu/~intell/mitheory.shtml),” Frankly, the essential variables of motivation, preference, teacher perception, and the learning tasks themselves probably cannot ever be isolated in an experimental design, thus prohibiting statistically significant conclusions regarding how students learn best and how teachers should teach.

3. Learning styles and multiple intelligences theories beg the question about how students learn. The assumption is that students learn best by receiving instruction in their strongest modality or intelligence. This may make sense for designated hitters in the American League. Allow me to explain. In the American League, pitchers rarely bat; instead, designated hitters bat for them. The designated hitter does not play in the field. It would make sense for the designated hitter to practice according to his modality strength. Developing kinesthetic expertise in slugging home runs will earn him his multi-millions. But exclusive kinesthetic batting practice will not help him become a better fielder. There is no learning transfer. We certainly don’t want designated hitters in our classrooms. We want students to be complete ballplayers. In fact, it makes more sense to practice our relative weaknesses. Why should kinesthetically adept Johnny continue to make project after project rather than practicing in his areas of relative weakness: oral (auditory, aural) and written (visual) communication?

4. By emphasizing the how of instruction, learning styles and multiple intelligences practitioners lose sight of the what of instruction and tend to force square blocks into round holes. For teaching input to be processed and stored in the memory, that input has to match how the information will be stored. Little of what we teach will be stored as visual or auditory representations. This does not mean that good teaching won’t use the visual or auditory domains, but the focus of most all of our instruction is meaning-based. We want our students to know stuff. We have to match the how of instruction to the what of instruction, not the reverse. “All students learn more when content drives the choice of modality (Willingham in American Educator 1995).” It should go without saying that if a child has, for example, an auditory processing disability, the how of instruction should be limited in that modality. Similarly, adapting learning tasks to perceived student intelligences is impractical for the vast majority of our teaching standards. A student with musical intelligence still needs meaning-based practice to understand the roles of the executive, legislative, and judicial branches of government.

Magic Elixir for Reading Problems

Snake Oil Cure-All for Reading Problems

5. Although learning Styles and multiple intelligences theories seem individual-centered and egalitarian on the surface, the converse is more likely true. The practical applications of these theories tend to pigeon-hole students and assume that nature plays a greater role in learning than does nurture. For example, teachers disproportionately tend to label African-American children, especially boys, as kinesthetic learners and Asian kids are more often classified as visual learners. Being labeled limits options and dissuades effort and exploration. Learning styles and multiple intelligences assessments particularly have this egregious effect. Our students are not stupid. Labeling them as “good at” and “has strengths in” also labels them as “bad at” and “has weaknesses in.” Students “shut down” to learning or “self-limit” their achievement with such labels. If limited to what the students know and don’t yet know, assessments data can be productive. If extended to how students learn, data can be debilitating. Additionally, who is to say that how a student learns remains a constant? Teachers certainly have an important role in nurturing motivation, risk-taking, and exploration. Teachers should be about opening doors, not closing doors.

Unfortunately, the differentiated instruction movement has largely adopted learning style and multiple intelligence theories. Check out why differentiated instruction should be more about the what and less about the how in 23 Myths of Differentiated Instruction. As we move ahead in the Response to Intervention process, this subject of how to best serve students with learning challenges is especially relevant. Readers may also wish to check out the author’s introductory article: Learning Styles Teaching Lacks Common Sense.

When we talk about differentiating instruction for struggling readers, we need to allow the data to drive our instruction. Good assessments can provide the what must to be learned by each student. The how may be small group instruction, guided reading, readers workshop, literacy centers, individual tutoring, and/or direct instruction. A variety of instructional methodologies work well, but they must be informed by data.

Mark Pennington, MA Reading Specialist, is the author of the comprehensive reading intervention curriculum, Teaching Reading StrategiesDesigned to significantly increase the reading abilities of students ages eight through adult within one year, the curriculum is decidedly un-canned, is adaptable to various instructional settings, and is simple to use–a perfect choice for Response to Intervention tiered instruction. The program provides multiple-choice diagnostic reading and spelling assessments (many with audio files), phonemic awareness activities, blending and syllabication activitiesphonics workshops with formative assessments, 102 spelling pattern worksheets, comprehension worksheets, multi-level fluency passages recorded at three different reading speeds and accessed on YouTube, 644 reading, spelling, and vocabulary game cards, posters, activities, and games.

Also get the accompanying Sam and Friends Guided Reading Phonics Books. These 54 decodable eBooks (includes print-ready and digital display versions) have been designed for older readers with teenage cartoon characters and plots. Each book introduces focus sight words and phonics sound-spellings aligned to the instructional sequence found in Teaching Reading Strategies. Plus, each book has a 30-second word fluency to review previously learned sight words and sound-spelling patterns, five higher-level comprehension questions, and an easy-to-use running record. Your students will love these fun, heart-warming, and comical stories about the adventures of Sam and his friends: Tom, Kit, and Deb. Oh, and also that crazy dog, Pug.

Teaching Reading Strategies and Sam and Friends Guided Reading Phonics Books BUNDLE

Teaching Reading Strategies and Sam and Friends Guided Reading Phonics Books

Or why not get both programs as a discounted BUNDLE? Everything teachers need to teach an assessment-based reading intervention program for struggling readers is found in this comprehensive curriculum. Ideal for students reading two or more grade levels below current grade level, tiered response to intervention programs, ESL, ELL, ELD, and special education students. Simple directions, YouTube training videos, and well-crafted activities truly make this an almost no-prep curriculum. Works well as a half-year intensive program or full-year program.

Reading, Study Skills , , , , , , , , , , , , , , , , , , , , , ,

Learning Styles Teaching Lacks Common Sense

Learning Styles and Multiple Intelligences

Learning Styles

Different strokes for different folks. What works for you doesn’t necessarily work for me. These sayings appeal to our American ideals of individualism and equality, don’t they? And they certainly seem to apply to how we think we should teach. Our assumption is that we all learn differently so good teachers should adjust instruction to how students learn. Specifically, we assume that some students are better auditory (or aural) learners, some are better visual learners, and some are better kinesthetic learners. Or add additional modalities or intelligences to the list, if you wish. All we need to do to maximize learning is to adjust instruction to fit the modality that best matches the students’ learning styles or intelligences. It just seems like good old-fashioned common sense.

However, common sense is not always a trustworthy or reliable guide. Galileo once challenged Aristotle’s wisdom and the popular consensus of two millennia that objects fall at different rates, depending upon their bulk. Galileo climbed to the top of the leaning tower of Pisa and dropped a tiny musket ball and a huge canon ball at the same time. Defying common sense, those objects reached ground at the same time. Even today, ask most people whether a nickel or computer would hit the ground first. Most would still pick the computer.

Leaning Tower of Pisa

Leaning Tower of Pisa

Teachers encounter counter-intuitive examples in teaching all the time: a not-so-bright student whose parents both have master’s degrees, a student with high fluency but low comprehension, an administrator who has never taught in a classroom. These anomalies just don’t make sense, but they happen quite frequently. In fact, before recent IDEA legislation, students with demonstrated learning problems could not qualify for special education unless there was an established discrepancy between ability and performance. In other words, unless the student’s learning disability challenged our notions of common sense, the student could not qualify for special education services.

Most teachers will say that they believe in some form of learning style or multiple intelligences theory. Most will say that they attempt to adjust instruction to some degree to how they perceive students learn best. Many use modality assessments to guide their instructional decision-making. This is particularly true within the special education community. Although there probably has been some change, Arter and Jenkins (1979) found that more than 90% of special education teachers believe in modality theory. These assumptions are especially relevant as special education teachers assume lead roles in the expanded Response to Intervention models, especially with respect to the three-tiered instructional model.

But these common sense assumptions are simply wrong for the most part. To understand why, we need to define our terms a bit. When we talk about how our students learn we need to consider three components of the learning process. First, the learner accesses input, that is teaching, through sensory experiences. Next, the learner makes meaning of and connects that new input to existing knowledge and experience. Finally, that learner stores this input into the short and long term memories.

Now, this learning process is not the same as knowledge. Learning (the verb) leads to knowledge (the noun). And knowledge is not how students learn. Knowledge is what students learn. Knowledge is stored in the memory. Knowledge = memory. Memory includes everything and excludes nothing. It even includes learning how to learn. We have no separate data bases.

So how is knowledge (memory) stored in the brain? According to cognitive scientists, 90% of the memory is meaning-based. Only 10% of the memory consists of visual or auditory representations (Willingham 2009). These percentages do reflect what we teach. Most everything we teach is meaning-based. So, shouldn’t we focus our teaching energies on matching how we teach to how the knowledge is stored?

Auditory Memory

Let’s start with the 10%. If knowledge will be stored as an auditory memory, teaching should emphasize this modality. For example, if band students are learning how to tune their instruments, they need to listen to and practice hearing the sound waves, not necessarily see a spectrograph or understand the complexities of how sound is produced. Or if students are learning to read with inflection, they need to hear good models of inflection and mimic those models. Both sound waves and reading inflection knowledge are stored primarily as auditory memories. To tune their instruments, band students will access their auditory memories of wave sounds and apply this knowledge to raising or lowering the pitch of their instruments. To read with inflection, students will recall the rhythm, emphasis, and altered voices of modeled readings and apply this knowledge to reading in front of the class.

Visual Memory

And now the balance of the 10%. If knowledge will be stored as a visual memory, teaching should emphasize this modality. For example, if art students are learning the color spectrum, they need to see and practice the colors with their various hues, not just memorize ROY G BIV (red, orange, yellow, green, blue, indigo, and violet). Or if students are memorizing the locations of the states, they will need to see and practice their shapes, sizes, and relationships to other states on political and/or physical maps. Both colors and the locations of states are stored primarily as visual memories. To draw an apple from memory, art students will access their visually stored memories of various hues of red and/or other colors and apply this knowledge to their watercolor. To pass the map test, students will recall the images of the political and/or physical maps and correctly label the states.

Meaning-Based Memory

And finally to the 90%. These meaning-based memories are stored independent of any modality-“not in terms of whether you saw, heard, or physically interacted with the information.” (Willingham 2009). If knowledge will be stored in the memory as meaning, teaching should be designed to emphasize this outcome. For example, if history students are learning the three branches of the federal government and the system of checks and balances, they need to understand the meanings of the terms: legislative, executive, and judicial as well as the specific limitations of and checks on powers that the framers of the Constitution designed to ensure balance and prevent abuse. Good teaching would emphasize both rehearsal and application of this information to ensure understanding. This would, of course, necessitate using the auditory (or aural) modality. It would also certainly be appropriate to use the visual modality by drawing the three-branch tree with each branch representing the divisions of government. However, most of the learning process will necessitate memorizing how, what, where, when, and why facts through meaning-storage strategies and techniques (such as repetition), establishing cognitive connections to prior knowledge and experiences with plenty of appropriate examples, and practicing trial and error feedback through class discussion, reading, and writing. Whew! Complex, meaning-based stuff. On the test, students will not access memories of the teacher’s lecture voice or the teacher’s tree drawing to answer the multiple-choice questions. Students will recall meaning-based memories derived from teaching that appropriately matches the content to be learned. If 90% of what our students learn is meaning-based, why waste limited planning and instructional time fixating on the 10%? Now that’s good old-fashioned common sense.

A little knowledge is a dangerous thing… especially in education. We teachers tend to bandwagon on many of the latest, greatest teaching trends. Remember those impressive-looking illustrations of the brain on the Universal Design for Learning site from a few years back and the interesting graphic organizers on the multiple intelligences sites? Or the brain-based strategies that were all the rage? We tend to hopelessly simplify what are complex subjects. What we know about the brain is still in its infancy. Daniel T. Willingham, cognitive psychologist and neuroscientist at the University of Virginia advises districts, schools, and teachers to “save your money” on any brain-based instructional in-services or instructional resources. According to Willingham, meaning-based memories make up the 90% of our memory. Visual and auditory memories are a small chunk of the rest. See Willingham’s excellent YouTube video on the fallacy of brain-based instruction. Another great one is a Ted Talk by Tesia Marshik, Assistant Professor of Psychology at the University of Wisconsin.

So, when we talk about differentiated instruction, especially in reading response to intervention, let’s be careful to focus on what needs to be taught or re-taught, and less on how it needs to be taught. Mark Pennington, MA Reading Specialist, is the author of the comprehensive reading intervention curriculum, Teaching Reading StrategiesDesigned to significantly increase the reading abilities of students ages eight through adult within one year, the curriculum is decidedly un-canned, is adaptable to various instructional settings, and is simple to use–a perfect choice for Response to Intervention tiered instruction. The program provides multiple-choice diagnostic reading and spelling assessments (many with audio files), phonemic awareness activities, blending and syllabication activitiesphonics workshops with formative assessments, 102 spelling pattern worksheets, comprehension worksheets, multi-level fluency passages recorded at three different reading speeds and accessed on YouTube, 644 reading, spelling, and vocabulary game cards, posters, activities, and games.

Also get the accompanying Sam and Friends Guided Reading Phonics Books. These 54 decodable eBooks (includes print-ready and digital display versions) have been designed for older readers with teenage cartoon characters and plots. Each book introduces focus sight words and phonics sound-spellings aligned to the instructional sequence found in Teaching Reading Strategies. Plus, each book has a 30-second word fluency to review previously learned sight words and sound-spelling patterns, five higher-level comprehension questions, and an easy-to-use running record. Your students will love these fun, heart-warming, and comical stories about the adventures of Sam and his friends: Tom, Kit, and Deb. Oh, and also that crazy dog, Pug.

Teaching Reading Strategies and Sam and Friends Guided Reading Phonics Books BUNDLE

Teaching Reading Strategies and Sam and Friends Guided Reading Phonics Books

Or why not get both programs as a discounted BUNDLE? Everything teachers need to teach an assessment-based reading intervention program for struggling readers is found in this comprehensive curriculum. Ideal for students reading two or more grade levels below current grade level, tiered response to intervention programs, ESL, ELL, ELD, and special education students. Simple directions, YouTube training videos, and well-crafted activities truly make this an almost no-prep curriculum. Works well as a half-year intensive program or full-year program.

Reading, Study Skills , , , , , , , , , , , , , , , , , ,