Sunday, September 27, 2015

The Big Data Derailment


Let me start this article off on the right foot:

I do like data.

I really do.

Not as much as a Triple Chocolate Andy's Frozen Custard, but I do like it. I sit around at home with cat and play with data for fun. (Frozen custard on the side would make it even better)

Now that I've admitted that I'm going to say something bold. Something that might make you curl your toes a little bit because it (along with some of my other views on literacy) go against the current of what's popular in testing, er, teaching these days.

Analyzing STAAR data is derailing literacy instruction in Texas.


Data Driven to Death


Data is good, but it's only good if you know how to read it, and the longer I watch trends of analysis the more I see data being completely misinterpreted all over. 

We're missing the forest for the trees, folks.

Like I mentioned in a previous article, Reading isn't like the other tests. Reading tests have two components: passages and questions.

If you're only analyzing the questions and using that to draw conclusions about quality of instruction, student ability, and overall literacy and direction of you or your school's direction, then you could be sending your train out the front window.

Why are so many schools struggling so much to catch up to current Reading standards and failing. There's a lot of reasons, because an entire state of people and schools is so vast and diverse, but I'd like to focus on just one aspect today: miscalculations.

Teachers and administrators are reading their SE hotspots correctly and even putting them into perspective by using blueprints and testing frequency charts. However, with this advent of testing culture and the need for everyone to be 'data driven' all the time, it has taught districts, schools, and teachers to think about literacy like this:


We study and study and study. We discuss and discuss and discuss. We plan and plan and plan. We do it all for the sake of being data driven! We want to fix those particular SE's! We must! We're doing it for our kids' sakes, our schools' sakes, our own sakes!

But it's not working.

If it were the real answer, for all the work we're putting in over four years worth of STAAR now, we ought to be seeing more results. Instead, more schools than ever are being classified by the state as "Improvement Required."

Why, oh, why?

Because, like I mentioned, we're missing the forest for the trees. Because of popular trends in educational jargon, we've focused on the tiles, the small pieces of what we think the problem is based on what a testing company paid by the state (and now fired by the state) told us our problem was.

But here's the thing.

The tiles aren't literacy. They're good for analysis to an extent, but they're not enough.

Why? Because literacy isn't tiles. It isn't isolated concepts or isolated skills. All those little things: character roles, conflict, main idea, inferencing, drawing conclusions, context clues--they all link up into what literacy actually is. If we want to improve, we have to zoom out to the big picture.

Literacy is a mosaic of cognitive skills. 



Those skills that the test doesn't open examine or quantify are what break our students if we don't address them in the classroom, even if we don't have numerical data from a test to quantify or justify their instruction. 

If you have a population of students who are not passing STAAR, don't only look at your SE's.

Also consider: 
  • Metacognition (Rigor): Can your students think about thinking?
  • Lexile/Guided Reading Levels: Can your students understand enough of the passage to have a chance of passing?
  • Do your students believe in themselves? Do they think they can be good readers?
  • Fluency: Do your students read fast enough to remember what they read?
  • Decoding: Hand in hand with fluency--Do your students need serious intervention?
  • Vocabulary: Do your students know what to do when they don't understand?
  • Background Knowledge: What are you doing to broaden the knowledge base of students whose families can't afford to take them places to gain new experiences?
  • Language Transition: What SIOP strategies are you using to help students find success in their second language?
  • Stamina: Are you giving your students time every day to read self-selected texts? Reading is a skill, not a content. You don't get better at doing it by learning about it.




2 comments:

  1. Well I don't know much about testing standards and/or practices but I see your point and you really put it into perspective with the illustration. My question is "what is the solution?". Is it possible for the school system or government (whomever it is prescribing testing standards) to allow a teacher of a student, or even an associate teacher test each student from a qualitative perspective? You know< taking a teachers word as to the overall skill of an individual student? Great Job!
    -Mr. Hoon

    ReplyDelete
  2. The solution is all the strategies I mentioned at the bottom of the page. There is an attempt to measure students on one of indexes based on personal growth, but overall the testing machine is complicated with laws about what data they will/won't use and how to interpret/not interpret it, that there is no answer to the problem on that side. The answer is to focus on balanced literacy and not allow education trends or misinterpretation of trends to trump good teaching. Good teaching always focuses on the student as a whole and then uses data to supplement, rather than vice versa.

    ReplyDelete