May 2019
Volume XVI Issue V

 

Share this newsletter 

 

 

 

 

    Meaningful Data We Might Be Missing

 

Bruce Oliver

Bruce Oliver, the author of Just for the ASKing!, lives in Burke, Virginia. He uses the knowledge, skills, and experience he acquired as a teacher, professional developer, mentor, and middle school principal as he works with school districts across the nation. Bruce has written more than 150 issues of Just for the ASKing!  He is the author of Points to Ponder and co-author of Creating a Culture for Learning: Your Guide to PLCs and More.

At a recent meeting, a group of teachers were directed to jot down the first thing that came to mind when they heard the word ”data.’’ When they were asked to share their responses, the following words/terms came up: Scary, time consuming, helpful, misleading, limiting, too much, burdensome, evidence-based, frustrating, growth, hurtful, overdone and confusing. The meeting leader then led a full group discussion during which participants were asked to share the thinking behind their responses. Some of the points that the teachers brought up included the following reactions:

  • Some data can be helpful but we spend too much time talking about data that doesn’t help us improve our teaching.
  • By the time we receive standardized test data results, too much time has passed for us to make good use of it.
  • Teaching is far too complicated to limit its impact with periodic test scores.
  • I always feel my stomach tense up when our annual tests are administered thinking about how the results can have such a strong negative impact on my life if the data results are not good enough.
  • When we are asked to use data to make instructional decisions, it is often confusing to know what specific steps to take.
  • The formative assessment data I gather is very helpful to me since I can turn right around and use it in a thoughtful manner unlike the standardized test results.
  • I often wonder if there is a better use of my time.
  • If I knew that all my time devoted to data analysis would have the big pay off I hoped for, I would be excited to invest the time crunching the numbers.

My next step was to see if some of the thoughts shared by the teachers coincided with articles that appeared in current literature. My Google search surfaced these titles:

  • “Standardized Tests Not Always the Best Indicator of Success”
  • “Beyond Test Scores: The Right Way to Assess Students and Schools”
  • “Are Standardized Tests a Fair Way of Evaluating Students?”
  • “Do Standardized Tests Show an Accurate View of Students’ Abilities?”
  • “Why Standardized Tests Don’t Measure Educational Quality.”

Thinking about this exercise, the reactions of these classroom teachers, and recent publications, I was reminded of my data-related personal experiences as a teacher and administrator. I was left with the following questions:

  • Have we made data synonymous with only standardized tests and thus limited ourselves in the use of other information?
  • Are we asking teachers to spend too much time analyzing the wrong data?
  • Are we misusing the data results we gather and thus should we open our minds to other options?

With the teachers’ comments and my three questions in mind, I began delving into more literature to see if there were better ways to apply the term “data.” Here is what I found.

Troubling Analysis
A group of researchers recently attended a data meeting during which a group of teachers held a discussion about data. It was evident from the outset that the teachers were frustrated with the time spent on discussing student data. During the session, teachers tended to explain poor student performance on student attitudes and behavior, inattentiveness during class time, ESL status, or lack of support from home. The team members felt that the data results would probably be predictable based on the above criteria. Thus, the time spent at meetings did not add much to what they already knew. 

The researchers who observed the session determined that the teachers explained the poor performances by focusing on the biased data. The observers felt that the “subjective, negative claims” about students and their academic needs resulted in low expectations and even stereotyping. They further suggested that the teams should spend more time analyzing their instructional delivery, and more specifically revise how concepts are taught to see if the results would lead to different results.  If they were honest with themselves, the result will be useful unbiased data.

Frank Advice
A recent Education Week article by Denise Superville drew attention to a rather unconventional procedure carried out by a Des Moines, Iowa high school. Because the school was experiencing chronic absenteeism and low achievement scores, a group of teachers decided to meet with 100 students to see if any of the problems could be resolved. The students were encouraged to be honest with their remarks since the teachers were genuinely interested in student suggestions, i.e., real data.  Among the ideas expressed by the students:

  • Make introductory lectures/remarks shorter
  • Spend more time moving around the room, checking on and interacting with students instead of remaining at their desks
  • Provide more on-the-spot feedback to individuals or groups
  • Present vocabulary up front that would be contained in a PowerPoint
  • Visit neighborhoods to learn more about student life outside the school and include more “real life” ideas in lessons
  • Allow students to serve as co-teachers (other students might be more attentive)
  • Assign more rigorous classroom tasks instead of mundane, uninteresting practices
  • Apply more “social and emotional” learning tools

This effort to improve teaching and learning by “listening to the kids” was a successful venture. As the principal pointed out, “If the relationship was strong and genuine, and they trusted the teacher, and the teacher showed interest in them, they were more apt to go to class and work as hard as they could.” The unconventional data provided by the students was a departure from traditional methodology.

Loving Connections
In her article, “Assessment As an Act of Love,” Christina Torres discussed her mind shift from being “assessment-based” and “data-driven” to the realization that life and day-to-day reality was an entire series of data-informed actions. Torres points out, “As teachers, we’re collecting data the moment our students enter our classrooms: Do lots of students have questions about the homework? Are they calm and present, bouncing off the walls, or do we need to address any hidden tension?”  She concluded that the data and assessments did not have to be “cold-hearted tools” that focused on student weaknesses. Instead her observations should help her to build “deeper and more loving connections with students.” Instead of seeing data as simply points on a chart or figures on a spreadsheet, teachers should utilize both content-related data as well as social-emotional information to best support students. Torres has concluded, “Discovery and supporting your students, allowing students to share their strengths, and asking them about their emotional state showed we care what they think and how they feel. Data doesn’t have to reduce students to a number, but the way we treat our students can.” Social-emotional data can provide insights that can help a teacher structure learning that can meet a different set of students needs and improve learning along the way.

Preconceived Notions
Classroom visits by administrators are a necessary requirement and an important part of their jobs. In some locations, visitors go into the classroom with a slanted idea about what constitutes good teaching. Much of their time is spent watching for what they believe are necessary components of a good lesson.  Such practices can be damaging and can lead to erroneous conclusions. Consultant Robyn Jackson sees it this way: “When we go into a classroom with a preconceived notion of what instruction should look like, we often miss really fantastic teaching going on.  It’s not about the practices as much as it’s about how those practices help students learn. I’ve seen teachers do everything ’right’ and students are still disengaged or confused.  Focus on the students.  What are they doing? How are they learning?”

In follow-up conferences using the checklist, the observer might spend time pointing out missing components that may or may not have had an impact on learning; they were simply items the observer was expecting to see. The result is a teacher who is discouraged after leaving the discussion. When leaders go in to classrooms to gather information about student learning, they should do so with an open mind.  The instructionally based data they gather then become the substance of the conversation with the teacher and the focus becomes on how the teacher structured the lesson and how it resulted in students’ learning. 

Extended Learning
For decades, educators have been encouraged to use rubrics as a way of assessing projects, assignments, and other pieces of work.  Typically, when students receive their work back, they see a cumulative number or an indicator of what level their work achieved. As authors Kami Thordarson and Alyssa Gallagher see it, “Assessment procedures in most schools value the product over the process because final products are so much easier to assess.”  When teachers only assess a final product based on a rubric description, they could be ignoring important data related to student learning. Thordarson and Gallagher pose important questions about the limited use of rubrics: 

  • Aren’t students learning even if mistakes are made and the final product doesn’t hit the highest mark on the rubric?
  • Should students have an opportunity to process the feedback from the rubric, iterate their product, and try again?
  • What if we shifted from using rubrics as a “once and done” grade to using rubrics to help students understand how to improve their skills?

Viewing rubrics in a different way allows students to self assess their skill level, measure their progress in problem solving and perseverance, reflect on their personal growth, or provide supplementary data that will help them set goals for future work.  As teachers continue to use rubrics in more creative ways they will discover data sources that may have been previously unforeseen.

Anecdotal Evidence
Much of the data analysis that professionals do is concentrated on standardized test scores, or teacher-made test grades, i.e., numbers.  Another option at the educator’s disposal that is often neglected or ignored in our data-crazed world is anecdotal evidence. Anecdotes are observations or explanations to illustrate or support a claim. They are also described as “brief, revealing assessments of an individual or incident.” 

Imagine a dialogue among teachers in which the focus centers on what teachers saw students do or heard students say that was a significant step in a learning sequence.  Or, what if a student shows the teacher how he was able to solve a problem or reach a solution. These kinds of data which are not considered scientific data, can be exciting evidence of students’ progress shared with an administrator, a fellow teacher, or in a parent conference. When student achievement data is not reduced to annual test outcome, or a number at the top of a test, and instead includes a student’s words or actions, it is indeed an important development. The term “anecdote” is derived from the Greek meaning “unpublished.” Maybe it’s time to give more credence to unpublished data as a real source of information about student leaning.

Teacher Hesitation
For a long, long time, I have been a proponent of allowing students to achieve mastery level learning (80% or above) by giving them second chances to improve on a test or product, often after hearing feedback from their teacher. During workshops, I often share different strategies that will enable teachers to emphasize learning and not just grading. What I occasionally find are educators who are initially excited but who are hesitant to allow students to have second chances, primarily because the students “will not have a second chance on a standardized test.” Teachers are also reluctant to give students a second chance when districts require periodic assessments that provide data about progress to central office personnel.  Again, the teacher’s fear is that students might not do as well on a district-wide test, and the teacher’s believability about student learning will come into question. Unfortunately, it appears that the annual test score can hamper a teacher’s classroom practice that improves learning, builds confidence, and encourages tenacity.  An opportunity for a student to resubmit work provides alternative data about the students’ problem solving skills, their ability to assess where a mistake might have occurred, and the perseverance to improve their individual performance. Teachers should be encouraged and not discouraged from using creative methods to support student growth.

Surprising Connection
As I reviewed the points I made in this is issue of Just for the ASKing!, I recalled an earlier issue in which I had addressed data; that issue was titled “DATA…Beyond THE TEST.”  I was surprised to see that the publication date was March 2007.  What I found most interesting was that the previous issue focused on some of the same ideas I am addressing twelve years later. It also includes additional information that is just as timely today as it was then.  It would be worth your time to check it out at https://justaskpublications.com/jfta/data-beyond-the-test/.

 

 

 

 

Resources and References

Thoradarson, Kami and Alyssa Gallagher. “Will Design Thinking Kill the Rubric?” Education Update, March 2019.
www.ascd.org/publications/newsletters/education-update.

Torres, Christina. “Assessment As An Act Of Love.” Education Update, February 2019.
https://www.ascd.org/publications/newsletters/education-update/feb19/vol61/num02/Assessment-as-an-Act-of-Love.aspx.

Superville, Denisa. “These Students Are Doing PD With Their Teachers. Their Feedback Is Candid.” Education Week, March 13, 2019.
https://www.edweek.org/ew/articles/2019/03/13/these-students-are-doing-pd-with-their.html.

 

 

 

Permission is granted for reprinting and distribution of this newsletter for non-commercial use only. Please include the following citation on all copies:
Oliver, Bruce. “Meaningful Data We Might Be Missing.”  Just for the ASKing! May 2019. Reproduced with permission of Just ASK Publications & Professional Development. © 2019. All rights reserved. Available at www.justaskpublications.com.