Skip to main content

McREL
Blog

Our expert researchers, evaluators, and veteran educators synthesize information gleaned from our research and blend it with best practices gathered from schools and districts around the world to bring you insightful and practical ideas that support changing the odds of success for you and your students. By aligning practice with research, we mix professional wisdom with real world experience to bring you unexpectedly insightful and uncommonly practical ideas that offer ways to build student resiliency, close achievement gaps, implement retention strategies, prioritize improvement initiatives, build staff motivation, and interpret data and understand its impact.

Opening the silos of classrooms with common assessments

By Blog, Classroom Instruction that Works 2 Comments

I had the good fortune this past school year of working with Bea Underwood Elementary teachers (Garfield County #16, Colorado) in helping them to create common assessments for their Power Indicators. Throughout the year, a core group of teachers diligently worked through identifying key standards that they wanted to commonly assess, collaborated with their grade-level teams to create activities and rubrics for assessing the students, and began the (sometimes) agonizing process of evaluating student work together so that they were all in agreement on the type of work that would earn a 1, 2, 3, or 4 on the rubric.

At our year-end meeting, the most poignant statements that the teachers made about the experience were those that talked about the critical conversations this project had spawned. One teacher remarked that one of her team’s biggest “ah-ha” moments was when they realized that they did not yet have a common language to use with students when administering the assessments. Another remarked on the many conversations she had had that year with her team regarding which skills were MOST important to assess in that particular grade. Most agreed that the experience had forced teachers to come out of their classrooms and have more collaborative conversations on student learning with their colleagues.

I believe that this one school is an example of a shift we are seeing in education: no longer are teachers expected or encouraged to do their own thing within the four walls of the classroom. A combination of technology, looking at best practices in other fields, and using data to inform instruction is positively impacting education in that teachers are exploring critical questions such as: “What’s really important?” “What really works?” and “What additional professional development do we need?”

Are there other schools or districts that are embarking on similar journeys? What have been your experiences? How has it impacted the culture of your schools? We would love to hear your thoughts.

Summer brain drain revisited

By Blog, Research Insights 2 Comments

As part of Atlantic Monthly’s annual ‘ideas’ issue, out this month, Derek Thompson offers up a provocative list of not-so-quick fixes for the nation’s educational system (“10 Crazy Ideas for Fixing Our Education System“). While Thompson’s list mixes solutions old and new, readers might be surprised by the suggestion topping the list: the elimination of summer vacation.

Perhaps his suggestion is a bit extreme, but Thompson’s reasoning has a basis in sound research. Several high-profile studies from the past few years have noted that achievement gap margins tend to widen over the summer break. For a good summary of the reasons why, see this article in yesterday’s Washington Post. Middle class children are more likely to have books in the home and to attend high-quality summer programs in the summer, offsetting the loss in reading skills that occurs while students are on vacation. The 2007 study cited in the article found that differences in summer experiences explained two-thirds of the achievement gap between advantaged and disadvantaged 9th graders.

Even high-quality programs are more likely to focus on reading than math, which explains why the reading achievement gap is more prevalent at summer’s end. Children (and adults) have more difficulty retaining specific processes than basic concepts over long periods of time – e.g., solving a quadratic equation versus reading a passage for comprehension. As a result, the greatest summer losses across the board are typically in math computation and spelling.

For more information on the effort to promote summer learning, check out the National Center for Summer Learning (www.summerlearning.org) ahead of the July 9th National Summer Learning Day. In addition to research and policy briefs, the site offers suggestions for effective programming, leveraging community partnerships, and professional development options.

Me as we

By Blog, Future of Schooling One Comment

In 2004, McREL embarked on a new project by creating its first scenarios; that is, possible futures in which we consider what our organization’s role would play given certain political, economic, technological, and social parameters. Those scenarios became The Future of Schooling: Educating American in 2014. Since then, McREL has worked with other districts and organizations as a thinking partner as they explore their own possible scenarios. McREL’s work with the Ohio 8 Coalition, an alliance of superintendents and teacher union presidents from Ohio’s eight largest metropolitan school districts (Akron, Canton, Cincinnati, Cleveland, Columbus, Dayton, Toledo, and Youngstown) resulted in the creation of a thirty-three minute video describing four possible futures for Ohio’s urban areas.

In the scenario planning process, members of an organization identify two critical uncertainties that they feel will most impact their work. One of the Ohio 8’s critical uncertainties centered on whether urban areas would thrive and populate in 2020 or whether they would be areas in decline as more people moved to the suburbs. The other critical uncertainty focused on whether the policy environment was prescriptive to students or whether it allowed flexibility in education. When two critical uncertainties are crossed, a Cartesian plane is created with four possible scenarios.

OH8_CartPlane

All four of these scenarios are fascinating, but I was most energized by the “Me as We” Scenario, in which urban centers are thriving, 21st century communities that have self-organized in order to help students discover and focus their education on their primary strengths and interests. In this scenario, federally-funded universal wifi access and the replacement of NCLB by individual, digital, community-involved learning plans have completely revamped education. Teachers are now seen as learning agents and innovators. High school diplomas have been replaced by a skills-based credentialing system, assessed in part by active and interested community members.

Take a look at the either the whole video or just the 5-minute “Me as We” scenario. Could your organization survive in this scenario? How would we need to rethink education? Professional development? Pre-service teacher education?

The tie between mixed-age ability grouping and standards-based grading

By Blog, Classroom Instruction that Works 4 Comments

When McREL delivers professional development on cooperative learning, we often talk about the tangent topic of ability grouping. The discussion is often fraught with misconceptions and strong opinions. Needless to say, it’s a controversial subject.

Lately an old form of mixed-age ability grouping has been given a closer look by schools and districts that are moving toward standards-based grading. This old form of ability grouping began in four elementary schools in Joplin, Missouri in 1954 and is known as the “Joplin Plan” (Cushenbery, 1967). Essentially, forms of the Joplin Plan include careful diagnosis of each student’s proficiency level in a given subject (reading level, math level, etc.), placement in a mixed age group at similar proficiency levels, and structural changes to the school’s schedule to accommodate teaching multiple levels/groups of different aged students. The big difference between this type of grouping and what most educators think of as ability grouping is that Joplin Plans are not based on homogeneous aptitudes in a given subject at a uniform age or grade. In other words, it does not group all of the 4th grade students “good at math” in one group. It groups students ages 9-11 that are performing at the 4th grade level together.

Now consider the mixing in of standards-based grading. It uses a form of assessment that mixes summative and formative data based on proficiency criterion for standards of what every student is expected to know, and a score is set compared to these benchmarks rather than a ranking compared to a norm. It is fully expected that every child will become proficient in all areas by the end of a period. If they are not, the data will show more precisely which areas are in need of improvement and which areas are at acceptable levels of proficiency. For instance, instead of a “C+” in writing, a student would have a report card with 8 difference areas of writing assessed by rubric score. Combined this with other indicators of proficiency, students can be grouped more effectively into Joplin Plan structures. For instance, learning structures like these have been recently incorporated by Adams County School District 50 in Colorado. (see http://www.sbsadams50.org/content and “Adams 50 skips grades, lets students be pacesetters” at http://www.denverpost.com/search/ci_11280071.)

I know what some of you are thinking. You’re thinking that in your experience, ability grouping has not worked and that it hurts those students at the bottom the most. If you are thinking this, you might be right, but dig deeper. Do some searching at http://scholar.google.com and find out more. Think of the psychological dynamics of traditional ability grouping compared to Joplin Plan grouping. The type of ability grouping that many researchers have discounted tends to group same-age students together by aptitude in a given topic. Thus, all of the 4th grade students that are good at math are in a group and all of the 4th grade students struggling with math are in another group. Most of us intuitively know what is going to happen. The teacher will unconsciously have lower expectations for the students in the low group. In general, the students in the low group have poor vocabularies, work habits, and attitudes toward the subject. Thus, intellectual discourse is weak. As students are influenced by their peers, an environment is created in which it becomes “cool” to not work hard, not know what you are doing, and dislike the subject. While the upper level group doesn’t experience the same ill effects, they may become elitist, perfectionist, and intolerant as they compete against the “best and the brightest.”

In a Joplin Plan group, a 9 year old with a keen verbal ability and strong work habits could be in the same class with an 11 year old who struggles. The 11 year old may take example from the 9 year old and improve his/her performance. With varied levels of aptitude, but standardized proficiency levels, intellectual discourse tends to be of a level that promotes critical thinking and improves understanding. Maturity and proficiency level determine your group, not age or aptitude. An 11 year old could be in a 9-11, 10-12, or 11-13 year old’s grouping. All of which are at the same proficiency level. Hence, if a particular 11 year old like’s to tease 9 year olds, he/she can be put in the 11-13 grouping. Assessment is ongoing, so students can be regrouped on a quarterly basis if needed. This allows us to make up lost time for students who are behind in their proficiency.

My own son is in such a grouping. I volunteered in his group just the other day and was impressed with how skillfully the teacher managed to deal with students of different age, but similar proficiency. Still I wonder. This is a complicated topic. Does your school use some form of a Joplin Plan and/or standards-based grading? If so, how does it work in your school? Do you think it improves student achievement for all types of students?

Written by Matt Kuhn.