Category Archives: Classroom Instruction that Works

Using BrainPOP as an Advance Organizer

One of the most effective strategies that teachers can employ when first starting a unit is to use advance
organizers
 to help students activate background knowledge and organize potentially confusing new information. Advance organizers can take many forms, including graphic organizers, skimming, narratives, and simply giving an overview of the content (expository advance organizers). They are given to students in advance of the
learning activities to help scaffold their learning. This “front-loading” before new material is presented increases
opportunities for student success as they’re able to connect to prior knowledge and organize new information more easily.

For example, a middle school teacher is beginning a unit on forces and motion, with an emphasis on types of bridges and forces that act upon a bridge. He knows that his students have studied some of these concepts before and wants to remind them of their background knowledge on the subject as well as get students personally
interested. In order to do this, the teacher will utilize several types of advance organizers as his kickoff activity.

He begins by creating a graphic organizer to help students organize the new terms and definitions they will be learning. This also serves the dual purpose of helping them to focus on what’s important while not being distracted
by taking lots of notes. In this way, the teacher is modeling teacher-prepared notes. (More about this in an upcoming blog post.)

 

BridgeGO_blank

The students do several activities to help them fill out their graphic organizer. One is to watch the BrainPOP movie on Bridges. Before or after watching the movie, they might explore the intriguing facts and comic in the related FYI on BrainPOP as another means of piquing student interest and activating prior knowledge. He also uses other online and print resources to help students find the information for which they are looking. In effect, the teacher is using these resources as expository advance organizers, but using a variety of media to engage students and to speak to different learning styles.

As a narrative advance organizer, the teachers shows a video clip of “Galloping Gertie,” the Tacoma-Narrows bridge that collapsed in 1940. Students ask questions such as, “What made the bridge ‘gallop’?” and “Was this a mechanical error or did a natural disaster occur?” In this way, the teacher is again using multimedia to tell a narrative and to get students interested in the subject at hand.

By the end of the activity, the students are more familiar with key vocabulary and concepts that they will need as they progress in their studies of force and motion. They are likely far more engaged than they would have been had they simply been asked to read the opening paragraph of their textbooks. Even better, they can go back and visit the resources again or quiz themselves on basic concepts to self-assess their understanding.

Are you a BrainPOP Educator? Sign up today for BrainPOP Educators, our free professional community, where teachers can find and share innovative lesson plans, graphic organizers, video tutorials, and best classroom practices.

by Elizabeth Hubbell, Educational Technology Consultant at McREL, and Allisyn Levy, Director of BrainPOP Educators

(*Note: this post is the first of a series of collaborative posts between BrainPOP Educators and McREL’s Using Technology with Classroom Instruction that Works. These articles will be cross-posted on the McREL Blog and on BrainPOP Educators Blog.)

Finding motivation to put forth the effort

One of the research-informed strategies from the book, Classroom Instruction that Works, is Reinforcing Effort. Many students don’t make the connection between success and effort. Often they think that other students achieve because of luck, who they know, or born-in abilities. If children don’t think that they have any of these factors in their favor, they may assume that they have no effect on their chances for success. It is up to the community, parents, and educators to make sure children do understand that effort can result is achievement. It’s at the core of the American Dream.

McREL is often called to struggling communities where motivation is a big issue. Many of these communities have the most difficult of circumstances such as high poverty, unemployment, teen pregnancy, drug use, and violence. Yet, once in a while we come upon a community that despite the odds, is succeeding in motivating their students to succeed. Some of the schools in these communities were documented in McREL’s Schools that “Beat the Odds” report. When we look closely, we find school leaders that look for ways to motivate students. Sometimes it’s simply finding a way to reward good behavior. For instance, a high school principal in Poplar, Montana makes a small difference by realizing that students want a safe place to socialize with friends. So she provides a supervised common area for students in order to motivate them to try harder in school. They can play the Wii video game, socialize with friends, or just relax with a book as reward for their efforts in school.

Increasing student effort boils down to finding out what is important to them and finding ways to use it for motivation. A large example of this is Project Citizen (http://www.civiced.org/index.php?page=introduction). As you can see from the embedded movie below, Project Citizen Educators from around the world help students find an interesting community problem. Then they work together to find solutions. Not all problems get fixed, but enough do to teach the lesson that effort does pay off in success.

Having positive role models also improves intrinsic motivation to succeed. If students see successful people like themselves, they believe that they can do it too. One of the best ways to provide genuine role models is by using the ones already found in your schools. For instance, a school counselor could start a peer-partner program that connects screened and trained role models from upper classmen at the high school with struggling students in the middle school. Or, educators could put together a summer camp to motivate urban boys or girls using young successful students from their own community. An example of this is shown in the video below.

Sometimes you may need to find mentors from higher education to motivate students. In the video below, the Expanding Your Horizons program is shown to motivate girls to pursue STEM fields.

McREL is also doing work in this area. We have begun work on a three-year project to design and study the effects of a two-week, summer science program designed to encourage high school students to enroll in high school chemistry call “Cosmic Chemistry”. In what ways have you seen schools in struggling communities beat the odds and motivate students to put forth a strong effort?

by Matt Kuhn

 

Homework and practice have a sister

In our work with districts across the county, we often find that homework and practice is a bone of contention in many schools. Many issues arise if the strategy of homework and practice (H&P) is misapplied. Sometimes H&P is too large a part of the students’ total grade. This enables some students to pass a class without really showing that they know the subject. Other times H&P is not differentiated enough resulting in some students finding the work too frustrating and others seeing it as a total waste of their time. Furthermore, the purpose of an H&P activity might not be communicated well to the students or the activity really has little or no purposeful connection to the learning objectives at all. For instance, we have all seen busy work such as word searches assigned as homework for homework’s sake. If the teacher, and more importantly the students, cannot readily tell you why an H&P activity is important, than it probably does not have a good purpose and should not have been assigned in the first place.

But the problem I find most egregious is when there is no opportunity for feedback on H&P activities. When you practice something you are trying to see what you are doing well and what you need to change about what you do not do well. This requires feedback, usually from someone as skilled as or more so than you in the subject. This is why master teachers pair homework and practice with its sister strategy, providing feedback.

Providing feedback can be tiered to give every opportunity to the students to receive the guidance they need to learn. For instance, teachers could lead students through checking the work and accuracy of their math homework (whole group feedback). Then the students could pair up and discuss how to solve the three practice problems that were most challenging to them (peer feedback). Then the teacher could encourage students to revise their work based on the feedback (mastery teaching). Finally, the teacher could collect a random assignment at the end of the week for in-depth feedback by the teacher (expert feedback).

In any case, students should have the opportunity for meaningful practice that includes criteria-based corrective feedback. These feedback should be both positive and negative in that is lets the students know what they are doing well and not well and how to improve it. Then the students should have the opportunity to act on this feedback to make the corrections. Sports coaches and master teachers know this practice and feedback loop very well. Do you have an example of a practice/feedback loop that you use with your students?

By Matt Kuhn – Curriculum & Instruction Consultant – STEM

What does teaching really look like?

I attended Colorado’s Learning 2.0 conference in February of this year, my second time participating in this lively “unconference conference.” The kinds of conversations and connections that are made at this event are, I think, the future of educational gatherings. Howard Pitler and I presented “What Does Teaching Really Look Like?” In this session, we presented data from our Power Walkthrough software. Now that we are nearly two years into this product, we have compiled data from over 27,000 walkthroughs. We are starting to get a picture of what classrooms look like during the school day and what our students are actually doing during their K-12 years.

What we are finding is startling: overwhelmingly, the primary instructional strategies that teachers are using are Practice, Cues & Questions, Nonlinguistic Representation, and Feedback. While these are all very effective strategies, those that engage students in higher order thinking skills, such as Generating and Testing Hypotheses and Identifying Similarities and Differences, represent a small margin of the strategies used.

Almost 80% of the time, students are either working individually (24%) or are in whole-group instruction (54%). This means that students are in cooperative groups, informal small groups, or pairs only 20% of the time. Considering the social nature of students, especially Millennials, this is unfortunate. Working collaboratively is increasingly becoming a necessary skill for the 21st century workplace, yet students get relatively little time to practice these skills. Teacher-directed question/answer and worksheets are two primary methods of providing evidence of learning. (See Wes Fryer’s post on Worksheets here http://www.speedofcreativity.org/2009/03/27/the-thursday-folder-and-worksheet-measured-learning/.)

The data that is being gathered is starting to paint a picture that we know all too well: due to high stakes testing and curricula that are all too often a mile long and an inch deep, teachers will quickly cue and question students, then give them practice time and feedback to learn the content or skill. There are many solutions to this problem, some more quickly implemented than others: reconstruct curricula so that students get deeper learning experiences with less content, make lecture material readily accessible online so that students come to class with the background knowledge for higher level projects (see Rethinking Homework, Part 1 of 2), and make sure that administrators can provide the support teachers need for collaborative inquiry projects with their students.

Administrators, what other “low-hanging fruit” can you think of that would help teachers to have the skills, time, and resources to make certain that higher-order thinking and project-based learning is happening regularly in classrooms? Teachers, what barriers currently keep you from doing as much collaborative, inquiry-based learning as you would like?

We welcome your comments.

*For a complete article on this topic, we invite you to read this month’s issue of Changing Schools, a free quarterly magazine written and published by McREL.

Elizabeth R. Hubbell is an Educational Technology Consultant in the Curriculum & Instruction department at McREL.

The problem with “problem solving”

There is a big difference between generating and testing hypotheses – problem solving and doing practice problems. The difference lies in the level of critical thinking required. These two instructional strategies are often confused by instructional leaders. This isn’t surprising since most text books rarely differentiate between the two.

When we train school leaders to conduct walkthroughs with McREL’s Power Walkthrough software, we make sure they can distinguish between the two. Let’s look at an example of this distinction. We might walk into a classroom and see students quietly completed a worksheet that asks them to find and correct ten “problems” with grammatical errors. These are language arts practice problems. On the other hand, students could be contemplating possible solutions to a community problem such as homelessness in Miami. In this second example, students would be using the problem solving process to define the problem, analyze and hypothesize multiple solutions, weigh these solutions against each other, and plan for action.

I am not implying that it is not worthwhile for students to do practice problems. Practice is important for building the foundational skills students need. Nonetheless, if we want students to think critically, we cannot be satisfied with just doing practice problems. We have to build upon the foundations laid by them by providing students opportunities to analyze, evaluate, and create through problem solving. Do you have an innovative example of generating and testing hypotheses – problem solving? If so, share it with us in a reply to this posting.

Opening the silos of classrooms with common assessments

I had the good fortune this past school year of working with Bea Underwood Elementary teachers (Garfield County #16, Colorado) in helping them to create common assessments for their Power Indicators. Throughout the year, a core group of teachers diligently worked through identifying key standards that they wanted to commonly assess, collaborated with their grade-level teams to create activities and rubrics for assessing the students, and began the (sometimes) agonizing process of evaluating student work together so that they were all in agreement on the type of work that would earn a 1, 2, 3, or 4 on the rubric.

At our year-end meeting, the most poignant statements that the teachers made about the experience were those that talked about the critical conversations this project had spawned. One teacher remarked that one of her team’s biggest “ah-ha” moments was when they realized that they did not yet have a common language to use with students when administering the assessments. Another remarked on the many conversations she had had that year with her team regarding which skills were MOST important to assess in that particular grade. Most agreed that the experience had forced teachers to come out of their classrooms and have more collaborative conversations on student learning with their colleagues.

I believe that this one school is an example of a shift we are seeing in education: no longer are teachers expected or encouraged to do their own thing within the four walls of the classroom. A combination of technology, looking at best practices in other fields, and using data to inform instruction is positively impacting education in that teachers are exploring critical questions such as: “What’s really important?” “What really works?” and “What additional professional development do we need?”

Are there other schools or districts that are embarking on similar journeys? What have been your experiences? How has it impacted the culture of your schools? We would love to hear your thoughts.

Generating and testing hypotheses is not just for science

I’m right in the middle of facilitating a three-day workshop in Using Technology with Classroom Instruction that Works. We are just about to get to the Strategy of Generating and Testing Hypotheses. Out of the 30 participants, less than a handful have taught science. I can tell that I will need to do my best to show the power of this strategy for all content areas.

Often when we mention the words “hypotheses” and “testing” together, people automatically think we are talking about science. To be fair, we sometimes are talking about science, but not nearly as much as people think. Generating and testing hypotheses is just another way saying “predict and determine how good your prediction turned out.” It can be used in all sorts of teaching situations. For instance, a language arts teacher might be leading students through reading a novel and ask them to predict what actions the character will take next based on what they have read so far. Then as the read more, they discuss the accuracy of their predictions. Another example is a music teacher that teaches a unit on Blues music and then has students create their own simple blues song. Creating music includes making many lyrical and melodic predictions and testing them out. A final example is the social studies teacher that asks students a big question like “What would the World be like today if the Nazis had won World War II?” Students are then asked to predict and investigate the feasibility of their predictions in a persuasive essay. Notice how the strategy tends to involve higher level thinking skills near the upper reaches of Bloom’s Taxonomy? This is why we have to use it beyond just science class.

We could go on and on about more non-science examples, but we would like to hear from you. What non-science examples can you come up with for Generating and Testing Hypotheses?

Setting life-long objectives

I was thrilled to find this article in my ASCD SmartBrief last week on the importance of setting objectives. As one of the strategies highlighted in Classroom Instruction that Works, it’s often the first strategy we talk about during the workshop, and often one of the most important things a teacher can do to engage and motivate his or her students.

This article in particular focused on helping students to see how their decisions in school impact their future lives and careers. Students often go through the motions of “going to school” without realizing that decisions they are making at age ten, thirteen, sixteen, can hugely impact the options they have available by age eighteen. One question teachers often bemoan is the inevitable, “When are we ever going to use this?” If teachers can help their students to understand that learning to problem-solve, work through difficulties, prioritize, and network with others will greatly impact their adult lives, then teachers can help students move beyond their sometimes naive views of wanting to dismiss specific skills because they may or may not need them. Instead, the experience of learning itself becomes a lifelong skill and can help students to reach their future endeavors.

Looking at the classroom from the other side

In a recent post on Suite 101 Barbara Pytel writes about why students drop out. According to a survey of 500 recent drop outs, here are some of the reasons they decided to drop out of school:

•    47% said classes were not interesting
•    43% missed too many days to catch up
•    45% entered high school poorly prepared by their earlier schooling
•    69% said they were not motivated to work hard
•    35% said they were failing
•    32% said they left to get a job
•    25% left to become parents
•    22% left to take care of a relative
•    Two-thirds said they would have tried harder if more was expected from them.

Looking at these numbers, educators should ask why. Why did 69% of students feel unmotivated? Why did 47% feel classes were not interesting? Why did two-thirds of students feel not enough was expected of them?

Part of answer might be found by looking inside a typical classroom. McREL has been collecting classroom observation data for over two years. Administrators and others, using McREL’s Power Walkthrough™ software have been in K-12 classrooms from coast to coast, in 27 states, and collected data from over 23,000  3-5 minute visits. What those data indicate might provide some clues to why some students drop out. Here is a picture of the “typical” classroom experience, as indicated by the Power Walkthrough data.

Students walk into a classroom and are seated in rows of desks for whole group instruction for the majority (54%) of their day. The teacher stands in front of the room lecturing for just over 20% of the day. When the teacher isn’t lecturing, students are doing workshops for 16% of their school day. Technology, the world students live in outside of the classroom, is only used by teachers in 22% of all lessons. Students only use technology in the classroom 21% of the time. Students are engaged kinesthetic activities in just 4% of all observations. Just under two-thirds of observations (60%) indicate that instruction is at the lowest two levels of the Blooms Taxonomy. Could this be a reason that two-thirds of dropouts feel not enough is expected of them?

“Yes,” you might say, “but that is high school, not elementary school. Elementary teachers have kids working in small groups and do much more hands-on activities.” Not according to the data. While the overall data indicate 54% of instruction is whole group instruction, the number for primary (K-2) classrooms is 50%. In fact, the data just doesn’t change much at all from primary through high school.

It is time to think about teaching and learning from eyes of the student. Let’s think about designing our classrooms and our instruction for the benefit of the student, rather than the convenience of the adult. As Marc Prensky writes, “Engage Me, Or Enrage Me.”

The tie between mixed-age ability grouping and standards-based grading

When McREL delivers professional development on cooperative learning, we often talk about the tangent topic of ability grouping. The discussion is often fraught with misconceptions and strong opinions. Needless to say, it’s a controversial subject.

Lately an old form of mixed-age ability grouping has been given a closer look by schools and districts that are moving toward standards-based grading. This old form of ability grouping began in four elementary schools in Joplin, Missouri in 1954 and is known as the “Joplin Plan” (Cushenbery, 1967). Essentially, forms of the Joplin Plan include careful diagnosis of each student’s proficiency level in a given subject (reading level, math level, etc.), placement in a mixed age group at similar proficiency levels, and structural changes to the school’s schedule to accommodate teaching multiple levels/groups of different aged students. The big difference between this type of grouping and what most educators think of as ability grouping is that Joplin Plans are not based on homogeneous aptitudes in a given subject at a uniform age or grade. In other words, it does not group all of the 4th grade students “good at math” in one group. It groups students ages 9-11 that are performing at the 4th grade level together.

Now consider the mixing in of standards-based grading. It uses a form of assessment that mixes summative and formative data based on proficiency criterion for standards of what every student is expected to know, and a score is set compared to these benchmarks rather than a ranking compared to a norm. It is fully expected that every child will become proficient in all areas by the end of a period. If they are not, the data will show more precisely which areas are in need of improvement and which areas are at acceptable levels of proficiency. For instance, instead of a “C+” in writing, a student would have a report card with 8 difference areas of writing assessed by rubric score. Combined this with other indicators of proficiency, students can be grouped more effectively into Joplin Plan structures. For instance, learning structures like these have been recently incorporated by Adams County School District 50 in Colorado. (see http://www.sbsadams50.org/content and “Adams 50 skips grades, lets students be pacesetters” at http://www.denverpost.com/search/ci_11280071.)

I know what some of you are thinking. You’re thinking that in your experience, ability grouping has not worked and that it hurts those students at the bottom the most. If you are thinking this, you might be right, but dig deeper. Do some searching at http://scholar.google.com and find out more. Think of the psychological dynamics of traditional ability grouping compared to Joplin Plan grouping. The type of ability grouping that many researchers have discounted tends to group same-age students together by aptitude in a given topic. Thus, all of the 4th grade students that are good at math are in a group and all of the 4th grade students struggling with math are in another group. Most of us intuitively know what is going to happen. The teacher will unconsciously have lower expectations for the students in the low group. In general, the students in the low group have poor vocabularies, work habits, and attitudes toward the subject. Thus, intellectual discourse is weak. As students are influenced by their peers, an environment is created in which it becomes “cool” to not work hard, not know what you are doing, and dislike the subject. While the upper level group doesn’t experience the same ill effects, they may become elitist, perfectionist, and intolerant as they compete against the “best and the brightest.”

In a Joplin Plan group, a 9 year old with a keen verbal ability and strong work habits could be in the same class with an 11 year old who struggles. The 11 year old may take example from the 9 year old and improve his/her performance. With varied levels of aptitude, but standardized proficiency levels, intellectual discourse tends to be of a level that promotes critical thinking and improves understanding. Maturity and proficiency level determine your group, not age or aptitude. An 11 year old could be in a 9-11, 10-12, or 11-13 year old’s grouping. All of which are at the same proficiency level. Hence, if a particular 11 year old like’s to tease 9 year olds, he/she can be put in the 11-13 grouping. Assessment is ongoing, so students can be regrouped on a quarterly basis if needed. This allows us to make up lost time for students who are behind in their proficiency.

My own son is in such a grouping. I volunteered in his group just the other day and was impressed with how skillfully the teacher managed to deal with students of different age, but similar proficiency. Still I wonder. This is a complicated topic. Does your school use some form of a Joplin Plan and/or standards-based grading? If so, how does it work in your school? Do you think it improves student achievement for all types of students?

Written by Matt Kuhn.