Active Learning Experiment: The First 1.6 Weeks

As mentioned a while back, I'm experimenting with "active learning" techniques in my intro courses this term. Specifically, I'm doing a variant of the "Peer Instruction" method developed by Eric Mazur and others. There are a few complications imposed by our calendar/ class schedule, but I'm giving it a shot, and I thought I'd report on what I'm doing and how it's going, for the benefit of readers who are interested, and on the off chance that some of my readers who are in education can give some feedback/ tips/ whatever.

What I'm doing: The Peer Instruction method is based on shifting the distribution of work around a little. Rather than having students' first exposure to basic new concepts come in a lecture, they read about the basic ideas and formulae in the textbook before class. Class time is spent instead on posing questions and problems that expose common misconceptions about the key ideas, to help students see where their intuition breaks down, and how to approach these problems in a more "expert" manner. A key component of this is splitting the students into groups, and having them discuss the problems with each other, and try to reach a consensus answer among the group, which they then submit, and the collective results form the basis for more discussion.

To this end, I have divided the class into groups of three (which took a good deal more work than I expected, as I will mention below), and started using Poll Everywhere to collect responses via text messages and web forms. The students get reading assignments and a smattering of short conceptual questions for homework before class (the questions are generally taken from the "exercises" embedded in the text to be read, and the assignment is handled through WebAssign, with the due date set at the start of the class period), then I do a quick recap of the main points (generally ~5min) at the start of class, then launch into questions and discussion.

What's Working Well: I had been a little nervous about the idea of asking students to do reading before class and actively participate in class-- in recent years, particularly in the engineering classes, they've shown a tendency toward lumpishness. That aspect of things is going reasonably well, though. Students are doing the WebAssign homework on time, and generally getting most of them right. Looking at the WebAssign results right before class has also proven illuminating, as on a couple of occasions already I've spotted areas where their answers to those questions demonstrated points of confusion, and I added questions for in-class discussion to address those points.

Poll Everywhere has been pretty good, save for the one day when the wireless went out in the class, so I didn't have Internet connectivity on the tablet PC I lecture off (ad Flash wasn't installed on the main computer in the front of the room, so I couldn't switch to that). Fortunately, that happened toward the end of class. Students have responded pretty well to the idea, and while we usually don't get 100% response rates, it's been pretty good.

The discussions have been pretty good, particularly for the more subtle and complex questions. It's not completely uniform-- some groups don't talk much at all, despite an effort to break those up-- but they're doing the things they're supposed to do. On the occasions when I've asked for both individual and group responses, there's been clear movement in the direction of the right answers after the group discussion, which is nice to see.

Making the groups took some work, especially since I had to do it so quickly, before I really had a good sense of the individual student personalities. I've only had them in assigned groups for a couple of classes, so the dynamics may still take some time to sort themselves out, but as a preliminary matter, they seem ok. I had a bit of a struggle with the gender issue-- some of the materials about how to run such a course specifically mention that it's best to avoid groups where men outnumber women. This posed a problem for one class, as it contained exactly three women, which meant I was either going to have a "you girls play over there" group, or one group with a sub-optimal gender balance (this was solved when another woman added the class a couple days late). I also got a "thank you for doing that" comment from one female student who noticed the pattern (in the other section, where there were more women), which was nice to hear.

What Needs More Work: The big problems with this experiment all have to do with our schedule. We're trying to fit most of a semester course into ten weeks, which means that, for example, we have only one class day allotted for vectors. The active methods are necessarily somewhat slower than just lecturing, and so, five classes in, I'm already half a class behind where I nominally ought to be. This isn't a huge problem-- I can make some of it up later-- but I need to work on pacing the classes a little better.

There are a number of nice features about Poll Everywhere, particularly that it doesn't require students to buy/rent specialized "clickers" that can be lost or forgotten, but works with the cell phones that essentially every middle-class American teenager has at all times. The one down side is that it can be slow to register results. On a few occasions, I've had to spend longer than I would've liked on a given question because it was taking too long for the results to come in. Again, this is awkward given the need to move fast, and a specialized clicker system would do a better job of providing immediate feedback.

(On yet another hand, though, Poll everywhere does offer a "free response" option, which means I don't always have to tip them off by breaking a complex problem down into multiple choice options. That seems to be even slower to register than the multiple choice options, but it may be useful.)

Another thing that needs a little work is that I think I'm still doing more judging of answers than I really ought to. That is, after the groups discuss the problem, and I show the poll results, the discussion tends to involve at most one or two people talking, followed by me affirming that one of them was right, and explaining why the other was wrong. That's not really ideal, but it's a hard habit to break.

The other calendar-imposed bit of awkwardness has to do with problem solving, which hasn't really come into play yet, but is starting to. Very few discussions of peer instruction methods say it explicitly, but when you dig into the details of what they do, you almost always find that they have TA-led recitation sections associated with the Peer Instruction lectures. That means they can spend the entire class period going over conceptual questions, secure in the knowledge that the class will also get an hour or so of detailed working of problems at some point.

We don't have that-- our classes meet for three "lecture" and one "lab" period per week (scare quotes because each section has a single lecture and single lab, so the actual content of those periods is somewhat flexible). There's no extra time I can dedicate to working through example problems, so I need to find a way to embed this within the main class.

This has been the rockiest part of the experiment to date. I've tried including some straightforward calculational questions among the poll items, but these generally fail to engage them very much. The bulk of the time to answer is spent plugging numbers into calculators, and the discussion tends to consist of "I got A, how about you?" I think I need to abandon those, and just leave that sort of material to the WebAssign.

Working through homework- and exam-type problems is another challenge. I'm trying to do this by breaking the problem down into discrete steps that can be polled, thus hopefully getting the students to work through the problem themselves rather than just watching me do algebra on the board. I've only tried this one time so far, though, and we ran out of time (embarrassingly, because I got confused about when the class period ended, and thought I had more time left). I hope that making them work through each stage one time will get more participation if I follow it up with an example for them to do in their groups, but the jury's still out on this.

One last awkward aspect of this: I'm teaching two intro course sections, but while both cover basic mechanics, they're two different flavors. One of the two is the first course in our "Integrated Math and Physics" sequence, which is about two-thirds calculus. And the calculus part of the course is very much a traditional lecture, with new concepts defined in class, and definitions written on the board, and all that.

As a result, the (entirely necessary) sales pitch for why I'm doing the active stuff in the first place had the unfortunate effect of appearing to slag off the teaching style employed for the math portion. I tried to make a distinction between the two subjects, by saying that the real problem with physics is that the new concepts aren't that unfamiliar, but run up against pre-existing misconceptions, which isn't so much the case for calculus, where the new concepts are completely unfamiliar. The implication being that the active stuff is more important for physics, in order to expose and break down those misconceptions to make room for the correct version of how things work. I think that was a little too subtle, though, which was kind of unfortunate.

On the other hand, I suppose that will provide a really good test of what students think about the whole business, when it comes to evaluation time. There will be a very clear contrast between the new method and the old, after all...

Anyway, the upshot of all this is that I am cautiously optimistic. The Peer Instruction stuff hasn't been a complete and immediate debacle, though there are some bugs to work out. At least in these early stages, though, I think this might work out all right, and am looking forward to doing more.

The biggest problem of all, though, is that this is a ton of work, and I'm getting crushed. I need to completely re-invent my classes, not only coming up with new PowerPoints, but also creating and embedding the Poll Everywhere questions. It's a huge time sink, in a term when I have an unusually heavy teaching load, and other things going on as well (FutureSibling preparations, production work on How to Teach Relativity to Your Dog, playing with SteelyKid...). I'm going to be really, really happy when this term is over.

More like this

Last year, Brad deLong did a most excellent dissection of the lecture, how it came to be, and why universities need to rethink the whole approach to learning concept before they get eaten by technology development providing even cheaper content delivery. I've been meaning to editorialize on this…
So, this post is almost ten days old, but I just now found some time to actually read the 35 comments on it as well as what others wrote about it on their blogs. I guess it is time to continue that conversation now. First, let me be clear about the origin of that rant: I've been teaching for quite…
I'm supervising a few independent studies this year, with groups of students working on fairly large and fairly fuzzily-defined design projects. These groups couldn't be more different from each other in terms of the way they act as a group, act as individuals, and interact with me. It's got me…
Science labs are not for all people. I've always enjoyed teaching lab courses, so some of you might find it strange that I agree with some of the comments from Steve Gimbel and fellow Sb'ers on the questionable benefits of laboratory courses in introductory physics. But you see, I wasn't very…

Way to go, Chad! I think what you're doing is great. Please stick with it long enough to work out the kinks. I'm sure you can figure out ways to ease the burden in the future.

Interesting reading. As to group formation, I've been experimenting with it for years and this is what I've come up with:

1. Every student has a number next to his/her name on the register, so I use the random number generator at http://www.psychicscience.org/random.aspx (I know I know -- Psychic Science... it has a good number generator...) to make a random list of numbers daily or weekly. Students come into the class, see their numbers up on the screen, and get into the appropriate groups. Obviously, these numbers can be tweaked to keep some students apart or put others together.

2. I change the groups regularly. In my experience, leaving the same students together makes them too friendly, which interferes with tasks because they get off topic and talk about personal life (of course, in smaller classes it means that at the end of the semester I have a class of people who are reasonably good friends).

I can understand your pain with creating materials from scratch, and I don't know what you include in your PowerPoint slides, but the new (and backed up by research) trend is that less is more. That means that each slide has a minimum of information and, for my money, this really speeds up making PowerPoint shows (or Prezi shows). See "Slide:ology" by Nancy Duarte or "Presentation Zen" by Garr Reynolds for the how-to of it.

I look forward to reading more on your experiment.

I use colored cards instead of clickers for peer instruction. I think that I can accomplish my goals: keep students engaged, get a rough idea of progress, and have students learn from one another. I don't assign groups as I haven't seen literature indicating this is a good idea.

I tried the "clicker" method for a term, but I found that the time spent administering was large, and it was slightly inconvenient for students. The advantages of the clickers are that you can get statistics and use it to assign grades for the activity, neither of which are a priority for me.

I agree that calculation is generally a bad idea unless it is fairly simple.

Hi Chad,

Great article! Thank you for outlining your methods and the implementation of Poll Everywhere in such detail.

In regards to response time I suspect you will see an improvement as your students become more familiar with the text message response format. The speed at which some people can text is somewhat amazing. The answers to both Multiple Choice and Free Text (open-ended) polls should appear nearly instantaneously on screen after being submitted. Delayed response may be an internet connection issue on your side, but I think it's more likely it was delayed due to server load during "peak hours" on our end. The good news is our engineers have made some huge strides, and I believe this week you will notice marked improvement in the speed of the site and response processing.

I'm glad your optimistic about your approach and can't wait to hear more as the semester progresses. In terms of workload I'd be happy to share some tips for speedier poll creation. Send me an email steve@polleverywhere.com

Thanks!

Steve
steve@polleverywhere.com

Duane, you may want to try random.org for random number generation. The site can also generate almost anything that's a random choice.

An alternative to slow response with Poll Anywhere (or a backup if something goes down) is to get a 4x8 sheet of melamine (white board) cut into 2x2 squares. Each group can write their answers with dry erase marker and give them to you or put them on the chalk tray wrong-side-out until you are ready for a compare/contrast discussion.

This would also work better than texting for problem setup (equations) and drawing free-body diagrams, etc.

By CCPhysicist (not verified) on 21 Sep 2011 #permalink