In the past 3 years, I have used Desmos Activity Builder in a number of different ways in my classroom: to introduce new ideas, as a formative assessment tool, and to allow students to “play” with mathematical ideas through Polygraph and Marbleslides. An activity I developed last year and re-built this year reminded me of two ideas I need to keep on my radar at all time. First, building an effective classroom activity is really, really hard. Second, don’t be stubborn in evaluating an activity – it pays to be brutally honest.
For my 9th grade class last year, I wanted an activity which would cause students to think about variability in data distributions, and introduce standard deviation as a useful measure of variability. You can preview the activity here. Take a few minutes, test drive it, and see if you can suss out the problems.
So, what went wrong. Well, a number of things – but here are the two primary suspects.
- It’s too damn long! It takes way too long to get to a working definition of standard deviation, and by screen 14 students are all over the place. Using student pacing could help remedy some of this, but I found much of the class losing interest by the time we got there.
- I was stubborn! I was looking for a “cute” visual way for students to think of standard deviation as “typical distance from the mean”. In my zeal to hammer this working definition home, I tried to build slick graphs which lost many students.
How to fix it – last year, the Desmos teaching faculty developed the “Desmos Guide to Building Great Digital Activities“. It’s worth a read (and a re-read) now and then to guide activity construction. In my variability activity, this bullet point from the guide resonated with me: Keep expository screens short, focused, and connected to existing student thinking. In many of the screens, I over-explained things. Students don’t want to read when they are completing a digital activity, they want to investigate, create, and explore. I robbed them of that chance.
Today I tried my new, rebuilt variability activity with 2 classes (slimmed down to 12 screens from 19), and there was a vast improvement in class engagement. There were more opportunities for students to express their ideas regarding comparisons of distributions, and we had plenty of time to pause, recap, discuss and think about next steps. A number of points from the Desmos Guide drove my thinking:
Ask for informal analysis before formal analysis. While I kept in the “typical distance” definition of standard deviation, it was only in a small way – we’ll move on to a more formal definition next. Students were able to conceptualize standard deviation as a useful measure, and now can move on to a formal definition. My old activity felt too “sledge-hammer-ish” and I knew it.
Incorporate a variety of verbs and nouns. I provided a number of ways for students to think about variability and distribution comparisons in the early screens, and strove to build different-looking screens. This kept the ideas fresh, and students talked with their partners to assess these differences in different ways.
Create activities that are easy to start and difficult to finish. In the last 3 screens, I ask students to extend their thinking, be brave, and apply new ideas. For those students who got this far – most did, these screens elicited the loudest debates. We ran out of time at the end, but we have some good stuff to build off of tomorrow morning.
I’ve learned that it’s important to be honest about an activity. It’s easy to blame the students when something goes wrong, especially which class heads away from learning and towards frustration. But performing an activity autopsy, focusing on clear goals, and keeping the design principles in mind is helpful to move an activity forward.