What’s Going On in This Graph

Today the New York Times Learning Network dropped the first “What’s Going On in This Graph?” (WGOITG) of the new school year. This feature started last year as a monthly piece, but now expands to a weekly release. In WGOITG, an infographic from a previous NYT article is shown with the title, and perhaps some other salient details, stripped away – like this week’s graph…

12GraphLN4-jumbo

Challenge your students to list some things they notice and wonder about the graph, and visit the NYT August post to discover how teachers use WGOITG in their classrooms. Here are some ideas I have used before with my 9th graders:

  • Have groups work in pairs to write a title and lede (brief introduction) to accompany the graph.
  • Ask tables to develop a short list of bullet points facts which are supported by the graph, and share out on note cards.
  • Have students consider how color, sizing, scaling are used in effective ways to support the story (note how the size of the arrows play a role in the graph shown here). This is a wonderful opportunity to think of statistics beyond traditional graphs and measures.

Invite your students to join in the moderated conversation, which drops on Thursday. Have your own favorite way to use WGOITG? Share it in the comments!

 

 

Advertisements

TMC Desmos Day – Stats Session

Today I am in Cleveland for 2018 Twitter Math Camp! It’s the Desmos Pre-Conference Day, and I am facilitating a session on using Desmos in Statistics classes. Below are many of the links and resources I plan to use – even if you are not in Cleveland with us, feel free to borrow from these resources.

Baseball Data Set

Comparing Data Sets and Summary Statistics

Regression Facts (Mean/mean point and slope)

Teaching the meaning of r-squared

“Release the Hounds” – my first attempt at random sampling

Participate as a student, Steal and Share

“Backpack Weights” – thinking about scatterplots (AP Stats)

Participate as a student, Steal and Share

Assembling the Model Solution

I use College Board released AP items often in my Statistics course. The problems are aligned to clearly-stated goals, and the solutions provide insight not only into the grading of AP questions but also allow students to study well-articulated explanations. You can visit the College Board Statistics website and explore. Jason Molesky’s website provides helpful guidance on using FRAPPY’s (Free-Response AP Practice…Yay!) as a formative assessment tool in AP Stats.

Each free-response solution begins with the “model solution” – the ideal explanation a student would provide for full question credit. It is not unusual for Statistics students to struggle with clear communication, and having students read and dissect the model solution can be helpful in strengthening statistical arguments. A few times this year, I have used the model solution as a formative assessment tool with an activity I call “Assembling the Model Solution”.

Here’s how it works – start with an AP Free-Response question with a narrative aspect. Today, I chose a problem which requires students to interpret a P-value, from 2009:

2009problem

The model solution contains a number of non-negotiable elements: a conditional probability, a reference to smple results, and the “extremeness” of results.

2009sol

Next, I took the model solution as broke it into small, strategic “bites”. At the same time, I added some parallel distractors and a junk phrase or two.

slips

Then, use a paper cutter and slice the Word document into phrase slices, and paper clip together. All students then received the problem and the slips of paper, with the challenge to assemble the model solution for part a of the problem.

 

The conversation were rich, and the teams mostly debated the salient aspects of the problem apprpriately. The biggest points of debate and incorrect solutions came from:

  • The difference between “sample” and “population” proportions.
  • The assumption of sameness in the treatments as the conditional aspect of P-value.

I have used this strategy a few times now, and continue to tweak how I provide the slips of paper. I’m also looking at digital options, but I like the social aspect of moving the slips of paper. The method is not ideal for everything in AP Stats, but there are a few areas in our curriculum where this fits in nicely:

  • Sampling and experimental design
  • Conclusions for inference procedures
  • Describing distribuitions.

You can download my file for this activity here.  Enjoy!

  • Credit to Jon Osters and the AP Stats glitterati who rightfully pointed out that my original post spelled “Yay!” incorrectly.

Today, Nothing Worked Well…and That’s OK

My 9th grade class has a quiz on statistics concepts tomorrow – standard deviation, interpreting graphs, outliers and the normal distribution. It’s a real cornucopia of stats ideas! To review, today’s class goal was to collect class-wide data using a fun applet, share using the collaboration space in OneNote, use a website to assess the data, and write our statistical summaries. A fun day filled with stats fairies and pixie dust! Here was the lineup:

  • shapesCollect data using Shapesplosion – an online game (think the old Perfection Game) developed by folks from Grinnell College. The plan was to play with, and without color. Aside: it’s OK if you disappear for a while to play with this site, it’s super-fun!
  • Share data using the collaboration space on OneNote.
  • Use the artofstat.com web apps to make graphs and produce statistical summaries.

This is what I had in mind….Here’s what really happened

  • Shapesplosion didn’t work – while I rehearsed the site on my laptop, it didn’t work for the kids. It was a Flash issue, and stopping to figure this out wasn’t in the cards. After a few minutes of hemming and hawing, I settled upon a far less fun data collection idea: Tell me a temperature you deem “cold” when you go outside, and one you deem “hot”. Not nearly as sexy as the time data I wanted…but hey, I needed a data set.  But at least we have data until…..
  • ArtOfStat was glitchy and wasn’t playing nice with copy/paste from OneNote. Kids are getting restless, we haven’t done much stats review, and I am definitely starting to lose my “big” class.

manuel.gifSo, what do you do when a lesson goes south, your objective is slowly slipping away and the kids smell chum in the water?

Remember:

It’s not the kids’ fault when your plans go kaput. You may feel like some yelling is in order, but breathe, calm down, and be honest about what went wrong.

Student learning can’t be compromised because things go south. “There’s no time” is an easy out when we get rushed, but maintaining lesson fidelity is far more important than rushing to get to “stuff”.

Maintain clear expectations. Eventually all of my students were able to review some, and I had to alter my plan of attack. But stopping class, making sure we were all on the same page and understood the statistical expectations was necessary.

It won’t be the last time stuff goes wrong….roll with it…and laugh along with it.

Cocoa Puffs and Shared Work

Shared worked problems! What a magical time to be alive! What wonders does the magic algebra worksheet have for us to enjoy today?

organ

OK….so most shared work problems suck. I apologize to my students aspiring to be pipe organ re-varnishers, but we can do so much better.

This week I used Cocoa Puffs, stopwatches and Desmos to bring some engagement to my rational expressions lessons. To start, each student was provided with a plate filled with 30 grams of Cocoa Puffs (incuding the plate) . After my 3-2-1 countdown, students picked Puffs one at a time from the plate and tossed them onto an empty plate.  As they completed the task, times were recorded for each student.

After students finished, I had them partner up and consider the question: “if you worked together with your partner on this task, with one plate of Cocoa Puffs, how long would it take you?”

Students asked a number of clariying questions (yes, there is one plate. yes, you can pick them off the plate together.), partnerships developed a few ideas. We debated the validity of many of them:

  • Many groups took the average of the two times, then divided the result by 2. This seemed reasonable to a number of groups, and led to a discussion of the vavlidity of averaging rates.
  • Some groups attempted to find a rate per gram. This was a good start, but given that groups did not know the mass of the plate (I use Chinet, so it’s bulky!), this introduced some guesswork.

To steer discussion, we focused on one student who took 80 seconds to complete the task. How much of the job did they complete after 40 seconds? After 20?  Can we write a function which depends on time here?  What does it mean? Crossing the bridge from the task time (80 seconds) to the job rate (1/80 per second) is a tricky transit. Using Desmos to show the “job” function lends some clarity.

desmos graph

From here, many partnerships felt more comfortable with establishing their own estimates.  The next day, teams shared their work and estimates on OneNote, then peer-assessed the communication.  Some of the work was wonderful, well-communicated, and served as a model for the class to emulate.

student work

puffsThe next day, we listed our calculated shared work predictions on the board, and tested our estimates. Teams timed each other with cell phone stopwatches, and did not let participants see the clock until the task was complete.

 

data

Many groups were quite close to their calculated predictions! We discussed why our predictions didn’t quite meet the actual – bumping, variability in mass, general panic – and when error is acceptable. And now we have a firm background in rates and rational functions – time to conquer those pipe organs!

 

I Built a Crappy Digital Activity…Here’s How I Fixed It.

In the past 3 years, I have used Desmos Activity Builder in a number of different ways in my classroom: to introduce new ideas, as a formative assessment tool, and to allow students to “play” with mathematical ideas through Polygraph and Marbleslides. An activity I developed last year and re-built this year reminded me of two ideas I need to keep on my radar at all time. First, building an effective classroom activity is really, really hard.  Second, don’t be stubborn in evaluating an activity – it pays to be brutally honest.

races.JPGFor my 9th grade class last year, I wanted an activity which would cause students to think about variability in data distributions, and introduce standard deviation as a useful measure of variability. You can preview the activity here.  Take a few minutes, test drive it, and see if you can suss out the problems.

So, what went wrong.  Well, a number of things – but here are the two primary suspects.

  1. It’s too damn long! It takes way too long to get to a working definition of standard deviation, and by screen 14 students are all over the place.  Using student pacing could help remedy some of this, but I found much of the class losing interest by the time we got there.
  2. I was stubborn! I was looking for a “cute” visual way for students to think of standard deviation as “typical distance from the mean”.  In my zeal to hammer this working definition home, I tried to build slick graphs which lost many students.

6 millionHow to fix it – last year, the Desmos teaching faculty developed the “Desmos Guide to Building Great Digital Activities“.  It’s worth a read (and a re-read) now and then to guide activity construction. In my variability activity, this bullet point from the guide resonated with me: Keep expository screens short, focused, and connected to existing student thinking.  In many of the screens, I over-explained things.  Students don’t want to read when they are completing a digital activity, they want to investigate, create, and explore.  I robbed them of that chance.

citiesToday I tried my new, rebuilt variability activity with 2 classes (slimmed down to 12 screens from 19), and there was a vast improvement in class engagement. There were more opportunities for students to express their ideas regarding comparisons of distributions, and we had plenty of time to pause, recap, discuss and think about next steps.  A number of points from the Desmos Guide drove my thinking:

Ask for informal analysis before formal analysis.  While I kept in the “typical distance” definition of standard deviation, it was only in a small way – we’ll move on to a more formal definition next. Students were able to conceptualize standard deviation as a useful measure, and now can move on to a formal definition.  My old activity felt too “sledge-hammer-ish” and I knew it.

Incorporate a variety of verbs and nouns. I provided a number of ways for students to think about variability and distribution comparisons in the early screens, and strove to build different-looking screens.  This kept the ideas fresh, and students talked with their partners to assess these differences in different ways.

Create activities that are easy to start and difficult to finish.  In the last 3 screens, I ask students to extend their thinking, be brave, and apply new ideas.  For those students who got this far – most did, these screens elicited the loudest debates.  We ran out of time at the end, but we have some good stuff to build off of tomorrow morning.

I’ve learned that it’s important to be honest about an activity.  It’s easy to blame the students when something goes wrong, especially which class heads away from learning and towards frustration.  But performing an activity autopsy, focusing on clear goals, and keeping the design principles in mind is helpful to move an activity forward.

 

EdCamp and My Amazing Principals

alg-ad-snickers-betty-white-jpgThe first week of district PD – lots of meetings, scads of “sit and get” messages, and every administrator making sure their voice is heard.  I suspect what I am describing is not unique to my area. A great opportunity to energize is lost, and the “grind” begins. And I haven’t been too shy about expressing my displeasure through tweets when I am frustrated – I can be kind of a jerk (think of those Snickers ads…better?…better…).

This week my school admin team got it right. And I feel fortunate to work with them. 

For about a year, my school’s admin team had kicked around the concept of a school-wide EdCamp. To be honest, I never thought it would see the light of day…there are just too many other things loaded into the calendar.  So an invitation to work with a team of teachers to organize a high school-wide EdCamp was a true surprise…then the work began.

We planned 3 morning sessions, followed by lunch and prizes.  But beyond the structure of the day, we had a lot of talking-up to do.  Would our teachers, many of whom had never been to an EdCamp, understand the concept?  Would people propose sessions? Could we engage the curmudgeons in our teaching ranks? At our opening faculty meeting, we showed a brief video to help teachers understand the EdCamp concept, then talked it up over the next 2 days.

building

The morning of the conference many teachers suggested ideas, asked questions and thought about what they’s like to learn. In the end, we had a nice variety of topics and it felt like there was something for everyone!

grid

As I walked around during the sessions, I was thrilled to see rooms filled with discussions, and teachers from different departments with an opportunity to engage.  I know there is no possible way every to reach everyone, but I hope it was a day of professional learning for most.

On my end, I offered a session “Activity Builder for the Non-Math Crowd” which seemed to be of use to those assembled. Then later, a session where we just did math – problems from Open Middle, Nrich, Visual Patterns and others let math teacher talk, learn, and think about engaging problems for their classrooms. You can download the problems I shared with this link.

Thanks to the fantastic people I work with for letting me be part of this: Baker, Dennis, Kristina and Melissa.

And a big thanks to the HH admin team: Dennis, Ralph, Tracey and JZ.  I appreciate the opportunity, and promise not to complain again….until the next time…..