I gave my two Algebra classes an assessment on Desmos covering linear functions. The idea started as a joke – I told a student who had found Marbleslides challenging that I’d put some on the test. As I thought about it more, though, I decided that assessing with Desmos would be a great idea.

I used Activity Builder to create the assessment. The easy-to-use, intuitive interface made creating the assessment fairly easy, but I encountered a major unexpected challenge. Designing worthwhile questions proved much more difficult. It no longer made sense to ask students to simply graph or write the equation of a line. Instead, I focused on questions that ask students to describe how to graph a line, to explain why an equation’s graph would look a certain way, and to interpret a line’s equation in the context of a problem. This is actually the type of question I always want to emphasize but rarely do.

Consider the question above. If I had asked students to simply graph the line given by the equation , they might have been able to do so without truly understanding the equation for a vertical line, and I never would have known.

Similarly, this word problem went beyond simply asking students to write an equation to making them connect the mathematics to the situation being modeled. Each of these five students wrote a correct equation, but their understandings of the problem clearly differ. So too does their ability to express their reasoning, something seen in the following example also.

Do these students understand the relationship between the graph of a line and its equation? To an extent, they certainly do, but their explanations also reveal some gaps in their understanding. What I find most interesting is how students managed to express their thinking in so many ways. Some used mathematical vocabulary; others didn’t. Some provided precise explanations that anyone could follow; others used ambiguous language that might obscure their meaning. For as much as I think I emphasize communication in my classroom, my students’ responses make me want to spend even more time refining our ability to share our thinking in a clear, concise manner. Perhaps including more problems that call for explanations on each assessment will help me move in that direction.

And that’s probably my favorite part of using Desmos for an assessment. It’s so much easier for students to explain themselves on the computer than it is with pencil and paper. Consider the following responses.

I know my students, and I can say with complete certainty that they would have written much less on a paper-and-pencil test. And I would have missed out on seeing and understanding their thinking. Between this problem and the one shown in the image at the top, I developed a clear picture of what my students know and don’t know about -intercepts, something that may not have been possible the way I typically assess.

And, of course, Marbleslides. The incomparable joy of Marbleslides.

I don’t see a lot of students absolutely beaming during tests, but I did this time. That student I mentioned earlier – the one who found Marbleslides so challenging – successfully collected all of the stars on this assessment, and she was so incredibly happy. Seeing her smile made the entire assessment worthwhile.

I suppose it’s worth discussing the nuts and bolts. Grading wasn’t really easier or harder than a pencil-and-paper assessment. It was just different. Take a look at the dashboard below.

It’s easy enough to grade a question when a student gets a check, but everything else required me to take a closer look. Sometimes, as with the following question, that was pretty easy to do.

I can quickly glance through student responses and get a sense of common misconceptions. But with questions that require an explanation (or an input that doesn’t get verified), I have to take the time to look through everyone’s individual work. And that’s totally fine. That’s what grading is usually like, and I think it’s important to see and assess each student individually. Desmos actually made it easier to do this.

As far as actually tallying scores and providing feedback, I had to improvise. I used Google Sheets to create a little rubric. I included a place for a numerical score and a place for brief comments on the individual problems. I also let my brain rest and made Sheets calculate the grades for me. Here are some examples.

I printed these little rubrics and returned them to students. Then, I un-paused the activity and allowed students to look back at their work and correct it if they so desired.

Other miscellaneous thoughts:

- The Ohio AIR test uses the Desmos graphing calculator, so this sort of assessment should help my students prepare. It’s also easy to create AIR-type questions using Desmos.
- There isn’t really a way for students to “turn in” the assessment. I just told them to close the tab and shut down their Chromebook when they finished. This is totally fine; it’s just something I had to tell them about a hundred times.
- It’s relatively challenging to monitor students to make sure they’re not just using Google to search “how to write a linear equation” or using Discord to ask each other questions. I emphasized honesty and integrity at the beginning, and that seemed to do the trick.

If you’re wondering if I’d give another assessment using Desmos, the answer is a resounding yes. I’m actually designing two more assessments (one for Algebra, one for Math 8) right now. And my colleagues have agreed to try using Desmos for one of their assessments!

Thank you to Desmos for being awesome! Thank you to Julie Reulbach and Jonathan Claydon for introducing me to the idea of Desmos assessments! Thank you to my students for making my job wonderful!

*Update: Wow! This post received quite a response on Twitter! Here’s the link for anyone interested: https://teacher.desmos.com/activitybuilder/custom/5bc52d70744e4b427f3ce5a6*