An interview with Jodi Asbell-Clarke about her NSF-funded project to study the development of computational thinking for upper elementary and middle grades students.
What is the big idea of your project?
The Logical Journey of the Zoombinis was the first in a series of three award-winning computational thinking games developed in the mid-1990s. In August 2015, TERC and partners re-launched Zoombinis for tablets and desktops for the commercial market. The Educational Gaming Environments (EdGE) group at TERC is studying how playing Zoombinis can help upper elementary and middle school learners build implicit computational thinking skills that teachers can leverage in formal instruction. Building on prior work creating implicit STEM game-based learning assessments, we are combining video analysis and educational data mining to identify implicit computational thinking that emerges through gameplay (Rowe, Asbell-Clarke & Baker 2015).
EdGE researchers are currently analyzing synchronized screen activity video and log data from elementary learners, middle school learners, and computer scientists. Building from the ground-truth of human-coded videos, we identify systematic, automated ways of predicting implicit computational thinking skills from gameplay behaviors. The videos of players’ gameplay is human-coded for evidence of specific computational thinking skills (e.g., problem decomposition, pattern recognition, algorithmic thinking, abstraction), with a sample double-coded to establish inter-rater reliability. We then distill features from log data from players’ gameplay and build detectors (e.g., classification algorithms) to identify each computational thinking skill within the gameplay data.
What are you struggling with?
As I mentioned in my CIRCL perspective, we’ve found in previous work that what students do in game play matters and teachers need to bridge what they are doing to make the implicit knowledge explicit. We’re using similar methods to look at computational thinking Zoombinis, which is all about computational and logical thinking.
But it’s very early days in computational thinking. We’re 2 years into this grant, starting up an implementation study, and still people don’t agree on what computational thinking is! It was so much easier with physics, which has been around for hundreds of years. CIRCL has a primer on computational thinking, we know now to talk about it, CSTA knows how to talk about it, Google knows how to talk about it. That’s not the problem. It’s that students and teachers don’t know what we are talking about when we talk about problem decomposition. They might be doing it, but they don’t recognize it. So there is a communication problem.
NSF Project Information
Title: The Full Development Implementation Research Study of a Computational Thinking Game for Upper Elementary and Middle School Learners (Award Details)
Investigator: Jodi Asbell-Clarke, Elizabeth Rowe, Teon Edwards
There are threads of computational thinking in Common Core, but grades 3-8 don’t have really it. We have been in discussion with Shuchi Grover and other computational thinking researchers, and think they are finding the same thing. Some teachers want to jump to coding without building a conceptual foundation for computational thinking. It’s the analog of wanting to teach science by throwing kids into a lab with equipment and they’ll learn. No teachers really do that with physics; they talk about the theory and there is preparation.
It’s very early days in computational thinking. All of this is totally understandable. As I said earlier, with game-based learning, teachers need to bridge what they are doing to make the implicit knowledge explicit. We’re trying to apply this with computational thinking, but we don’t have as much to bridge with. It’s becoming more and more of a design-based implementation research study by the day.
Some data science education researchers express similar struggles. Do you see them as similar?
Yes, data science education has similar issues. My understanding from my colleague Andee Rubin is that things we could have tapped into were in the old math standards, but many have been taken out. They were in curriculum that were evolving in the 1990s and 2000s. But those have all either stayed on the shelf or because of the evolution of the Common Core, teachers have stopped using them because it doesn’t fit into their curriculum anymore. That’s my impression. It’s disappointing.
Our own experience is that when we run workshops at CSTA, NSTA, ISTE, and a couple other, and we get rooms of 75 teacher who want Zoombinis, but when we talk about concepts like computational thinking and problem decomposition, we’re not connecting. It’s not to mean they are not teaching good stuff, it’s just that we’re not connecting. It’s a researcher-practitioner gap. Add it’s because it’s the early days: It’s like talking about scientific inquiry in the 70s. It’s a way of thinking that’s different from a curriculum standard. But now scientific inquiry is an inherent part of the Science Standards. Computational thinking will get there too.