When I read this article, I chuckled a little. Developers of web content and sites that host web content are getting better and better at collecting data about how people are using their tools. Some examples are Youtube’s Insight and Visible Measures. The latter of these two examples actually will display in a graph when people are rewinding the scrub bar and re-watching portions of the video. Granted, I think a person has to have about 50,000 or so views for this to work, but it does give a pretty accurate display of audience engagement with a video.
In the tool I helped design for my dissertation — PrimaryAccess Storyboard — we used something similar to Flash cookies to collect data about how the students were using the tool. This tool is built using Flash, and Bill (the programmer) was able to add in some code that records everything the user does while logged into this application. I was able to get detailed information from each student, such as when he or she logged in, whether or not they logged in after school from a different computer, every single move they made with the mouse or keyboard (I didn’t know what they wrote and deleted, just the number of characters typed and deleted). I was even told how much time each student spent on task. Of course, we who have been teachers know that a general number for time on task, such as 52 minutes, does not really tell me much about what the student was doing. Using the rest of the data, I was able to see the differences in how the students used their time. Some would experiment with different images, layering them on top of each other and tinkering with the order. Other students were just playing with the tool, adding characters, spinning them around and deleting it.
This approach to data collection was not perfect, and Bill and I are still thinking about ways to make this data more useful to teachers, but it did answer a longstanding issue I have always had with technology in the classroom. Computers, just like in other contexts, make it easy for students to look busy when they really aren’t. As an observer, I was able to see how students would deftly switch between their work and Solitaire without the teacher ever noticing. Even though the data told me how long they stayed on task, I was able to infer from the rest of the data that they really weren’t very engaged. And even if they looked engaged, I could see how they were spending their time. Some used the tool creatively to make the best product possible, while others played around but did not put a lot of thought into their final product.
I see a lot of potential in this kind of data collection. We can learn a lot about how students use different tools and scaffold projects in a way that anticipates these patterns. This also opens the door to looking at how teachers make instructional decisions based on the feedback they get from the students’ tool use. Given what they know about how a student is using his or her time, can they make better decisions for directing student effort before the final product is turned in? I think there are lot of good questions, and hopefully some answers, that will come from this kind of data.