Using data to measure digital literacy
Digital literacy is not a new term. It’s not a buzzword that will fade away in a few years, it is a critical part of teaching learners the skills they need to be successful. Most of us interact with a computer of some sort on a daily basis. From the smartphone in your pocket to the desktop at work, computers are everywhere, and everyone needs to know how to use them to find, evaluate, understand and share information in a variety of formats and from an array of sources.
Teaching learners how to use technology effectively has become an integral part of curricula across the country and abroad. This can be seen in the shift to computer-based assessments and in the shifts in learning standards across the U.S. Most states and school districts, including those that adopted the Common Core Standards, now require learners to be able to use computers not only for research and typing, but also to create and collaborate.
The specific intricacies of digital literacy are constantly changing as technology and what we can do with it evolves. EdWeek does a great job of explaining these nuances and how they have become a more plural concept, instead of just one general definition of digital literacy in the article, “What is Digital Literacy?”.
eWhen a concept is so complex and constantly changing, how do you make sure learners are mastering the skills involved? Can you really assess a learner’s ability to comment on a blog or produce writing digitally? In the end, assessing digital literacy comes down to two big questions; “Are learners spending enough time producing learning online?” and “What do these online learning experiences look like?”
Are learners spending enough time producing learning online?
You’ve heard it a million times already, “practice, practice, practice.” If you want learners to be comfortable reading, writing and collaborating online, you need to need to make sure that these behaviors are a regular part of instruction. The more time learners can spend practicing digital literacy skills in a safe space, the more quickly they will become comfortable and independent digital citizens.
However, whether or not learning experiences take place online or through the more traditional routes of paper, pencil and discussion is up to the individual teacher. And teachers who are less comfortable with technology may shy away from moving learning online.
When so much of learning takes place behind a closed classroom door, it can be difficult to measure how much time learners are spending working and creating online. For many administrators, the best method they have for measuring access to online learning experiences is distributing surveys to teachers and students.
There are some technologies that can help with measuring exposure to digital learning. Schools that use Google Classroom, for example, can look at how many Docs learners are creating. The rough correlation being that the more Docs learners create, the more work they’re doing online. However, as a data point, this is a little vague. It doesn’t tell you much about the content of those Docs, how they’re being used or how much time learners are actually spending in these digital documents.
Hapara Analytics takes this data collection to the next level. For schools that use Google Suite, Analytics provides data around how much time learners spend actively working in Drive, which teachers are using Drive with their learners, and how often. Analytics opens that classroom door so administrators can make sure that all of their learners are spending equitable amounts of time practicing those critical digital literacy skills.
What do these online learning experiences involve?
One of the key distinctions between digital literacy and traditional literacy, is that writing produced digitally is often meant to be shared or worked on collaboratively. Learners need to be able to produce this kind of writing in the real world as they email, participate in social media, and collaborate with their future colleagues.
Not only do we want to measure how often learners work online, we want to make sure that these learning experiences are collaborative and social, so they mimic what learners need to be able to do outside of school.
Many administrators trust that the teachers in their schools are providing learners with these collaborative experiences. But again, some teachers are more comfortable using these tools than others, and that leads to some learners getting more exposure than their peers. If we want learners to have equitable experiences with digital collaboration, we need to find a way to measure how much it is (or isn’t) already happening.
Surveys are one option for gathering this data. Administrators can simply ask teachers and learners about how much collaboration they do online. However relying self-evaluations of digital learning experiences may not lead to the most objective data sets.
Hapara Analytics uses the digital footprints teachers and learners leave in Google Drive to gather data about how often learners collaborate and with whom. Administrators can clearly see how often learners are working in Docs together with their teachers and their peers. With this data, administrators can see where learners are not getting equitable experiences collaborating, and provide teachers with the support and professional development they need to make this important shift in practice.
Gathering objective data in an education setting is difficult, but it is a critical tool for making sure students receive equitable learning experiences. As digital literacy becomes more and more complicated, turning to data will enable you to make better decisions for helping teachers and learners tackle these skills.
Head of Content | Hapara
Join our Hapara Newsletter
Get notified about new posts and resources!
Share Your Story
Use Hapara? Tell us how and we might write about it!