Now what? How will you know if your learners are learning the tools of the 21st century? What evidence is there that this initiative is not simply swapping one tool for another? What does progress and growth even mean?
In the sections below, we will take you on a Hapara Analytics-driven journey to show you the many shapes and stages of digital adoption. Together we will see different facets of this shift to digital practice and how they align and differ from organization to organization. Now brace yourself for the tornado…
How does adoption of technology spread and grow within your schools and districts? Figure 1 shows a common experience.
For most technologies, the path to adoption begins with a few early adopters who are open to tinkering and experimenting. They might be teachers looking for a new way to deliver a lesson or they might be learners who found a new way to do their work. Early success stories spread via word of mouth, and a year later principals, superintendents and tech-directors are launching pilot programs to test the technology with specific classrooms, grades or schools. Rave reviews and happy teachers/learners make the case for a district-wide rollout.
Your large-scale rollout ensures every teacher and learner has access to these new tools, be they hardware or software. Your early adopters continue to voice positive testimonies, and others give reassuring anecdotes, but you have a sense that this technology is not being used to its full potential. A plot like Figure 1 confirms this intuition. Years 3 and 4 look more like a gradual progression or a plateau than a jump. One-hundred percent adoption is obviously a dream, but why is learner activity only at 50%?
By the end of year 4 it’s obvious that adoption of this new technology ecosystem requires more than just access. It requires directed effort and attention. Over the summer, to prepare for year 5, you create a more focused initiative. You implement professional learning strategies — not just for tools training but to empower your organization with pedagogies and practices conducive to digital teaching and learning. Now, five years in, we see the jump we had hoped for in year 3, and can deem the new initiatives a success.
Of course not all experiences are identical. Some districts find ways to accelerate this path and others never escape the plateau.
To better understand how these shapes vary between different schools and districts, we begin with a small sample (n=35) of Hapara Analytics datasets representing a range of populations from small charter schools with less than a hundred learners to large districts with tens of thousands of learners. Under the hood every customer has a unique story to tell about their deployment and adoption of digital environments like Google Suite. With customers all at varying stages of adoption and maturity, a direct comparison of the data is neither intuitive nor meaningful. Moreover, some customers have several years of data, while others have only a few months.
To make the data suitable for comparison, we will transform them into trajectories. For our purposes, a trajectory is a trendline which predicts year-over-year growth of learners actively using G-Suite. Trajectories are computed via linear regression to fit each district’s monthly data into a line with a slope and an intercept. Plotting this collection of models yields the beautiful rainbow of trajectories shown in Figure 2.
Cue music please….
Somewhere in the rainbow (of trajectories)
Way up high
Lies digital adoption that you heard of
Once in a seminar
Somewhere in the rainbow (of trajectories)
Chromebooks are new
And the one-to-one that you dare to dream of
Really does come true
Each line in our rainbow of trajectories shows a trajectory that maps from the number of years of Google activity to an expected percentage of learners active in Google Suite. The steeper the line, the faster the trajectory. For example, the line on the far left shows a deployment that went from 0 to 80% in less than a year, whereas the line on the far right shows a trajectory that hits 50% of learners after 10 years. Following the Pareto-Principle (aka the 80/20 rule), let’s consider 80% to be a useful, globally desirable threshold for widespread adoption. This cutoff is represented as the black, dashed line in Figure 2; its presence makes it visually easy to map a trajectory into years needed to achieve widespread adoption.
Now that we have a collection of trajectory models, we have a basis for quantifying and identifying the different experiences of districts and schools moving to cloud-based learning environments. As mentioned above, we can now compute years to widespread adoption for all data in our sample, which in turn will allow us to say something about how this varies.
To visualize the distribution of years to 80%, we use a graph known as a boxplot. As shown on both the Emerald City (above) and in Figure 3, a boxplot graphically depicts the data quartiles using a box and whiskers. The box shows the range for the middle 50% of the data with its left and right edges respectively indicating the cutoffs for the 25th and 75th percentiles. The line with whiskers shows the boundaries for outliers.
The pink box shows time to 80% activity for deployments with populations greater than 5000 learners. The blue box for less than 5000 learners, and the purple box for all deployments.
Looking at Figure 3, we see the median time to 80% activity takes close to 4.5 years. To better contextualize these times, we have broken down the data into smaller districts/schools and larger districts. Smaller districts on average have faster trajectories with a median of just under 4 years, while larger districts, unsurprisingly, require almost two additional years to reach widespread adoption.
At this point we now have a sense of time to digital adoption, and we can see that it can take anywhere from 1 to 9 years. But how does this relate to our original shape in Figure 1? What does the average path look like? Who are these unicorn 1 year, 100% adoption districts? To find the answer, just follow the Yellow Hued Road in Figure 4.
This graph simplifies the Rainbow of Trajectories to a zone representing the middle 50% of districts. The dashed line in the middle depicts the median trajectory. The line at the bottom shows a trajectory at the 25th percentile (one that takes longer to ramp up than the median), while the line at the top shows a trajectory at the 75th percentile (one that is faster to ramp-up than the median).
The Yellow Hued Road now gives us some context to look at our real adoption data. Figure 5 shows the median adoption curve for this dataset. Like in our first figure, we see many of the bumps associated with the on-ramp to digital adoption with both semester peaks and summer valleys showing growth year-over-year. Moreover, we see how this progression traces the dashed median trajectory — giving us a better sense of what a steady 4.5 year march from 0% to a sustained 80% looks like.
What about these larger districts in the third and fourth quartiles? Looking at Figure 6, we see the trend just toes the Yellow Hued Road. Visually, the trajectory looks much steeper than is projected. This can largely be attributed to our simple linear model and the flat, near-zero span in the first years of adoption. Launching and following through on a large-scale digital initiative requires a lot of up-front investigation and extra time to address the needs of multiple schools, grade-levels and diverse demographics.
Finally, how can an initiative go from 0 to nearly 100% in just one year? Figure 7 appears as if the technology director put on some ruby slippers, clicked her heels and pushed out 1:1 devices and G-Suite to all her learners at once. In reality, this kind of trajectory requires heavy up-front investment in infrastructure and training. More importantly, it would not happen without first getting all of the teachers and learners on-board with such a paradigm shift. Though the deployment in Figure 7 represents a student body of hundreds of learners as opposed to the thousands in larger districts, the pace of adoption is still very impressive.
Each and every organization has a story to tell, which is the result of several complicated, overlapping and often conflicting factors. Objective measures and longitudinal analytics allow us to peel back the curtain to see where the miracles and magic are happening in our organization. Armed with data we can conduct retrospectives like shown above, and looking forward, it provides a useful mechanism for gauging and evaluating success. Most importantly, data demystifies the miracles and provides context to understand what are the factors driving adoption of digital teaching and learning practices in your district.
Of course the story does not end here. Widespread access and usage of tools is only one of many steps toward making the shift to digital teaching and learning. Sustaining adoption requires hard work and ongoing commitment to communication, professional learning, and funding. Looking beyond adoption, success expands to include using these tools to enable learners and teachers to do what was previously not possible, such as real-time collaboration, formative feedback, and creation of new digital artifacts.