Schools are awash in technology in a way never before seen, thanks to the mad dash toward digital that was prompted by the pandemic a little more than two years ago.
But how well that technology works to improve outcomes for kids—or when it works, for whom, and under what conditions—remains a mystery to, well, everyone. That’s mostly because the research and evaluation necessary to find out hasn’t been conducted. And it hasn’t been conducted because, at least so far, there’s been very little incentive for education technology providers to prove their products do what they say they do.
It may well be that many of the 9,000 or so edtech products on the market work just as intended. Some could even be “transforming” education, as promised. Without evidence, though, we simply can’t know.
That may be changing. With enough tech flooding schools in recent years to reach critical mass, and enough kids who have fallen behind academically during the pandemic to raise the alarm, school district leaders are asking more questions about the evidence behind edtech products. And companies, in turn, are beginning to work out the answers.
A Winning Strategy
Irina Fine is seeing this play out in real-time. The long-time classroom educator is co-founder and chief content officer of Bamboo Learning, a company that launched in 2018 with a voice-enabled literacy application and began piloting the technology in schools earlier this year.
“From the founding of the company and also being a lifelong educator, I knew we wanted to have a product informed by research and by focus groups,” she says. “It was always important to base our product design on research and user feedback.”
Prior to January, Bamboo had hosted its voice-enabled app on the Amazon Alexa platform. Then schools began requesting the company make its technology available on iPads, too.
“As soon as we shifted our strategy to schools, we said right away: we need research, we need evidence, we need validation,” Fine says.
Bamboo Learning began working with LearnPlatform, a company that helps districts manage their edtech products, in January to show that its product “demonstrates rationale,” the baseline tier of showing evidence, as defined by the federal Every Student Succeeds Act (ESSA).
To be certified as ESSA Level IV (demonstrates rationale), a company must show a logic model and have plans underway to study the effects of the product. It is not a high bar.
Working with LearnPlatform, which earlier this year rolled out its evidence-as-a-service subscription model to evaluate edtech companies, Bamboo was certified ESSA Level IV in February.
From there, the company began pursuit of ESSA Level III, or “promising evidence,” which requires at least one “well-designed and well-implemented correlational study with statistical controls.” Bamboo conducted its pilot study at a charter elementary school in Oklahoma City throughout March and April. The students involved in the study used the Bamboo Learning iPad application for five to 10 minutes each morning for six weeks.
The outcomes of that study, which were published June 17, showed that Bamboo Learning’s pilot program satisfied ESSA Level III requirements, allowing the company to earn Level III certification. The study showed that the students who regularly used Bamboo’s application demonstrated improved reading and listening comprehension skills as well as high levels of engagement.
As a next step, Fine said Bamboo hopes to transition into ESSA Level II, or “moderate evidence,” which requires a study with a 300-student sample size.
For Fine and her co-founder Ian Freed, this path of ticking off ESSA tiers was a no-brainer. She has spent enough years in the classroom to think better than to waste teachers’ time with a product that isn’t needed or wanted and doesn’t work. But it’s more than just a moral obligation. Showing evidence—or at least making the effort to want to prove efficacy—is giving Bamboo Learning a leg up with school districts.
This spring, the company was one of 200 vendors that responded to a northeastern school district’s request for proposals. Bamboo was one of only eight companies selected to present to the district’s nine-person decision-making committee. And when asked to share materials in advance, Bamboo’s leaders shared the logic model from ESSA Level IV and came prepared to discuss their product design, research and expected learning outcomes from the pilot study. And out of the initial pool of 200 providers, Bamboo was awarded the contract for the district’s 12,000 K-5 students.
Karl Rectanus, CEO of LearnPlatform, which provided third-party validation for Bamboo’s ESSA Level IV and Level III studies, insists that victory for Bamboo was not a coincidence.
“They’re winning,” he says of Bamboo. “We’re not saying it’s just because of that evidence, but … the return on that investment [in validation] is much higher than it was previously because districts and states are saying, ‘Yeah, we want to see evidence and we are much more likely to purchase because of it.’”
Fine, too, sees an appetite among district leaders for companies to show evidence.
“I think the expectation on the part of educators is there. But there is no habit or practice to offer it on the part of companies,” she explains. “School leadership has to drive that requirement: ‘Unless you have x, y and z, we can’t evaluate you.’ Are there enough products that are validated by research to allow that to happen? Maybe not yet.”
In fact, she has been surprised to learn how few companies have ESSA validation or are pursuing it. “It’s not as common as I would like,” she says.
The Incentive Problem
The fact is most companies don’t pursue independent, rigorous research of their products because they don’t have to.
Bart Epstein, CEO of the Edtech Evidence Exchange and a champion for better regulation and oversight of the industry, says that some edtech providers realize they can get away with a colorful, well-packaged case study and call it “evidence.” So, they figure, why bother spending the time and money on something more involved?
“More and more companies are ready for the question about efficacy and research, and that’s a step in the right direction,” Epstein says, “but there’s a world of difference between someone having an independent, third-party, government-funded gold standard efficacy study showing how a product performs in a similar environment, and on the other end of the spectrum something written by a marketing department that uses vaguely academic, flavored language that is meaningless.”
One of the great flaws in the edtech industry is there are few, if any, barriers to entry, and no governing body is holding companies accountable for their claims the way the Food and Drug Administration does with drug companies before they bring a product to market, Epstein says. “Tomorrow, you and I could go out, hire a superintendent, launch a company, and make $10 million, without showing any efficacy,” he explains.
So when a district leader asks for proof of efficacy, and a company hands in a document whose contents check all the boxes—a sigma sign, a sample size, key findings—that is typically seen as good enough, even if it’s no more than a dressed-up anecdote from one teacher at one school. Most educators, meanwhile, don’t have the time to comb through research or the expertise to discern rigor from rubbish. “It’s so easy to game the system,” Epstein adds.
“In a world in which school districts are not pressured or strongly incentivized to select the product that is most efficacious, we see that decisions about what to purchase are far more often made on usability, personal relationships, features, and not on evidence,” he says. “As long as schools are left on their own to try to choose between different products, it’s very unlikely that they are going to be able to consistently choose the product that is ‘better.’”
As a result, folks in the industry—well-intentioned though they may be—have been incentivized not to invest millions on a high-quality research study, but to spend that money beefing up their sales and marketing teams, to send people to conferences and trade shows, to source new potential customers.
“We are definitely moving in the right direction, but we’re moving very slowly,” Epstein says. “I would love to see a world in which the companies who do real research get rewarded and prioritized and make more sales.”
A Better Way?
Rectanus at LearnPlatform thinks he might be part of the solution. Historically, rigorous research has cost companies somewhere in the six- to seven-figure range. But his company’s new evidence-as-a-service model is making third-party evaluation available to edtech providers at a fraction of the cost and in a fraction of the time—a few weeks, instead of 18 to 36 months. It is also, Rectanus notes, delivered to inquiring districts in a much more accessible, digestible format.
His goal is to convince the education market that this endeavor is within reach. Most companies do believe they have a good product, after all. They trust it works. They just aren’t sure it’s feasible to prove that, with all the costs associated with conducting research.
“Ultimately, any district should be able to ask, ‘Do you have evidence for a solution in a context like mine?’ If the answer is yes or no, they should also be able to say, ‘Are you willing to document evidence with us, in our context? In a way that meets our requirements, allows us to use federal funding, and make decisions for our students?’” Rectanus explains.
These questions are becoming increasingly common, Rectanus says.
And for Carmen Alvarez, early childhood director at Harlingen Consolidated Independent School District in Texas, getting answers to those questions is essential.
Harlingen is a high-poverty district of 18,000 students near the Mexico border. Early in the pandemic, the district started using an adaptive, game-based math program called My Math Academy with its pre-K students. Sensing that the program was a boon for the district—the kids loved it, and their math skills seemed to be improving—Alvarez agreed to work with Age of Learning, the company that makes My Math Academy, to participate in a research study of the program at Harlingen.
Their findings matched the anecdotal evidence: 98 percent of pre-K students in the Title I district who used My Math Academy consistently were “on track” in math by the end of the school year, based on state-administered assessments, compared to about 77 percent of students who did not use it regularly.
Now, more than 5,000 students from pre-K through third grade at Harlingen are using the program. And My Math Academy has since earned ESSA Level I certification, the highest ESSA tier for demonstrating improved student learning outcomes.
“Having that outside stamp is very important,” Alvarez says of the ESSA certification. “It’s important when we’re evaluating so many programs.”
When the pandemic began, she explains, she and her colleague were “bombarded” with pitches and programs and all sorts of materials from edtech companies looking to secure a new customer. “For me, I just have to know what I’m presenting to my assistant superintendent and superintendent for elementary education, to my school board,” she explains. “I want to have that stamp of approval so we know it’s great, we know it works. We want to put best practice in front of our teachers and students, and being able to say [it has been validated] carries a lot.”
A Piecemeal Push for Proof
The shift in the industry remains slow-moving and piecemeal, but it is real.
Sunil Gunderia, chief innovation officer at Age of Learning, thinks that the influx of technology in schools during the pandemic played a large part. But so did the fact that the American Rescue Plan’s Elementary and Secondary School Emergency Relief (ESSER) funds specifically mention the need for districts to use “evidence-based” interventions and approaches. (Rectanus notes that the ESSER funding uses the term “evidence-based interventions” 17 times but does not offer specifics on how to prove it.)
Gunderia and his colleagues at Age of Learning have spent a considerable amount of money conducting efficacy research and earning ESSA certifications, in part because they want to know that the products they are putting in front of children actually work, but also because he thinks the industry is moving in a direction that will soon demand such research be presented at the outset.
“We want to win because our product works better than any other product, and we prove that through efficacy testing,” he says. “We believe we’re going to win in the long run, so we view the [research] investment as worth it. Student outcomes will align with the company’s success—we sincerely believe that.”
That is already bearing out in companies’ internal conversations, Rectanus says.
“It used to be a tradeoff—investing in personnel versus a research trial. But what we’re finding, as we talk to providers, is that it’s the sales and marketing team that is going to the product team to say, ‘Can we have evidence as a service?’” Rectanus says. “Sales is hearing it in the market: ‘We just lost this RFP to an organization that says they have evidence.’”
Epstein, for his part, remains wary of undeserved optimism. For the industry to change in a meaningful way, it needs more than individuals expressing interest. It needs an overseer and a regulator.
“Everything is anecdotal,” he says. “It’s natural that given the pandemic, and a huge increase in spending, and the increased media attention on the issues, and some nonprofits working on it, there’s more realization that we need that evidence.”
He hopes a more meaningful movement is within reach, “one that’s organized and is demanding more evidence and getting it and knowing what to do with it and being able to use it.”