Artificial Intelligence in K-12 Schools: Are educators and ed-tech developers overlooking the biggest risk? (And biggest opportunity?)
When educators allow themselves to fall into this same mindset they may fail to question it and instead plunge headlong into the underlying fallacy: seeing change through this lens they cling to being a voice for protecting children and sounding alarm bells, rather than attending to the hard and urgent work that AI will never solve on its own: that of achieving complex and dynamic education reform and improvement goals and that of living up to and sustaining intangible and ever-elusive humanistic and social aspirations.
Whether we like it or not, the AI wave is coming and it's a global event...
Although Ed-tech developers may have big ideas, many educators have big concerns...Past lessons suggest that new and better tools won't necessarily bring meaningful and sustained school improvements.
In this article I consider whether a critical "mindset disconnect" between B2B Technology Developers and Educators might hold at least one key to avoiding the biggest risks and unlocking the biggest long-term rewards and opportunities as schools purchase innovative AI solutions.
The McKinsey Institute has projected that an astonishing 70% of companies worldwide will have implemented one or more types of AI technology solutions between now and 2030, a wave of innovation predicted, in the same forecast, to generate some $13 trillion in economic activity.
But AI isn't only about economic activity, it's about creating unprecedented opportunities for strategic action, growth, reform, and responsiveness in any sector of human endeavor: problem-solving...medicine...forecasting...(re-)distributing or accumulating...regulating...predicting and preventing...innovating (and accelerating the speed of innovation), teaching...learning...producing...All of these modes of action define much of individual and social activity and all can be improved if not revolutionized not only by the application of human intelligence (a range-bound resource) but by the possession of more data and with respect to how accurate, how timely, and how comprehensive or how granular one can make the data. Because AI will revolutionize this last part, the data part, it will have dramatic impacts on all the other human endeavors and the policy and governance structures that direct them.
Because of this game-changing power and because AI disruption is taking us into uncharted territory, the general consensus today is to see the penetration of AI into K-12 education in terms of a fairly stark dichotomy of infinite solutions vs. worrisome risks and unintended consequences:
On one side of the dichotomy are the possibilities for solutions and efficiencies:
__ reduction of time-consuming tasks
__ administrative efficiencies and consolidations
__ rapid scaling of adaptive, responsive, differentiated, and individualized learning solutions
__ immense scaling and centralizing efficiencies
__ diagnostic efficiencies
__ automated natural-language communication and translation
__ real-time, disaggregated formative assessment data and summative assessment data and analytic tools
In the opposing column, are any number of risks and threats and any number of unintended consequences:
__ privacy risks
__ child harassment and abuse
__ turning children into screen zombies
__ reducing learning to endless sequential skill-acquisition tasks devoid of meaning and purpose for the learner
__ failing to engage learners with sufficiently complex higher-order learning challenges
__ social-emotional deprivations
__ unequal access to ed-tech resources and by extension to equal opportunity
__ implicit bias
__ loss of diverse perspectives resulting from centralized design and from deployment of vastly-scaled-and-integrated AI platforms
The disruptive aspect of change and the speed and scale at which modern digital technologies transform practice are no doubt among the reasons that the anticipation of AI tools in K-12 schools evokes both exuberant optimism in some camps and lots of red flags in others.
Perspective and one's relation to technology disruption also play a role...
Ed-tech developers are typically inclined to focus on (and rewarded for) imaginative and quantum-leap-type innovation dreaming and prototyping. Theirs is an endless world of possibilities and one focused on overcoming technological barriers to further innovation.
Educators (and parents and child advocates) tend to be immersed in quite different habits of mind and are traditionally rewarded for different kinds of judgment and for coping with very right-now, day-to-day tasks (something AI developers want to help change!)...Educators must also be keenly aware of the "liabilities" that lurk around all of their decisions, actions, speech, innovation and learning experiments, and must regularly conform action to any number of federal, state, and local education laws and regulations. In addition, they must negotiate these demands in the context of daily informal and formal relational and value-laden interactions with children and their communities, in any range of social, racial, demographic, economic, and political settings.
The default is to see this dichotomy in rather static terms:
ed-tech developers as dreamers and optimistic actors...
and
educators as risk-focused, risk-adverse, and sometimes too small-minded actors...
Part and parcel of this way of thinking is to see most obstacles to innovation largely if not exclusively in terms of dysfunction in school organizations. Indeed, some people have been so sure about this inherent aversion to or incapacity for change and innovation in school organizations that whole charter systems were erected as a work around. Hence, as an opinion piece in Forbes magazine suggested, "Education might be a bit slower to the adoption of artificial intelligence and machine learning, but the changes are beginning and will continue" (Marr).
This perspective, the idea that change will come presumably despite schools, corners the educator: the implicit message it betrays about public perceptions and assumptions is that innovation offers all these great solutions and efficiencies, but students will have to wait longer than others to reap the benefits because schools are slow to innovate. In some cases, the implicit messaging includes following this logic to its end point: with the advent of tools this powerful (AI tools), it's hard to even know what meaningful tasks are left to assign to teachers or to the role of human teaching. One ed-tech reporter, although I think he genuinely sought to express his belief that human teachers will still have critical roles to play alongside AI, could only muster a platitude in this regard: "there's no replacing the human aspect of our teachers..." (Lynch). (Not sure what a human aspect is meant to refer to.)
When educators allow themselves to fall into this same mindset they may fail to question it and instead plunge headlong into the underlying fallacy: seeing change through this lens they cling to being a voice for protecting children and sounding alarm bells, rather than attending to the hard and urgent work that AI will never solve on its own: that of achieving complex and dynamic education reform and improvement goals and that of living up to and sustaining intangible and ever-elusive humanistic and social aspirations.
To the extent that educators and parents lose sight of this and lose sight of their relevance in even a highly automated learning environment, the fallacy now becomes a self-fulfilling prophecy as well. But what is really needed is more dreaming and more intellectual (not necessarily technological) risk taking as well as aspirational creativity and experimentation on the side of the school and educators (and parents with them).
"The presence of risks does not need to remove our optimism; instead it can be a force for developing a more mature goal" (Piech and Einstein).
Indeed, risks are real however...
Typically, given the sensitive and highly-regulated nature of educators' relational interactions with young children, educators do bear a risk burden far greater than those of the ed-tech innovator. But before talking more about educators, let's look at the other side of the fallacy, the notion that ed-tech holds all the promises, delivers all the solutions..., while the risks are assigned primarily to educators. In fact, giving in to this fallacy has risks and costs for ed-tech and for society at large too.
Let's take for example the Gates Foundation's 2013 inBloom technology initiative in New York City schools and see what it reveals about ed-tech mindsets and ed-tech risks and failures (even when the "tools" work...).
This initiative, supported by funds from the Gates Foundation, was intended to deliver more timely and useful data for monitoring student learning--data that provided a logical and needed foundation for more effective, responsive, and targeted instructional planning, in alignment with education advocates' understanding that effective instructional design requires access to timely data about student learning and student progress (what educators refer to as formative assessment data).
Our Mission: inform and involve each student and teacher with data and tools designed to personalize learning
—from the Home Page Banner of the InBloom website
(source: Haimson)
Funded in 2011, implemented in New York City schools in 2013, the initiative was abandoned by the school system in 2014...Not exactly a model of sustainability.
The initiative as a whole reflected an intervention mindset that some reform thinkers argue is flawed: a one-to-one attribution mindset. It's a traditional way of thinking in the context of technology development: you identify a "problem" and come up with a technology that delivers a "solution." In theory, the better the technology is defined by and adapted to the parameters of "the problem," the better the deliverable will be...
A number of educational researchers who study organizational dynamics and innovation- and change-oriented leadership strategies in school settings, however, have cautioned against this type of linear thinking and implementation mindset in schools. It's one thing to automate attendance record keeping...It is also one thing to transform the mechanical logistics of testing and test scoring...
To transform learning cultures and long-term student outcomes is far more complex...
It really isn't that schools are inherently "fatal" to innovation or technology adoption efforts, it's that deeper innovation in school culture and in day-to-day attitudes and practices relating to teaching and learning requires the concurrent development of organizational routines, mechanisms, and interventions in order to align (and regularly adjust and re-align) multiple dynamic practices, external change forces (which are always shifting), diverse individual, group, societal, political, and cultural constituencies and interests, and myriad logistical and regulatory constraints and compliance mandates all at the same time AND in an overarching framework and hierarchy of coherent goals, needs, challenges, and objectives negotiated actively and robustly across numerous channels, teams and factions, individuals, standing policy frameworks and so forth...within larger networks of individual school sites that operate within a larger system (typically a school district, but also a county and a state, and so forth...). All of these complexities require re-negotiation as well at various junctures as school systems seek to consolidate ideas and professional learning and social pressures into new aspirational goals and implement new practices or adopt new learning targets.
It can all sound abstract and falsely complex, until it's not. For example, when an education advocate who also happened to be an inBloom insider published a post-mortem of the failed initiative in Forbes, he pointed out that some reasonable concerns about student privacy attributable to the data platform were amplified and whipped up further by external pressures of the moment--namely a social and media climate where fears about privacy were already running high (Horn).
The technological issues or design issues in the inBloom initiative were perhaps insurmountable, but the fact is that the lack of skilled adaptive leadership in negotiating the adoption process and "processing agendas of issues" (to borrow loosely from Fullan's writings on change-oriented leadership skills) meant that it is hard to know if data security as such was really a fatal technological flaw in this setting or really just a road bump that was poorly managed in terms of implementing the technology tool and negotiating the concerns and interests of distinct constituencies. What those reporting on these past events do seem to make clear is that the larger external concerns about privacy on the one hand, when added to the utter failure to involve stakeholders (parents especially in this case) and to elicit and address their "agendas and issues" during the design, adoption, and implementation stages of the initiative, did prove fatal.
According to some reports, the CEO of inBloom, instead of taking any responsibility for the lack of participatory mechanisms, put the blame on safety advocates for sabotaging the effort. This is a perfect example of how the public discourse shapes opinion: because schools are tasked with coordinating and actually responding to the divergent interests and constituencies that they are intertwined with, they are cast as opponents of change and innovation because they elected to take on the role of being proponents of responsible innovation after having been shut out of the design and decision-making process. In all fairness, the inverse can also be true sometimes: educators choose to insulate themselves from external change pressures and networks, including innovation technologies, in order, presumably, to simply make life easier or more manageable, to avoid triggering complex change dynamics within their organization...perhaps even to be rewarded by the systems they work in for maintaining harmony and stability...
A textbook case of "we all lose..."
The zero impact on learning and the insurrection of parents means that, arguably, schools and students were the biggest losers in the wake of the inBloom failure. But as this fallacy played itself out and as the ed-tech contingent cast itself as the driver of innovation and blamed educators and parents for exaggerated concerns about privacy risks, there were consequences for ed-tech aspirations and technology applications in schools as well, since the initiative barely survived one year once actual implementation work began.
As is not uncommon, other collateral damage follows in the wake of such debacles.
Time, money, and effort are squandered (in some of these initiatives it's lots of money). Likewise, professional morale in the school system probably took a hit too, and instead of building foundations for iterative learning, for machine learning in a K-12 setting, and for broader collaborations between a growing circle of stakeholders, folks returned to square one. The only deliverables were scathing editorials and acrimony within the school community and between ed-tech leaders and the school clients. And, in case you're thinking the inBloom initiative was a one-off in this respect, consider that one news reporter commenting on inBloom quipped that "public education is a legendary graveyard" for these kinds of "ambitious philanthropic plans" (Hall and Callahan).
In theory, one might wish that the educators could have learned something from this failed ed-tech and K-12 alliance. What we can presume is that educators didn't learn much or benefit much with regard to formative assessment data--the very aspect of instructional practice the technology platform was designed to support and modernize.
However, I think it's worth asking what still might be learned...After all, don't we teach students that we can all learn from our mistakes?
Whatever cautionary warnings educators bring to the table as the AI wave approaches, it probably matters little as things stand, and "change" will be inevitable and will, yes, bring both risks and opportunities of the kinds listed above.
But perhaps the biggest risk is educators marginalizing themselves from (or being marginalized from) the innovation and design aspirations and processes that will shape this and subsequent waves of accelerating innovation...such that past flawed implementation initiatives are repeated: some vendor will come and sell a solution with many task efficiencies to the district technology officer, distracting educators from the changes that are really needed in order to reap the benefits of these tools in a meaningful way that changes student experiences and achievements in terms of significantly statistical and meaningfully qualitative measures...leaving teachers to clean up the mess, and students to suffer the unintended consequences, and leaving educators and the school's most critical learning objectives and cultural values largely disconnected from the transactional decisions that may in fact define or redefine learning, or merely "determine" learning experiences and outcomes--the school fitting itself to the parameters and functionalities of the new platforms...
But how does this frame the way we measure the "change" potential of ed-tech tools? Is there any meaningful relation between a tool's "capabilities" and "features" as such and attempts to measure their value, in terms of deepening learning experiences and for the larger social benefits that accrue from transformative changes in teaching and learning that are truly socially adaptive and responsive?
Michael Fullan, arguably today's most prolific, informed, pragmatic, and hands-on reform researcher, thinker, and practitioner, has argued repeatedly that meaningful learning and likewise meaningful change in education settings will be driven only marginally by "tools," "efficiencies," or "solutions."
Meaningful change and change that actually adds "meaning" to educators' own work and to students' learning experiences involves not just "what tool" or "what efficiency" but more complex development of workplace culture, deeper organizational capacity building, and more skillful coordination of complex organizational dynamics (internal pressures and external forces) on behalf of leaders and other skilled change agents across school systems. This is not merely a value judgment with regard to the limited role "tools" and "solutions" will play. While it is rooted in certain value-driven visions of education and its core mission and social role, it is also an empirical finding--a conclusion based on engaged study of and active participation in co-leading school change efforts. If I were to translate Fullan's assertion on this point into more blunt language, it might go like this:
'You know what school leaders and technology vendors?..Stop thinking that seriously impacting learning outcomes will depend on whatever bells and whistles the newest ed-tech program, solution, or platform comes with because in fact all of these carefully developed tools will play at MOST a 25% role in effective change, and any tool or solution--no matter how unique or powerful or well designed--will have its potential impacts negated significantly or even entirely if the complexities of managing school change and school change dynamics are concurrently misunderstood or mishandled...'
Again, those aren't Fullan's words, that's my rendition of how I translate what Fullan is trying to yell from the rooftops...
This is a really powerful finding for affirming the essential role educators have in actually getting our nation's schools to really improve. And...it is a perhaps sobering reminder for ed-tech folks--despite all the dreams, platforms, and big promises...
It's not that ed-tech innovations can't realize these dreams, per se, it's that, in the end, the hopes attached to such dreams can be easily and utterly negated by the complex dynamics at work in school settings, where linear approaches to "problems" and "solutions" simply don't obtain. When new technology tools are adopted, ed-tech simply goes on to the next wave of inventions, rarely looking in the rear view mirror to learn from this "negation of innovations" phenomenon. Society finds that schools are changed but never changing (plus ca change, plus c'est la meme chose, as they say in France), but education (real student learning and flourishing) still doesn't change that much for the better (and the default response is to blame schools and complain again about the need to fix schools).
I hope readers may find, from their own perspective and roles "inside" or "outside" education systems, some useful takeaway in these reflections...
The take-away I want to propose today is that the many "risks and unintended consequences" associated with artificial intelligence notwithstanding, the biggest risk for us all is falling prey to misleading assumptions about educators being irrelevant or redundant as the wave of change crashes over schools (or educators letting themselves be fooled into thinking, deterministically, that they are irrelevant and redundant...).
Conversely, the biggest reward may not have anything to do with AI functionalities. Instead the biggest rewards might actually flow from educators acquiring enhanced capacity for driving powerful learning outcomes by becoming design partners and mutual, active school change agents in the context of technological innovation.
Being an educator vs. being a tech developer…we need to recognize this is a deceptively reductive duality compared to the reality educators and ed-tech developers operate in. Big tech has only recently started learning that design and development are no longer "laboratory processes" (as in Bell Labs, for example, back in the days...) but are intrinsically tied to data that comes from user habits, end-user contexts, user needs, and users elective choices and decision-making processes, and from a complex ecology of IOT dynamics. But more than that, most ed-tech developers also need to stop seeing themselves as merely developers or engineers or wild dreamers. What are arguably the most reliable insights into school dynamics and change leadership suggest that "tools" and "technologies" have no intrinsic power or value with regard to meaningful learning and school change outcomes independent of other complex organizational factors and implementation dynamics.
Educators already possess or are in a position to quickly acquire and best understand these factors and dynamics.
Educators are best equipped to understand the organizational and leadership competencies required to make the adoption of new tools and solutions truly meaningful for society, such that ed tech innovations actually help drive effective, sustained, and iterative improvements to schools and to students' greater mastery of satisfying, relevant, and socially meaningful learning objectives.
Some educators step up to this challenge; some retreat from it...
More educators and reform thinkers are aware of the need for new leadership competencies, but more educators and more education partners (B2B businesses, technology dreamers and developers, civic groups...) also need to step into this complexity and engage with it constructively.
When educators add to their expertise, skill, and practice in fostering organizational capacity and when they engage more intentionally in becoming active and inquisitive learners (including embracing technology learning and dreaming), embodying the learning values they apply as teachers, they will play a central role in defining and building schools of the future, no matter how fast the shape of schools and the logistics of teaching shift, evolve, and change--they just need to be in for riding the wave after they first get out in front of it!
When ed-tech developers engage in ongoing and mutually-informing partnerships with educators, they can help educators grow a more adaptive, innovation-oriented mindset. At the same time, these partnerships will help ed-tech developers and their philanthropic sponsors learn from educators about the "getting-the-job-done challenges" they are tasked with (to borrow Michael Horn's phrasing):
"inBloom always talked about the need for its service, but the bulk of its potential customers did not have the motivation to adopt it--a big reason why needs-based product development lacks the power of and is fundamentally different from a jobs-to-be-done understanding of the world" (Horn).
These kinds of jobs-to-be-done challenges unfold in real time in complex organizational settings and in the context of dynamic and shifting conditions (diverse stake-holders and competing interests...evolving organizational beliefs, attitudes, norms, and skill levels...competing system-wide priorities, shifting external pressures...).
Indeed, ed-tech developers and vendors may find that by simply partnering more deeply with educators: engaging in more sustained, deliberative, and meaningful design- and innovation- and resource-focused interactions with administrators, teachers, specialists, resource managers...and by growing their understanding of school governance conditions and dynamics, they can significantly improve both early and iterative stages of design learning and planning for the "tools" they dream up, build, and update.
Breaking out of professional silos for the common aim of educating all of our children and breaking out of the mindsets that implicitly pit educators and innovators on opposing sides of distorting and limiting dichotomies does not, of course, address or allow us to anticipate all of the "risks and unintended consequences" that come with disruptive change, but it might provide a critical, formative, and very constructive influence not only in terms risk mitigation, but in terms of seeing learning outcomes advance more meaningfully as new tools and solutions flood the horizon.
If you are an educator reading this, you might want to think about next steps in terms of getting in front of the present technology wave. Don't assume you can know now what that entails or where any such intention will lead, but let your own curiosity, aptitudes and vocational ideals guide you. For starters, try reading "5 Examples of AI In The Classroom" (see Sources below). Next, try piloting some tools and apps and start having discussions with colleagues that segue into envisioning the tools, apps, and functionalities that would bring what you know about deep learning and expert teaching to life (or perhaps inspire new ideas).
If you are a K-12 B2B ed-tech developer wondering where to start, try building deeper and longer-term engagements and partnerships with education foundations that partner directly with specific school districts or specific school sites or directly engage district or school-site managers. Identify a few different school systems or locales that are representative of distinct stages of school reform and innovation challenges and accomplishments and that will provide your enterprise insights into the the challenges unique to serving different demographic groups. Assign each team member to learning about school governance and education policy concerns and flash points within these distinct school settings. Engage teachers and school managers at all stages of design and implementation discussions when feasible. Encourage enterprise team members to spend most of their initial time with school leaders just observing and listening (with an eye toward building mutually-informing and deliberative partnerships and networks over time). This work is not just about closing a deal, but about building larger alliances around well-coordinated work that is guided by meaningful educational goals and aspirations for student learning and student fulfillment, that is informed by realistic understandings of school system complexities, and that is responsive to hard-to-anticipate but rapidly changing vocational demands and to evolving social needs, aspirations, crises, and challenges.
Sources:
Bughin, Jacques, et al. "Notes from the AI frontier: modeling the impact of AI on the world economy." McKinsey Global Institute, 4 September 2018, mckinsey.com/…/notes-from-the-ai-frontier-modeling-the-impact-of-ai-on-the-world-economy. Accessed 3 June 2020.
Fullan, Michael. Change Forces: probing the depth of educational reform. Falmer Press, 1993.Fullan, Michael. The New Meaning of Educational Change. 5th ed. Teachers College Press, 2016.
Haimson, Leonie. "Student education data collecting initiative inBloom puts sensitive information at risk" (Op-ed). NY Daily News, 15 March 2013, https://www.nydailynews.com/new-york/inbloom-education-data-cloud-jeopardizes-lives-new-york-students-article-1.1288189 . Accessed 3 June 2020.
Horn, Michael B. "inBloom's collapse offers lessons for innovation in education." Forbes 4 December 2014, https://www.forbes.com/sites/michaelhorn/2014/12/04/inblooms-collapse-offers-lessons-for-innovation-in-education/#17461bc0525f. Accessed 3 June 2020.
Koperniak, Stefanie. "Bringing artificial intelligence and MIT to middle school classrooms." MIT News, 30 December 2019, http://news.mit.edu/2019/bringing-artificial-intelligence-and-mit-middle-school-classrooms-1230#:~:text=MIT%20researchers%20piloted%20a%20new,and%20math%20studies%20and%20careers. . Accessed 2 June 2019.
Lynch, Matthew. "5 examples of artificial intelligence in the classroom." The Tech Advocate, 22 August 2017, https://www.thetechedvocate.org/5-examples-artificial-intelligence-classroom/. Accessed 2 June 2020.
Marr, Bernard. "How is AI used in education--real world examples of today and a peek into the future." Forbes, 25 July 2018, https://www.forbes.com/sites/bernardmarr/2018/07/25/how-is-ai-used-in-education-real-world-examples-of-today-and-a-peek-into-the-future/#39008c41586e. Accessed 2 June 2020.
McCambridge, Ruth. "Gates’ $100M Philanthropic venture inBloom dies after parents say 'no Way'." Nonprofit Quarterly, 22 April 2014 https://nonprofitquarterly.org/gates-100m-philanthropic-venture-inbloom-dies-after-parents-say-no-way/ . Accessed 3 June 2020.
Piech, Chris and Lisa Eisenstein. "A vision of AI for joyful education: here's how we can avert the dangers and maximize the benefits of this powerful but still emerging technology" (blog editorial) Scientific America, 26 February 2020, https://blogs.scientificamerican.com/observations/a-vision-of-ai-for-joyful-education/
Rauf, David. "Artificial intelligence in k-12 education: unintended consequences lurk, report warns." Education Week, 5 May 2020, http://blogs.edweek.org/edweek/DigitalEducation/2020/05/ai_report_blog.html. Accessed 3 June 2020.
Ravitch, Diane. The Death and Life of the Great American School System: how testing and choice are undermining education. Basic Books, 2016.