Lessons for Education Providers From Other Sectors on AI Adoption and Innovation


Education companies seeking to integrate artificial intelligence into teaching and learning can draw insights from other industries that are wrestling with strategies to maximize the technology’s potential.

Some applications of the tech, and challenges in using it in innovative ways, look quite different in other sectors, but there are clear lessons that can be applied to the process of developing products for schools.

Liz Ngonzi serves on the board of directors for the American Society for AI, a nonprofit organization focused on promoting conversations about using artificial intelligence in ways that benefit society. In her work at the organization, as well as in her background as a professor and the founder and CEO of a social impact nonprofit, Ngonzi has trained more than 7,000 professionals from all backgrounds and continents in generative AI.

About This Analyst

Liz Ngonzi serves on the board of directors of the American Society for AI and chairs its Ethics & Responsible AI Committee. As Founder and CEO of The International Social Impact Institute and a New York University adjunct assistant professor, she has trained more than 7,000 professionals across six continents in generative AI implementation and advised several tech companies on how to optimize their offerings for education and social impact. She is the former CEO of Afrika Tikkun USA, a role in which she supported 20,000 South African children and youth.

EdWeek Market Brief recently spoke to Ngonzi about AI’s implications and lessons that the education industry can learn from applications in other sectors of the economy.

Ngonzi discusses the challenges created by human-AI collaboration, the need for a mindset shift, as well as new iterations of AI that are already coming down the road for other industries – which the education space may eventually see as well.

How does K-12 education compare to other industries when it comes to AI adoption?

Everyone thinks everyone else has figured it out. Education is behind, but everyone [else is] trying to figure it out, too. It’s hard coming out of COVID. It’s hard navigating all of the different variables over the last few years, and now, we talk about bringing in this new technology and all that comes with it.

Everyone thinks that we’re starting from scratch, but we’re not. This is just the next evolution of technology that we have available to us. A lot of what we’ve already seen, a lot of policies like GDPR [a sweeping data-privacy regulation adopted by the European Union] or other frameworks that we use to adopt new technology, we can actually leverage those. I don’t think it’s that big of a chasm in terms of trying to come on board with these new technologies, understanding that we already have frameworks in place that we can adopt from the public-private sector.

What industries do you see leading AI right now?

In the financial sector, we’re seeing AI transform everything from fraud detection to customer service and product development. Healthcare is particularly noteworthy, with AI revolutionizing everything from medical imaging analysis to patient care. AI systems can now detect early-stage conditions such as brain tumors and Alzheimer’s with remarkable precision.

I’ve personally opted for AI-reviewed mammograms over the last couple of years, recognizing their ability to detect patterns and identify cancer earlier and more accurately than the human eye, while still maintaining human validation of results. What’s particularly exciting is how AI is streamlining healthcare operations by automating administrative processes, allowing medical practitioners to focus more on patient care.

By 2025, we expect to see about 55 percent of companies using AI, with an additional 45 percent exploring implementation. This widespread adoption across sectors indicates that AI is no longer just a competitive advantage – it’s becoming a fundamental business necessity.

What’s the most important thing for education companies to consider as they explore AI?

What’s so important is that organizations understand the guardrails they need to put in place as they’re starting to implement AI. Looking at generative AI, do you use tools that are already in the market? Or do you train your own models? What can you afford to do? What kind of risk profile do you have as an organization? That’s the technical part of it.

It’s also important to understand that you have people who are going to have to be reskilled and some people who are going to be left behind. We’ve seen that in every digital revolution we’ve had. There are also the people who are able to really embrace this and move forward with it. So, it’s very important for vendors or the school systems themselves that they’re making that investment to retool their skill set.

Join Us for EdWeek Market Brief’s Virtual Forum

Join our virtual forum June 10 & 11, 2025, to hear directly from school district leaders and industry peers about important trends playing out in the sector—and the support school systems need from education companies.

How does the application of AI reframe the type of questions that education organizations in the space need to ask themselves?

It speaks to what is education [supposed to accomplish]? What are we educating for when someone can go and find information with AI, and AI can do some of the basic functions that you’d be training someone to be able to do in school? What do we want a graduate to look like when they come out? Are we going to focus more on critical thinking? Are we going to look at more creativity and entrepreneurship, and other aspects that are harder to replicate using AI?

School leaders and district officials still largely approach AI with wariness. What can vendors do to assuage some of these concerns?

We have to have a really good PR campaign that helps to communicate about AI in a way that people can understand. We’ve been taught to fear AI first and foremost. So it’s about being able to speak about the benefits of it and addressing the challenges that educators have.

The other part is conveying it at a human level. A lot of times, when you read about AI, it’s communicated at sort of a PhD level, or it’s communicated in ways that you feel like it’s not accessible to everybody – yet it is. As someone who’s programmed many systems, I can tell you, using generative AI is super simple. It’s whether or not you can communicate that which is going to determine your ability to be effective. Helping demystify it and providing an easy way in for people to start using it opens up the possibility for them to adopt it more readily.

Are there common pitfalls you see from those developing AI, whether within education or not?

I’m surrounded by plenty of brilliant people who know how to build these systems, and I’ve met them in a lot of different contexts. They’re often really excited about the features and functions and the coolness of the latest and greatest, but their connection to the actual user is oftentimes limited. They’re creating this technology because they’re excited about creating the technology — but then they think about the user [only] later.

We need to be a lot more human-centered when we’re building these systems. You’re really bringing in the subject- matter experts, the domain expertise, to help inform how this technology is being built, so that you have better adoption, and to not only address the challenges they have and maybe improve the processes they’re growing out, but maybe even imagine better ones. It’s very important to bring those folks into the conversation more.

How do companies bridge that divide, and develop AI products that are more “human-centered?”

In this new age, K-12 education must change. If we’re developing tools, and we’re rethinking processes, then we need to have the technology companies working with the folks who are actually in the classroom, the folks who are administrators, even at the school board level.

We need to work together on this because this isn’t going to be solved by just the educators. It’s going to have to be in partnership with the folks who are creating the tools and enabling environments for us to actually improve the way that we deliver education and what we’re teaching.

By 2025, we expect to see about 55 percent of companies using AI, with an additional 45 percent exploring implementation.

What’s an exciting AI-driven innovation that you’ve seen at work in other industries?

Folks are focused on generative AI, but now agentic AI is where it is. That’s a big trend for 2025.

When we talk about generative AI, it’s being able to do a specific task. You’re interacting with that particular tool, and you’re prompting it. You determine the output that’s going to ultimately be shared with your audience. With agentic AI, it’s working on autopilot on multiple processes. You could automate, for example, a self-paced course, where students would interact completely on their own with this particular platform.

What other applications do you see for agentic AI?

You could have it quiz the student in advance to understand what the baseline understanding is for them and then customize the course content accordingly. Then, the student goes through it, you grade it, and then you have some sort of an assessment at the end that summarizes where the student is and where they should go, moving forward.

Agentic AI can completely cover that whole process. What’s important to differentiate between [this model] and something that’s just automated is that [agentic AI] can train on the students’ performance and generate its own content to be able to come up with insights into what you want the course to look like. So, what happens is, you remove the human from the loop with agentic AI, which is great in some ways — if you trust it. But then…



This article was originally published by a marketbrief.edweek.org . Read the Original article here. .

Subscribe
Notify of
guest
0 Comments
Oldest
Newest Most Voted
Inline Feedbacks
View all comments