A young girl using a laptop illustrated with holographic internet data and symbols.
Stacker Connect

Is early childhood education ready for AI?

May 15, 2024
metamorworks // Shutterstock

This story was produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education, and reviewed and distributed by Stacker Media.

Is early childhood education ready for AI?

Interest in artificial intelligence has surged among K-12 and college educators, who are looking at ways it can be used to support both students and teachers. But in the early childhood arena, those discussions are still in the beginning stages. The Hechinger Report asked Isabelle Hau, executive director of Stanford Accelerator for Learning, to share about the potential benefits and challenges of AI in early learning. The conversation below is edited for length and clarity.

Interest in AI has obviously surged the past couple of years in K-12, for both teachers and students. With early childhood, the use of AI may be a little less obvious. Have you noticed that trend in early childhood classrooms — are teachers interested in using AI or teaching about it?

Hau: I'm observing some activity in a few areas. One is interest in novel forms of assessment, or assessment areas that have been a big pain point for early childhood teachers for a long time, because observational assessments take a long time. There are some innovations that are starting to materialize in making assessments less visible, or invisible maybe, at some point. So discussion around how to leverage, for example, computer vision or some form of voice inputs in classrooms, or some gamified approaches that are AI-based.

Are there any specific ways you're seeing AI technology emerge in early childhood classrooms?

Hau: At Stanford, we have one super interesting project that is not necessarily in a classroom but could be in a classroom context. It's a tool my colleague, Dr. Philip Fisher, has developed called FIND that looks at child-adult interactions and takes video of that interaction. It is very expensive for humans to look at those videos and analyze the special moments in those interactions. Now, artificial intelligence is able to at least take a first pass at those interactions in a much more efficient manner. FIND is now an application for early childhood educators; it used to be mostly for parents, initially.

Two of my colleagues, one in the school of medicine and one at the school of education, have partnered to build Google Glasses that children with challenges recognizing emotions can wear. And based on the advances that are happening with AI, especially in the area of image recognition, the glasses that young children can wear help them detect emotions from adults or other young people they are interacting with. Feedback, especially from parents and families of young children, is quite moving. Because for the first time, some of those young kids are able to actually recognize the emotions from the people they love.

Others have been working on language. Language is a complicated topic because we have, in the U.S., more and more children who speak multiple languages. As a teacher, it's very complicated. Maybe you're bilingual or trilingual at best, but if you have a child who speaks Vietnamese and a child who speaks Mandarin or Spanish, you can't speak all of those languages as a teacher. So how do we correctively support those children with huge potential to thrive when they may not be proficient in English when they arrive in this classroom? Language is a really interesting use case for AI.

When you look up AI tools or products for early educators online, a lot comes up. Is there anything you would be cautious about?

Hau: While I'm excited about the potential, there are lots of risks. And here we are speaking about little ones, so the risks are even heightened. I'm excited about the potential for those technologies to support adults – I have a lot of questions about exposing young children.

For adults, where it's very confusing right now is privacy. So no teacher should enter any student information that's identifiable in any of those systems, especially if they are part of a district, without district approval.

That information should be highly private and is not meant to go in a system that seems innocuous but is, in fact, sharing information publicly. There are huge risks associated with that, the feeling of intimacy for a system that doesn't exist. It's a public place.

And then one concern is on bias. We've done some research at Stanford on bias sentiments in those systems, and we have shown that systems right now are biased against multilingual learners. I can see that myself, as a non-native English speaker. When I use those systems, especially when I use voice, they always mess up my voice and accent. These biases exist, and being very mindful that they do. Biases exist everywhere, but certainly they do exist in [AI] systems. And we have proven this in multiple ways. And then I also have huge concerns on equity. Because right now some AI systems are paid, some are free.

Are there any other ways you could see AI used to fill a need in early childhood?

Hau: Right now, a lot of parents are struggling to find care. You have people who are providing care – it could be center, it could be home-based, nanny, preschool, Head Start, you have all these different types. And then you have families. It's a mess right now – the connection between the two. Of course it's a mess because we don't have enough funding, we don't have enough slots, but generally, it's a mess. This is an area that, over time, I'm hoping there will be better solutions powered by technology.

If I want to dine tonight at a restaurant in Palo Alto, this is really easy. Why don't we have this for early childhood? 'I'm a low-income parent living in X, and I'm looking for care in French, and I need hours from 8 to 5,' or whatever it is. It would be really nice to have [technology] support for our millions of parents that are trying to find solutions like this. And right now, it doesn't exist.

Do you have any tips for teachers who want to learn more about AI programs to use in class?

Hau: For safety, in particular, I really like the framework the EdSAFE AI Alliance has put together. It's mostly oriented toward K-12, but I think a lot of their accommodations on when is it OK to use AI and when it is not are very clear and very teacher-friendly. There are some great resources at other organizations, like TeachAI or AI for Education, that I really like. At Stanford, we partner with those organizations because we feel like this is an effort that needs to be collaborative, where research needs to be at the table. We need to build coalitions for effective and safe and equitable use of those technologies.

Trending Now