Thursday, October 30, 2025

Powerful testimony from parents Christine Tadler, Jim Baker, and Liat Olenick of Climate Families NYC, on why DOE should not be using AI in the classroom

Below is the testimony of Christine Tadler, a parent of a Brooklyn second grade student and also Jim Baker, a parent of three NYC public school students, both of whom spoke at the Panel for Educational Policy meeting last night in opposition to the use of AI in the classroom.  

Also below is the written testsimony of Liat Olenick, a NYC teacher and a leader of the group Climate Families NYC, that was sent to PEP members before the meeting. 

Liat recently  published an excellent oped in the Daily News, calling for a two year moratorium on the use of AI for two years. Thankfully, all three AI contracts, for EPS/Amira, Lumi and Kiddo, were voted down by the PEP last night.   But the organizing work of parents and teachers to oppose the powerful forces pushing AI must continue. 

___

My name is Christine Tadler, I’m the parent of a 2nd grader in District 15, and I’m here to urge the panel to reject the proposed EPS contract, as well as the expanded use of AI in the classroom in general.

It feels rather dystopian to be standing here, urging a decision-making body not to turn my child’s literacy education over to a robotic avatar, but perhaps not surprising.

For decades, we’ve invited Big Tech’s untested products into our classrooms based on the promise that it would revolutionize learning.

But to quote Jessica Grose of The NY Times, “Companies never had to prove that devices or software helped students learn before those devices had wormed their way into America’s public schools.”

But learning outcomes haven’t improved. We’ve seen declining test scores, drops in overall literacy, and alarming trends in college readiness and performance.

And now we find ourselves here again, considering whether or not to buy what Big Tech is selling, except this time it’s AI.

At the present moment, AI poses a clear threat to student learning and wellbeing. There is insufficient evidence for student use of AI to support genuine learning gains, despite a massive marketing push to position these products as essential to students’ future livelihoods.  

With regard to student safety:

Apps like EPS Reading Assistant introduce gross privacy violations, recording children’s voices and guzzling student data to train their own models and feed to advertisers, all without the knowledge or consent of parents.

Meanwhile, young people using chatbot “companions” are vulnerable to psychological and emotional addiction, yet we consider normalizing the use of chatbots as human substitutes in the name of educational “progress.” AI "relationships” continue to trigger mental health crises, human relationship breakdowns, and in the worst cases, attempted and completed suicides.

Given these realities, it’s alarming that we’re even considering introducing this fast-evolving and dangerously opaque category of tech into our classrooms.

With regard to efficacy:

Nowhere on earth are young readers thriving because they’ve been isolated and made to read to a glitchy robot. Never has removing the human element lead to improved learning outcomes.

These chatbot tutors “listen”, but children don’t just need to be heard—they need to be understood. A teacher doesn’t just correct pronunciation. They notice emotion, attention, confidence and joy. Children thrive when education is rooted in relationships and human connection, not robotic correction.

In closing, I urge you again to reject this contract, reject AI mania, and prioritize student safety above all. Thank you.

---------- 

Dear Members of the Panel for Educational Policy, 

I am unable to attend the PEP in person this evening due to the time and location. However, I wanted to share the testimony I submitted, and note that I am submitting it as a leader in a parents climate organization, Climate Families NYC, which represents more than 2400 parents across the city, the vast majority of whom are current public school parents.  Here's my testimony.

As a parent of a public school student, teacher, and climate organizer, I am horrified that the PEP is considering 3 contracts for millions of dollars for new AI products in classrooms: Amira, Lumi and Kiddo. forcing untested, predatory AI into classrooms is the last thing New York City schools should be doing.  

Here’s why:

  

AI is torching the planet, and with it, our kids' futures. In fact, the Center for Biological Diversity just released a report this week showing how the data centers powering AI are undermining our national and international climate goals. AI uses a LOT of energy (and water). While we don’t know exactly how much carbon a particular usage generates– because tech companies are keeping it a secret–we do know that AI is massively driving up energy demand, giving the fossil fuel industry a new justification for building out fossil fuel infrastructure and delaying the green energy transition.

 

Global scientists have made clear we must cease new fossil fuel production and phase out existing fossil fuel use to prevent a truly catastrophic future marked by mass death, displacement, and famine. New fossil fuel infrastructure, or old infrastructure staying online or being repowered to meet increased demand, isn’t just locking in this hellish future for our kids, it’s also poisoning the air they breathe NOW, like xAI is doing to Memphis. This tech is driving us to our deaths, and it’s deeply immoral to foist it recklessly on our children.

 

Any education system that purports to care about children’s futures needs to categorically reject all contracts for energy-intensive AI products in schools. While of course, the industry on the whole is the problem and school contracts are just a small part of demand, the largest school system in the country taking a stance can make a difference, while also educating stakeholders on the ties between the climate crisis and AI. 

 

There are other very compelling reasons for rejecting AI contracts in NYC schools, particularly when it comes to students using AI products. For one thing, the point of school is learning. And the evidence suggests that AI is doing the opposite, eroding critical thinking skills as well as writing skills. It’s still too early to know the long-term impacts of AI use on children’s brains, but it is common sense to think that it is profoundly irresponsible to introduce new corporate technology into classrooms without fully understanding the cognitive and pedagogical impacts on children. As a parent, I’m outraged by the idea of our school system giving tech companies free rein to experiment on our children in order to train their AI models or mine their personal data. 

 

Bringing AI into schools is also a privacy disaster. Teachers and parents have reported “programs” being used where kindergarteners (including immigrant children) are video recorded by an AI program, without parental consent. In an era when companies like Palantir are using biometric data from AI to help ICE track down and violently detain immigrants, allowing this tech into our diverse, multicultural classrooms endangers our students, their families, and school workers. Our children’s personal, biometric data should only be shared with corporations with parental consent, if at all, not by default.

The money for these untested, unproven, energy intensive and dangerous AI products could better be used to hire more teachers, upgrade decaying school buildings or provide wrap-around care services to children who need them, The PEP must reject these three contracts.  Our kids deserve better than being the subjects of a surveillance experiment that will leave them a world on fire.  -- 

Liat Olenick,  Climate Families NYC

---------

“I don’t think I have ever been this burnt out and demoralized and part of this is because institutions are not taking our concerns about the pedagogical harms of widespread and uncritical adoption of generative AI and LLMs in higher education seriously. "  

 

This is a quote from a NYC public school teacher.  They need our support tonight and in the future.  

 

Let me enumerate my concerns about this contract which would put AI in DOE public schools this evening:

 

  1. Privacy - echo the concerns of my fellow speakers on can kids consent? Are parents meaningfully consenting? What about data sharing? 
  2. Students already suffer from assessment exhaustion - use 30 minutes a week (also uses the same guidance for iready which is already, frankly, too much)
  3. Developmentally inappropriate, period. Increasing body of research saying that kids can’t meaningfully distinguish between chatbots and people - some lead to romantic relationships - A Guardian article published this week points out that researchers have identified 16 cases in the media this year of individuals developing symptoms of psychosis – losing touch with reality – in the context of ChatGPT use. In addition to these is the now well-known case of a 16-year-old who died by suicide after discussing his plans extensively with ChatGPT – which encouraged them. If this is Sam Altman’s idea of “being careful with mental health issues”, that’s not good enough. We needn’t dance on the bleeding edge of this technology.
  4. EPS’s best evidence shows only a modest effect size after an incredible amount of usage - students who used it 20-29 sessions did much worse than a human tutor, and students who used it over 30 times did only marginally better. This is not a good return on investment.

Indeed, when you look at schools that have implemented it, they are told to have students use it 30 minutes a day - this is an astounding abdication of responsibility of learning to a machine.

 

Moreover: NJ’s state literacy working group says Amira has trouble with students with accented English - most linguistically diverse city in the country (Amira available in English and Spanish, what about the other 154 languages spoken by English language learners in the DOE).

 

Invest in people, in teachers and in students, not in unproven technology. Let the students read whole books. Let teachers, not machines, teach. And let the students, not the machines, do the learning. Don’t approve this contract!

 

Thank you. Jim Baker


No comments:

Post a Comment