🧠 How ChatGPT Impacts Student Learning—and What Parents Should Know

Will AI Vaporise the brains of our kids?

I recently stumbled across a fascinating Edutopia article titled ā€œHow AI Vaporizes Long-Term Learning.ā€ It highlights research involving 1,000 students who used AI tools—like ChatGPT or a specially designed AI tutor—to prepare for a math exam.

In the study, students either learned using traditional methods (like notes and textbooks) or had access to AI-powered tools, including a basic version of ChatGPT (GPT Base) and a more structured tutor (GPT Tutor).

The reason this caught my attention? This piqued my interest because, let’s be honest—AI is at the center of every educational conversation right now. While no one has all the answers, districts, educators, and parents are all doing their best to create thoughtful, responsible frameworks.

Last month, I attended an IB workshop that offered guidance for teachers on integrating AI in student support with IB essays, the Personal Project and so on. So I decided to review the original research behind the Edutopia article to see how it could inform the advice we offer to educators—and the conversations we have with parents and caregivers. I know how urgently schools and families are grappling with this. Every district is trying to build the plane while flying it—and no one has the full map yet.

šŸ” What the Research Found

The original research cited in the Edutopia article is entitled:Generative AI Can Harm Learning. At first glance, the results look exciting:

Students using GPT Base performed 48% better on practice problems.
Those using GPT Tutor saw an even bigger boost—127% better than the control group.

Sounds amazing, right? But here comes the plot twist:

🚫 When AI was removed during the final exam, GPT Base students scored 17% worse than students who never used AI at all.
 šŸš« Even GPT Tutor users didn’t outperform the traditional learners.

In other words, AI boosted short-term results—but didn’t lead to lasting learning.

āš ļø What Went Wrong? A Few Key Patterns

🩹 Overreliance on AI as a Crutch

Instead of solving problems themselves, many students—especially those using GPT Base—simply asked the AI for answers. When the training wheels came off, their performance suffered.

It’s a classic case of ā€œHelp me… but don’t make me think too hard.ā€

šŸŽ­ False Confidence, Real Gaps

Students believed they were learning more with AI. But their test scores told a different story.Many students used AI to get answers rather than engage with problem-solving. Instead of thinking through problems, students in the GPT Base group often copied answers directly from ChatGPT. This led to worse performance when AI was no longer available (17% drop in test scores compared to students who never used AI).

What is more, surveys showed that students believed they learned more with AI. This gap between perceived understanding and actual learning is dangerous—it builds overconfidence, not competence. Even GPT Tutor users reported feeling confident, despite not performing any better than the control group.

āŒ AI Isn’t Always Right

In the study, GPT Base only provided correct answers 51% of the time.

  • Students who blindly trusted AI responses without checking for mistakes ended up reinforcing incorrect knowledge. Especially in math, where precision matters, ChatGPT proved... let’s say, a bit overconfident.

  • GPT Base got math answers wrong 49% of the time.

  • Many students, especially those struggling academically, trusted those wrong answers without checking them.

So not only did they get incorrect answers—they also reinforced misunderstandings.

šŸ“‰ Weaker Students Lean on AI More—But the Skills Don’t Stick

Weaker students initially benefited the most from AI, especially when grappling with tricky content. But when AI was removed?

šŸ’„ The wheels came off.

Without AI’s support, these students struggled the most, revealing that the concepts hadn’t really sunk in. It’s like practicing with a tutor who holds your pencil—and then being asked to draw the picture on your own.

AI temporarily closed the gap, but it didn’t build long-term skills. And unfortunately, the most vulnerable learners paid the biggest price.

šŸ” AI Doesn’t Permanently Reduce the Skill Gap

Researchers used the Herfindahl–Hirschman Index (HHI) to measure grade disparities. During AI-supported practice, weaker students caught up. But by the final exam, the gap reappeared.

šŸ’” The takeaway: AI didn’t permanently strengthen learning—it just provided temporary scaffolding. Even GPT Tutor, with its hint-based approach, didn’t improve final test scores. While it reduced overreliance, it still didn’t lead to sustained skill development.

🧠 Why This Matters: Learning How to Learn

These findings underscore something essential: we must teach students how to learn, not just how to get answers.Learning isn’t just about completing the task—it’s about building the thinking muscles that help kids become curious, independent problem-solvers. That’s the skill they’ll need in a future filled with evolving tech, new challenges, and complex decisions.

And it all starts with a question.

šŸ’” Curiosity Sparked the Greatest Discoveries

Throughout history, some of the most groundbreaking discoveries began not with an answer—but with a question.

🧬 Rosalind Franklin asked:
"What is the structure of DNA?"
Her X-ray imaging revealed the double helix—changing science forever.

šŸ’» Steve Jobs asked:
"Why can’t computers be beautiful and easy to use?"
That question gave us the Macintosh, the iPhone, and a whole new era of design.

🌌 And of course, Einstein reminded us:
"Never stop questioning."

AI is here to stay—but how we teach students to use it makes all the difference. We don’t need to resist AI—we just need to stay intentional. Let’s guide students to become curious, thoughtful learners, not just faster answer-seekers.

Because at the end of the day, a sharp question beats a quick answer every time.

šŸ’”Parent | Caregiver Coaching Notes

Here’s how you can support your child in developing stronger thinking habits—even when AI is part of the learning process:

šŸŽÆ Embrace the Question, Not Just the Answer

Whether they’re using ChatGPT or solving a math problem the old-fashioned way, remind your child that struggle is part of the learning process.

Try the ā€œ3 Whysā€ Strategy

Here’s a simple, powerful questioning routine to encourage reflection and reasoning (far away from screens):

1. Why do you think that’s true?
2. Why is that important?
3. Why might someone else think differently?

Use it at the dinner table, in a journal, or after homework—it’s a great way to build metacognition. This strategy is based on a thinking routine that educators use from Project Zero, a Harvard University initiative.

The original research article:
Bastani, Hamsa, Osbert Bastani, Alp Sungu, Haosen Ge, Ɩzge Kabakcı, and Rei Mariman. 2024. Generative AI Can Harm Learning. University of Pennsylvania.

Please vote below to help me improve the Big Idea posts.

I would love your feedback

Login or Subscribe to participate in polls.