How ChatGPT is influencing your Brain
„Thinking is the hardest work there is, which is probably why so few engage in it.“
Recently, I came across a study about the impact of using ChatGPT and the effects might be bigger than you think.
The Hidden Cost of ChatGPT
ChatGPT is a powerful tool that can provide a lot of useful answers to our questions. It has made accessing information easier than ever before, but this convenience comes with a cost you should be aware of.
A few months ago, a group of researchers conducted a study on how using tools like ChatGPT influences our thinking and our performance when completing tasks such as writing an essay.
They recruited N = 54 students from five different universities, including MIT and Harvard, and divided them into three groups of n = 18 each.
Group 1 was not allowed to use any external tools and had to rely solely on their own thinking.
Group 2 was allowed to use the internet and search engines, but not large language models (LLMs) such as ChatGPT.
Group 3 was allowed to use ChatGPT 4o for support.
The groups stayed the same across three sessions, so each participant wrote three essays on assigned topics with the same type of support every time.
The Results
The results showed that the groups differed significantly in their neural connectivity patterns (p < .001). Brain connectivity decreased as reliance on external tools increased.
Participants in the ChatGPT group showed markedly weaker neural connectivity and under-engagement of key networks, including those responsible for memory recall, attention regulation and error monitoring.
By contrast, the brain-only group showed stronger activity in memory-related areas and regions involved in visual processing.
The researchers also tested how well participants could quote from their own essays and how much ownership they felt over their work.
More than 80% of the brain-only and search-engine groups could correctly quote their own essays, whereas over 80% of the ChatGPT group could not. This pattern was mirrored in the subjective feeling of ownership, which was highest in the brain-only group and lowest in the ChatGPT group.
The essays themselves differed in structure and style as well. While the brain-only group used more personal examples and original ideas, the essays written with the aid of ChatGPT were more homogeneous. One plausible explanation is that ChatGPT generated most of the content and the users mainly curated it.
Cognitive load, the amount of mental effort required to process information, hold it in working memory and complete a task, differed between the groups too. The brain-only group showed the highest cognitive load because they had to generate the ideas themselves. The ChatGPT group showed much lower cognitive load, as participants outsourced a large part of the effort to the AI.
The search-engine group sat in between. Participants in this group still reported a strong feeling of ownership and showed brain activity consistent with processing and integrating new information. Their cognitive load was moderate.
Overall, the ChatGPT group performed worse than both the brain-only and the search-engine group on all levels: neural, linguistic and in terms of scoring.
In a fourth session, however, the researchers changed the rules. They removed ChatGPT access from the former AI group and turned them into a brain-only group, while the original brain-only and search-engine groups were now allowed to use ChatGPT.
Participants from the former brain-only group, who now used ChatGPT, stayed neurally active and could remember their essays better. In this case, ChatGPT functioned as a tool rather than a crutch.
But participants from the former AI group, who suddenly had no support, maintained their reduced, weakly connected brain activity and continued to struggle to cite their own work.
This effect was not just momentary, it resembled a form of cognitive debt.
That cognitive debt was the price paid for the short-term convenience of outsourcing thinking to AI. When the tool was taken away, the brain was less prepared to do the heavy lifting on its own.
So, how to use AI?
The big question remains: is using ChatGPT making us dumb? The short answer is: it depends.
If you introduce ChatGPT too early and rely on it too heavily, the integration of information can be disrupted because too much of the cognitive processing is outsourced to AI. Holding back on AI at the beginning can support memory formation and helps you reactivate your knowledge when you use AI later on.
If AI is introduced later, metacognitive engagement is higher because the already integrated knowledge can then be compared with the answers of the AI. This allows you to reflect on what you already know and where it needs refinement.
So the order and purpose of using these tools matter if you want to use them efficiently and to your advantage. If you use AI too early, your thinking becomes more superficial and biased, you risk importing its blind spots into your own decisions and you undermine your own independent thinking.
AI should be treated as augmented intelligence, not as a replacement. It can speed you up, but it can never take over responsibility for what you think, write and decide. Humans need to stay in the loop and work with AI, not be replaced by it.
So it´s not about avoiding AI. It´s about avoiding the trap of letting it think instead of you.
Train your brain first, then let AI challenge and extend your thinking!
Reflection starts with dialogue.
If you’d like to share a thought or question, you can write to me at contact@lucalbrecht.com
Thinking from Scratch
by Luc Albrecht
Exploring how we think, decide and create clarity