Is AI Going To Rot Our Brains?

By Layne A. Gritti DO, Adult, Addiction, and Perinatal Psychiatrist

I have a strong interest in Artificial Intelligence (AI) technology for reasons that are common: it's a very cool new technology that's easy to use and makes so much more knowledge accessible. It helps me refine my writing (like in this blog post) to get my point across more effectively. It also takes notes for me during my patient meetings, allowing me to be more present with the patient I'm speaking with. It helps me search every available medical journal all at once to answer a specific patient question. It helped me find a photographer for specific dates, times, and prices so I didn’t have to search every local website (this saved me hours of my life!). It helps me visualize how new curtains will look in my house through images. It can even do a color analysis!

But as with any transformative technology (think about the radio, TV, computer, calculator, internet) there's always concern for what is lost when we rely on a new tool to do work we used to do in our brains. These concerns extend far, from cognitive function to environmental concerns and the lack of genuineness when everything online (or every email) is overly polished.

The Cursive Writing Parallel

When was the last time you ...did long division on paper? ...went to the library to look up the answer to a question? ...wrote a letter to a friend?

I always think of cursive as the perfect example here. While cursive has made a comeback in recent years, with an increasing number of states requiring it to be taught, there was a significant period when it was largely abandoned. The 2010 Common Core Standards removed cursive requirements, and many schools stopped teaching it entirely.

This is not just about nostalgia. Research has demonstrated that learning cursive handwriting activates specific neural pathways and brain regions that typing simply doesn't engage. Studies using brain imaging show that cursive writing activates areas of the brain involved in thinking, language, and working memory to a greater extent than keyboard typing. Cursive writing helps develop "functional specialization" in the brain, which is the capacity for optimal efficiency that integrates sensation, movement control, and thinking.

Is this change just a natural part of the evolutionary process, or have we lost something vital?

Will the young brains that don't learn cursive not be able to reach their full potential?

The Science Behind AI and Cognition

Recent research reveals a complex picture of AI's impact on our cognitive abilities. A study from MIT led by research scientist Nataliya Kosmyna found concerning evidence that excessive reliance on AI tools can lead to cognitive atrophy: a decline in core cognitive skills like cognitive effort, metacognitive engagement (the awareness of one’s thought processes and an understanding of the patterns behind them) along with an increase in digital fatigue and loneliness with less interpersonal relationships and analytical thinking. This follows the brain's "use it or lose it" principle: when we don't regularly exercise certain cognitive muscles, they weaken.

The concern is particularly acute for developing brains. Children and adolescents who rely heavily on AI for schoolwork may miss crucial opportunities to develop independent reasoning and problem-solving skills.

However, the relationship isn't simply linear. Evidence shows that moderate AI use doesn't significantly impact critical thinking, but excessive reliance leads to diminishing cognitive returns. The key word here is balance.

Cognitive Offloading: Tool or Crutch?

Cognitive offloading occurs when we externalize mental processes to technology. This isn't necessarily a bad thing. GPS navigation or calculators can free up mental resources for more complex tasks. But when AI tools consistently handle tasks that require deep thinking, we risk cognitive atrophy.

Consider the implications: when individuals consistently turn to AI for routine cognitive tasks, they miss opportunities to practice and refine analytical abilities. The result could be a population that can be efficient in the short term for simple tasks but less capable of complex tasks over time. This is the example too much technology innovation makes me think about.

The Neuroplasticity Factor

Here's where the story gets more hopeful because the human body is amazing. The human brain has a remarkable ability to adapt called neuroplasticity, which means we're not locked into cognitive decline. Our brains can form new neural pathways and modify existing ones throughout our lives. To simplify: you abosolutely can teach an old dog new tricks! The key is intentional engagement.

When we use AI as a complement rather than a replacement for human cognitive skills, we can actually enhance our abilities. Interactive games and brain-training activities have been found to improve memory, attention, and problem-solving skills, particularly in older adults.

Evolution or Loss?

So should we all ditch our smartphones and return to flip phones or snail mail? If only it were so easy to choose one or the other! Every technological advancement has required us to make trade-offs. We've gained unprecedented access to information and computational power, but we may have lost some capacity for sustained attention and deep reflection.

The question isn't whether we're evolving—we clearly are. The question is whether we're evolving in directions that serve our long-term wellbeing and cognitive health.

Finding the Balance

The research points to several strategies for maintaining cognitive health in an AI-enhanced world:

  • Intentional Use: Rather than reflexively turning to AI for every task, we can be more deliberate about when and how we use these tools. Use AI to enhance your capabilities, not replace them entirely.

  • Cognitive Cross-Training: Just as physical fitness requires varied exercise, cognitive health benefits from diverse mental activities. Continue engaging in tasks that require deep thinking, problem-solving, and creativity.

  • Educational Awareness: Understanding how AI affects cognition can help us make better choices about when to engage our own mental processes versus when to leverage AI assistance.

  • Moderate Integration: The sweet spot appears to be moderate AI use that supports rather than supplants human thinking.

The Path Forward

We're not simply developing new skills and evolving alongside our culture and technology. We are being shaped by these tools in ways we're only beginning to understand. The key insight is that AI itself isn't the problem. The problem lies in how we choose to integrate it into our cognitive lives.

Just as we learned to navigate previous technological transitions, we can learn to use AI in ways that enhance rather than diminish our cognitive abilities. This requires intentionality, awareness, and a commitment to maintaining the mental skills that make us uniquely human.

The future isn't about choosing between human intelligence and artificial intelligence; it’s about finding the right balance between the two. That balance starts with recognizing that how we use these powerful tools matters far more than the tools themselves.

Previous
Previous

Generalized Anxiety Disorder: When Your Brain’s Alarm System Won’t Turn Off

Next
Next

The First Step to a Healthy Baby: Protecting Mom’s Mental Health