Breaking the AI Barrier: Reframing AI as an Accessibility Tool for Diverse Learners
- Eric Hayes
- 2 days ago
- 4 min read

For many of our learners—whether neurodivergent students, multilingual learners, or those who need more structured support—education isn’t just about mastering content. It’s about navigating barriers, many extending beyond the classroom and our control. As educators who support these students, we are tasked with removing those barriers so they can access their education in a way that works for them.
But there’s another, often overlooked, barrier—a newer challenge rooted not in access, training, or ability, but in perception.
The Unseen Barrier: Fear of AI
AI has the potential to be one of the most powerful accessibility tools available to our students. But how has it been introduced to them? In many cases, through fear.
Concerns about cheating, misinformation, and ethical use are valid. But in the rush to protect students, we may have unintentionally created a mindset that AI is inherently bad. That it’s untrustworthy. That using it is cheating.
I’ve seen this firsthand. Over the past two years, I’ve introduced AI to my students in ethical, responsible ways—demonstrating how it can support learning, not replace it. Still, I often hear the same reactions:
“Oh no, I can’t use it. That’s cheating. You can’t trust AI.”
The barrier had already been built before they even had the chance to engage with the tool. I could push past that mindset for some students and show them what was possible. But for others, that door was already closed.
I had a student with autism who also struggled with anxiety, and something as simple as writing an email was a significant challenge. AI could have helped him structure his thoughts, reduce his stress, and advocate for himself. But he wouldn’t even consider it because he had been told that using AI was wrong.
How Educators May Be (Unknowingly) Reinforcing the Barrier
This is the challenge we now face, especially as educators supporting neurodivergent learners and multilingual learners. Even as AI improves and safeguards evolve, many students have already been conditioned to reject it.
And if we’re honest, some of that mindset may have come from us.
Have we dismissed AI outright—not because it has no value, but because we don’t yet fully understand its role in education? The rapid pace of change understandably brings anxiety. Many educators fear losing control or that their students will rely too heavily on AI rather than learning critical skills.
However, if that fear leads us to reject AI entirely, we risk doing our students a disservice.
Here’s the reality: Some universities already expect students to use AI. Some workplaces do, too. Imagine a student taught that AI is cheating—only to arrive at university and be expected to use it. What message does that send?
Modeling Thoughtful, Ethical Use
This isn’t about throwing caution aside. AI should be approached thoughtfully, with ethical considerations at the forefront. But keeping an open mind rather than shutting AI down ensures we don’t contribute to the very barrier we’re trying to dismantle.
So how do we ensure students see AI for what it truly is—a tool, not a threat?
We start by modeling its use ourselves.
That doesn’t mean blindly promoting AI or ignoring valid concerns. Instead, it means incorporating AI into our teaching in ways that demonstrate both its strengths and its flaws. We can show students how AI can:
Read text aloud for those with processing challenges
Suggest ways to structure writing
Simplify complex academic language
Help draft ideas they can then revise and make their own
We can also demonstrate how to recognize when AI gets things wrong. Most AI tools display disclaimers about potential inaccuracies. Instead of skipping past those moments, we can pause and ask students:
Why might this be incorrect?
How can we verify this information?
What should we do when AI gives a flawed or biased response?
These are powerful learning moments—ones that build digital literacy, critical thinking, and ethical awareness.
Empowerment Through Access
By actively engaging with AI in this way, we empower students to see it as a support, not a shortcut. We give them space to ask questions, interact with the technology safely, and develop the discernment they’ll need as AI becomes an expected part of higher education and the workforce.
For neurodivergent students and multilingual learners, this is especially critical. These learners are already navigating multiple access points. AI can serve as a bridge, helping them process information, express their ideas, and advocate for themselves in ways that weren’t previously available.
But if we don’t take the time to engage with it ourselves, if we don’t model how to use AI ethically and thoughtfully, we risk reinforcing yet another barrier. One that could follow them far beyond our classrooms.
Our role is not to remove tools that could empower students simply because they’re unfamiliar or evolving. Our role is to foster independence, equip students with discernment, and guide them as they learn to navigate the digital tools that will shape their futures.
A Barrier We Can Remove
The barriers will always exist.
But this one?
This is a barrier we can remove.
Comentarios