It Thinks for Them”: What Italian Educators Are Saying About Youth, AI, and Autonomy

Artificial intelligence is no longer an abstract concept in the Italian classroom. From generative platforms like ChatGPT to algorithm-driven learning tools, AI has become a daily companion for students—and a growing concern for educators.

As part of the Erasmus+ project YouthGovAI, ALFA Liguria convened a national focus group with a diverse panel of educational stakeholders, including teachers, guidance professionals, youth workers, and STEM experts. The aim was clear: to gather insights from those who work directly with young people and are witnessing firsthand how AI is reshaping not only learning practices, but also critical thinking, civic awareness, and social development.

The reflections that emerged were far from techno-enthusiastic. While participants acknowledged the potential of AI to support personalized learning, bridge access gaps, and streamline educational tasks, the discussion quickly moved to deeper questions—about autonomy, ethics, and the future of youth agency in an increasingly automated world.

One of the most recurring concerns was the progressive erosion of critical thinking. Educators reported a growing tendency among students to rely uncritically on generative AI tools, bypassing reflection in favor of fast, automated responses. Several spoke of an “outsourcing of cognition,” where learners, though digitally fluent, are less inclined to question sources, verify claims, or engage in analytical reasoning. As one youth worker put it, “It’s not that they can’t think. It’s that they don’t need to—because the machine does it faster.”

Another key issue was the invisibility of AI systems in students’ lives. Despite high levels of interaction with AI-powered platforms—whether through content recommendations, search engines, or learning apps—young people often do not recognize these systems as AI. This, educators warned, prevents them from understanding the logic, biases, or intentions embedded in the tools they use daily, and leaves them vulnerable to manipulation, misinformation, or uncritical dependence.

Participants also expressed unease about job insecurity and the shifting meaning of work in an AI-driven economy. While these are not new themes, educators highlighted a growing sense of anxiety among students about whether their skills—and even their aspirations—will remain relevant in the near future. STEM experts raised particular concerns about the risk of “disempowering automation,” where the very tools designed to assist may inadvertently limit young people’s ability to explore, create, or innovate independently.

But perhaps the most urgent concern raised was that of democratic exclusion. Educators agreed that AI is being developed and deployed in ways that largely exclude youth voices, both in policymaking and in institutional settings. They lamented the absence of age-appropriate educational materials on AI ethics, digital rights, and algorithmic governance, noting that most AI education in Italy is still focused on technical training—if offered at all—rather than on cultivating civic and ethical awareness.

This mismatch between exposure and empowerment was described as a “structural deficit of inclusion.” While students interact with AI constantly, they rarely encounter structured opportunities to reflect on what it is, how it works, or what it means for their futures. As one teacher put it: “We teach them to use the tools, but we’re not teaching them to govern them—or to question them.”

The call that emerged from the focus group was clear: AI education must be reimagined as civic education. This means embedding discussions about power, rights, responsibility, and ethical design into curricula at all levels, from secondary schools to vocational training centers. It means creating participatory spaces—forums, workshops, simulations—where young people can critically engage with AI systems, and begin to see themselves not as passive users, but as active contributors to technological futures.

The YouthGovAI project embraces this vision. By working with educators across Europe, it aims to co-develop teaching materials, training modules, and policy recommendations that make AI literacy both critical and accessible. Because shaping the future of AI means more than training the next generation of coders. It means raising a generation of citizens—capable of understanding, questioning, and participating in the digital systems that increasingly shape their lives.

Leave a Reply

Your email address will not be published. Required fields are marked *

toggle icon