Although artificial intelligence provides clear advantages in managing data and improving education, its increasing presence in journalism—particularly in reporting on government—brings critical concerns regarding media understanding and democratic responsibility.
Fulcrum Fellow Jared Tucker examines how efficiency frequently sacrifices deep thinking,
Being seated in the rear of a classroom offers a glimpse into modern university experiences.
Numerous computer screens facing you show students engaged in online shopping, carefully taking notes, and naturally utilizing artificial intelligence.
The use of AI among college students has become common, according to the Global Education Council, which discovered that 86% of students regularly utilize AI in their studies.global survey. Although language learning models (LLMs) can assist students in structuring notes or mastering material more effectively, their adverse impact on critical thinking has prompted teachers to search for methods to limit their usage while recognizing their benefits.
All emerging technology, regardless of its form, goes through a panic cycle, but this represents a broad, forward-looking perspective on how learning and life will evolve,” said Caley Cook, the Journalism and Public Interest Communication Coordinator at the University of Washington. “This is having, and will continue to have, effects on decision-making and critical thinking for everyone. It is highly harmful for students to delegate critical thinking in a university setting where they are meant to develop their ability to think independently.
A 2025 study from Phys.orgThere is a strong link between low critical thinking scores and the use of AI technologies. As students depend more on AI to finish assignments, they may lose the advantages that these tools are meant to provide in their learning process.
Obtaining a degree is about thinking like a scholar, and that’s a skill,” Cook stated. “You need to apply it repeatedly, make mistakes, and keep trying to strengthen that skill.
But if AI has the potential to take away students’ learning opportunities, why do they continue to use it?
Certainly, higher education presents challenges. Learners must balance their academic tasks, assignments, personal relationships, and employment responsibilities. Thanks to the capacity of large language models to provide instant responses, they can help alleviate the intense demands of college studies.
AI is so fast and simple that now, many students don’t have the time to sit and study for hours each day,” said Tzuriel Jennings, a sophomore at the University of Washington. “Many individuals use it to quickly rush through an assignment when they don’t have much time or motivation to complete it.
However, in addition to its capacity to ease the burden, LLMs have been transformative in overcoming language differences, supporting student research, and structuring data.
Artificial intelligence can definitely assist with studying, particularly when determining what and how to study while trying to cram,” Jennings said. “I utilize it to help me develop flashcards or study guides, saving me hours of planning before I start studying.
This is where the challenging issues in AI arise. It’s a resource that can aid students in advancing their education but may also hinder their development of critical thinking. How can an instructor prohibit something that supports learning while permitting something that undermines their capacity to learn?
If AI can perform the task in a beneficial, efficient, and time-saving manner, then it’s a positive application of AI,” Cook stated. “The element you overlook when you avoid challenges is the mistaken belief that learning is simple—that there’s a clear right or wrong answer and that reaching the correct solution happens quickly.
This issue has also appeared in the field of journalism, an industry that faces threats from artificial intelligence. Established and trusted sources like the Associated Press have already started using AI to write articles and cover news. Is it worth taking the risk for the sake of avoiding routine tasks in the only industry in the United States that is constitutionally protected?
Regrettably, AI is still in its early stages and continuously changing, which complicates efforts to limit its usage. Students can readily bypass techniques embedded in prompts, and traditional paper-based methods are not always practical, pushing many instructors to adopt this modern approach to teaching.
Individuals who are approaching this with seriousness have shifted towards increased group activities, oral evaluations, in-class tasks, and considering how students engage with assessments and apply their learning both inside and outside the classroom,” Cook stated. “I modify my courses annually to align with current circumstances.
Students would concur. Openness is the solution.
I believe [professors] ought to be transparent with their students regarding their expectations about AI,” Jennings stated. “While some students will inevitably use it, fostering a clear and open dialogue about AI can be beneficial.
However, as large language models keep exploring new data, advancing, and altering their style, their potential to harm students’ learning experiences continues to increase. In this highly distinctive challenge, today’s remedies will not be sufficient for tomorrow’s issues.Jared Tuckeris a second-year student at the University of Washington – Seattle, majoring in Journalism and Public Interest Communication, and minoring in History.
Jared Tucker, a student at the University of Washington, is part of the cohort program with theFulcrum Fellowship.
