JMS 430 Story Project

Rise of artificial intelligence: opportunity or challenge in a changing world?
How journalists, educators and readers are navigating the promise and uncertainty of AI
The newsroom is a sacred place that has been in a constant state of development since its early days — a space built on collaboration, diversity, trust and a drive to serve the public interest. For years, newsrooms have operated in a strictly uniform manner — report, write, edit and publish, but recently we’ve seen a change to this typical routine.
Over the decades, every wave of new technology has pushed newsrooms to evolve. From shifting to digital platforms to adapting to the rise of social media, each change has reshaped how news is produced, but none has sparked as much of a debate as artificial intelligence.
Today, that debate becomes urgent as AI transforms how information is gathered, written and distributed. Newsrooms across the country are racing to adapt. Some think it’s a powerful tool for reporting and reaching audiences quickly, while others worry it threatens accuracy, ethics and jobs. From established media outlets to college classrooms, journalists and educators alike are grappling with how to integrate this technology responsibly.
Helping readers understand artificial intelligence in journalism
The struggle of navigating AI is not a task that only journalists face, as the issue extends far beyond the newsroom. Everyday readers are also seeking to better understand this emerging technology and what it means for them.
Trusting News, a journalism support organization, recently conducted a research study which found that 94% of people believe that journalists should be fully transparent and honest about their use of AI.
Lynn Walsh, assistant director of Trusting News, explained that readers are widely skeptical of AI use in general, but are also extremely curious. They want to educate themselves and learn more about the evolving tool.
“[People] want to know how it works,” Walsh said. “They want to know how to spot it, how to use it and how it's being used around them.”
She explained that Trusting News has gathered feedback from its audience and, in turn, has shifted its focus to one main question: How can we help educate our readers?
“I see it as an opportunity to potentially rebuild trust and relationships with audiences and their communities through giving them something that they need and want, which is AI literacy,” Walsh said.
Rather than assuming what readers want to know about artificial intelligence based on typical concerns — such as its environmental impact or the spread of disinformation — Trusting News has partnered with 10 newsrooms across the country to share a survey with their audiences aiming to better understand what the public truly needs.
The survey centered on questions such as:
What do you want to know about the technology?
What are you worried about?
What are you curious about?
What format would be most helpful to learn from: Social media? A YouTube video?
These surveys will close in a few weeks, and the results gathered will help shape the type of content and formatting the newsrooms will create and share about AI, according to Walsh.
Educating audiences on artificial intelligence isn’t solely for the purpose of newsroom gains, rather it’s a process that connects both readers and professionals during a time of uncertainty, confusion and frankly fear.
“We want to help people,” Walsh said. “Let’s be that example of one responsible use, but then also be a partner in this with them so they can understand it, and bring them along with us as we learn too.”
Maintaining ethics amidst change
With the uncharted environment artificial intelligence creates for the world of journalism, news organizations are struggling to find their footing in managing this new technology.
Many argue that the main challenge of integrating AI into the newsroom lies in ethics. Maintaining a strong, trusting relationship with readers is what keeps the art of writing alive, but how should a newsroom determine what this means for a tool that is so new?
Bella Ross, a reporter for Voice of San Diego, said that its newsroom amongst countless others has begun to embark on this journey of finding a way to make sense of ethics when it comes to artificial intelligence.
“We’re hoping to create a general outline or options to bring to the newsroom as a whole,” Ross said. “Something like a version where you’re not allowed to use AI to write, but you’re allowed to use it as a companion, or you’re only allowed to use it later in the editorial process rather than earlier when you’re writing.”
Ross explained that although the process of creating ethical guidelines is underway, Voice of San Diego does not yet have a timeline for when they will be finalized due to other newsroom priorities and time constraints.
While highlighting her personal beliefs surrounding AI, Ross emphasized the importance of authentic, passionate storytelling.
“Writing is one of the most uniquely human things we have,” she said. “Humans want to create and appreciate art, and express ourselves through words and visuals … It just feels wrong to use AI in my writing.”
While skeptical, she noted that AI does bring slight advantages to the table when it comes to the more tedious tasks within a newsroom such as meeting scheduling or developing short form social media content.
“In theory, if I could spend less time doing little jobs that aren’t super important, I would love that, but in my particular role I feel like there aren’t a lot of ways I could use AI that don't completely undermine the parts of my job that I love,” Ross said.
While she believes AI use should be limited, others argue that it exists for a reason, and that ignoring these tools would be a missed opportunity.
Alicia Shapiro, Chief Marketing Officer and founding member of AiNews.com, said she believes that artificial intelligence is a key tool for story creation and utilizes the technology for the site's content on a daily basis.
Shapiro, alongside Michael Mazzotta, founded AI News in the hopes of creating a site that makes AI generated news stories understandable, approachable and accessible to people around the world.
When it comes to story writing, Shapiro utilizes AI in a unique manner compared to traditional news outlets, using it to find reputable sources, develop story outlines, draft the content itself and assist in the editorial process.
She added that AI allows her to maximize her workday in ways she now considers indispensable.
“I can usually get between three to four articles done in one day, and spend about an hour writing on each,” Shapiro said.
Although her use of artificial intelligence has become essential, she noted that there are several aspects of the technology that fall short of human capabilities.
“AI has mastered the skills, but what do we bring to the table that they don't?” Shapiro questioned. “It’s the human experience; it’s your personal views; it’s how you think. These are the things that fuel a story and separate a computer from a writer.”
As AI continues to advance and develop, there’s no telling where the future of its abilities will land, but it's important to remind ourselves that the technology is here to help us, not replace us.
“It’s a thinking partner really,” Shapiro said. “It helps me think and work through things, but we can’t rely on it to do everything.”
Artificial intelligence in the classroom
While newsrooms work to define their relationship with AI, educators are facing their own set of challenges. Across campuses, teachers are rethinking how students learn, collaborate and demonstrate their skills in an age of instant information.
Lourdes Chacón, assistant professor in the School of Journalism and Media Studies at San Diego State University, explained that even established education departments at SDSU are attempting to alter their curricula in an effort to manage the challenges AI brings to learning environments.
Faculty groups from different JMS majors — such as advertising, journalism and media studies — have been exploring how AI can be used in their classrooms in a constructive way that acknowledges its benefits but also addresses the risks it poses to developing students’ writing and analytical skills, according to Chacón.
“We have to train our students in a way where they know how to use these tools instead of teaching them skills that can be easily replaced by AI,” she said.
In order to adapt to these changes, Chacón has had to weigh her options when it comes to how much she wants to incorporate AI in her classroom and consider what kind of impact this will have on students.
“The university has been very supportive in terms of giving us access to tools like Gemini, ChatGPT and Copilot Microsoft, but also in creating spaces for us to be trained and have productive discussions about these tools,” Chacón explained.
SDSU has encouraged its faculty to participate in mini AI certification courses in order to help teachers update their syllabi, and it has also kept educators up to date on the best approaches to the technology in an educational setting, according to Chacón.
Over the past year, Chacón has begun to incorporate AI into different class assignments to test the waters and familiarize students with the process, ultimately embracing the technology more than she had expected.
“The way I phrase it in my syllabi is to approach the technology as a creative collaborator that is going to support you in your endeavors, and maybe even challenge you in a constructive, productive way,” she said.
Even with the addition of AI tools in her classroom, Chacón warned that we must remain cautious in the face of this ever-emerging technology.
“We have to be very alert,” she said. “It’s a burden, but we have to keep up with what’s out there for our own good.”
Photo: Advanced journalism students present their semester final projects for professor Lourdes Chacón’s class (Photo by Lily Fadell)