This is how ChatGPT and other artificial intelligence writing tools operate, according to a webinar I attended Wednesday with a panel of international experts.
“Tools such as ChatGPT are a wake-up call for newsrooms about the rewards and risks of artificial intelligence capabilities,” according to the webinar hosted by The Associated Press.
ChatGPT and other generative AI tools should be a wake-up call for all of us. If you’re not familiar with it, you will be eventually. ChatGPT is a chat-bot developed by OpenAI. It was launched last fall and can essentially write anything that users prompt it to write. Instantly.
For example, would you like to see a four-page essay on William Shakespeare’s works? Done. How about a business proposal for a new software program? Just give it a second. Or maybe a social media post to market your business? No problem. It’s all about AI’s ever-growing capabilities of memorization and regurgitation.
People are also reading…
“It’s becoming the new Google,” the CEO of an engineering firm told me.
He uses it regularly for social media posts, internal emails to his staff and just about anything else for writing purposes. He wondered whether ChatGPT and other generative AI tools will someday replace what a professional writer or news journalist does daily.
“Let’s find out,” the CEO suggested.
He asked me the focus of an upcoming column. I told him about the debate about obesity being a disease or a lifestyle choice. (Saturday is World Obesity Day, and I’m intrigued by the differing and opinionated perspectives about obesity.)
Within seconds, ChatGPT spit out a themed 100-word essay on clinical obesity, citing the very debate I’m talking about. With another click, the app expanded the essay to 500 words, then 800. (The average length of my column is 900 words.)
Each version was impressive. But it was missing something. It was missing a voice, a heart, a soul. Describe it however you wish. For my job as a columnist, it also was missing a perspective and context in relation to you, our readers. I’ve depended on this crucial aspect throughout my writing career. I’ll continue to rely on this with the emergence of ChatGPT.
Although it’s hailed as a revolutionary technological breakthrough, it’s causing serious concerns for teachers and educators whose students can use it in place of their own writing and the critical thinking needed to develop it. My colleague, Will Skipworth, recently examined this topic with insights from Region educators.
“One major risk associated with ChatGPT is plagiarism. Students may use the tool to generate text or responses that are then passed off as their own work,” Hammond Superintendent Scott Miller wrote to his staff last month. His lengthy email was written entirely by ChatPGT.
The news world has similar concerns that were explored by Wednesday’s webinar.
“We explain the technology behind these tools, how newsrooms might take advantage of them and what to look out for as the industry begins to grapple with the emerging potential around Generative AI,” the outline stated.
Nicholas Diakopoulos, one of the expert panelists, is an associate professor of communication studies and computer science at Northwestern University.
“Top of mind for many journalists is the accuracy issue,” he said. “In my evaluation, I found that almost half of the responses had some kind of accuracy issue.”
Other issues include privacy, copyright, plagiarism, confidentiality, social biases and terms of service for third-party users.
How should ChatGPT be used for journalistic needs? Would these kind of tools blur the lines for readers? Or is it similar to using a calculator for math? The questions are endless. The answers are a work in progress.
“As we know, these are just statistical text generators,” Diakopoulos said. “They don’t have feelings. They don’t have intents and motivations. They don’t take initiative. They don’t have big abstract ideas. They just work one word at a time.”
Miranda Marcus, head of BBC News Labs, said the impact of ChatGPT and similar programs goes far beyond journalism. This technology ventures into the usage of metadata, coding applications, copyrights management and responsible usage.
Hank Sims, an editor at Lost Coast Communications in California, has been using generative AI tools for months.
“We’re not replacing people,” he said. “We’re using it to do things that weren’t possible for us to do before.”
Edward Tian, a 22-year-old Princeton University student and journalist, created GPTZero, a free and publicly available app that can detect whether a bot or a human is the author of a piece of text. His viral tool has since attracted millions of users.
“Generative AI is here to stay, but we need to use it responsibly,” he said.
With journalism, most technology has been adopted and utilized from the “back end,” he pointed out, including social media marketing and search engine optimization techniques. Generative AI will follow a similar route.
However, in turn, these AI tools are already tapping into journalists’ writings and regurgitating excerpts for public usage. Professional writers shouldn’t be surprised if their work, in some form, pops up for other users.
When it comes to using such tools as ChatGPT for professional writing and journalism, “this is where it gets a little dicey,” Diakopoulos said. “You’re definitely going to want to have humans integrated in the loop before publication.”
I’m one of these humans. I write with indelibly human aspects that ChapGPT and chatbots don’t possess with their instant collection of factoids, as he said. Emotion. Opinion. Feelings. Abstract ideas. More importantly, authenticity.
Columns such as this one may someday be able to be fully generated by artificial intelligence. But not today. This writing is all me. Word by word.
Contact Jerry at [email protected]. Watch his “She Said, He Said” podcast. Find him on Facebook. Opinions are those of the writer.