The main ideas in this article were processed by ChatGPT based on my notes from the course at Craig Newmark J-School in New York. The first draft of the article has been later revised by a human (It’s me, hi!), adding a personal tone and voice, and ideas that the original version left out.
And links. I’ve added so many links for additional context.
More than out of laziness, this exercise is done to show that artificial intelligence (AI) is getting extremely good at delivering decent and presentable work, but there’s still a need for edits from a human that adds some sugar and spice, to differentiate the final product from the vast amount of bland content out there, making it somewhat unique.
When AI took off as a buzzword, like the metaverse or blockchain before, I was on the fence, initially. When the AI-generated ‘magic avatars’ flooded your feed I voiced my concerns very loudly over what that meant for artists’ work and who was profiting from unreferenced work.
Seven months later, out of cynicism or simply survival instinct, I’m trying to embrace this change. AI is here to completely challenge all jobs that involve creative work or data processing, and we better cautiously adapt.
There is a clear need to legislate it, make sure there’s transparency on the datasets used and that original creative work is fairly recognized and compensated, but also an urge to welcome this inevitable shift.
The importance of embracing change
AI has become a driving force in various industries, and the journalism sector is no exception. With the rise of AI-driven technologies, news organizations are experiencing a significant transformation in the way they gather, produce, and deliver information to the public.
In the fast-paced advancements in AI technology, is important to embrace change but be selective in adopting new technologies.
News organizations can ensure that they stay on the right track by carefully evaluating all the AI solutions available and sayingNOto those that may compromise ethics, quality, or credibility.
Let’s get started
Adopting AI technology in the newsroom requires careful planning and execution. A strategic approach is essential. Newsrooms must identify the specific areas where AI can enhance efficiency, such as automating data-driven stories, optimizing social media scheduling, or summarizing public meetings. Focusing on these needs is the right place to start.
The impact of AI in the news industry
Revolutionizing information delivery: The advent of AI-powered tools like ChatGPT has transformed information delivery, making it more accessible and engaging. Similar to how Windows revolutionized basic programming with its user-friendly interface, AI technologies can make complex data more understandable to the masses.
Fact-checking and ethical concerns: As AI-generated content becomes more prevalent, concerns arise about fact-checking and ensuring content accuracy. There are additional ethical considerations surrounding the use of AI for image recognition, personalization, and data mining, as well as the potential biases ingrained in training data.
Generative AI and its challenges
Generative AI is capable of generating text, images, or other media in response to prompts, powered by Large Language Models (LLMs) like, the most popular right now, ChatGPT.
It has become a game-changer in content creation. However, it also poses several challenges. The black-box nature of training data and the opacity of outputs raise questions about transparency and accountability. Legal and intellectual property issues also come into play, as AI-generated content may inadvertently infringe on copyrights.
Addressing pain points in newsrooms
AI offers solutions to various pain points faced by newsrooms, such as automating weather forecasts, police blotter summaries, and email sorting. The Local News AI initiative by the Associated Press, producing open-source projects, aims to address some of these challenges and make AI tools accessible to all newsrooms.
Building an ethical and trusted AI framework
To overcome the challenges of AI for news and journalism, it is vital to establish an ethical framework. News organizations must focus on building their own LLMs with licensed, trustworthy content. Relying on third-party tools should be minimized, while embracing experimentation and encouraging standards evolution.
Not everyone is Bloomberg Media and not every news organization can afford such a great investment in technology, but they present an interesting example: Bloomberg’s audience trusts the information they receive is reliable and accurate, they make important decisions based on it, some even pay $24,000 every year for access to the Bloomberg Terminal. One thing that ChatGPT cannot fully guarantee just yet is full reliability and accuracy, so what did they do? Develop their own: BloombergGPT.
With a lower budget, there’s Alex, MTL Blog’s new AI-powered concierge.
Future possibilities and jobs
AI’s transformative potential in journalism extends beyond content creation and could result in the revamp of some roles and the disappearance of others.
Archivists can be the new best friend of AI to organize vast archives of past content for better research that allows AI assistants to provide better background information on topics or people based on archival data.
On the other hand, if your specialization is search engine optimization (SEO), you may want to start considering a career change? The future of search traffic may see a shift as summaries of articles become more prevalent, impacting established SEO practices.
The camera lies
A lot is said about how generative AI can easily produce text content, but a separate chapter needs to be dedicated to how good it is at creating visual content (image and video), adding full new layers to an already complex debate over photo manipulation online.
At the same time, cameras (specially on smartphones) come with increasingly efficient AI features to identify objects, scenes and optimize camera settings. Turns out, the camera does lie.
There are ethical questions on the inclusion of face recognition in AI tools, and with increasingly accurate technologies for deepfakes – which are realistic fake videos and photos created using AI to superimpose or replace someone’s face, voice and likeness – it becomes increasingly harder to tell what images are a factual representation of reality.
Standards like EXIF (available on platforms like Flickr) tell you a lot about an image: the camera that was used, settings, if it was edited with software… but as soon as you take a screenshot, all that metadata is gone.
It’s not an easy endeavor, but there is a need for an industry-wide initiative across news organizations to certify images have (or have not) been completely or partly generated by AI tools. It’s necessary to retain the trust from the audience that what we are seeing is factual.
Are we in a Black Mirror episode?
AI is reshaping the landscape of news and journalism. While it offers tremendous benefits, it also presents ethical and operational challenges that need careful consideration.
By adopting AI strategically, fostering transparency, and prioritizing ethical practices, the news industry can leverage AI’s potential to revolutionize journalism for the better.
But it all can also go dystopian-welcome-to-a-Black–Mirror-episode level wrong. News content doesn’t exist in a vacuum and I feel it is wrong to write an article about AI in the summer of 2023 and not talk about the double strike of actors and writers in the United States, which has generative AI at the center of their labour disputes.
In solidarity with their demands, and wishing for them nothing but the best in their negotiations, I’m sorry to be the downer predicting generative AI will be part of the TV & film industry shortly.
The way I see it, like we have haute cuisine restaurants and fast food restaurants, we will have movies and TV as we know them today, and then mass-produced low-effort content made by AI. Then, it will be up to the audience to choose what we spend our time and money on.
And the same applies for news content: audiences have the power to choose news sources that rely and trust the human craft of journalism to deliver news accurately, maybe with the support of AI as an additional tool (we no longer use ink and typewriters, after all), or to misinform ourselves with AI-powered ‘news’ outlets producing something that resembles news but is full of errors and inaccuracies, mass-produced in content farms. Media literacy is key.
The journey towards an AI-driven future might not be easy, especially when stakeholders have differing attitudes towards change. However, with the right balance of urgency and careful planning, news organizations can navigate the avocado curve of change and embrace the transformative potential of AI in media and journalism.
AI technology presents extraordinary opportunities to make the process of producing and distributing news more efficient. But it also offers appalling possibilities to impoverish democratic societies with manufactured, unverified and mass-produced news content.
In conclusion, news producers face an ethical choice in how they use AI technology, while audiences need to critically and consciously consume and evaluate news content.
Befriend ChatGPT
You probably can tell by this article that my relationship with this popular chatbot has substantially changed since I came back from New York. I use it very often now, to assist me in more tasks than what I’m comfortable admitting in the public forum that is this article.
If you want to give ChatGPT a chance, here are some best practices:
- Use active language: ask to summarize a story, build a headline…
- AI can take 80% of the way, the rest should be human
- Experiment and encourage experimentation
- Define your standards, knowing they will evolve
In my experience, it also helps tremendously to start new ‘chats’ for each new project you are working on and don’t be afraid to clarify what you are looking for (i.e. “write a single sentence”) or regenerate a response if you are not happy with the initial result.
Other tools and resources
These are other tools that journalists can explore to leverage AI effectively:
- Google Bard
- Perplexity.ai
- GPTZero
- Google Pinpoint
- Legitimate
- OpenAI examples
- AP’s Local News AI
- CBC’s AI guidelines
If you know of other tools, contact me to add them to the list.
Could you identify the parts of the article generated by ChatGPT and what I wrote or edited?
This article is the second in a series where I’m collecting my thoughts after attending the 3-day Transformation Boost course at Craig Newmark Graduate School of Journalism at CUNY, in New York City, in July 2023. This is an open conversation, share your views in the comments or contact me.
4 thoughts on “AI for news, as written by ChatGPT”