Generative AI tools are exciting but have limitations. We don’t believe there is any room for unreliability in our journalism. Given our goal is to always act in the best interests of the public – without fear or favour – we will tackle the challenges Generative AI creates, including trust in media, protection of copyright and bias.
1News has an Editorial AI working group to consider how we respond to the evolving challenges Generative AI creates. We’re continuously looking at what other media organisations (local and international) are doing in this space too.
We think Generative AI is best treated like a news source –that means before using it, we will apply the same journalistic processes we do for any other source, including monitoring it for potential bias. AI should only ever be a tool for journalists, and meet the standards required for all tools our journalists use. Like all information we gather, if journalists have any doubt about AI-generated material, we will not use it.
Generative AI offers huge potential to save time, which could mean more opportunities to focus on journalism, and we’re open to that. However, human oversight will be an essential step in any Generative AI content, and we will never rely solely on AI-generated research. If we include significant elements generated by AI in a piece of work, we will let you know. 1News will also always consider the rights of artists and rights holders when using Generative AI.
Trust is the foundation of the 1News’ relationship with our audiences. We are committed to bringing you the latest, most relevant, news and current affairs content, and to do so in an open and transparent manner.
How 1News will use Generative AI:
Transparency is key - should we ever generate an image or text solely using Generative AI we will let you know.
If we use tools that utilise Generative AI, the final product will always be checked by humans.
If we use Generative AI for creative purposes, we will make it clear that it is an artist’s impression.
SHARE ME