Top Qs
Timeline
Chat
Perspective
Automated journalism
Journalism produced by algorithms From Wikipedia, the free encyclopedia
Remove ads
Automated journalism, also known as algorithmic journalism or robot journalism,[1][2][3] is a term that attempts to describe modern technological processes that have infiltrated the journalistic profession, such as news articles and videos generated by computer programs.[3][4][5] There are four main fields of application for automated journalism, namely automated content production, data mining, news dissemination and content optimization.[6] Through generative artificial intelligence, stories are produced automatically by computers rather than human reporters. In the 2020s, generative pre-trained transformers have enabled the generation of more sophisticated articles, simply by providing prompts.
Automated journalism is sometimes seen as an opportunity to free journalists from routine reporting, providing them with more time for complex tasks. It also allows efficiency and cost-cutting, alleviating some financial burden that many news organizations face. However, automated journalism is also perceived as a threat to the authorship and quality of news and a threat to the livelihoods of human journalists.[2][3]
Remove ads
History
Summarize
Perspective
Historically, the process involved an algorithm that scanned large amounts of provided data, selected from an assortment of pre-programmed article structures, ordered key points, and inserted details such as names, places, amounts, rankings, statistics, and other figures.[4] These programs interpret, organize, and present data in human-readable ways. The output can also be customized to fit a certain voice, tone, or style.[2][3][4] Early implementations were mainly used for stories based on statistics and numerical figures. Common topics include sports recaps, weather, financial reports, real estate analysis, and earnings reviews.[3]
Data science and AI companies such as Automated Insights, Narrative Science, United Robots and Monok develop and provide these algorithms to news outlets.[4][7][8][9] In 2016, early adopters included news providers such as the Associated Press, Forbes, ProPublica, and the Los Angeles Times.[3]
StatSheet, an online platform covering college basketball, runs entirely on an automated program.[4] In 2006, Thomson Reuters announced their switch to automation to generate financial news stories on its online news platform.[10] Reuters used a tool called Tracer.[11] An algorithm called Quakebot published a story about a 2014 California earthquake on The Los Angeles Times website within three minutes after the shaking had stopped.[4][7]
The Associated Press began using automation to cover 10,000 minor baseball leagues games annually, using a program from Automated Insights and statistics from MLB Advanced Media.[12] Outside of sports, the Associated Press also uses automation to produce stories on corporate earnings.[4] Since 2014, Associated Press has been publishing quarterly financial stories with help from Automated Insights.[13]
In May 2020, Microsoft announced that a number of its MSN contract journalists would be replaced by robot journalism.[14][15][16] On 8 September 2020, The Guardian published an article entirely written by the neural network GPT-3, although the published fragments were manually picked by a human editor.[17] Agentic Tribune produces all of its news articles automatically using AI.[18]
News broadcasters in Kuwait, Greece, South Korea, India, China and Taiwan have presented news with anchors based on generative AI models, prompting concerns about job losses for human anchors and audience trust in news that has historically been influenced by parasocial relationships with broadcasters, content creators or social media influencers.[19][20][21] Algorithmically generated anchors have also been used by allies of ISIS for their broadcasts.[22]
In 2023, Google reportedly pitched a tool to news outlets that claimed to "produce news stories" based on input data provided, such as "details of current events". Some news company executives who viewed the pitch described it as "[taking] for granted the effort that went into producing accurate and artful news stories."[23]
In February 2024, Google launched a program to pay small publishers to write three articles per day using a beta generative AI model. The program does not require the knowledge or consent of the websites that the publishers are using as sources, nor does it require the published articles to be labeled as being created or assisted by these models.[24]
Meta AI, a chatbot based on Llama 3 which summarizes news stories, was noted by The Washington Post to copy sentences from those stories without direct attribution and to potentially further decrease the traffic of online news outlets.[25]
Remove ads
Benefits
Speed
Robot reporters are built to produce large quantities of information at quicker speeds. The Associated Press announced that their use of automation has increased the volume of earnings reports from customers by more than ten times. With software from Automated Insights and data from other companies, they can produce 150 to 300-word articles in the same time it takes journalists to crunch numbers and prepare information.[4] By automating routine stories and tasks, journalists are promised more time for complex jobs such as investigative reporting and in-depth analysis of events.[2][3]
Francesco Marconi[26] of the Associated Press stated that, through automation, the news agency freed up 20 percent[27] of reporters’ time to focus on higher-impact projects.
Cost
Automated journalism is cheaper because more content can be produced within less time. It also lowers labour costs for news organizations. Reduced human input means less expenses on wages or salaries, paid leaves, vacations, and employment insurance. Automation serves as a cost-cutting tool for news outlets struggling with tight budgets but still wish to maintain the scope and quality of their coverage.[3][10]
Remove ads
Concerns
Summarize
Perspective
Authorship
In an automated story, there is often confusion about who should be credited as the author. Several participants of a study on algorithmic authorship[3] attributed the credit to the programmer; others perceived the news organization as the author, emphasizing the collaborative nature of the work. There is also no way for the reader to verify whether an article was written by a robot or human, which raises issues of transparency although such issues also arise with respect to authorship attribution between human authors too.[3][28]
Credibility and quality
Concerns about the perceived credibility of automated news is similar to concerns about the perceived credibility of news in general. Critics doubt if algorithms are "fair and accurate, free from subjectivity, error, or attempted influence."[29] Again, these issues about fairness, accuracy, subjectivity, error, and attempts at influence or propaganda has also been present in articles written by humans over thousands of years. A common criticism is that machines do not replace human capabilities such as creativity, humour, and critical-thinking. However, as the technology evolves, the aim is to mimic human characteristics. When the UK's Guardian newspaper used an AI to write an entire article in September 2020, commentators pointed out that the AI still relied on human editorial content. Austin Tanney, the head of AI at Kainos said: "The Guardian got three or four different articles and spliced them together. They also gave it the opening paragraph. It doesn’t belittle what it is. It was written by AI, but there was human editorial on that."[3]<refname="EC" />[30]
The largest single study of readers' evaluations of news articles produced with and without the help of automation exposed 3,135 online news consumers to 24 articles. It found articles that had been automated were significantly less comprehensible, in part because they were considered to contain too many numbers. However, the automated articles were evaluated equally on other criteria including tone, narrative flow, and narrative structure.[31]
Beyond human evaluation, there are now numerous algorithmic methods to identify machine written articles[32] although some articles may still contain errors that are obvious for a human to identify, they can at times score better with these automatic identifiers than human-written articles.[33]
A 2017 Nieman Reports article by Nicola Bruno[34] discusses whether or not machines will replace journalists and addresses concerns around the concept of automated journalism practices. Ultimately, Bruno came to the conclusion that AI would assist journalists, not replace them. "No automated software or amateur reporter will ever replace a good journalist", she said.
In 2020, however, Microsoft did just that - replacing 27 journalists with AI. One staff member was quoted by The Guardian as saying: “I spend all my time reading about how automation and AI is going to take all our jobs, and here I am – AI has taken my job.” The journalist went on to say that replacing humans with software was risky, as existing staff were careful to stick to “very strict editorial guidelines” which ensured that users were not presented with violent or inappropriate content when opening their browser, for example.[35]
In June 2024, Reuters Institute published their Digital News Report for 2024. In a survey of people in America and Europe, Reuters Institute reports that 52% and 47% respectively are uncomfortable with news produced by "mostly AI with some human oversight", and 23% and 15% respectively report being comfortable. 42% of Americans and 33% of Europeans reported that they were comfortable with news produced by "mainly human with some help from AI". The results of global surveys reported that people were more uncomfortable with news topics including politics (46%), crime (43%), and local news (37%) produced by AI than other news topics.[36]
Employment
Among the concerns about automation is the loss of employment for journalists as publishers switch to using AIs.[3][4][37] The use of automation has become a near necessity in newsrooms nowadays, in order to keep up with the ever-increasing demand for news stories, which in turn has affected the very nature of the journalistic profession.[6] In 2014, an annual census from The American Society of News Editors announced that the newspaper industry lost 3,800 full-time, professional editors.[38] Falling by more than 10% within a year, this is the biggest drop since the industry cut over 10,000 jobs in 2007 and 2008.[38][39]
United States Senators Richard Blumenthal and Amy Klobuchar have expressed concern that generative AI could have a harmful impact on local news.[40] In July 2023, OpenAI partnered with the American Journalism Project to fund local news outlets for experimenting with generative AI, with Axios noting the possibility of generative AI companies creating a dependency for these news outlets.[41]
Dependence on platform and technology companies
There has been a significant amount of recent scholarship on the relationship between platform companies, such as Google and Facebook, and the news industry with researchers examining the impact of these platforms on the distribution and monetization of news content, as well as the implications for journalism and democracy.[42][43][44] Some scholars have extended this line of thinking to automated journalism and the use of AI in the news. A 2022 paper by the Oxford University academic Felix Simon, for example, argues that the concentration of AI tools and infrastructure in the hands of a few major technology companies, such as Google, Microsoft, and Amazon Web Services, is a significant issue for the news industry, as it risks shifting more control to these companies and increasing the industry's dependence on them.[45] Simon argues that this could lead to vendor lock-in, where news organizations become structurally dependent on AI provided by these companies and are unable to switch to another vendor without incurring significant costs. The companies also possess artefactual and contractual control[46] over their AI infrastructure and services, which could expose news organizations to the risk of unforeseen changes or the stopping of their AI solutions entirely. Additionally, the author argues the reliance on these companies for AI can make it more difficult for news organizations to understand the decisions or predictions made by the systems and can limit their ability to protect sources or proprietary business information.
Remove ads
Misuse
Summarize
Perspective
In January 2023, Futurism.com broke the story that CNET had been using an undisclosed internal AI tool to write at least 77 of its stories; after the news broke, CNET posted corrections to 41 of the stories.[47]
In April 2023, the German tabloid Die Aktuelle published a fake AI-generated interview with former racing driver Michael Schumacher, who had not made any public appearances since 2013 after sustaining a brain injury in a skiing accident. The story included two possible disclosures: the cover included the line "deceptively real", and the interview included an acknowledgment at the end that it was AI-generated. The editor-in-chief was fired shortly thereafter amid the controversy.[48]
Other outlets that have published articles whose content or byline have been confirmed or suspected to be created by generative AI models – often with false content, errors, or non-disclosure of generative AI use – include:
- NewsBreak[49][50]
- outlets owned by Arena Group
- B&H Photo[53]
- outlets owned by Gannett
- The Columbus Dispatch[54][55]
- Reviewed[56]
- USA Today[57]
- Journal Star[58]
- El Paso Times[58]
- Fort Collins Coloradoan[58]
- The Record[58]
- The Augusta Chronicle[58]
- The Providence Journal[58]
- Argus Leader[58]
- Southwest Times Record[58]
- The Des Moines Register[58]
- North Jersey Media Group[58]
- Pocono Record[58]
- MSN[59]
- News Corp[60]
- outlets owned by G/O Media[61]
- The Irish Times[66]
- outlets owned by Red Ventures
- BuzzFeed[68]
- Newsweek[69]
- Hoodline[70][71][72]
- outlets owned by Outside Inc.
- Hollywood Life[57]
- Us Weekly[57]
- The Los Angeles Times[57]
- Cody Enterprise[73]
- Cosmos[74]
- outlets owned by McClatchy
- outlets owned by Ziff Davis
- outlets owned by Hearst
- outlets owned by IAC Inc.
- outlets owned by Street Media
- Riverfront Times[76]
- Apple Intelligence[77]
In May 2024, Futurism noted that a content management system video by AdVon Commerce, who had used generative AI to produce articles for many of the aforementioned outlets, appeared to show that they "had produced tens of thousands of articles for more than 150 publishers."
Many defunct news sites (The Hairpin, The Frisky, Apple Daily, Ashland Daily Tidings, Clayton County Register, Southwest Journal) and blogs (The Unofficial Apple Weblog, iLounge) have undergone cybersquatting, with articles created by generative AI.[78][79][80][81][82][83][84][85]
In response to potential pitfalls around the use and misuse of generative AI in journalism and worries about declining audience trust, outlets around the world, including publications such as Wired, Associated Press, The Quint, Rappler or The Guardian have published guidelines around how they plan to use and not use AI and generative AI in their work.[86][87][88][89]
Remove ads
References
Wikiwand - on
Seamless Wikipedia browsing. On steroids.
Remove ads