By Freddy Mayhew
A recent poll by Press Gazette asking readers if they thought AI robots were a threat to journalism or an opportunity found the majority (69 per cent) of more than 1,200 voters saw AI as a threat.
But while what’s known as “artificial general intelligence” – machines akin or superior to human intelligence – does not yet exist and may never be fully realised, AI tools are already in use in the news industry today.
These tools help in the gathering, production and distribution of information. They fall broadly under the definition of “machine learning”, a subset of AI where computers handle specific tasks and are able to learn and improve as they go – independent of human help.
Both Facebook and Google depend on AI to enhance user experience, showing readers more of what they like or predicting search questions for example. Even spellcheckers are a type of AI.
Within the news industry AI tools are already used to personalise newsfeeds, scan social media for stories, moderate reader comments or process huge volumes of data to aid investigations.
Natural language processors can produce written transcripts from audio or video recordings and natural language generators can write simple stories based on datasets, both of which are types of AI tools.
A survey of journalists working with AI at 71 news organisations from 32 countries, carried out by LSE and think-tank Polis, found that just under half of respondents said they used AI for newsgathering, two-thirds said they used it for production and just over half for distribution.
Just over one-third of respondents to the survey, published in November last year, claimed to have an active AI strategy, while 44 per cent said AI was already having an impact on their news organisations.
A majority of respondents (68 per cent) said they had started adopting AI technology in their newsrooms to make journalists’ work more efficient.
AI journalism tools in use at newsrooms
The Times cut digital subscriber churn down by half last year using AI software dubbed JAMES (Journey Automated Messaging for Higher Engagement) that created newsletters tailored to readers’ interests.
Bloomberg’s Cyborg AI tool helps reporters to be first with the headlines on financial market movements in a competitive environment.
UK news agency PA’s Radar (Reporters and Data and Robots) service, which launched in 2017, uses software to produce localised data stories at scale and speed: up to 30,000 a month for its local news clients.
Radar editor Joseph Hook told Press Gazette that the process remains “in the hands of a journalist” who finds the data, chooses the angle and writes the template that will be the foundation for all other iterations of a story.
“I think it doesn’t quite fit into how people assume AI is going to work in journalism,” said Hook, adding: “It doesn’t do all the legwork for you… it’s using software and AI to scale up what we are putting out.”
Hook said the service “fills the skill gap that a lot of journalists have where they aren’t comfortable with numbers and data” and reporters “can be confident that we have got those numbers right”.
He added: “It also allows journalists to play to their strengths more. It might be that they take our stories in full, but a lot of the time we see journalists use it as a base and go out and find local case studies.”
Looking ahead, Hook said: “My instinct is that, over time, this will start to take on more of the less enjoyable repetitive work… and allow journalists to go out and do the human element of the work.”
But he added that journalists “have to continue to be integral to journalism, because there is a certain amount of context and understanding and engagement with the readership that only humans can have”.
“I think that will always be key,” he noted, adding: “There will always be a level of risk in leaving AI or a computer completely free to choose stories and data because often data can contain anomalies.”
Microsoft made headlines recently after it turned over curation of news stories on its website and app to AI software, resulting in mass redundancies – including 27 jobs at PA. One affected journalist told Press Gazette: “My job’s been replaced by a robot. It doesn’t feel good.”
The AI quickly made an error, however, when it accompanied a story about a Little Mix star’s experience of racism with an image of her bandmate.
Microsoft later said the error had been the result of a new feature where its AI software would select an alternative image for its homepage snippets. The tech company declined to speak with Press Gazette for this piece.
Although it could be read as a cautionary tale against handing full control over news decisions to an algorithm, in this instance the AI is only repeating a mistake made many times before by human journalists.
AI use in journalism is ‘important and growing’
At Reuters, AI technology is seen as “already important and growing”, but humans still have the final say – for now, according to global editor for media news strategy Jane Barrett. “We don’t let the machine talk directly to our clients yet,” she said.
Barrett said AI helps journalists by “taking away the drudgery” of tasks such as stock market reports and inputting sport match results. The machine does it faster and in multiple languages, saving time and resources.
“I’m delighted to give the boring jobs to the robots,” she said, adding: “It will free journalists up from the rote stuff to do the value-added stuff which is much more fun frankly.”
Reuters newsrooms use an AI tool called News Tracer to spot breaking news on Twitter and rank them by newsworthiness. Lynx Insight is used to identify trends in the publisher’s vast financial datasets and suggest stories to journalists. It can even be used to write sentences.
Opta provides Reuters journalists with sports data feeds, which can then be matched with metadata from images taken by the agency’s own photographers to stitch together video match summaries.
Reuters has even experimented with so-called “deep-fakes”, or synthetic media as it is known when positively applied, to create a virtual presenter based on one of their own journalists.
Understanding how such technology works will allow journalists to spot fakes, and AI is expected to play as much of a role in fighting disinformation as it does in proliferating it.
“I think AI is seen as the monster under the bed, but actually it’s already one of the tools that we are happy to use. It’s just working out how to redeploy those tools to better serve the problems that we have,” Barrett said.
She said AI tools such as facial recognition could soon be used to search through news archives and pull out material and images relevant to a news story being written by a reporter today.
“I think we will get to the stage where we don’t need a human journalist, but it will be up to each publication as to what they are happy to put through, and there will be regular testing,” said Barrett.
This could see the return of now redundant print journalism roles, such as a copy desk manned by a human editor who could look through stories written by robots for readability and relevance.
AI is ‘making the journalism better’
Media professor and Polis director Charlie Beckett, who heads up LSE’s Journalism AI project, echoes the view that AI is not there to replace journalists, but to help them, in what’s known as “augmented journalism”.
“It’s there to help journalists connect with customers, discover stories, write articles,” he said, adding: “These tools are seen as supplementing the journalists’ work and journalistic values.”
Beckett dismissed fears of a robot takeover, saying they “don’t exist anyway” and adding that “if your job can be replaced by an algorithm you have got to ask yourself why you were doing the job in the first place”.
“The 2001 Space Odyssey robot taking over – I don’t think we need to worry about it. I would be more worried about Facebook founder Mark Zuckerberg taking over,” he noted.
Beckett elaborated: “The work being done with the technology is really pushing the boundaries. It’s making the journalism better, not just cheaper or more efficient. It’s about saying: ‘Look, we have got this great product, how can we make sure that people connect to it?’”
The large amount of data generated about the spread of the Covid-19 pandemic has meant that newsrooms have been able to use AI to keep track of it in a way that a team of human journalists would have struggled to achieve.
Beckett said AI tools have given journalists “the time to do the human stories” on Covid-19, away from the number crunching, and that those stories are “better when done by humans rather than machines”.
Hook meanwhile said that his Radar team had ramped up their content on Covid-19 figures, doubling its output to local newsrooms in recent weeks. The average number of views on articles has nearly doubled in the past month.
“The opportunity is there because of the amount of data being published,” said Hook, adding: “There has never been a data story that everyone has engaged with to this extent. We are interested to see whether this helps to drive interest in other data stories.”
Fears that newsrooms will fall behind on AI
In AI, as in other parts of the news media, the role of the technology companies is hugely influential. Advanced AI software is not realistically going to be developed by a newsbrand, but by a technology firm.
Both Beckett and Barrett said they feared journalism falling behind on this new trend and failing to shape its development.
“It’s clear that the world is going to be increasingly dependent on technology companies. That’s in every field, including news media,” said Beckett.
He said he hoped the news media “gets its act together so that it retains some control over the technology” and understands how it works.
“I think AI is going to affect pretty much every industry and it will take away jobs in every industry and I don’t think journalism will be immune from that. My hope is actually that journalism doesn’t fall behind,” noted Barrett.
She added: “We are short of cash as an industry. AI engineers are very sought after – most of them are being snapped up by Google or Facebook and the like.”
It’s not a robot takeover that journalists should fear, but failing to spot the direction of travel and being left behind.
Andrew Carter chips in, saying that humans still have the final say, for now – adding that he thinks the last two words here should be stressed emphatically and with great dismay.
He says: “As for AI use in search engines, it’s just opening the door for untrue propagated information to be fed to the opening mouths of wannabe hacks without any of them realising the manipulated adulterated fodder they are asked to consume and digest.”
“I say this because everyone knows that if you pay enough money your bait floats to the top of any search engine – and nobody can be bothered to look past the first few pages of a search which produces millions or billions of results’,” he adds.
Article published courtesy of WAN-IFRA Executive News Service.