There’s an old quote – ’Too much caution means not enough risk’. So for news publishers, getting the balance right between embracing AI, whilst not allowing AI generated content to damage their integrity and lose audience trust is, quite rightly, the subject of considerable ponderment and debate.
Mixing ‘real’ content with AI generated synthetic media on output will undoubtedly have a devastating effect on a news publishers integrity and the audiences trust in that publisher. The argument that labelling ’synthetic media’ as AI generated will preserve audience trust is just simply denial on the publishers part. When downloading apps or software, a 2017 study by Deloitte found that 91% of consumers accept the terms and conditions without reading them. So who reads the small print on an image or video when you’re ‘engrossed’ in the article itself? No one.
There’s also the very real concern that AI ‘hallucinates’, or in other words, like that drunk bloke who’s always down the pub, ‘makes stuff up’ and generates inaccurate information. And it’s fair to say that in attempt to be helpful and AI-mazing, it aggregates content from different sources, without having a deep understanding of how information is really linked in the real world.
I’ve been delivering AI News & Media Production training for broadcasters, digital publishers, news channels and content agencies for about a year now and I’ve been asked so many times if newsrooms will one day be totally run by AI from idea creation/news-gathering to programme broadcast and at every step in between.
Well the honest answer to that is yes. It’s basically happened already at one or two organisations.
Is it a success? Well it works and when you know how, it’s reasonably easy to implement and set up. Is it what the audiences want? No.
As well as delivering AI Media training I also deliver broadcast documentary training for broadcasters and publishers all over the world. And, spoiler alert, the essence of that course is that all good storytelling is about relating everything to the ‘human condition’. Find ‘that thing’ in the story that makes the audience care. I’ve developed an algorithm that you can apply to fast track you to that ‘essence’ which works every time.
So surely you can give that same algorithm to AI and it’ll be able to do the same?
No.
We can ‘run’ that algorithm in our heads, in a way that we can process and discriminate what’s relevant because we’re human. AI cannot.
AI cannot generate stories or content that connects and affects people in their hearts. That’s what powerful storytelling is all about. That’s what audiences want. It can’t do that, because it can’t feel. It may have been taught about what makes us feel, but it can’t feel. Ignore this and, as an organisation, you’ll die.
So should we forget about using AI in news and media production? Well it would seem so yes. But if you do, as an organisation, you’ll die.
AI is not about generating your output. If you want a connection with your audience, only humans can do that. But like a power drill compared to a hand crank drill, AI as an individual production tool (or tools) can supercharge your workflow. And if you don’t embrace that aspect of AI, your competitors will and whilst you’re busy banging on about how much you care about your audiences, as an organisation, you will die.
Embracing Ai is not some binary process. Do we do it, or do we not?
Any media or content production workflow can be supercharged with the careful integration of the right AI tools at, potentially, each and every stage of the production process, and maybe initially working alongside conventional production processes. It’s about working smarter, creating better content, working more efficiently, using AI to actually serve and connect with your audiences better. In election campaigns political parties are using AI to implement what is known as ‘micro-targetting’. The ethics of that is a whole other conversation, but as ‘publishers’ we must learn how to serve our audiences better. And the key to that IS AI.
Rejecting AI outright in news and media production until we understand it better is a mistake. AI is not about creating synthetic output, but about helping us create our human content better and getting it to the audience more efficiently. We need to understand the difference between AI generated synthetic media and AI production tools that can supercharge the whole production workflow.
They say the biggest risk is not taking any risks! With the right information and guidance, AI is no risk at all!
Dean Arnett is a Video Production Specialist, AI Consultant and Documentary Producer. He is known as ‘The Human AI Guy’. He provides AI Consultancy for News and Media Publishers, as well as delivering a number of ‘AI in News & Media Production’ training courses.
Find out more at www.ai.deanarnett.com