As you read this article, you may be consciously or subconsciously wondering whether it is computer-generated text. While I promise you it is the work of yours truly, I cannot blame you for doubting it. This is one way technology is reshaping content and the media, but it is not the only way.
Within the Department of Artificial Intelligence at the University of Malta, we recognise this shifting landscape and its implications. For this reason, we are dedicating several research efforts to focus on tools that can help humanity produce and consume media content, including news.
The easy way out would be to work on AI models that can create content and ride the current hype of generative AI.
Instead, we are harnessing the positive power of generative AI and modern machine learning to produce byproducts that help humans – journalists and readers alike. Our mission is to create technology that makes journalism more sustainable, and fair. These principles guide us.
Principle 1: Empowerment without replacement
We believe that the fear that AI will replace humans is there to serve speculation and hype. Like any other technology that reshaped its moment, AI will reshape what we do today. It is up to us to choose how, and we are choosing sustainability and fairness.
Put aside the hyped-up versions and ideas you may have about AI. Its ultimate relevance is in the approximation of patterns from data.
This means it can instantly process a large volume of written text, say 10 books, and then present outlines, trends, similarities or an analysis. It can also approximate new insights or perspectives based on the processed data. This is one way of empowering journalists. It is not there to write their articles, but it should be used to help them go through the ever-growing volume of information. At the same time, they save their human brain power to carve out relevant and tailored perspectives for their audience and serve their role as the fourth pillar of democracy.
Principle 2: Transparency and explainability
There is no free lunch in life, and the same goes for AI. While approximations are helpful at scale, they are still approximations at the end of the day. This means that the output might not always be accurate, and in some cases, it also carries forward any bias found in the data analysed. Society needs more precise news and less biased news.
Journalism has a sustainable future in the age of AI- Dylan Seychell
This is one of the main challenges when we use AI in journalism or news analysis. It is not acceptable to excuse AI for such output or treat it as a black box and wait for its output with fingers crossed.
As scientists, it is our responsibility to devise ways of making AI models more open, explainable, and interpretable, which is our priority in our work.
Principle 3: Human in the loop
Forget computers and AI for a moment. Whenever we make a decision, we split the process into two parts. First, we analyse the situation and develop alternative options or predictions. Secondly, we judge which is the best way forward and conclude. The value is in using AI in the first part to provide insights, predictions and alternatives.
As things stand today, judgement should be left for humans to make, assuming they have good information upon which to base their judgement. This mitigates most of the risks of AI while also addressing the issue of accountability.
This translates to a situation where AI would provide journalists and readers with an analysis of the content and leave conclusions for humans to make. It is not a good idea to delegate the key part of our decision-making process to a machine that only knows how to approximate it.
Principle 4: Media literacy through AI
We live in an age where AI poses a risk to how information is spread and communicated. It should also be employed to counter these risks. Media literacy and the human ability to intuitively and accurately understand what is being communicated are becoming more critical for the survival of democracy. In our specialised research projects, we are developing AI tools to assist readers and journalists to strengthen critical thinking and independent judgement.
What is the future of journalism?
While I will not make any predictions, I will stick to the obvious: significant transformations will occur, AI is here to stay, and it will only improve. The transformation ranges from how we consume the media, to how news stories are built to business models that will allow for survival. It is and will be a bumpy ride for all who mean well.
Journalism has a sustainable future in the age of AI: collaboration between AI researchers, journalists, and the public. While this article focuses on journalism, these principles and thoughts apply to any area or industry.
The future is not there for us to speculate about but to build it together.
Those who wish to be part of this research can express their interest here: www.ainewsanalysis.org.
The project ’Exploring Visual Bias in News Content using Explainable AI’ (NBxAI) is financed by Xjenza Malta, for and on behalf of the Foundation for Science and Technology, through the FUSION: R&I Research Excellence Programme.
Dylan Seychell is a lecturer in the Department of Artificial Intelligence.