Journalism’s Quiet Revolution

As Pakistan’s media landscape has undergone a profound technological transformation, the integration of AI into journalism offers both exciting opportunities and serious risks. On one hand, AI provides tools that allow streamlining newsroom routine tasks, automate transcription and translations, facilitate cross-lingual and multilingual reporting, and support data-driven investigative journalism that was once beyond imagination. AI can serve as a catalyst, enhancing performance for smaller media organisations and enabling them to compete with larger media outlets through improved efficiency and scale.
But there are risks as deep as the rewards. Unchecked AI-generated content can erode public trust by spreading misinformation and deepfakes. In a country as diverse and politically dynamic as Pakistan, there is a real danger of algorithmic bias where AI systems replicate and can enforce societal discrimination. Moreover, excessive reliance on automated AI tools may sometimes marginalise human editorial judgment that is the foundation of ethical journalism.
The dilemma I call ‘bias, bots and bylines’ is crucial to address as rapidly as the technology is transforming journalism. Bias is reinforced and seeps in when algorithms are trained on skewed datasets and begin deciding for you what news or content you see and what you don’t. Bots are designed to saturate digital spaces with synthetic content that appears like real reporting, making it increasingly difficult to distinguish between fact and fiction. And the bylines are the journalistic human voices, can be overshadowed or replaced by automated systems that lack contextual understanding, accountability and empathy. This dilemma is not just technical; it is existential. If the media organisations allow bots to dominate news industry in the age of fake news and deepfakes without the implementation of ethical framework for AI usage, we risk losing what is fundamental to good investigative and ethical journalism. If journalists fail to interrogate the biases that exist in datasets and AI systems, we risk reinforcing the discrimination and inequalities that ethical journalism needs to expose. And if we neglect human editorial oversight and the role of journalists as a byline, we risk turning newsrooms into content-making factories rather than ethical spaces of truth, accountability and inquiry. This is why I believe that a policy framework for newsrooms and journalists is very significant to address how AI can be and should be used.
Mohsin Naqvi, MQM-P discuss urban Sindh issues and security situation
This dilemma of ‘bias, bots and bylines’ is not confined to Pakistan; it reverberates in newsrooms globally. Countries with similar challenges have begun to develop frameworks of ethical practices of using AI to ensure that it enhances journalism rather than becoming a risk for journalists.
Nordic newsrooms have been among the early adopters of AI and have been incorporating the new AI tools in their newsrooms to enhance the efficiency of their employees but meanwhile, they pioneered a ‘human-in-the-loop’ model, where AI tools assist with transcription and metadata tagging, but final editorial decisions remain in human hands. This model ensures the human oversight that doesn’t compromise the journalistic integrity. Similarly, the Universidad de Santiago de Compostela in Spain has collaborated with media organisations to examine the algorithmic biases and promote inclusive datasets, highlighting the need for accountability, impartiality and transparency in how AI tools make decisions.
Ali Muhammad urges PM to allow KP CM’s meeting with PTI founder
The United States also provides another useful example. The Poynter Institute, a prominent journalism think tank, has created AI ethics guidelines that emphasise editorial oversight, public trust and transparent labelling of synthetic content. Several United States newsrooms are now using these guidelines to train journalists on how they can responsibly use AI. In Europe, the European Union’s proposed AI Act seeks to regulate high-risk applications to avoid discrimination and enforce accountability. These global practices reinforce Pakistan’s urgency of its own journey to frame guidelines for using AI in journalism.
In Karachi, I just had the privilege of joining a powerful three-day convening hosted by Media Matters for Democracy, supported by the European Union, where we witnessed a transformative moment for journalism in Pakistan. This gathering brought together some of the country’s most respected newsroom directors, editors and media academics to collaboratively co-create a standards framework for integrating artificial intelligence into our newsrooms.
Electricity tariff likely to drop by 37 paisas per unit
This initiative, led by Asad Baig, Founder of Media Matters of Democracy, was not only timely but visionary. It marked a significant step towards ethical and innovative journalism in the age of AI, where technology is rapidly shaping how we verify, cover and report the news and how we engage with our audiences. The three-day dialogue was rich and multilayered, and a step toward being the part of the global practices. We explored how AI can help newsrooms work more efficiently, enhance fact-checking and even conduct investigative journalism by combing and analysing through vast datasets. But we have also discussed the risks: algorithmic bias, misinformation and fake news, editorial opacity and the erosion of human judgment.
With renowned and senior journalists, the convening became a platform for a rigorous debate and meaningful consensus. Together, we laid the groundwork for a framework that will guide Pakistani newsrooms in integrating AI responsibly and ethically. It demands human oversight, editorial transparency and inclusive data practices. It is a reminder that AI is a tool, not a replacement for judgment. And it centres journalists, not algorithms, in the newsroom.
Trump says he urged Modi to avoid war with Pakistan
I believe that with a standards framework, there is also an urgent need of capacity building for ethical usage of AI across media sectors in Pakistan. Many newsrooms, especially the smaller or the regional ones, lack the technical resources and expertise to engage with AI and integrate it into their work meaningfully and ethically because this transformation in journalism must be driven by ethics and responsibility, not just speed and efficiency. A standards framework is a significant first step, but it must be supported by the capacity building of journalists and media professionals across Pakistan’s media sector. Without support and capacity building, there is a risk of using AI inappropriately which can undermine journalism. We need to invest in training, infrastructure and ethical literacy. Journalists need to know how AI is used, but also how it can be questioned, challenged and responsibly used. Editorial independence must be protected. Human judgment must be prioritised. And the principles of truth, data accountability, impartiality and public service must remain non-negotiable.
To continue this journey, we are going to host a Sahafi Summit with Media Matters for Democracy in November this year on AI in journalism at the Department of Media & Development Communication, Punjab University, Lahore. This conference will ignite a dialogue around the ethical use of AI in Pakistan’s journalism. This is not just a conference; it’s a commitment to provide a platform to journalists, technologists, academics and policy thinkers to explore the intersection of Media, AI and responsibility. Because I believe that the future of journalism in Pakistan depends on the ability of journalists to use AI without giving up their souls. Let’s imagine a media ecosystem in which technology augments human wisdom, not distorts it; one in which biases are challenged, bots are regulated, and bylines remain to raise the voice of the public and speak truth to power.
Dr. Ayesha Ashfaq
The writer is the Chairperson and Associate Professor at the Department of Media & Development Communication, University of the Punjab, Lahore.