Artificial intelligence has entered journalism at precisely the moment when newsrooms are losing people, not gaining tools. Across Pakistan, and across much of the global media industry, layoffs have become a steady, almost background condition of the profession. Reporters are let go, copy desks are thinned, senior editors take early exits, and entire beats quietly disappear, not because the need for reporting has diminished but because the business model that once sustained it has fractured. Similar patterns are visible in media houses from South Asia to North America and Europe, where successive rounds of job cuts have been justified by falling revenues, declining print circulation, and the migration of advertising to platforms that do not produce journalism themselves. Into this vacuum steps AI, often presented as a neutral technological upgrade, but increasingly functioning as a way to maintain output while human capacity contracts. The danger lies not in the technology, but in the timing: AI is being adopted not into healthy newsrooms, but into hollowing ones.
The appeal is obvious. Generative systems can write clean copy, summarise press briefings, translate statements, generate headlines, produce scripts, and even deliver news through synthetic presenters, all at a speed and scale no shrinking newsroom can match. For managements under financial pressure, this looks like resilience. For journalists watching colleagues disappear from payrolls, it feels more like substitution. Journalism’s historical strength has never been volume or velocity; it has been judgment — the layered, human process of verification, contextualisation, scepticism, and correction. Those layers are labour-intensive by design. When newsrooms lose subeditors, researchers, senior reporters, and fact-checkers, they lose the internal friction that catches mistakes and challenges weak claims. AI does not replace that friction; it smooths over it, producing text that reads well even when the underlying reporting load has quietly diminished.
This is where misinformation becomes less an external threat and more an internal risk. Generative AI does not need to invent falsehoods to cause harm. It can simply reproduce existing inaccuracies, partial truths, and misleading frames with greater polish and reach. It excels at coherence, not truth. In an understaffed newsroom, where fewer people have the time or mandate to interrogate claims, AI-assisted production can normalise error through repetition. What once might have been caught by an experienced editor or a beat reporter with institutional memory now moves quickly from draft to publication because it sounds right. Over time, this creates a media environment filled with fluent but fragile journalism — content that looks authoritative but rests on thinner verification.
Pakistan’s media industry is encountering this shift in real time. Experiments that lean heavily on automation, including those like Saga Digital, are often discussed as innovation stories, but they are better understood as signals of where the industry is being pushed. These models show how a small team, supported by AI tools and synthetic production pipelines, can generate a steady flow of news-like material even as traditional newsroom roles shrink. The point is not to single out any one initiative, but to recognise the broader logic at work: fewer journalists, more output, less visible reporting labour. When such approaches scale in an environment already marked by layoffs, the risk is that journalism begins to resemble content manufacturing — efficient, continuous, and increasingly detached from the slow work of reporting.
This shift also complicates accountability. Journalism has historically relied on visible human responsibility — bylines, editors’ notes, corrections — to sustain trust even when errors occur. As AI becomes embedded in everyday production, those signals blur. When content is assembled through prompts, templates, and machine-generated drafts, responsibility diffuses. Mistakes feel systemic rather than owned, and corrections feel procedural rather than corrective. For audiences already sceptical of media, this opacity deepens distrust. The problem is not that AI is being used, but that it is often being used quietly, without a parallel conversation about standards, disclosure, and limits.
There are, however, other ways of integrating AI that do not depend on reducing journalism to automation. The approach taken by Al Jazeera through its internally developed AI system, commonly referred to as The Core, offers a useful contrast. Rather than positioning itself as an AI newsroom, the organisation has framed AI as infrastructure — a set of tools to assist journalists with research, data analysis, translation, and workflow efficiency, while keeping editorial judgment human. The emphasis is not on replacing reporters or editors, but on strengthening their capacity in an information environment that is already overwhelming. This model recognises that technology can thicken journalism’s back-end processes without thinning its front line.
The larger pattern remains troubling. Media houses facing economic pressure are cutting people first and adding technology later, hoping the latter will compensate for the former. But journalism does not scale cleanly in reverse. Once institutional memory is lost, once beats are abandoned, once editorial scepticism is weakened, no amount of automated fluency can restore it. The result is an industry that publishes more while knowing less, speaks faster while listening less, and corrects itself more slowly because fewer people are left to notice what went wrong.
The future of journalism after AI is therefore not a question of whether machines can write. They already can. It is a question of whether media institutions are willing to protect the human labour that gives journalism its weight. In an era of layoffs and automation, the real scarcity is not content but credibility. If AI is used to support reporting — to help journalists see patterns, process information, and work more effectively — it can strengthen the profession. If it is used to paper over the loss of journalists themselves, it will produce a great deal of fluent text and very little sturdy journalism.
Follow the SPIN IDG WhatsApp Channel for updates across the Smart Pakistan Insights Network covering all of Pakistan’s technology ecosystem.