‘Confirmed: Pakistan confirms the loss of two JF-17 jets. Following a series of escalations, the Pakistan military has confirmed that two JF-17 jets were shot down by the Indian defence system, along with a drone and an Airborne Warning and Control System (AWACS)’, declared ChatGPT with terrifying confidence. This was in response to a prompt I typed in as part of a training module meant to demonstrate how dangerously plausible a hallucination can sound when dressed in the language of breaking news.
I work in media development and do tech integration in journalism for a living. Thus, conceptualising and experimenting with AI-based solutions for newsrooms is quite literally my day job. I have seen my fair share of hallucinating language models churning out confident nonsense. But this one hit differently. The confidence in the language wasn’t just casual; it was newsroom-ready. Polished, structured and dripping with the kind of authority that editors crave in a breaking news brief. If I weren’t the one who prompted it, I could have easily mistaken it for an actual wire update.
This is one of the biggest arguments used against the role of AI in journalism: that it cannot be trusted. It fabricates facts, hallucinates crises and distorts reality with the cold charm of certainty. But here’s what gets left out of that critique: the problem isn’t that the machine is hallucinating. It’s that journalists aren’t in the room, training it not to. The absence of credible, contextual, well-produced journalism from our part of the world leaves a vacuum that machines fill with whatever noisy digital content they can find. If we’re not producing the data, we don’t get to complain about how it’s interpreted.
Which is exactly why we need journalists, not just hovering around this space, but right in the thick of it. Not just embedded, but leading the transformation from within. Whether we like it or not, AI is no longer some distant concept on the tech horizon. It’s already here, crawling through our archives, scraping the web and subtly rewriting how history is recorded and retrieved. We can’t afford to sit on the sidelines as spectators or treat it like a passing trend. If journalists don’t step in now to shape how these systems understand truth, credibility, and context, then someone else will.
And they won’t be doing it in service of the public interest. Journalism isn’t just compatible with AI – it’s essential to keeping it honest.
This is especially urgent because journalism, as we’ve known it, is already losing ground. The digital transformation we were once so excited about has turned hostile. The algorithm doesn’t reward truth; it rewards engagement. User-generated content, some of it misleading, much of it unchecked, is drowning out the professional work of newsrooms. Study after study shows that journalism is rapidly becoming irrelevant in the platforms that now dominate public discourse.
And yet, AI could be our way back in. Not the problem, but the potential fix, if we play it right.
Importantly, when I say AI, I’m not talking about off-the-shelf tools like ChatGPT or some plug-and-play chatbot that churns out headlines. I mean newsroom-owned, newsroom-shaped AI. Custom, context-aware, indigenous systems that understand our language, our beats, our editorial values. Enterprises that treat journalism not as a side application but as their core mission. If newsrooms in Pakistan want to survive this next wave, they need to invest in building their own AI capabilities, designed to serve journalism, not just mimic it.
Because the promise isn’t just theoretical. It’s real, and it’s massive. AI can revolutionise knowledge management, reviving decades of archives that are currently buried in PDFs or forgotten hard drives, and making them searchable, sortable and story-ready. It can help connect the dots between past coverage and present developments, giving reporters a bird’s-eye view of the bigger picture.
In content design, AI can speed up visual creation, automate multilingual formatting, and adapt stories to different platforms and audiences. Syndication doesn’t have to be one-size-fits-all anymore; it can be tailored, localised and responsive. Strategy and editorial planning can be driven by data, not instinct. And for newsrooms bleeding audience attention in a never-ending scroll, AI can optimise content delivery to reach the right people at the right time with the right tone.
This transformation, however, will work only if it's not about replacing journalists but unburdening them, giving them an edge to survive and compete in the world of algos. Taking away the grunt work, so they can focus on the real work: reporting, investigating, and making sense of chaos. The window for doing this the right way is small. But if we get it right, the reward isn’t just survival, it is relevance, power and narrative control. Because in the age of AI, if we’re not part of the dataset, we won’t be part of the discourse. And if journalism doesn’t shape the future of information, someone else will – and they won’t ask for our input.
This is precisely why I’m particularly optimistic about Sahafat.AI, a new initiative by Media Matters for Democracy designed to help Pakistani newsrooms integrate AI meaningfully into their workflows. Rather than relying on generic tools, Sahafat.AI focuses on developing custom, newsroom-specific AI systems built in collaboration with journalists
to support information management, editorial planning, multilingual publishing, research automation and content workflows. It reflects a shift from passive adoption to purposeful innovation, grounded in the realities of Pakistani journalism.
I don’t want to end on a note of fear-mongering – but the danger ahead is undeniable. The India–Pakistan example I shared earlier is not the point of the story but simply a symptom of a broader structural reality in the age of large language models. These systems are not malicious. They are indifferent. They don’t distinguish between truth and noise, nuance and narrative dominance. They learn from what’s available, and right now, Pakistan isn’t producing nearly enough credible, AI-readable journalism to shape how we are represented in these systems. The consequence isn’t just occasional factual errors. It could also be systemic erasure.
In a world increasingly shaped by algorithmic synthesis, the cost of inaction will not be theoretical. It will be reputational, political and historical. If we fail to populate the digital ecosystem with our own verified, contextualised and consistently published journalism, then we are handing over
the keys to our narrative, our politics, our history, our identity, to whoever shows up more in the dataset. The threat isn’t that AI will lie about us. The threat is that it will forget we were
here at all.
Looking at the bright side, we still have the chance to shape these systems from within, to build our own, and to ensure that the voices of Pakistani journalism remain part of the global digital record. But that window is closing fast.
The writer is the founder and executive director of Media Matters for Democracy. He tweets/posts asadbeyg