Did You Write This?
Authorship, AI, and the Ethics of Knowing

Rafiq al-Bunduqia watches — not to catch you out, but to make sure you’re there.
1. Introduction: The M Dash and the Machines
There was a time when the biggest red flag in student writing was a mysterious shift in font — or the sudden appearance of a semicolon. Now? It’s the M dash. That long, theatrical punctuation mark — so beloved by stylists, poets, and late-night philosophers — is now being flagged as a possible sign of AI authorship.
As someone who has long preferred the M dash over the “weakling N dash,” I can only laugh. But beneath the punctuation paranoia lies a deeper, more pressing question — one that has followed me for some time:
What does it mean to write — and to take responsibility — in the age of intelligent machines?
Two days ago, I revisited this very theme in a public reflection I posted to LinkedIn under the voice of Rafiq al-Bunduqia:
“Don’t hand over your compass.”
AI can help you draft faster, structure better, even summarise long texts.
But it can’t replace your intention, your ethics, or your lived knowledge.
Use it — yes. But stay in charge.
Check. Review. Bring your full self.
Because tools may change — but responsibility doesn’t.
The question isn’t simply whether AI was used — the deeper issue is:
Did you show up? Is this yours?
That very concern was echoed in April 2025 by writer and academic Shafinaaz Hassim, who asked a sharp and timely question to her peers:
“Friends in the academy, publishing industry, etc — with rising concerns on content creation, what are the ways to ascertain whether students or writers have relied on AI…?”
Her question invited a rich and thoughtful set of responses — not moral panic, but reflection on pedagogy, ethics, and the changing terrain of writing itself.
This blog is part of that unfolding conversation. Not to police authorship, but to deepen it. To move the focus from howsomething was written to who stands behind it — and why that still matters.
2. A Shift in Focus: From Origin to Ownership
In a recent LinkedIn post, journalist and academic Reg Rumney described the quiet dilemma he faced while marking student work:
How do you tell whether a piece of writing has been shaped — or even generated — by AI?
Is it the rhythm? The over-polished syntax? The sudden arrival of the elegant M dash?
But then, he had a moment of clarity. The real question isn’t whether AI was used — it’s whether the writing stands on its own.
Is it well-sourced? Are the claims grounded? Does it reveal engagement with the material and the reader?
He writes:
“I don’t need to know whether A.I. was used in the production of any writing. What I need to determine is whether the writing is properly sourced and justifies the claims it makes… If any writers have used A.I. along the way, so be it. They own the output.”
This is a quiet revolution in how we might think about authorship. It places the focus not on origin — but on ownership. Not how the writing was produced, but whether the writer is present, attentive, and accountable.
We’ve seen this sleight of hand before. Long before ChatGPT, many writers (and yes, some journalists) would open with “Economists say…” or “Experts agree…” — without ever citing who those economists or experts were. The problem here wasn’t AI. It was vagueness disguised as authority.
Rumney points out that even AI tools that cite sources — like Perplexity — often do so too broadly: a whole IMF report is not a source. A page number, a quotation, a date — that’s sourcing. That’s what makes information traceable, and ideas accountable.
And so the challenge before us is this: don’t outsource your integrity.
If you quote someone — say who. If you argue a point — ground it. If you make a claim — own it. Whether you used an AI model, a spellchecker, or the notes from your last lecture is beside the point. The question is:
Do you understand what you’ve written — and can you stand by it?
That’s what distinguishes a writer from a copyist. And that, in the end, is the difference between literacy and knowledge.
🫖 Rafiq al-Bunduqia says:
“It’s not where the words came from — it’s whether you meant them.
AI can dress the table, sure. But did you cook the food? Did you taste it?
Don’t serve your guests borrowed meals and call it hospitality.”
3. Classroom Practices: Witnessing the Process
While much of the conversation around AI writing has turned toward detection tools — apps, software, “AI sniffers” — some of the most grounded wisdom comes from teachers who choose a different path: staying close to the writing process itself.
Siphiliselwe Siphili Makhanya, a teacher and writer, offered a refreshingly practical and humane approach in response to Shafinaaz Hassim’s question. Her method? Make the process visible. Watch the work unfold.
“I try to make the kids start their writing process in the classroom so I can see a first draft taking shape. I also ask them to add me as an editor to their Google Doc — you can view the different versions with timestamps and figure out if they actually sat there and typed or if they copied and pasted.”
This isn’t about surveillance. It’s about presence. A teacher witnessing the formation of thought — not just the final result. That alone often reveals more than any detection algorithm ever could.
But Siphiliselwe also adds a caution that speaks volumes:
“Be careful though — people with autism can sound like AI in their writing patterns.”
That one line is worth pausing over. In our rush to detect machine-generated language, we may end up punishing neurodivergent writers for simply having a different rhythm, structure, or affect. The tools that claim to sniff out artificial intelligence are often not trained to recognise human variation.
Instead of rejecting AI, Siphiliselwe teaches students how to use it responsibly. In her classroom, AI isn’t the writer — it’s a tool for critique, for grammar checks, for scaffolding clarity. She adds:
“We’re also discussing how it actually hurts you to become overly reliant on AI… you never gain the skills, and what skills you do have will atrophy without use.”
Emma Arogundade shares a complementary strategy. Rather than banning AI, she integrates it into reflective pedagogy. In one exercise, she gives students a short ChatGPT-generated answer to a standard exam question — and then asks them to critique it using class readings and personal insight.
“Then I ask them to reflect on what they’ve learned,” she writes.
What’s happening here isn’t about “catching” students. It’s about creating a relationship with their own thinking. It’s about making the invisible choices of writing — source, structure, argument — visible again.
These educators are not outsourcing authorship to software. They are recovering authorship through presence. Slow, steady, relational presence. And that may be the best safeguard of all.
4. The Deeper Wound: When We Outsource Thought
There’s something seductive about smooth writing. It reads well. It flows. It sounds finished. But fluency is not the same as understanding. And what worries many of us — far more than whether students used AI — is the deeper erosion we begin to see:
the outsourcing of thought itself.
When students (or journalists, or policy writers) start depending on machines to explain things they don’t understand, to connect dots they never truly explored, or to mask gaps in knowledge with polished generalities, we’re not just talking about AI anymore.
We’re talking about the quiet death of inquiry.
I’ve read pieces that were grammatically flawless — structurally sound, even “academic” in tone — but hollow inside. There’s no felt tension, no intellectual wrestling, no friction of mind against material. It’s like looking at a beautifully decorated cake that collapses when touched. No weight. No substance.
And this isn’t new. Even before AI, students have been tempted by shortcuts — Wikipedia patches, essay mills, pre-written templates. But the difference now is that AI makes the shortcut feel authentic. It mirrors their tone. It fills in their arguments. It “sounds like” them — better than they sound like themselves.
That’s the real danger: not that AI steals the writer’s voice, but that it gives back a voice that feels plausible — without the effort of becoming it.
If we’re honest, most people won’t say, “It wasn’t me, it was ChatGPT.”
But many will turn in work that sounds more assured than they feel. They’ll submit sentences they don’t quite understand. They’ll quote things they haven’t really digested. They’ll adopt a style without earning the insight.
And when that happens often enough, something slowly fades:
- The courage to grapple with complexity.
- The patience to revise what’s unclear.
- The humility to not know something — yet.
That’s the deeper wound. Not just a question of authorship, but of learning. When we outsource our thinking too early, too often, we never develop the callouses of real understanding.
🫖 Rafiq al-Bunduqia says:
“The saddest thing isn’t that a machine wrote it —
it’s that the human reading it didn’t notice.
Because when thought is absent, even beauty is a costume.
And costumes don’t bleed.”
5. A Call to Ethical Writing Pedagogy
So where does this leave us — teachers, editors, writers, mentors — in the age of intelligent machines?
The temptation is to double down on control: tougher detection tools, stricter rubrics, more suspicion. But that approach misunderstands the moment. The real challenge is not technological. It’s formational. We are not merely assessing output — we are shaping people.
This is the deeper work of pedagogy: not to police writing, but to nurture writers. And that means helping them ask not just, “Did I write this?” — but “Do I stand by this?”
Here are a few modest suggestions — drawn from conversations with thoughtful educators and shaped by years of writing and mentoring:
1. Make the process visible.
Let students draft in real time. Let them annotate their own thinking. Use collaborative tools that allow for version tracking, not for punishment, but for conversation.
2. Shift the focus from fluency to friction.
Reward uncertainty. Praise revision. Ask not just for answers, but for process notes: What did you struggle with? What changed your mind?
3. Use AI transparently — and critically.
Give students a ChatGPT response and ask: What’s missing? What’s misleading? Could you write this better — not slicker, but truer?
4. Reinforce responsibility, not just originality.
Ask: Where did this idea come from? Did you cite the source? Do you understand the claim?
5. Honour the writer’s presence.
What matters is that the writer shows up. In risk. In reflection. In integrity. That’s the real goal.
🫖 Rafiq al-Bunduqia says:
“Don’t fear the tools. Fear forgetting yourself.
If the words are yours — broken, blooming, becoming —
then no one can take them from you.
You’re not being tested on perfection.
You’re being invited to show up.”
6. Closing: “It Was Me. I Wrote This.”
In the end, the question “Did you write this?” isn’t a trap. It’s a mirror. It asks the writer to pause — not in fear, but in presence. To stand beside their words and say,
“Yes. This is mine.”
That doesn’t mean the writing is flawless. Or that AI wasn’t used along the way to brainstorm, rephrase, or clarify. It doesn’t mean the grammar sparkled or the structure landed. But it means the mind and conscience behind the words are awake. It means the work is inhabited — not just delivered.
This is what writing has always asked of us: not just fluency, but presence.
Not just polish, but integrity.
Not just production, but reflection.
So maybe the final test is not whether a text was written by a human or a machine. Maybe the test is simpler — and deeper:
“Did someone care enough to mean what they wrote?”
“Can they say, with some trembling or pride — ‘It was me.’”
That’s the writer’s task. That’s the learner’s journey. That’s the invitation still open — even in the age of algorithms.
🫖 Rafiq al-Bunduqia says:
“You don’t have to sound perfect.
You just have to be there —
in the ink, in the pause, in the part you nearly deleted.
That’s where the real writing lives.
That’s how we know it was you.”
Acknowledgements
With deep gratitude to Shafinaaz Hassim, whose question opened the door to this reflection, and to Reg Rumney, whose response offered a compass point: judge the output, not the method.
Thanks also to Siphiliselwe Siphili Makhanya and Emma Arogundade for generously sharing their classroom wisdom — rooted, ethical, and alive to the human behind the words.
And to Rafiq al-Bunduqia — for watching from the edge of the screen, always asking, “Did you show up?”

