By avoiding sounding like AI, you are avoiding sounding like a Human.
You're letting AI change you — are you okay with that?
I can’t get it out of my head: agents and editors are actually telling their authors to remove metaphor and simile from their writing because it ‘sounds like AI’. This is exactly what I was worried about. Let’s dig in.
I’m an author, content strategist, and anthropologist actively researching the AI Revolution. By removing the em-dash, by trying to change the way that you speak so that you don’t sound like AI, you are, in fact, reducing how much you sound like a human.
AI was trained on human literature, and not just any literature: AI was trained on the best literature humanity has to offer.
So, for example, when AI makes use of an em-dash and everybody freaks out, humans are encouraging bad grammar. I described the impending consequences in my previous post, and now here we are.
Now the writing controversy has depend. Honestly, folks, we have an all-out crisis, and no one is talking about it.
First it was the em-dash
The came the Shy Girl controversy
Now agents and editors are starting to tell their authors to remove metaphor and simile from their writing because it sounds too AI-ish (my TikTok hot take, here)
This isn’t a post on whether or not AI should be used. This is a post exploring why we feel shame for using AI; especially if you’re a writer.
I don’t know what happened behind the scenes of the Shy Girl situation. I don’t know if the author used AI or not. Honestly, I’m not sure if we should care.
I want to know why editing with AI, which we have been doing for decades via Grammarly, by the way, is suddenly shameful. This is where the high-culture revolution needs to be shaped — by people like us.
Consider this: stock brokers have been using some form of AI and prediction models for decades. They make tons of money doing it. Do they go home and feel shame for using tech to predict the markets instead of their good ol’ grey matter?
Then why is it that writers, researchers, and business developers feel shame for using AI?
Is it because writing and the creative arts are supposed to result in some kind of pure human essence? Who decided that was the only valid distillation?
Look, I personally will not use AI in any of my professional writing. My books are my own. My essays are my own. My poetry is my own. I believe that nothing can truly replicate the human experience and I find the struggle to describe it illuminating. That is, I value the struggle as much as the result.
But I also think that it’s okay to double check that my grammar is correct. It was okay to have Microsoft Word do it. Why isn’t it okay for AI to do it?
I’m not asking you to change your behaviour or to handle AI one way or the other. What I’m asking you to do is decide what you want, and when you decide, to stand by your decision.
The fact that we feel shame for an inevitable decision of engaging with AI in some aspect is the troublesome part.
When we say it’s not human work, thus it isn’t worth money, what we’re really saying is the work to engage with AI, to develop an idea and then to prompt AI well enough to generate something that has value is not real work.
But only for some; because I can go online right now and look at hundreds of job listings that feature the job title “AI Prompt Writer”.
So why can I get paid as an AI prompt writer but the author of Shy Girl had her book taken away from her and her name besmirched because there were some supposed signs of AI in her writing (the sources being incredibly dubious)?
By now you’re probably unsure that what my stance is, and that’s okay, because my purpose here is to not tell you what I think or what to do.
I’m telling you to stand by your decision, whatever one you make, instead of feel shame or impart shame on others.
We are all working to stay on the cutting edge, and we are all working to do so in a way that aligns with our core values.
Now, if you don’t know what your core values are, I invite you to join me next week in a workshop to hammer that out in a way that makes you unshakable whenever a massive industry or culture change occurs.
The rest is up to you.
Stay grounded,
Veronica





All of this! We definitely have to stop policing and outing people. If you don't like a book, don't read it. I saw a post the other day saying you can tell something is written by AI because it has illogical errors in the narrative that don't make sense. Well, I was just re-reading my work in progress that came from my human brain and there is a character in chapter 13 engaging with people who very clearly went somewhere completely else in chapter 12.
There's a big difference in asking a model to output an edited version of the writing, and using AI or other tools like Grammarly to suggest edits. The former will always imprint statistical biases and signature patterns that are almost impossible to remove in standard LLM sampling (I'm working on alternative samplers that might diminish it).