My Thoughts on AI in Writing
Artificial intelligence used to be the stuff of science-fiction. Whenever we heard those words, or even came across any mention of “AI,” we imagined a utopian world where sentient robots carried out tedious or even dangerous tasks while humans were able to pursue more meaningful paths.
However, movies, books, and video games blasted us with stories of the dangers of AI. Some people worried that computers gaining sentience would lead to our downfall, whether it’s via a supercomputer launching every nuclear missile across the world, or a robotic army enslaving the human race.
Well, it’s now 2025 and fortunately nothing of the sort has happened, despite just how much artificial intelligence has advanced in recent years. That being said, there are concerns regarding how “intelligent” artificial intelligence has become, especially in the realm of art.
Over the past few years, many AI-backed language learning models (LLMs) have emerged, from OpenAI’s ChatGPT, to Google’s Gemini. These programs have made their way into various assets, from summarizing social media comment sections, to providing condensed search results on popular search engines.
Plenty of other companies have utilized AI and LLMs into their business models. AI can be used to develop full sales strategies, generate pages of written copy, analyze marketing analytics, and more. AI is also able to create images and entire videos with sound! From a certain point-of-view, it’s incredible to see how far we’ve come technology-wise. But many people, including myself, are wary and apprehensive about what this means for the future.
There is a large conversation to be had about AI replacing a large portion of the human workforce. Certain programs can drive taxis, sort and deliver packages, and even analyze medical x-rays. But since I am a “creator,” I want to specifically discuss AI in the art world, specifically AI in writing.
Admittedly, I use AI (ChatGPT specifically) for certain tasks. I’ve used it to help explain a concept to me that I had trouble understanding, I’m using it to develop a social media marketing plan for an upcoming book launch, and I’ve used it to assist with managing a budget. There’s one really cool feature in one of my school textbooks where AI can take a chunk of text and generate a quick little refresher quiz to assess how well I know the material. In these aspects, I think AI is a fantastic tool.
But there are problems when it comes to writing. I once tried having AI edit and critique a short story I wrote. It gave some positive feedback and said there were no edits to be made. This may be obvious, but it felt very… disingenuous. Many LLMs are defaulted to giving “positive” responses unless the user specifically configures them to be more “fair” or “unbiased.” At the end of the day, the user has control over how the AI reacts, which, to me, means it’s not really artificial intelligence (again, I’m specifically referring to commercially-available AI programs like ChatGPT or Gemini).
I sent the short story to several beta readers who also enjoyed it, and I found myself more welcoming of their feedback because it felt genuine and “real.” From then on, I refused to use AI in my writing endeavors – it strictly remained a tool to help me out with other tedious tasks.
As I’ve discovered, not every writer shares the same perspective as me. I’m part of several writer groups on Facebook and have seen so many authors who stated they are using AI to churn out their books from now on. They feed samples of their writing into an LLM, and then they’re able to produce stories in minutes, like a cheaper, faster ghostwriter. They then take it a step further and have AI generate cover images to be used for their books. I actually left one group I had been part of for almost a decade because the group founders/admins actually encouraged the use of AI to rapid release books.
I hate to sound like a pretentious cynic (like people who talk about how the Fast and Furious movies are “ruining cinema”), but I have to say it: using AI in this manner is destroying art in so many ways.
First, AI imagery and content are not original. These programs use existing content from the internet to produce their material. This is tantamount to plagiarism in my eyes. Some LLMs have content policies in place to minimize IP violations, but there are easy workarounds if someone were committed enough. There was recently a scandal about an indie author who had an LLM rewrite a section of their book in the style of another more prolific author (I am not going to link it because I do not want to draw attention to their books).
Next, and this may sound a little cheesy, AI content is soulless. I know I am not alone in this, but it’s incredibly easy to spot generated written content. If it’s meant to be an informative article or blog post, then I don’t mind the AI as long as the information is correct (I promise I am not using AI for this post). However, if I’m reading a book or an editorial column and I get a whiff of AI, then I stop right there. Don’t get me started on how obvious AI photos and videos are (though, they are still an impressive display of the advanced technology so I have to give some credit).
Another point is regarding rapid releases. Personally, I can’t do rapid releases of my books. I just don’t have that kind of drive. Other authors are able to churn out high-quality books by the boat-load. Meanwhile, I’ve been working on the third chapter in The Storyworld Saga for close to a decade.
But using AI to do rapid releases doesn’t sit well with me. AI content is drivel, but flooding the market with said drivel only hurts authors. Self-publishing already gets a bad reputation because it allows people to push their work out into the world with ease. This is great for people who struggled to get an agent (like myself), but then you get plenty of poorly written, poorly edited books on the market that frustrate readers and make them hesitant to choose indie authors in the future. In my opinion, using AI to fully write and edit books will only contribute to that growing problem.
Finally, and this may sound a little gatekeep-y, but it’s based on personal experiences: AI is making people respect creators less.
I did a book event about a year or two ago and had someone walk by and say something along the lines of, “Why would I buy a book when I can just use AI to make one?” I tried to argue that it won’t be as special and he kinda chuckled and walked away (thankfully, because I did not feel like dealing with that). I saw an Instagram reel of a video (I can’t remember what it was about) but when I read the comments, there were plenty of parents talking about how they no longer buy children’s books because they can simply ask ChatGPT to write a story for them in seconds.
Writing a story, drawing a picture, playing a song – any type of art requires two things: talent and passion. AI and LLMs lack both of those factors, and they’ll never develop them, no matter how advanced they become (I don’t care what Steven Spielberg tells us).
All these programs do is produce content, and art is not content. It hurts to see people treat it as such. We are not just “sitting around, doodling” or “writing our little fairy tales.” We are working hard, sometimes spending hours late into the night, to create something that we hope will make an impact on the world. Sure, it’d be nice to make a living from it, but 99.999999% of creators are creating art not just because they want money, but because they want to express an idea to the world and evoke a genuine feeling from their audience.
As far as I’m concerned, AI will never be able to do that.
The post My Thoughts on AI in Writing appeared first on Alessandro Reale.


