Is AI Writing A DEADLY Drug?!

Reading Time: 5 minutes

The use of AI when writing cannot only be addictive…it can be deadly.  Asking a chatbot to brainstorm ideas for you when you hit a roadblock while writing can make things easier, but is that good?

No.

The beauty of creative writing is making it original, and true to you.  When you turn to AI to come up with ideas for you, you slowly but surely become addicted.  This reliance kills your ability to be an independent writer with original and creative work.  Let’s talk.

Here’s A Quick Look at What We’ll Uncover:

  • How reliance on AI can worsen your ability to write independently.
  • How what you see when you see a strawberry is amazing.
  • How AI becomes a dangerous and unnecessary crutch.

How Can AI Be A DEADLY Drug?

Well, how about we first look at the Australian Government’s Department of Health, Disability and Aging definition of a drug (thought that would be an interesting source).

“Drugs are substances that change a person’s mental or physical state.  They can affect the way your brain works, how you feel and behave, your understanding and your senses.  This makes them unpredictable and dangerous, especially for young people.”

How about we dissect this definition?

Drugs can change your mental state.

That change in your mental state is made worse once you become reliant.  So let’s look at what happens when you become reliant on AI for your writing.

I’ve never personally used a chatbot, but I’ve discussed them with friends and colleagues who use it.  It’s relatively straight forward: you type something into the chatbot like “I need a synopsis for a show titled Across State Lines.”  That’s the title of TV show pilot I wrote.

I don’t know what the chatbot would spit out, but I can promise you that even with 100 attempts, it wouldn’t get the plot right.  Why?  Because titles of shows and films are typically creative in and of itself.

You wouldn’t re-title The Avengers to Six Superheroes Assemble to Take On an Alien and His Army in NYC because that’s a horrendous title.  Titles aren’t meant to be descriptive, they’re meant to be distinct.  A unique title is what engages audiences — a title should represent the story accurately, but not explain the story.

When you turn to AI to come up with a synopsis for your story, or the title, it will undoubtedly land you in boring water.  If you come up with a synopsis for a story you wrote, it’ll be unique by every measure. When you turn to AI, it’ll change your story and give you a version of the same synopsis it would give everyone else.

Your mental state will go from functioning with creativity to functioning like a cliché.  No one wants to be a cliché, so why would you want your own writing to be cliché?

Drugs can affect your ability to understand.

Your ability to understand things, to grasp and recognize things is one of the most beautiful things you can do.  I’m sure that sounds a little theatrical, but it’s true!

You look at a strawberry and what do you see?

You understand it’s edible.

You grasp that it’s a fruit.

You recognize it’s the color red.

AI can do the same thing.  But do you need to consult AI every time you see a strawberry to find out whether it’s a red edible fruit?  Unless you’re colorblind like my friend Jackson who thinks the Wicked Witch of the West is grey (she’s green), you should see red.

Don’t turn to AI like a drug to function.  The moment you do, you risk becoming reliant and losing the ability to understand independently.

Drugs can be dangerous.

AI isn’t going anywhere, and that’s okay. It’s true that AI has a purpose, it can be safe when prescribed and used properly.  But also like drugs, it can be dangerous when abused.

Chatbots can answer almost 80% of routine questions.  The key issue is that they are even being used for routine questions.

So what makes it dangerous?  Its ability to become an unnecessary crutch.

Turning to AI to define a word is perfectly safe. But turning to AI every single time to define the exact same word…is not safe.

How?  Because you’ve registered that you can turn to a chatbot to define the word instead of registering the definition in your own memory.  You need to retain, not rely.

So how does that make AI a drug?  Well, it’s simple, you can replace the word Drug with AI in each sentence italicized.

AI can change your mental state.

AI can affect your ability to understand.

AI can be dangerous.

It may be my opinion.  It’s subjective sure, but look at the reasoning — with the hopes of not sounding pompous, I’d argue it’s solid.

Let’s face it…

AI can be a dangerous drug when applied to writing.  It has the ability to alter the way in which you understand what you see and read.  Even more dangerous is its ability to remodel your mental state and kill your artistic ability.

AI is DEADLY to your creativity.

And remember, what you see when you look at a strawberry is awesome.

Want to learn more about my mission?

Check out the website!

SOURCES:
* Canva was used to create the images used in this blog *
Australian Government’s DHACC Definition of Drug

contact a real writer

We’ve entered a brand yet bland new world with AI writing, but if you’re interested in original and creative writing by a real person, fill out the contact form today! I’ll be in touch within 48 hours to schedule your free consultation!

No artificial "intelligence" was used in the writing of this site.

Contact Form
© 2026 Nick Stat Writing. All rights reserved.