Thursday, August 7, 2025

Chatter

So here's the big thing that is wrong with using AI to write your book.

The book is the writing.

This works for visual imagination, too. Hell, in that case we can go right down to models of the human visual system. Know the blind spot? Yes, but you can't see it. That's because vision is an illusion. You aren't seeing this 3D world in high detail, straight lines and everything. That's constructed for you in your mind.

Or, rather, the illusion of it is. You think you see the world in high detail because anything that catches your attention, your eyes flick to it in that ceaseless motion they are always making. Your mind is maintaining this sense of the rest of what is in the visual field and giving you this emotional impression of it all being there, even though the reality is that it is in lower detail outside the center of your vision and your moment's attention.

It is like a dream. When you experience it, you think it is all there in detail. You also think the story makes sense and that's what I am winding back to.

Because anyone who actually crafts an art realizes that while the basic shapes and that impression of it all making sense is there at the outermost level of detail, at the most zoomed-out level of perceiving it, the experience of the book or movie or artwork is the encountering of details that agree with and support that impression.

And these details aren't in the writer's head. They aren't in the artist's visual imagination, no matter how good. Because the human brain isn't big enough to hold it all.


 That artist above had a concept of the character that informs every stage. She didn't have to draw the shoes before she knew what kind of shoes that character would wear. But at the same time, she didn't know how those socks folded or how many laces or any of that because those details didn't matter.

In many cases they unfold from the underlying conceptions in a logical way. Or can be reconstructed from basic principles. They don't need to invent the concept of "shoe" just to finish a drawing. They can also start that drawing knowing that shoes exist, that they as an artist have drawn shoes before, that they know how to look up a reference if that fails. It is, to borrow the math joke, a problem for which "a solution exists."

But the specifics of that shoe supports that original idea of the character, and the execution of it is unique to that artist in many ways, and the combination is that which makes this her drawing.

At the very best, if you ask ChatGPT to write your novel for you, it is only using that first gestural drawing. None of that input the artist makes is there.

And that's best case. The AI operates not with a deductive logic but statistically; it will add the kind of shoes that are more likely to be added in similar circumstances. This is a place where an artist could say, "ah, but he might have penny loafers with tassels, and that could add a little flair that isn't otherwise visible in his dress." The AI can't make those kinds of decisions.

It can give the illusion of making them, because it will make some decisions and some of those will be low probability. But even outside the "death of the artist" argument, since there's no connectivity here, the details won't support each other.

More artist talk. See that gestural drawing that's first in the series, and how that captures how the character is standing? Now look closer. The drape of the clothing follows how that clothing would have to move from that person assuming that position. The line-work points and subtly accents the underlying line of action.

Look at an AI image and the line is broken. Because there never was a line; any of the parts that remain are borrowed chunks from similar poses and similar choices made by similar artists that may or may not resemble each other in this specific aspect.

Every single line of dialogue in a novel is doing something. Every choice of a word in a description is doing something. It isn't a a "gaunt" stoney outcrop because that's a synonym for bare, it is because the writer wanted you to be thinking of sunken cheeks. Of hunger, perhaps, thus helping to establish that this is a place bare and inhospitable. Or it is "gaunt" because they'd used "barren" in the previous sentence and those two sound too much alike. Or it is "gaunt" because three paragraphs down there's going to be a little joke with it.

Again, the AI can do this. Not through intention, but because it is in the training data, and the patterns are familiar, and some other writer once made a similar choice even if for different reasons. So it can come up, and it can convince, create that illusion of mind, the way a dream can appear to have a rational plot at the moment you are dreaming it.

But none of this is the choice made by the person who asked AI to write their book.

No. You didn't find the cheat code to make art. You didn't find a way to skip the boring part -- because the part of it that is your book isn't there in the idea, in the outline, in the prompt.

It didn't exist. It never existed. You've got an illusion of this wonderful book that just needs someone to put the words down for you. No. You don't. That is the blind spot speaking, the dream speaking so compellingly. The book in your mind doesn't exist yet. And it will never exist.

Unless you write it.

No comments:

Post a Comment