Writing with AI

Well, it applies to artists whose works are still under copyright, so both living artists and the estates of recently deceased artists.

I think something like that idea, analogous to the licensing for music samples, is probably inevitable in other fields as well.

How will they know that the works are AI generated? And what constitutes being AI generated? If you used AI to edit your book would that count as AI generated?

1 Like

They can charge whatever they want, but they can’t stop someone else from redistributing it for free. Effectively the piracy problem that plagues copyrightable works, but without the various legal tools used to limit it.

That’s not what I had in mind. Those images are “published” (made public) under specific terms, so potential customers can find them, preview them, or create mockups with them, before buying. They’re, in this form (as a preview), specifically not intended to create a commercial product in itself.

So I could just wait for others to “write” books and then say they’re mine.
That’s even better than GPT!
(Just joking)

Obviously that’s a likely debate for future litigation and/or legislation. Just how much human contribution is needed to make something copyrightable?

This suggests that a lot of people are not going to admit using AI. And it would be hard for anyone to tell if they did. I am specifically talking about text not images. If someone who can’t draw suddenly published art that would be a no-brainier.

Admin note: I’ve moved the entire thread to the And Now For That Latte forum. It doesn’t have much of anything to do with Scrivener.

3 Likes

I don’t think that’s accurate. As I mentioned above, works that are not protected by copyright at in the public domain. There are many companies making a lot of money selling IP in the public domain. Not the least of which are Bible publishers.

Also, many book publishers take PD works and transform them by adding forewords, annotations and analyses to create a new edition of the PD work that is eligible for copyright. The same could be done to AI generated works. Just running an AI book through your word processor and reworking it makes it eligible for copyright AFAIK.

I think that many authors will prompt an AI bot to write their book and they will simply claim it as their own. Since there is no infringement and no injured party, the odds of being punished for claiming Public Domain material as your own seem quite low.

The legality of it all will come down to court rulings about the threshold of transformation. How much is required to activate copyright eligibility? Until we have bright-line rulings, it’s the Wild West out here.

1 Like

Ever wonder why new translations of the Bible are coming out all of the time? Some of it’s scholarship, sure, but a new translation is a new copyrightable work.

The potentially injured party(ies) would be the author(s) of the works used in the training set. The definition of “derivative work” is about to get a judicial beating.

Goofing around

Truly impressive.

Something tells me that the free Bing version is not quite what I should be stressed with…

That conclusion is a stretch. If you told AI to create a work in the style of Stephen King and it produced a reasonable facsimile of Firestarter, that would not be public domain. It would at least be eligible for a copyright infringement action.

The proper term for what you’re describing is derivative work. In the US at least, a new edition of a derivative work is only eligible for copyright of newly authored portions, such as a newly-written forward. As for reworking, just no. Here is what US copyright law (17 USC 103(b)) says about the above:

The copyright in a compilation or derivative work extends only to the material contributed by the author of such work, as distinguished from the preexisting material employed in the work, and does not imply any exclusive right in the preexisting material. The copyright in such work is independent of, and does not affect or enlarge the scope, duration, ownership, or subsistence of, any copyright protection in the preexisting material.”

1 Like

I think you know that’s a poor example. In that case, you’d be deliberately asking the AI to plagarize.

If, on the other hand, you asked the AI to write a novel like Firestarter, written in the style of Stephen King, but give the child the phychic power to bring frost like the girl in Disney’s Frozen, you’d end up with a book that felt like an unoriginal crib of King, but it would not be plagarism. It would also be unprotectable by copyright unless a person transformed it substantively.

Yes, as I mentioned above, the process of taking a PD book and augmenting it in order to make it protectable is by definition creating a derivative work. When you do that, the PD book with your augmentations becomes protectable, and therefore a stronger monetization asset. The new book you made can’t be copied and sold without your permission. The underlying PD work can be.

Example: PRIDE AND PREJUDICE AND ZOMBIES is an augmentation of PRIDE AND PREJUDICE. It is a derivative work, but since the underlying work is in the public domain, there is no rightsholder to license or restrict your use of the work. The copyright of PRIDE AND PREJUDICE AND ZOMBIES prevents anyone from making and selling copies of the book, or making a zombie movie out of it without the new author’s permission. It doesn’t prevent anyone from publishing PRIDE AND PREJUDICE or making a movie from it.

As I said before, I think some writers will have the AI write entire books for them and will claim them as their own. Since there is no injured party, there is no torte. There might be an argument of claim of fraud, but we might have to rethink that as “authorship” is redefined.

An AI in these examples requires human input to write a book. Someone has to type “write a horror novel in the style of Stephen King about a little girl who has the psychic power to freeze things because of a secret government experiement conducted on her parents while they were in college.” The argument will be made that the human input qualifies as authorship.

For now, it’s a grey area that will require interpretation by the courts.

The Writers Guild of America made a statement that AI-generated text was like text genrerated on Final Draft. The AI is just a machine to facilitate the work of the author. I think when the dust settles, this will be the prevailing opinion, with some kind of financial compromise to soften the fiscal impact of everyone being able to write a book in a day.

1 Like

As I mentioned above, I think the cause for action on behalf of the authors will be against the AI companies on the basis of exploitation without license, not the AI users creating new works.

My argument was that using an AI to create a book probably doesn’t incur a lot of legal liability, because there’s no injured party. If there is an inury, it was caused in the training stage, not in the usage stage.

A little disagreement with @Orpheus - I don’t think talent is king; I think that hard work allied to talent is royal.
I teach people to write stories - a small class on Zoom - and one of the things I teach is show-don’t-tell, because people starting to write tend to be narrative-heavy and point-of-view-light; the ability to have a version of a story automatically translated in this way is useful. Whether you keep the machine’s revisions is another matter; but having the machine revise the story like this can show you a new way of looking at it; you can print it off and then rewrite it again in a different way. You may return to the third-person-objective distant narrative, but you’ll do it informed by that different view that the machine tool has given you.

1 Like

As usual in such cases, the injured party is likely to go after everyone they can think of. They might not win, of course. Certainly if I were a publisher I would stay very far away from such works.

this is an end. I do think so. I think this is the blue room alluring between what humans do and what humans do together. This tool adds

The above is text to speech unedited. This kind of thing and the bot are worse than we are, for now, but the day will shortly come when the bots are better, or enough so the difference is not noticeable.

It will end the writing business as it is. It will devalue writing.

Devaluing writing cascades across society.

I have been listening to people talk all day about whether AI is sentient and it will be soon, it seems.

A skilled AI author will crash things for us. Not enough care to do differently.

People who think AI is near-sentient tend to have a very simplistic understanding of what intelligence is. You won’t hear a lot of neuroscientists saying that.

As for AI “devaluing writing,” as I said upthread, it’s not like writing is a scarce good now.

4 Likes

Calling statistical pattern recognition machines “artificial intelligence” was a mistake. Perhaps this was done to get attention, because in truth such networks have no trace of intelligence or even a consciousness-like state, not even close. But they are excellent machines for statistically collecting large amounts of data and assigning them to each other.
It is often said that the routines of an “AI” are modeled on the human brain, which is the second fallacy, because even the functioning of the brain is only understood in small areas. We still do not know how intelligence works in the brain. So the replication by an AI in the present is still doomed to failure.

1 Like

This. Machine intelligence, when and if it comes, may not look like human intelligence. But I’m pretty sure it won’t look like “needs to analyze a large fraction of the internet to write a coherent paragraph,” either.

1 Like