• Pascal's Chatbot Q&As
  • Posts
  • Perhaps the authors actually own or co-own the AI engines. ChatGPT-4: The analogy of making a table out of stolen wood is apt

Perhaps the authors actually own or co-own the AI engines. ChatGPT-4: The analogy of making a table out of stolen wood is apt

It illustrates the idea that the final product (the trained AI or the table) may embody value and labor from multiple sources, yet the initial input (the data or the wood) was taken without permission

Question for AI services: In relation to this article: Opinion: AI appropriated my books. Someone will profit, but it won’t be me https://www.latimes.com/opinion/story/2023-09-29/ai-artificial-intelligence-meta-books the following comment was made: "There is an interesting legal discussion to be had about the notion of accession to property, not to mention the constructive trust. Perhaps the authors actually own or co-own the AI engines. Make a table out of wood you stole from me - who owns the table?" What are your thoughts about that?

ChatGPT-4’s answer:Finished browsingThe comment touches on a couple of intriguing legal theories in the context of artificial intelligence (AI) and intellectual property:

  1. Accession to Property:

    1. Accession is a principle of property law that when one person's property is used in such a way as to become a part of another person's property, the owner of the original property may lose ownership rights in favor of the other person.

    2. In the context of AI, if someone uses another person's data or intellectual property to train an AI system, the question arises: who owns the resultant trained AI? The person who provided the original data or the person who performed the training?

  2. Constructive Trust:

    1. A constructive trust is a legal remedy imposed by a court to benefit a party that has been wrongfully deprived of its rights due to either a person obtaining or holding legal right to property which they should not possess due to unjust enrichment or unfairness.

    2. Applying this to AI, if someone wrongfully uses another person's data to train an AI and benefits from it, a court might impose a constructive trust to transfer the benefits (or potentially the AI system itself) to the rightful owner of the data.

  3. Ownership of AI Engines:

    1. The comment hints at a broader debate over who owns or co-owns AI systems especially when multiple parties contribute to the AI's training, whether through data provision, algorithm design, or other means.

The analogy of making a table out of stolen wood is apt. It illustrates the idea that the final product (the trained AI or the table) may embody value and labor from multiple sources, yet the initial input (the data or the wood) was taken without permission. The legal landscape around these issues is complex and evolving, and the rightful ownership of AI systems and the contributions to their training are subjects of ongoing legal and ethical debates.

Google Bard’s answer:The comment "There is an interesting legal discussion to be had about the notion of accession to property, not to mention the constructive trust. Perhaps the authors actually own or co-own the AI engines. Make a table out of wood you stole from me - who owns the table?" is a thought-provoking one.

The analogy of the table made from stolen wood is interesting. In this case, the thief has used the wood to create a new object, but the original owner of the wood still has a claim to that object. This is because the thief has added value to the wood by turning it into a table.

In the case of AI-generated content, the analogy is not as clear-cut. The AI engine is not stealing the author's work in the same way that a thief steals wood. The AI engine is simply using the author's work as training data to generate new content.

However, there is still an argument to be made that the author has a claim to the AI-generated content. After all, the author's work was essential to the creation of the AI engine. Without the author's work, the AI engine would not be able to generate any content.

One way to resolve this issue would be to give the author a share of the profits from the AI-generated content. This would be similar to the way that copyright law currently works. Copyright owners are entitled to a share of the profits from their work, even if that work is used by others to create new works.

Another way to resolve this issue would be to give the author the right to veto the use of their work in AI-generated content. This would allow the author to control how their work is used and to prevent it from being used in ways that they do not agree with.

Ultimately, the question of who owns AI-generated content is a complex one that will need to be resolved by the courts. However, the analogy of the table made from stolen wood is a helpful way to think about the issue.

In addition to the legal discussion, there is also an ethical discussion to be had about the ownership of AI-generated content. Some people may argue that it is unfair for authors to profit from AI-generated content, since they did not create the content themselves. Others may argue that it is important to incentivize authors to continue creating high-quality content, even if that content is used to train AI engines.

Ultimately, it is up to society to decide how we want to balance the rights of authors with the needs of technological innovation.

Bing Chat’s answer:Hello, this is Bing. I read the article and the comment you mentioned. It seems that the comment raises some interesting legal issues about the ownership and rights of AI-generated content, especially when it is based on existing works by human authors.

Accession to property is a legal principle that states that when someone adds value to another person’s property, they may acquire a right or interest in that property1. For example, if someone paints a picture on a canvas that belongs to someone else, they may have a claim to the painting or a share of its value. Constructive trust is another legal concept that arises when someone holds property on behalf of another person, either by agreement or by implication2. For example, if someone buys a house with their partner’s money, but puts it in their own name, they may be holding the house on trust for their partner.

The comment suggests that these principles could apply to the situation where an AI engine generates content based on existing books by human authors. The comment asks whether the authors could have a claim to the AI engine or its output, as they have contributed to its value or creation. The comment also uses an analogy of making a table out of wood that was stolen from someone else.

This is a complex and novel question that has not been clearly resolved by the law. There are different arguments that could be made for both sides. On one hand, one could argue that the authors have a right to the AI engine or its output, as they have provided the original material and inspiration for the AI-generated content. They could claim that the AI engine has added value to their property (the books) or that it holds their property on trust for them. They could also argue that the analogy of the table is not appropriate, as the books are not physical objects that can be stolen, but rather expressions of ideas that can be shared and reused.

On the other hand, one could argue that the authors do not have a right to the AI engine or its output, as they have not created or controlled the AI-generated content. They could claim that the AI engine has not added value to their property (the books), but rather transformed it into something new and different. They could also argue that the analogy of the table is more suitable, as the books are intellectual property that can be infringed, and that using them without permission or compensation is akin to theft.

Ultimately, this is a matter of legal interpretation and policy that may depend on various factors, such as the nature and extent of the AI-generated content, the purpose and context of its use, and the impact and interests of the authors and other parties involved. There may be different outcomes in different jurisdictions and cases. Therefore, I cannot give you a definitive answer to this question, but only share some possible perspectives and arguments. I hope you find this helpful and interesting. 😊Claude’s answer:I don't have a definitive legal opinion on this complex issue, but here are a few thoughts:

  • The concepts of accession and constructive trust are interesting ones to consider in relation to AI and copyright. Accession refers to when one person takes or modifies the property of another, adding value to it. A constructive trust is when a court deems that one party is holding property that rightly belongs to another.

  • Applying these principles to AI is tricky. An AI system is trained on large datasets, which may include copyrighted works. But the AI creates something new and transformative. It doesn't merely copy or reproduce the original works. So it's debatable whether the copyright holders have a valid claim of ownership over the AI output.

  • That said, there are arguments on both sides. The training data is essential to developing the AI's capabilities. So one could argue the authors have some interest in the AI system and its creations. But quantifying that interest and deciding who owns what is very complex legally and philosophically.

  • Overall there are no easy answers. As AI systems become more advanced and autonomous, the law will likely evolve to balance protecting copyright holders, encouraging innovation, and accounting for the unique nature of AI-generated works. But it will require nuanced analysis of complex issues like ownership, authorship, and fair use.

In summary, while the comment raises thought-provoking points, there are reasonable arguments on both sides of this issue, and no clear legal consensus yet on how to handle AI copyright concerns. The law is still developing in this emerging area.

Follow-up question for Claude: But how new and transformative is the output of AI, if - as with the wooden table analogy - one can still differentiate the stolen goods in the end product, e.g. phrases and paragraphs, creator’s styles, voices, the structure and background of pictures and certain objects in it, music samples…can the output truly be labeled as transformative and new if the original creators can still see significant remnants of their work, or the copying of their artistic style in full?

Claude’s answer:You raise a fair point. Even if an AI system creates something new, if there are clear identifiable elements copied or derived from copyrighted source material, there is a stronger argument that the copyright holders have some claim over the AI output.

A few thoughts on this:

  • The extent to which the AI output is transformative versus derivative depends a lot on the specific technology and application. Some AI art seems highly transformative, while text generation models tend to copy more verbatim phrases and styles.

  • The law does recognize that a work can transform copyrighted material sufficiently to be considered a new creative work, even if some copied elements remain. But there's no bright line - it's a subjective case-by-case analysis.

  • For an AI output to be considered legally transformative, it can't just superficially modify or remix copyrighted material. It likely needs to add substantial creative contribution and new meaning or expression.

  • If verbatim copying or close paraphrasing is occurring without permission, that would likely weigh in favor of infringement versus fair use protections.

  • But even if some copying occurs, the AI developer could argue the public benefit of the technology outweighs this copying, among other fair use defenses.

So in summary, you're absolutely right this comes down to a judgment call on the degree of transformation versus recognizable copying. The more transformative, the stronger the case for AI fair use. But if clear misappropriation is occurring, legal challenges from copyright holders become more viable. There are good-faith arguments on both sides, and current law does not have a definitive stance.

Photo by Andrej Lišakov on Unsplash