A rising movement of artists and authors are suing tech companies for training AI on their work without credit or payment

  • Lil' Bobby Tables@programming.dev
    link
    fedilink
    English
    arrow-up
    21
    arrow-down
    3
    ·
    1 year ago

    Sorry if this comes off as offensive, but this isn’t new news, and I think we all need a reality check here. I’ll also be forward, the artifice of doubt cast on this makes me pretty angry and is a threat to every one of us.

    So, OK. Speaking as a thirty-year programmer, a neurobiology minor, and a hobby animator, you know what these objections sound like? They sound like somebody who just learned to program, with JavaScript, at the top of that initial peak. You know the one, when you feel like you can do anything and present as such. You’ve never had to worry about stack overflows or memory exceptions or, quite possibly, even fundamental networking utilities, but you got a web page to do something which is, in fact, legitimately cool, and you’re proud of yourself—as you friggin’ should be. However, you don’t know how much you don’t know, and are way further from the top than you suspect you are.

    Until you learn something like C, and really walk away from it with a sore ass and a sense of perspective, you don’t know what you don’t know. This isn’t because we all aren’t rooting for you, it’s just an expected rite of passage; when you really begin to learn. We’re totally rooting for you, we used to be you! And this is also true of visual art and programming; they have nothing to do with each other, and as far as I can see this case is open and shut, and we’re still mulling over the details.

    Artists don’t copy, they analyze. It’s difference between reading all of the answers on a Stack Exchange site, considering them, discussing them, putting them in the context of your own life, and applying them in an organized and personal fashion; versus simply copying the code verbatim and jamming things in until it arguably works, which is literally what an LLM neural network is designed to do. We’ve all seen this with extremely questionable code output by GPT-3 (and, yes, GPT-4, it still happens, just not as frequently).

    The art neural networks are the same—an artist charges for that inspiration, because it was a lot of physical and emotional toil for them. It was a lot of feedback and self-critique. “Prompt engineering” is neither art nor, if we’re honest with ourselves, engineering.

    I maintain that this is basically a slightly obfuscated recap of the Napster trials from twenty years ago, who, ironically, fell back on almost the same “fair use” defense. It’s going to falter just as hard, as there’s a massive difference between showing smaller, low-resolution images on a search page; and using meaningful elements of those images to produce brand new, and competing, works; which happen to be flawed, but look the same to someone who browses JPEGs on Google like a channel surfer.

    The last thing I’m going to say is that we need to stop describing LLMs as “AI”. “AI” doesn’t mean anything. It could refer to a neural network, machine learning, A* path finding, collective intelligence, and any number of other things, the colloquial definition being technology that “performs a function which was previously only possible for a human”, and come on, once upon a time you could describe a hammer that way. AI is not a formal industry term, it’s marketing flak. LLMs are effectively a database of connections browsed with the use of a neural network.

    You want to keep using Midjourney and Stable Diffusion? Great! Go for it. You want to use ChatGPT to help you understand some multivariate calculus? Go nuts, I do it all the time, most mathematicians are terrible at articulating. However, they should, without question, have to pay for the art that they used, or cease using it if the sale won’t be completed. Any other outcome is absolutely going to lead to an economic collapse.

    • archomrade [he/him]@midwest.social
      link
      fedilink
      English
      arrow-up
      6
      ·
      1 year ago

      However, they should, without question, have to pay for the art that they used, or cease using it if the sale won’t be completed. Any other outcome is absolutely going to lead to an economic collapse.

      This is the part that drives me crazy, the conclusion doesn’t follow from the premise. Just because machine learning poses an economic threat to artists/writers/ect, does not mean it somehow makes how they are trained unethical. It undoubtedly does pose a threat, but not because they’re “stealing” work, but because our economic system is fucked. I would challenge anyone to try applying the same copywrite legal argument to the more common training sets using reddit, Twitter, and other online forum text data, which does not have the same copywrite protections, and isn’t identifiable in the ML outputs. Machine learning applications have the potential to liberate humanity from millions of hours of meaningless work, why should we be whining about it just because “they took our jobs!”?

      Just like the Napster trials, I think our economic system and industry ought to adapt to the new technology, not cling to legal precedent to protect it from changing. Employment should not be a prerequisite to a standard of living, full stop. If some new technology comes along and replaces the labor of a couple million people, our reaction shouldn’t be to stop the progress, but to ensure those people put out of work can still afford to live without creating more meaningless work to fill their time .

      • tabular@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        1 year ago

        With a universal basic income artists would be free to choose to make art for fun instead of survival. Given enough job destruction in transport a UBI-like solution may be manditory as there are not enough jobs for humans generally.

      • 33KK@lemmy.blahaj.zone
        link
        fedilink
        English
        arrow-up
        1
        ·
        edit-2
        1 year ago

        Most models are trained unethically, relying on weird statements about how humans learn the “same way” (looking at a few references when drawing a specific thing, you need to know how it looks to draw it lol) as large models (more or less averaging and weighting billions of images stolen from internet with no regards to the licenses)

        • archomrade [he/him]@midwest.social
          link
          fedilink
          English
          arrow-up
          3
          ·
          1 year ago

          I don’t think i said “humans learn the same way”, but I do think it helps to understand how ML algorithms work in comparison with existing examples of copyright infringement (i.e. photocopies, duplicated files on a hard drive, word for word or pixel for pixel duplication’s, ect.). ML’s don’t duplicate or photocopy training data, they “weight” (or to use your word choice, “average”) the data against a node structure. Other, more subjective copyright infringements are decided on a case-by-case basis, where an artist or entity has produced an “original” work that leans too heavily on a copyrighted work. It is clear that ML’s aren’t a straight-forward duplication. If you asked an MLA to reproduce an existing image, it wouldn’t be able to recreate it exactly, because that data isn’t stored in its model, only the approximate instructions on how to reproduce it. It might be able to get close, especially if that example is well represented in the data set, but the image would be fundamentally “new” in the sense that it has not been copied pixel by pixel from an original, only recreated through averaging.

          If our concern is that AI could literally reproduce existing creative work and pass it off as original, then we should pursue legal action against those uses. But to claim that the model itself is an illegal duplication of copyrighted work is ridiculous. If our true concern (as I think it is) that the use of MLAs may supplant the need for paid artists or writers, then I would suggest we re-think how we structure compensation for labor and not simply place barriers to AI deployment. Even if we were to reach some compensation agreement for the use of copyrighted material in the training of AI data, that wouldn’t prevent the elimination of artistic labor, it would only solidify AI as an elite, expensive tool owned only by a handful of companies that can afford the cost. It would consolidate our economy further, not democratize it.

          In my opinion, copyright law is already just a band-aid to a broader issue of labor relations, and the issue of AI training data is just a drastic expansion of that same wound.

          • 33KK@lemmy.blahaj.zone
            link
            fedilink
            English
            arrow-up
            1
            ·
            edit-2
            1 year ago

            My concern is that billions of works are being used for training with no consent and no regard to the license, and that the model “learns” is not an excuse. If someone saved some of my content for personal use, sure, I don’t mind that at all, but huge scale scraping for-profit operation downloading all content they physically can? Fuck off. I just blocked all the crawlers from ever accesing my websites (well, google and bing literally refuse to index my stuff properly anyway, so fuck them too, none of them even managed to read the sitemap properly, and it was definitely valid)

  • goetzit@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    5
    ·
    1 year ago

    I never understand this argument. If I go to an art museum, look at all the works, and create an art piece inspired by what i saw, no problem. If I go to an art museum, look at the works, and create a computer program that can create an art piece based on what I saw, that is somehow different? Because of the single step of abstraction that was taken by making some software to do it?

    • CheeseNoodle@lemmy.world
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      2
      ·
      1 year ago

      I think the difference is in the actual implementation. A lot of times AI has been caught outputing near direct copies of other peoples work or otherwise similar enough that if a human did it then it would become plagarism which is the crux of the problem. If I read someones book then write a slightly altered version and try to pass it off as my own work then thats plagarism, if I feed someones book into a language model and then have it write a slightly altered version thats somehow different and allowed.

      • goetzit@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        ·
        1 year ago

        Okay, I can understand that. But why is that being turned into “the creator of any work an AI looks at needs to be compensated” instead of holding AI companies accountable for plagiarized works?

        I totally understand fining an AI company that produces a copy of Starry Night. But if it makes a painting similar in style to Starry Night that wouldn’t normally be considered a plagiarized work if a human did it, do we still consider that an issue?

        • CheeseNoodle@lemmy.world
          link
          fedilink
          English
          arrow-up
          6
          ·
          1 year ago

          From an existing legal perspective (giving some reddit tier legal advice here) I’m pretty sure there’s nothing legally wrong with AI art so long as its not straight up plagarism however there is another argument thats likely going to need settling at some point and iI’ll do my best to summarise it.

          Humans learn from other peoples work, but then eventually develop their own style and become net producers of ‘data’ (data being pictures, books whatever we’re training the AI on) Current AI never does this, it can effectively only remix other peoples work and thus needs to constantly scrape other peoples work in order to expand its repotoir it is never a net producer of ‘data’, this is effectively proven (for current AI) by the fact that using AI output as training data can actually make the AI worse because it compounds existing flaws and ‘AI hallucinations’

          This means human artists initially rely on others but ultimately create value from their own effort, AI on the other hand (for now) must continuously rely on the work of others in order to produce value. Or to put it even more simply, the AI industry is entierly reliant on the work of human artists but gives no credit or remediation.

    • Lil' Bobby Tables@programming.dev
      link
      fedilink
      English
      arrow-up
      2
      arrow-down
      7
      ·
      1 year ago

      So you can walk into an art museum, look at a Rembrant, a Da Vinci, and a Carlo, go home to a canvas, and jot up a fusion of the three?

      Wow. You’re good.

  • oracle33@lemmy.ml
    link
    fedilink
    English
    arrow-up
    15
    arrow-down
    8
    ·
    1 year ago

    While I recognize the AI art is quite obviously derivative and considering that ML pattern matching requires much more input, there’s argument that it’s more derivative, I really struggle to grasp how humans learning to be creative aren’t doing exactly the same thing and what makes that ok (except of course that’s ok).

    Maybe it’s just less obvious and auditable?

    • inspxtr@lemmy.world
      link
      fedilink
      English
      arrow-up
      20
      arrow-down
      3
      ·
      edit-2
      1 year ago

      I believe with humans, the limitations of our capacity to know, create, learn, and the limited contexts that we apply such knowledge and skills may actually be better for creativity and relatability - knowing everything may not always be optimal, especially when it is something about subjective experience. Plus, such limitations may also protect creators from certain claims about copyright, 1 idea can come from many independent creators, and can be implemented briefly similar or vastly different. And usually, we, as humans, develop a sense of work ethics to attribute the inspirations of our work. There are other who steal ideas without attribution as well, but that’s where laws come in to settle it.

      On the side of tech companies using their work to train, AI gen tech is learning at a vastly different scale, slurping up their work without attributing them. If we’re talking about the mechanism of creativity, AI gen tech seems to be given a huge advantage already. Plus, artists/creators learn and create their work, usually with some contexts, sometimes with meaning. Excluding commercial works, I’m not entirely sure the products AI gen tech creates carry such specificity. Maybe it does, with some interpretation?

      Anyway, I think the larger debate here is about compensation and attribution. How is it fair for big companies with a lot of money to take creators’ work, without (or minimal) paying/attributing them, while those companies then use these technologies to make more money?

      EDIT: replace AI with gen(erative) tech

      • Ath47@lemmy.world
        link
        fedilink
        English
        arrow-up
        10
        arrow-down
        7
        ·
        1 year ago

        How is it fair for big companies with a lot of money to take creators’ work, without (or minimal) paying/attributing them, while those companies then use these technologies to make more money?

        Because those works were put online, at a publicly accessible location, and not behind a paywall or subscription. If literally anyone on the planet can see your work just by typing a URL into their browser, then you have essentially allowed them to learn from it. Also, it’s not like there are copies of those works stored away in some database somewhere, they were merely looked at for a few seconds each while a bunch of numbers went up and down in a neural network. There is absolutely not enough data kept to reproduce the original work.

        Besides, if OpenAI (or other companies in the same business) had to pay a million people for the rights to use their work to train an AI model, how much do you think they’d be able to pay? A few dollars? Why bother seeking that kind of compensation at all?

    • bane_killgrind@lemmy.ml
      link
      fedilink
      English
      arrow-up
      16
      arrow-down
      6
      ·
      1 year ago

      It’s not being creative. It’s generating a statistically likely facsimile with a seperate set of input parameters. It’s sampling, but keeping the same pattern of beats even if the order of the notes changes.

      • admiralteal@kbin.social
        link
        fedilink
        arrow-up
        17
        arrow-down
        2
        ·
        edit-2
        1 year ago

        Because I never think I can post it enough, Let’s forget the term AI. Let’s call them Systematic Approaches to Learning Algorithms and Machine Inferences (SALAMI).

        So much confusion and prejudice is thrown into this discussion by the mere fact that they’re called AIs. I don’t believe they are intelligent anymore than I believe a calculator is.

        And even if they are, the AIs don’t have the needs the humans do. So we still must value the work of the humans more highly than the work of the AIs.

        • inspxtr@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          1 year ago

          I agree with you on both points. Fixed texts in my comment from AI to generative tech, mostly because I honestly dont fully have a good grasp on what exactly can be considered intelligence.

          But your second point, I think, is more important, at least to me. We can have debates on what AI/AGI or whatever is, the thing that matters right now and in years (even months) to come is that we as humans have multiple needs.

          We need to work, some of our work requires generating something (code, arts, blueprints, writing) that may be replaceable by these techs really soon. Such work takes years, even decades, of training and experience, especially domain knowledge experience that is invaluable to issues such as necessary human interaction, communication, bias detection and resolution, … Yet within a couple of years, if all of that effort might get replaced by a bot (that might have more unintended consequences but cut costs), instead of augmented/assisted, many of us would struggle to have a job for living while the companies that build these profit and benefit from that.

      • Peaces@infosec.pubOP
        link
        fedilink
        English
        arrow-up
        3
        ·
        1 year ago

        Though, at what point does sampling become coherentism from philosophy? In the end, whether an AI performs “coherently” is all that matters. I think we are amazed now at ChatGPT because of the quality LLM from 2021 but that value will degrade or become less “coherent” over time, i.e. model collapse.

    • Fonchote@lemmy.world
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      2
      ·
      1 year ago

      I agree with you, the only caveat here is that the artist mentioned say that their books were illegally obtained. Which is a valid argument. I don’t see c how training an ai in publicly available information is any different than a human reading/seeing said information and learning from it. Now that same human pirating a book is illegal.

      The additional complexity here are laws that were written and are enforced by people that don’t fully grasp this technology. If this was traditional code, then yes it could be a copywrite issue, but the models should be trained on enough data to create derivative works.

      • 14th_cylon@lemm.ee
        link
        fedilink
        English
        arrow-up
        4
        arrow-down
        2
        ·
        1 year ago

        I don’t see c how training an ai in publicly available information is any different than a human reading/seeing said information and learning from it.

        well, the difference is that humans are quite well autoregulated system… as new artists are created by learning from the old ones, the old ones die, so the total number stays about the same. the new artists also have to eat, so they won’t undermine others in the industry (at least not behind some line) and they cannot scale their services to the point where one artist would serve all the customers and all other artists would go die due to starvation. that’s how the human civilization works since the dawn of the civilization.

        i hope i don’t need to describe how ai is different.

        • Skua@kbin.social
          link
          fedilink
          arrow-up
          4
          arrow-down
          1
          ·
          1 year ago

          I’m not sure this argument really addresses the point. If some human artist did become so phenomenally efficient at creating art that they could match the output of the likes of Midjourney as it is today, I don’t think anybody would be complaining that they learned their craft by looking at other artists’ work. If they wouldn’t, it’s clearly not the scale of the output alone that’s the issue here.

          It’s also not reasonable to describe the art market as an infinitely and inherently self-regulating one just because artists die. Technology has severely disrupted it before. The demand for calligraphers certainly took quite a hit when the printing press was invented. The camera presumably displaced a substantial amount of the portrait market. Modern digital art tools like Photoshop facilitate an enormously increased output from a given number of artists.

    • 14th_cylon@lemm.ee
      link
      fedilink
      English
      arrow-up
      10
      arrow-down
      4
      ·
      edit-2
      1 year ago

      what makes that ok

      well, the difference is that humans are quite well autoregulated system… as new artists are created by learning from the old ones, the old ones die, so the total number stays about the same. the new artists also have to eat, so they won’t undermine others in the industry (at least not behind some line) and they cannot scale their services to the point where one artist would serve all the customers and all other artists would go die due to starvation. that’s how the human civilization works since the dawn of the civilization.

      i hope i don’t need to describe how ai is different.

    • TheHighRoad@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      ·
      1 year ago

      AI is an existential threat to so many. I see it similarly to how an established worker may sabotage a talented up-and-comer to protect their position.

  • Nioxic@lemmy.world
    link
    fedilink
    English
    arrow-up
    5
    arrow-down
    3
    ·
    1 year ago

    If i look at art and then try to make art…

    Do i also have to pay the artist?

    • Shardikprime@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      arrow-down
      2
      ·
      edit-2
      1 year ago

      I mean museums do have free days but you have to pay for regulars visits

      Same as well with art galleries

  • fidodo@lemmy.world
    link
    fedilink
    English
    arrow-up
    2
    ·
    1 year ago

    While I do think it’s technically possible and the right thing to do to determine what original works were used in derivative art to pay royalties, the reality of the situation is that those payments would be a small fraction of what they make now since the whole point of generative art is to be able to reproduce derivative works for a fraction of the cost. Unless the demand for art increases proportionately with the decreased cost, which it can’t, compensation with decrease, and as more art goes into the public domain the compensation will decrease further.

    This will not save artists and they need a back up plan. I think the future of art will probably be that artists will become more like art directors or designers who are responsible for having good taste, not producing the original work, but even that will have greatly diminishing demand as generative AI will handle most basic needs.

  • gimmedat@lemm.ee
    link
    fedilink
    English
    arrow-up
    7
    arrow-down
    6
    ·
    1 year ago

    Current artists learned their art from the previous generation’s artists. Are they also suposed to pay?