• ubergeek@lemmy.today
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    2
    ·
    1 day ago

    No AI company has ever made any of their own content to train their models, they took what others created, remixed it, and presented it as something new.

    This AI model did the same thing.

    AI lost its job to AI.

    • chiliedogg@lemmy.world
      link
      fedilink
      English
      arrow-up
      3
      arrow-down
      1
      ·
      21 hours ago

      Yes, but that doesn’t mean it is more efficient, which is what the whole thing is about.

      Let’s pretend we’re not talking about AI, but tuna fishing. OpenTuna is sending hundreds of ships to the ocean to go fishing. It’s extremely expensive, but it gets results.

      If another fish distributor shows up out of nowhere selling tuna for 1/10 the price, it would be amazing. But if you found out that they could sell them cheap because they were stealing the fish from OpenTuna warehouses, you wouldn’t argue that the secret to catching fish going forward is theft and stop building boats.

        • chiliedogg@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          arrow-down
          1
          ·
          10 hours ago

          So what happens when OpenTuna runs out of fish to steal and there are no more boats?

          Information doesn’t stop being created. AI models need to be constantly trained and updated with new information. One of the biggest issues with GPT3 was the 2021 knowledge cutoff.

          Let’s pretend you’re building a legal analysis AI tool that scrapes the web for information on local, state, and federal law in the US. If your model was from January 2008 and was never updated, then gay marriage wouldn’t be legal in the US, the ACA wouldn’t exist, Super PACs would be illegal, the Consumer Financial Protection Bureau wouldn’t exist, zoning ordinances in pretty much every city would be out of date, and openly carrying a handgun in Texas would get you jailtime.

          It would essentially be a useless tool, and copying that old training data wouldn’t make a better product no matter how cheap it was to do.

          • ubergeek@lemmy.today
            link
            fedilink
            English
            arrow-up
            1
            ·
            10 hours ago

            Once tuna runs out, and we run out of boats?

            Maybe we then stop destroying the tuna population?

            Or, to bring this back to point: the environment will be better off once the AI bubble collapses.

              • ubergeek@lemmy.today
                link
                fedilink
                English
                arrow-up
                1
                ·
                7 hours ago

                Its actually very much the conversation. The quicker the race to the bottom happens, the quicker this entire bubble bursts, and the quicker we stop torching the planet for imaginary profits.

                • chiliedogg@lemmy.world
                  link
                  fedilink
                  English
                  arrow-up
                  1
                  ·
                  7 hours ago

                  That’s your opinion/agenda, not a legitimate argument in the conversation about AI efficiency. The discussion is on how best to achieve a goal, and you’re saying that it shouldn’t be achieved. Even if you’re right, you’re still going off on a separate tangent.

                  You’re the vegan who butts in on the conversation about how best to sear a steak and says meat is murder. You’re welcome to your opinion on meat and you may even be right, but it is of absolutely no value or interest to the people talking about methods for cooking meat.

                  • ubergeek@lemmy.today
                    link
                    fedilink
                    English
                    arrow-up
                    1
                    ·
                    7 hours ago

                    That’s your opinion/agenda, not a legitimate argument in the conversation about AI efficiency

                    I’m not arguing about the “efficiency” of it. I’m stating that OpenAI did the exact same thing they are complaining DeepSeek did: Steal other’s work, remix it, and then claimed it as their own.

                    And to reply to you “tuna fisher” analogy, I would be fully ok with people stealing the loads of tuna, to hasten the collapse of the entire industry.

                    Its you who is getting into the weeds about this, not I.