Google’s DeepMind unit is unveiling today a new method it says can invisibly and permanently label images that have been generated by artificial intelligence.

  • Puzzle_Sluts_4Ever@lemmy.world
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    1 year ago

    That’s great.

    It is basically the exact same situation as DRM in video games. A truly dedicated person can find a workaround (actually… more on that shortly). But the vast majority of people aren’t going to put in any more effort than it takes to search “Generic Barney Game No CD”.

    And… stuff like Denuvo has consistently demonstrated itself to be something that a very limited number of people can crack. Part of that is just a general lack of interest but part of it is the same as it was with Starforce and even activation model Securom back in the day: Shit is hard and you need to put the time and effort in to knowing how to recognize a call.

    Albeit, the difference there is that people actively are not paying for video game cracks. Whereas there would be a market for “unlocked” LLMs. But there is also a strong demand for people who know how to really code those and make them sing so… it becomes a question of whether it is worth running a dark web site and getting paid in crypto versus just working for Google.

    So yeah, maybe some of the open source LLMs will have teams of people who find every single call to anything that might be a watermark AND debug whether those impact the final product. But the percentage of people who will be able to run their own LLM will get increasingly small as things become more and more complex and computationally/data intensive. So maybe large state backed organizations will be doing this. But, with sufficient watermarking/DRM/content tracing, the ability for someone to ask DALL-E 2 to make them a realistic picture of Biden having an orgie with the entire cast of Sex Education and it not being identified as a fake fairly easily is… pretty much at the same level as people not realizing that someone photoshopped a person’s head onto some porn. Idiots will believe it. Everyone else will just see a quick twitter post debunking it and move on with their lives.