(Deleted for not relevant anymore)

  • cll7793@lemmy.worldOP
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    3 years ago

    They are requesting for something beyond watermarking. Yes, it is good to have a robot tell you when it is making a film. What is particularly concerning is that the witnesses want the government to keep track of every prompt and output ever made to eventually be able to trace its origin. So all open source models must somehow encode some form of signature, much like the hidden yellow dots printers produce on every sheet.

    There is a huge difference between a watermark stating that “this is ai generated” and having hidden encodings, much like a backdoor, where they can trace any pubicly released ai image, video, and perhaps even text output, to some specific model, or worse DRM required “yellow dot” injection.

    I know researchers have already looked into encoding hidden undetectable patterns in text output, so an extension to everything else is not unjustified.

    Also, if the encodings are not detectable by humans, then they have failed the original purpose of making ai generated content known.