• glibg10b
    link
    fedilink
    151 year ago

    Right. I don’t know how the hell someone managed to reveal their OpenAI key to the LLM itself

    • I don’t think it gave him the openAI key, he just had the ability to send as many hijacked (not game related) prompts as he wanted through the game on the devs’ dime.

      • @computergeek125@lemmy.world
        link
        fedilink
        English
        -11 year ago

        Which, now given the ability to inject arbitrary code, you could conceivably now write code to list every variable it had access to.

        • The text prompt in the game might also be vulnerable to arbitrary code injection, but that wouldn’t really have anything to do with the prompt injection being used here. Everything being done is within the confines of chatGPT which wouldn’t need or have access to any of the game’s code.