EU parliament accepted a last minute amendment, mandating age verification for pornographic (whatever that is) content online, punishable with up to one year prison sentence.

This was rolled into a directive concerning CSAM. Because adults accessing porn need to be de-anonymised to avoid child exploitation?

Some press releases: (1), (2), (3)

  • iii@mander.xyzOP
    link
    fedilink
    English
    arrow-up
    9
    ·
    edit-2
    5 months ago

    How does one “follow the tokens” then?

    We don’t know what they do with the information, as it’s closed source.

    Assuming it’s based on this EU prototype:

    They don’t know why it was requested, but do know who, where and when.

    So they gather the logs of A, the token provider. Is the target present? They have his token. They also see where and when the token was used. Did you have a fun time yesterday evening, on your phone at home, on websites B, C and D?

    Next up, if they want even more detail, gather the logs of B, look for the token. That way they can pinpoint the exact search terms, categories, watch time, etc

    In summary: centralizing the de-anonymisation this way makes mass surveillance easier than if it were decentralized, in sometimes foreign jurisdictions.

    It also shifts the conversation away from the best solution: don’t deanonymise in the first place.