• fonix232@fedia.io
    link
    fedilink
    arrow-up
    20
    ·
    19 hours ago

    No, your honour, we did not use the porn downloaded to train our AI, at least not directly.

    The people working on training the AI? Oh, the definitely had a cheeky wank here and there after a stressful day! This just shows how committed Meta is to the wellbeing of our employees.

  • Jack@slrpnk.net
    link
    fedilink
    arrow-up
    51
    ·
    22 hours ago

    “we didn’t download any porn, and the one we downloaded was for personal use”

    This is fucking ridiculous…

  • megopie@beehaw.org
    link
    fedilink
    English
    arrow-up
    6
    ·
    edit-2
    15 hours ago

    As has been made very clear, it is not actually possible to prevent these models from regurgitating any information they’ve been trained on, no matter how fancy the system prompt. So, if there is NSFW content In the training data, users will always be able to access it, not matter how “compliant” the company is with restrictions on NSWF content by way of system prompts.

    They can have their cake and eat it to, many users will prefer the models because of their ability to do porn stuff, and they will not be held legally liable for that since they’ve done everything they possible could.

    So long as no one proves that they did in fact intentionally train the models on a shit ton of porn …

  • Jul (they/she)@piefed.blahaj.zone
    link
    fedilink
    English
    arrow-up
    23
    ·
    22 hours ago

    I mean probably true, but that shouldn’t stop them from being liable if ordinary individuals are liable for the same actions. Sure, punish your employees, but if a parent is responsible to pay for a child’s pirating then an employer should be just as responsible for paying for an employee’s pirating.

  • floquant@lemmy.dbzer0.com
    link
    fedilink
    arrow-up
    6
    ·
    18 hours ago

    What’s their angle? US companies can seemingly get a pass on copyright infringement if it’s for training AI. Why say that?