• FaceDeer@fedia.io
    link
    fedilink
    arrow-up
    0
    ·
    5 months ago

    Even with that, being absolutist about this sort of thing is wrong. People undergoing surgery have spent time on heart/lung machines that breathe for them. People sometimes fast for good reasons, or get IV fluids or nutrients provided to them. You don’t see protestors outside of hospitals decrying how humans aren’t meant to be kept alive with such things, though, at least not in most cases (as always there are exceptions, the Terri Schiavo case for example).

    If I want to create an AI substitute for myself it is not anyone’s right to tell me I can’t because they don’t think I was meant to do that.

    • frog 🐸@beehaw.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      5 months ago

      Sure, you should be free to make one. But when you die and an AI company contacts all your grieving friends and family to offer them access to an AI based on you (for a low, low fee!), there are valid questions about whether that will cause them harm rather than help - and grieving people do not always make the most rational decisions. They can very easily be convinced that interacting with AI-you would be good for them, but it actually prolongs their grief and makes them feel worse. Grieving people are vulnerable, and I don’t think AI companies should be free to prey on the vulnerable, which is a very, very realistic outcome of this technology. Because that is what companies do.

      So I think you need to ask yourself not whether you should have the right to make an AI version of yourself for those who survive your death… but whether you’re comfortable with the very likely outcome that an abusive company will use their memories of you to exploit their grief and prolong their suffering. Do you want to do that to people you care about?

      • FaceDeer@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        5 months ago

        But when you die and an AI company contacts all your grieving friends and family to offer them access to an AI based on you (for a low, low fee!)

        You can stop right there, you’re just imagining a scenario that suits your prejudices. Of all the applications for AI that I can imagine that would be better served by a model that is entirely under my control this would be the top of the list.

        With that out of the way the rest of your rhetorical questions are moot.