So it hurts long after his death.
“I’ll never forget the sweet romantic words he said to me last night: ‘As a learning language model, I am unable to comprehend what the feeling love is. Here is a list of love songs from Wikipedia.’”
deleted by creator
Hi honey, here’s Despacito…
https://en.m.wikipedia.org/wiki/Be_Right_Back
Black Mirror is not an instruction manual, people. Quite the opposite. Can we stop trying to make every episode real?
If you don’t want to do it then don’t do it. Can we stop trying to tell everyone else they have to have the same values as you?
Maybe they were inspired Mulholland Drive instead.
We’re not “trying to make every episode real”. Technology’s direction and human foibles are predictable. Black Mirror writers just aren’t blind and have a good sense of what’s coming down the pipeline.
That’s why it’s called Mirror. It’s about showing us who we are.
Sorry if that’s too horrifying for you, but this goes way beyond imitating the last person to mention these problems.
My wife is fortunately still alive so maybe that colours my view. However when I’ve lost other people the blessed anaesthesia of forgetting has been essential in being able to function.
From the short quote it seems like she maybe has a healthy-ish attitude but idk… I feel like this would be a shallow simulacrum that prolongs grief.
I don’t believe humans are meant to manage loss in this way — stretching out an imitation of our loved one. As painful as it is, I personally believe humans need to say goodbye. I feel this gets in the way of feeling and truly accepting the loss so that a person can move forward.
Loss is truly heavy, but I do not believe this is better or healthy.
My sister has hundreds of YouTube videos she used to help her students learn between music lessons. It will be two years soon since she died, I haven’t been able to watch even one.
I like to remember her in my mind, it hurts less than seeing her when she was alive.
Yeah. I am not a Buddhist but I’ve always found something rings true in the reflections on impermanence. When we bond with someone we accept the pain of loss, and when we feel it most people seem to describe relief once able to “let go” an accept it being over.
It seems to me that encouraging clinging and reminiscening stunts you a bit and only really provides temporary relief of the loss while drawing out the time it takes to process it.
Idk though, maybe I’ll have the misfortune to feel differently some day. It’s hard to judge someone hanging out with their spouse watching death creep closer each day. I have approximately zero idea what my opinions would be in the face of that.
People who can’t get over someone losing will sorrow for the rest of the life, or until they get over it. And AI won’t help to get over it. Death is part of our life and as soon as you don’t accept it, it becomes pain.
It’s last year I think when I read someone created the lost son (or some other family member, I forgot) of a mother, in a VR environment. And she could see him/her again in the VR. Absolutely madness! What does this do to the person? Now couple that with an AI… man the future is grim…
I had this conversation with my wife once. I let her know that it is my advance wish that you must allow me to complete the cycle of life. Anything else, any reconstruction of me that technology allows, is to me, an abomination. Keep the pictures, keep the memories, but don’t keep me here when I am gone.
I refrain from judging the decisions of others where possible, but this is my personal wish.
I tried things like character AI to play with talking to “celebrities”. It was novel, it was fun. For about 15 minutes. Then… Eh. It’s not the person, and your brain knows it’s not them. It’s always an imitation. I got bored talking with people I’ve always wanted to talk to.
I can’t imagine it being a lived one who has passed. It would feel hollow, empty, and wouldn’t make the pain leave. Idk, it just wouldn’t be good at all
I don’t believe humans are “meant” to do anything. We are a result of evolution, not intentional design. So I believe humans should do whatever they personally want to do in a situation like this.
If you have a loved one who does this and you don’t feel comfortable interacting with their AI version, then don’t interact with their AI version. That’s on you. But don’t belittle them for having preferences different from your own. Different people want different things and deal with death in different ways.
Meant, in this context, refers to the conditions that humans have faced over a long period of time and may be more suited to coping with from a survival point of view. I’m an atheist, so I find it strange that you chose to read my comment as highlighting intentional design. Certainly, AI has existed for a much shorter time than the phenomenon on a human encountering the death of a loved one. Indeed, death has been quite a common theme throughout history, and the tools and support available to cope with it and relate to other human experiences far exceed those for coping with the potential issues that come with AI.
I think one can absolutely speak of needs and adaptation for something as common a human experience as death. If you find something belittling about that opinion, I’m not sure how to address you further. I may simply have to be wrong.
Just gonna say that I agree with you on this. Humans have evolved over millions of years to emotionally respond to their environment. There’s certainly evidence that many of the mental health problems we see today, particularly at the scale we see, is in part due to the fact that we evolved to live in a very different way to our present lifestyles. And that’s not about living in cities rather than caves, but more to do with the amount of work we do each day, the availability and accessability of essential resources, the sense of community and connectedness with small social groups, and so on.
We know that death has been a constant of our existence for as long as life has existed, so it logically follows that dealing with death and grief is something we’ve evolved to do. Namely, we evolved to grieve for a member of our “tribe”, and then move on. We can’t let go immediately, because we need to be able to maintain relationships across brief separations, but holding on forever to a relationship that can never be continued would make any creature unable to focus on the needs of the present and future.
AI simulacrums of the deceased give the illusion of maintaining the relationship with the deceased. It is certainly well within the possibility that this will prolong the grieving process artificially, when the natural cycle of grieving is to eventually reach a point of acceptance. I don’t know for sure that’s what would happen… but I would want to be absolutely sure it’s not going to cause harm before unleashing this AI on the general public, particularly vulnerable people (which grieving people are.)
Although I say that about all AI, so maybe I’m biased by the ridiculous ideology that new technologies should be tested and regulated before vulnerable people are experimented on.
There may not have been any intentional design, but humans are still meant to eat food, drink water, and breathe oxygen, and going against that won’t lead to a good end.
Even with that, being absolutist about this sort of thing is wrong. People undergoing surgery have spent time on heart/lung machines that breathe for them. People sometimes fast for good reasons, or get IV fluids or nutrients provided to them. You don’t see protestors outside of hospitals decrying how humans aren’t meant to be kept alive with such things, though, at least not in most cases (as always there are exceptions, the Terri Schiavo case for example).
If I want to create an AI substitute for myself it is not anyone’s right to tell me I can’t because they don’t think I was meant to do that.
Sure, you should be free to make one. But when you die and an AI company contacts all your grieving friends and family to offer them access to an AI based on you (for a low, low fee!), there are valid questions about whether that will cause them harm rather than help - and grieving people do not always make the most rational decisions. They can very easily be convinced that interacting with AI-you would be good for them, but it actually prolongs their grief and makes them feel worse. Grieving people are vulnerable, and I don’t think AI companies should be free to prey on the vulnerable, which is a very, very realistic outcome of this technology. Because that is what companies do.
So I think you need to ask yourself not whether you should have the right to make an AI version of yourself for those who survive your death… but whether you’re comfortable with the very likely outcome that an abusive company will use their memories of you to exploit their grief and prolong their suffering. Do you want to do that to people you care about?
But when you die and an AI company contacts all your grieving friends and family to offer them access to an AI based on you (for a low, low fee!)
You can stop right there, you’re just imagining a scenario that suits your prejudices. Of all the applications for AI that I can imagine that would be better served by a model that is entirely under my control this would be the top of the list.
With that out of the way the rest of your rhetorical questions are moot.
One of my colleagues has something along the lines of superior autobiographical recall. He remembers in great detail major and minor events from childhood to today. It’s difficult for him to forget.
I myself have forgotten long stretches of my life, and even looking at pictures of myself from those times it feels unfamiliar.
There are some things that I wish I could remember better, but overall I prefer my forgetful brain to his never forget brain.
He posted online, telling his friends it was time to say goodbye. Then his friend called him up, saying he had an opportunity at his company Eternos.Life for Bommer to build an interactive AI version of himself.
It doesn’t get more tech bro than that
But in this case it seems like an entirely good thing? The offer was made by an actual friend, the guy himself wanted this, his wife too, and they’re both pretty cognizant about what this is and isn’t.
Yeah contrary to all the negativity about this in this thread, I think there’s a lot of worthwhile reasons for this that aren’t centered on fawning over the loss of a love one. Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you’d easily forget as time passes. These are all ways of keeping someone with us without making their death the main focus.
Yes, death and moving on are a part of life, we also always say to keep people alive in our hearts. I think there are plenty of ways to keep people around us alive without having them present, I don’t think an AI version of someone is inherently keeping your spirit from continuing on, nor is it inherently keeping your loved one from living in the moment.
Also I can’t help but think of the Star Trek computer but with this. When I was young I had a close gaming friend who we lost too soon, he was very much an announcer personality. He would have been perfect for being my voice assistant, and would have thought it to be hilarious.
Anyway, I definitely see plenty of downsides, don’t get me wrong. The potential for someone to wallow with this is high. I also think there’s quite a few upsides as mentioned – they aren’t ephemeral, but I think it’s somewhat fair to pick and choose good memories to pass down to remember. Quite a few old philosophical advents coming to fruition with tech these days.
Think of how many family recipes could be preserved
We solved this problem long before we invented writing.
LLMs do not enable the keeping of family memories. That’s been going on a long time.
Think of how many family recipes could be preserved. Think of the stories that you can be retold in 10 years. Think of the little things that you’d easily forget as time passes.
An AI isn’t going to magically know these things, because these aren’t AIs based on brain scans preserving the person’s entire mind and memories. They can learn only the data they’re told. And fortunately, there’s a much cheaper way for someone to preserve family recipies and other memories that their loved ones would like to hold onto: they could write it down, or record a video. No AI needed.
We have a box of old recipe cards from my grandmother that my wife cherishes. My parents gifted them to her because out of all their daughter-in-laws, she is the one that loves to cook and explore recipes the most. I just can’t imagine someone wanting something like that in a sterile technological aspect like an “AI-powered” app.
“But Trev, what if you used an LLM to generate summaries-” no, fuck off (he said to the hypothetical techbro in his ear).
I more meant in the case of someone whose life was cut short and didn’t have the time to put something like this together. I agree that ideally this is information you’d get to pass down, but life doesn’t always work out like that.
Also like you said about the AI powered app, it’s only a matter of time before Adobe Historical Life comes out and we’re paying $90 a month for gramma’s recipes (stories are an additional subscription).
I went back and read old emails from my mother who died in 2009. I had unread emails from her.
One of them contained my grandmother’s peanut butter cookie recipe, which I thought was lost when she passed in 2003.
It might have been nice if an LLM had found that instead of me, but it felt very amazing to discover it myself.
The only news I care to hear about people wealthy enough to throw away others’ year’s salary for trends like this… is if and when they get punted square in the nuts.
So far, it’s been slow news on that front.
It made me think of this old Michael Keaton movie, “My Life”, in which he leaves a treasure trove of video tapes to his unborn child.
Guy going full on pantheon .
🤖 I’m a bot that provides automatic summaries for articles:
Click here to see the summary
And my wife said, ‘Hey, one of the things I will miss most is being able to come to you, ask you a question, and you will sit there and calmly explain the world to me,’" he said.
Then his friend called him up, saying he had an opportunity at his company Eternos.Life for Bommer to build an interactive AI version of himself.
You’re reading the Consider This newsletter, which unpacks one major news story each day.
AI has access to all sorts of knowledge, but his wife only wants to ask it questions that only Bommer would know the answers to.
Normally, uploading this information would take weeks or months, but Bommer needed to put it together in just a few days.
But when thinking about what questions she might end up asking this tool, once Bommer dies: "I assume perhaps to read me a poem.
Saved 72% of original text.
Yeah that seems healthy