Depends, are they sentient? If they are conscious beings, yeah I think it would be unethical to mass murder them
Ask Lemmy
A Fediverse community for open-ended, thought provoking questions
Rules: (interactive)
1) Be nice and; have fun
Doxxing, trolling, sealioning, racism, toxicity and dog-whistling are not welcomed in AskLemmy. Remember what your mother said: if you can't say something nice, don't say anything at all. In addition, the site-wide Lemmy.world terms of service also apply here. Please familiarize yourself with them
2) All posts must end with a '?'
This is sort of like Jeopardy. Please phrase all post titles in the form of a proper question ending with ?
3) No spam
Please do not flood the community with nonsense. Actual suspected spammers will be banned on site. No astroturfing.
4) NSFW is okay, within reason
Just remember to tag posts with either a content warning or a [NSFW] tag. Overtly sexual posts are not allowed, please direct them to either !asklemmyafterdark@lemmy.world or !asklemmynsfw@lemmynsfw.com.
NSFW comments should be restricted to posts tagged [NSFW].
5) This is not a support community.
It is not a place for 'how do I?', type questions.
If you have any questions regarding the site itself or would like to report a community, please direct them to Lemmy.world Support or email info@lemmy.world. For other questions check our partnered communities list, or use the search function.
6) No US Politics.
Please don't post about current US Politics. If you need to do this, try !politicaldiscussion@lemmy.world or !askusa@discuss.online
Reminder: The terms of service apply here too.
Partnered Communities:
Logo design credit goes to: tubbadu
Intelligence isn't the important factor there - consciousness is. Does it feel like something to be those entities in the simulation? If yes, then I'd argue that ending the simulation is like killing a person painlessly in their sleep.
I personally don't think ending the simulation is even the most troubling part. We could unintentionally create a simulation that's effectively a hell and then populate it with entities that have subjective experiences we don't realize exist. The only thing worse than ending a life is creating one just for it to suffer through its entire existence.
We could unintentionally create a simulation that's effectively a hell and then populate it with entities that have subjective experiences we don't realize exist. The only thing worse than ending a life is creating one just for it to suffer through its entire existence.
And this is basically the plot of the TV series Severance. Has me wondering how they intend to address it.
Didn't scientists train brain cells to exclusively play Doom? It's like their whole conscience is stuck in a video game version of hell through a brain in a vat experience.
Not really. It's not nearly enough cells to have any kind of consciousness as we know it. A few neurons learning to play a game is a far cry from tying a being into a simulation of hell.
The only thing worse than ending a life is creating one just for it to suffer through its entire existence.
Antinatalism entered the chat
Or maybe just well reasoned morality?
Somewhere in a box in your childhood home, a Tamagotchi is slowly dying...
Slowly? Those things would 'die' in under 24 hours!
Just turn down the simulation speed real low and run it at one tick per 20 years, then you can technically keep it going without such great expense. The people inside won't notice the difference.
"Now playing human music, on Earth Radio"
If you take the limit of that you'll realize that people won't raise if you turn it off either.
the simulacrants wouldn't realize the simulation is ever not running.
Kurzgesagt made a video about how in a dying universe (from heat death) civilizations that uploaded their consciousness into a simulation could live forever, by intermittently running the simulation and pausing it for greater and greater amounts of time as expendable energy in the universe diminishes. The consciousness would not perceive the time the simulation isn't running and to them things just go on and go on for eternity.
If this is a way for our simulation creator to decide to pull the plug without guilt, I guess just go ahead and do it. I was holding out hope that this was all real, but it has been getting more clear that it's not.
The old question right? Does a simulated (or rather emulated) brain actually think and feel? Or does the computer just output what it would be if it was alive?
I think before I am, but I can't prove that I'm self aware and not just "pretending" to do so. But because you are a human being like me, I understand that you do too. But that assumption is broken when you are not a physical organism but software running on a computer.
Yes because then your car battery won't start
There is a tv film, i don't know the title, related about this topic.
The plot was :
a group of scientists made a living simulation, and go in the simulation to operate fixes and prevent making simulation. On day, one the scientific was killed, and left a message in the simulation for their coworkers. The message was : "take a road and follow no direction", a guy in the simulation followed the instruction and discovered that he was in a simulation, but the message were for the scientists who are in a simulation too.
If someone can find the movie, it could be great.
I'd say that whether or not it's in a simulation doesn't matter. If the beings you created were recognizable as people (human or otherwise) then they have rights and you'd be trampling those rights if you ended their existence. The creation of such life should not be done without an appropriate sense of responsibility.
then they have rights
Why? I'm not trolling, I just really think it's interesting where people think "rights" come from. Some people think they come from God. Which is great, because in this scenario we are God. So anything we do is ethical because we did it.
I contend they come from States. Because I notice that rights are different in different States. And I don't think a god would obey jurisdiction.
Another way of saying this is that the beings themselves have to recognize and demand rights. Because a state is just people deciding things after all.
So where do the rights come from? Are they a legal/socail construct, or inherent in the universe some how? Some third thing I didn't think of?
People forget how scary the real world is. We are the only creatures to create this concept of rights. You think that grizzly bear cares about your rights? Got some news for you....
And shit, even we don't respect other people's right to exist.
:: gestures very very briefly to.... EVERYTHING going on right now::
You think the asteroid that ended 90+% of life on earth cared about the dinosaur's rights?
All that being said, I wouldn't be able to pull the plug.
You can "simulate" life inside your brain, too.

[Alt text: this is Bob. Bob is a figment of you imagination. When you leave, Bob will leave too. "Don't leave" says Bob]
The Bob in your head is intelligent, it can communicate in English. Is it unethical to stop thinking about Bob? Was it unethical of me to show you this picture, creating a "Bob" in your head? Is any story unethical to tell?
no, because if you dont use the kill switch you get The Ring happening.
Is that "the ring" cannon or are you just yes anding?
"cannon"?
Uhh, like part of the story.
Cannon is a thing you shoot big lumps of exploding metal to the distance of tens of kilometers.
Canon is the thing with the plots and stories. Also, a camera.
This is a tough question, I think to answer it you have to know if those simulated beings have actual consciousness / sapience or if that is just simulated.
Depends who they've elected as leader.
The fun thing about ethics is that not everyone shares the same rules. Personally, I would probably say it is. (Though is more worse than what we do to cows? Or what we do to other humans in war?) However, others may say they aren't real, and only an illusion manufactured by the simulation, so it's fine. There are other arguments I'm sure someone could make too. It's up for you to decide what your ethics are, not others. There is no universal code of ethics just as there is no universal morality.
If you set the simulation to end before it has begun, do you dodge the question of ethics?
If you are a human, human ethics of not killing "alive" stuff still applies to you no?
Thinking more into rules of ethics, if those simulated beings came up with their own morals like "don't try calculating all digits of pi in large groups because it causes lag" that would not really apply to you.
Basically different beings have different rules of ethics IMO and you can't simply end the simulation more so because you are a human than anything.
The answer could change in same exact scenario if you are some kind of eldritch being instead of human.
How do you define intelligence? In any case, I think it's irrelevant. What's relevant is whether the beings are self aware or if they exist having notions and concepts of fear of death. For this reason, I deem it unethical to slaughter - for instance - animals in a setting in which their peers are aware of the moment of death of their peers. Seeing, for instance, a cow agonize about their peer in front of them being shot to death is heart-wrenching. For this reason, my answer to your question is "yes". Yes I eat meat. :3
You got me curious, you seem to feel some way about slaughtering animals, but that doesn't seem to translate into your actions being aligned with your feelings.
Would you care to talk about it?
I'll be honest I have reduced my meat/dairy/egg consumption significantly, but every once in a while I'm not the one cooking at home and I don't really feel able to go on a side quest while hungry.
One of the Minds in Ian M Banks' last novel, The Hydrogen Sonata, faces and addresses exactly this problem. Much is at stake, so it's a meaningful discussion.