Thinking there's something after death seems to make people lose sight of this world and fail to see the beauty in it, IMO. When I hear religious people ask this question I think their god(s) must feel insulted. Doesn't really answer your full question but that's my thoughts.
Asklemmy
A loosely moderated place to ask open-ended questions
Search asklemmy ๐
If your post meets the following criteria, it's welcome here!
- Open-ended question
- Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
- Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
- Not ad nauseam inducing: please make sure it is a question that would be new to most members
- An actual topic of discussion
Looking for support?
Looking for a community?
- Lemmyverse: community search
- sub.rehab: maps old subreddits to fediverse options, marks official as such
- !lemmy411@lemmy.ca: a community for finding communities
~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~
To enjoy the chemical pleasures that life has to offer, in its fullest.
Everything happens after you die. Who told you nothing does?
Ngl this type of post on reddit used to make me depresssed as a kid and id make them too, dont want to see them, theres no point in thinking about this thats why ppl either dont or spend all their time religious
if one's life was just loneliness, failed relationships, and soul-draining work it might appear pointless
maybe there is lots of other things to do?
The meaning of life is to search for the meaning of life.
So the billionaires can live their best lives, they need pedons to serve them.
I'm pretty confident that there's an afterlife.
I speak from my own research into related phenomena.
The afterlife is basically the dreamworld but moreso.