Not AI slop, just I slop
Showerthoughts
A "Showerthought" is a simple term used to describe the thoughts that pop into your head while you're doing everyday things like taking a shower, driving, or just daydreaming. The most popular seem to be lighthearted clever little truths, hidden in daily life.
Here are some examples to inspire your own showerthoughts:
- Both “200” and “160” are 2 minutes in microwave math
- When you’re a kid, you don’t realize you’re also watching your mom and dad grow up.
- More dreams have been destroyed by alarm clocks than anything else
Rules
- All posts must be showerthoughts
- The entire showerthought must be in the title
- No politics
- If your topic is in a grey area, please phrase it to emphasize the fascinating aspects, not the dramatic aspects. You can do this by avoiding overly politicized terms such as "capitalism" and "communism". If you must make comparisons, you can say something is different without saying something is better/worse.
- A good place for politics is c/politicaldiscussion
- Posts must be original/unique
- Adhere to Lemmy's Code of Conduct and the TOS
If you made it this far, showerthoughts is accepting new mods. This community is generally tame so its not a lot of work, but having a few more mods would help reports get addressed a little sooner.
Whats it like to be a mod? Reports just show up as messages in your Lemmy inbox, and if a different mod has already addressed the report, the message goes away and you never worry about it.
Well, you aren't entirely wrong...
-
AI uses already existing data to create "new" data / Dreams uses things you experience throughout your life to create the scenes you see
-
Both are imperfect facsimiles of reality, very convincing at first but falls flat with a more detailed analysis
-
AI tends to hallucinate / dreams tends to be really weird
-
Places have no sense of permanence in dreams nor in AI.
-
some may argue dreams have no real meaning or purpose, it isn't a coherent narrative, just like AI.
Of course, it isn't a 1:1 relation, but I kinda dig it.
I don't remember jumping off a cliff to land on a whale
Not recently, but as a kid you may have. Kids have quite the imagination.
Brave of you to admit that one so openly. Good for you.
I don't get why this had -25 net upvotes before I gave it an upvote to balance it, I think it's a good shower thought and thinking of AI slop as similar to dreams is genuinely not something I'd thought of before.
This is only true if the A in AI stands for ACTUAL intelligence.
I’d argue the opposite. AI cannot dream, it can only shuffle around things that already existed before, while dreams are our brains actually being creative.
I was thinking scientifically that your brain uses electric signals and creates an image/dream when your asleep. First comment put it better. More of an I slop lol
Is it though? Most of my dreams seem to be things I know scrambled into nonsense. Like an old coworker showing up at my high school to help me find my kids. It's just elements of my past scrambling together into a story.
Sounds like you're just a boring dreamer.
Wait, you guys remember your dreams?
AI = ACTUAL IMAGINATION
dreams have always been propped up by the porn industry.
The big difference is that I often enjoy my dreams.
Not really the same. Some dreams, the more vivid kind can have smells or feelings. Like the one I had yesterday in which I had cat ears, whiskers, and a tail, and also curiously enough, horns as well. I could run my finger along them and feel the texture in them as well as the warmth in the horns. It was weird.
AI doesn't have smells, feeling, or textures, or if it did it can't really give it to you. Only sights and sounds. So they aren't really the same. Dreams can come in all 5 senses, AI can only channel through 2. Usually just 1.
@BomberMan9865@sh.itjust.works @SolidShake@lemmy.world
First: there are already robotic sensors capable of taste and smell (e.g. World's first artificial tongue ' tastes and learns ' like a real human organ). Those sensors could technically be integrated to a language model, although the approach for training would be different (can't simply feed it with gazillions worth of taste/smell corpus).
Then, there's a thing called multimodal, it's a thing already. LLMs can be multimodal, and there isn't exactly an algorithmic limit to how many "modals" (textual, vision, audio, robotic sensors and actuators, etc) can be connected together. Smell would be just another data stream to be integrated into the model's latent space.
The only thing I agree is that robots and language models wouldn't have "feelings", although this is pretty much a subjective thing: if we consider science, feelings are nothing more than the interaction of neurotransmitters (oxytocin for "love", dopamine for "joy", epinephrine for "fear", etc) going on inside our gray matter, and humans aren't the exclusive ones to be able to "feel".
And scientifically, living beings are no better than, say, an asteroid wandering through the cosmos, for everything is "made of star stuff" (as per Carl Sagan): humans, cats, chairs, residential buildings, AirBus A350 aircrafts, satellites, asteroids, everything is made by a bunch of baryonic particles (which is merely the collapse of waves) interacting with leptons and mesons like some kind of double pendulum dynamic system.
Of course, we can consider things beyond the scientific strictness, such as spirituality (I myself am spiritually-leaning, even if it sounds like I'm not due to my aforemention to hard science). But then some spirituality branches believe that spiritual forces would be able to "embody" inside a computer or other electronic device (e.g. Spiritism's Electronic Voice Phenomenon). I myself believe LLMs can be interesting digital Ouija boards.
In the end of the day, we homininae can't even define sentience and consciousness, just barely the concept of "intelligence" as "capability for tool usage" (in which New Caledonian crows want to have a word).
And from a solipsistic perspective, no one exists but oneself.
I mean, you can neither know nor prove whether I'm sentient, just like I can neither know nor prove whether you are sentient. To you, I may even sound like LLM due to the way this reply is structured alongside the seemingly non-sequiturs I used.