The clear pattern of him deciding to throw something out and then realizing that it's actually very important, he should really read about chesterton's fence it's a shame that he can't read.
You don't need an LLM to find that pattern.
The clear pattern of him deciding to throw something out and then realizing that it's actually very important, he should really read about chesterton's fence it's a shame that he can't read.
You don't need an LLM to find that pattern.
Apparently you can also just use bronners ok your hair.
Wild that they just kinda let them do whatever until now.
Now you have conjured a vision in my mind of an earthbender who dynamically raises small platforms under their feet in such a way that it simulates high heels. At least I think that would be cool as hell.
The attendant when I bring my own hammer and hinge pin removal tool /j :
I think an important facet here is the fact that I don't just disengage with people because I'm afraid of debating them or whatever. Most people's opinions are locked the fuck in and there's nothing you can say that will ever change what they think.
This is why these days I'm very careful that whatever I put on my lawn. The most I will ever put down under normal circumstances is water from the hose and fresh clover seeds.
I'm sick of all these damn so-called lawn care people showing up and offering to dump poison all over everything for an exorbitant fee.
I feel like use of an LLM should come with a pretty huge disclaimer. Anything that uses an LLM should have a disclaimer that reads like this:
Chat at your own risk:This system may generate inaccurate, incomplete, misleading, or harmful outputs. It is not a professional, sentient, or your friend and cannot replicate the training, judgment, accountability, practiced expertise, or lived experience of a doctor, lawyer, engineer, therapist, financial advisor, safety professional, or other qualified human expert. It is inadvisable to rely upon its outputs for medical, legal, financial, engineering, safety, employment, or other consequential decisions without independent verification and review by an appropriate professional. The software is provided “as is,” without warranties, and you are solely responsible for how you use it and for any consequences that result.
But I feel like if people read and paid attention to that the people who are trying to shove it into everything might not make as much money so they won't.
Well I don't really know what to feel about this. Nothing in the lyrics really contradicts what I've already observed.
I don't particularly care for strange AI generated Lego propaganda, I feel like it's just unnecessary.
I just feel like everything is so broken and collapsed that it's just kind of impossible for me to know what I should think anymore. I feel like I can barely make any clear decisions about it anymore.
I wish I could stop the people in charge of my country from doing what they're doing. But I'm a few orders of magnitude too low on the totem pole for that
This post has several fingerprints common to LLM-generated engagement bait, could you elaborate on why that is OP?
In particular:
The title and opening line appear duplicated/rephrased, which is a common artifact of LLM borne text being carelessly copied.
The body is structured in a highly generalized, templated way (broad setup --> three example scenarios --> bolded central question), with no specific personal anecdote or concrete detail.
Phrasing such as “I’ve been talking to a lot of indie TTRPG creators” and “as someone who builds tools for TTRPG creators” establishes authority but remains non-specific and unverifiable. Who are these TTRPG creators, are they in the room with us now? What sorts of things have you already made, may we see them?
The tone is uniformly neutral and engagement-optimized (“no wrong answers,” “I’m here to learn”), which is typical of engagement bait.
Also the nefarious Em Dash makes an appearance.
Checking on OPs profile, they seem to be all in on LLMs. Keep in mind if you're answering this thread and you don't like LLM stuff, you're feeding it by answering this thread.
How do you disable this verification?
Mrs longfinger sure has a point huh?