peanuts4life

joined 2 years ago
[–] peanuts4life@lemmy.blahaj.zone 13 points 6 hours ago

I usually give up or find a free alternative. Typically, if something is available at a good price I won't bother trying to get it free to begin with.

[–] peanuts4life@lemmy.blahaj.zone 4 points 7 hours ago* (last edited 7 hours ago)

It's been a while since I've listened to him, but I felt like he really struck an authentic vein with young men, especially queer kids. I wouldn't say his lyrics are genius, but his music was refreshing for popular hip-hop. I think a lot of his fans value the feelings he gives them over most other rappers in the space, even when their technical ability surpasses his.

Also his music production team sounds great, imo.

[–] peanuts4life@lemmy.blahaj.zone 6 points 11 hours ago

I don't remember the institution, but I remember reading a paper on a simulated trading environment with several ai agents who didn't know about eachother. The LLMs were pretty conservative with profits and deliberately bought and sold in predictable ways. They all ended up "colluding" with eachother by deliberately not competing.

[–] peanuts4life@lemmy.blahaj.zone 1 points 1 day ago (2 children)

I don't really think that this is a very productive approach to the issue of ai 'consciousness.' Anthropic has demonstrated that several LLMs have a rudimentary ability to reflect on their internal state during inference. They are an undeniably interesting, literate technology that we don't really fully understand being developed at an increasingly rapid rate

It's not that I think LLMs are conscious, but I do see why a person might come to that conclusion. Calling them crazy, dumb, or unimaginative is kind of insulting. They are interacting with an alien stort of intelligence engineered to keep their attention.

It's especially annoying when a lot of critics in the ai space are so smug about it. Many of those critics don't like LLMs for legitimate reasons regarding their effect on employment, the environment, ai slop, art, etc. But, these valid issues are biases unrelated to ai 'consciousness.' If a lay person comes in with an unbiased (not good, just unbiased) perspective, they just see a very difficult to understand, literate computer program which seems to have destroyed the turring test. And, they get insulted by people for making a naive, but somewhat reasonable assumption that it is conscious.

How do I view the entire image?

[–] peanuts4life@lemmy.blahaj.zone 2 points 2 weeks ago (1 children)

What is the image source?

How could I say no to a face like that?

[–] peanuts4life@lemmy.blahaj.zone 6 points 2 months ago

I'm no floor expert, but I did put floor like this into a room. The instructions we got was to leave a small gap all around the room, since temperature fluctuations will cause the floor to expand, and if there is no space to expand into it will buckle. Baseboard trim that I installed later disguises the gap.

I think gluing it down is not the answer. It may just cause it to buckle in the middle where it is harder to reach. If it's like the stuff we used, you can score it with a straight edge and a knife and then cut it.

[–] peanuts4life@lemmy.blahaj.zone 2 points 2 months ago

Sweaty gamer meme. I think the government is supposed to be the gamer? https://www.youtube.com/watch?v=Jb9Ebe_rA8M&t=0

[–] peanuts4life@lemmy.blahaj.zone 4 points 2 months ago

I understand the intended sentiment, but a gatekeeper is someone who acts as an obstacle to something that the subject wants to obtain.

To say a surgeon is a "cultural gatekeeper" implies that they are dolling out culture--or ethnicity in this case--in an exclusive way to those who seek it. This would make sense if the subject of the surgery was altering their appearance to that of a ethnicity other than their own, but this would be contrary to the point of the article.

It's emblematic of the purple prose which LLMs engage in. It is unlikely that a human writer who demonstrates such an excellent grasp of the English language as demonstrated in the article would make such a mistake.

I should also point out that the article lacks any specificity whatsoever. While it does cite general facts about surgery, a specific clinic, and descriptions of ethnicities, The article fails to demonstrate that this trend even exists. It does not contain a single image demonstrating the trend. It does not cite the origin of the trend. It does not cite any central figures, celebrities, or even specific surgeons or clinics that engage in this practice. Nor, does it highlight how this practice would even manifest itself. For example, how are these features highlighted? It only gives examples of whitewashing, but not how it's ethnically sensitive surgery would actually appear.

It reads a lot more like a Google Gemini search result, or an article prompted through chatGPT, and the subject of the article sounds like it was fabricated by the user.

[–] peanuts4life@lemmy.blahaj.zone 6 points 2 months ago (2 children)

"Surgeons are more than technicians, they are cultural gatekeepers." I'm pretty sure this was written by an LLM. This phase in particular literally contradicts it's intended meaning for the sake of Pliny phrasing.

[–] peanuts4life@lemmy.blahaj.zone 3 points 2 months ago

I kinda agree. While I do want these llm companies to be more private, in terms of data retention, I think it's native to say that a company which is selling artificial intelligence to hundreds of millions of users should be totally ambivalent in the face of llm induced psychosis and suicide. Especially when the technology only gets more hazardous as it becomes more capable.

view more: next ›