diz

joined 2 years ago
[–] diz@awful.systems 7 points 3 weeks ago* (last edited 3 weeks ago) (1 children)

Even to the extent that they are "prompting it wrong" it's still on the AI companies for calling this shit "AI". LLMs fundamentally do not even attempt to do cognitive work (the way a chess engine does by iterating over possible moves).

Also, LLM tools do not exist. All you can get is a sales demo for the company stock (the actual product being sold), built to impress how close to AGI the company is. You have to creatively misuse these things to get any value out of them.

The closest they get to tools is "AI coding", but even then, these things plagiarize code you don't even want plagiarized (because its MIT licensed and you'd rather keep up with upstream fixes).

[–] diz@awful.systems 5 points 3 weeks ago* (last edited 3 weeks ago)

But just hear me out: if you delete your old emails, you won’t be roped into paying for extra space, and Microsoft or Google will have a little less money to buy water with!

Switch to Linux and avoid using any Microsoft products to conserve even more water.

[–] diz@awful.systems 3 points 1 month ago* (last edited 1 month ago)

Well yeah but the new age ones overthink everything. Edit: I suspect you could probably find one of them spelling it out.

[–] diz@awful.systems 16 points 1 month ago (6 children)

The problem is that to start breaking encryption you need quantum computing with a bunch of qubits as originally defined and not "our lawyer signed off on the claim that we have 1000 qubits".

[–] diz@awful.systems 4 points 1 month ago

I wonder if the weird tags are even strictly necessary, or if a sufficiently strongly worded and repetitive message would suffice.

[–] diz@awful.systems 4 points 1 month ago* (last edited 1 month ago) (2 children)

Embryo selection may just be the eugenicist's equivalent of greenwashing.

Eugenicists doing IVF is kind of funny, since it is a procedure that circumvents natural selection quite a bit, especially for the guys. It's what, something like billion to one for the sperm?

If they're doing IVF, being into eugenics, they need someone to tell them that they aren't "worsening the species", and the embryo selection provides just that.

edit: The worse part would be if people who don't need IVF start doing IVF with embryo selection, expecting some sort of benefit for the offspring. With American tendency to sell people unnecessary treatments and procedures, I can totally see that happening.

[–] diz@awful.systems 6 points 1 month ago* (last edited 1 month ago)

I think I have a real example. Non hierarchical (or, at least, less hierarchical) arrangements. Anarchy is equated with chaos.

Anything in nature we ascribe a hierarchy to; ants or other hymenoptera and termites have supposed "queens", parent wolves are "alphas" and so on. Fictional ant-like aliens have brain bugs, or cerebrates, or the like. Even the fucking zombies infected with a variant of the rabies virus get alphas somehow.

Every effort has went into twisting every view on reality and every fiction to align with the ideology.

[–] diz@awful.systems 9 points 1 month ago* (last edited 1 month ago)

I think it's a mixture of it being cosplay and these folks being extreme believers in capitalism, in the inevitability of it and impossibility of any alternative. They are all successful grifters, and they didn't get there through some scheming and clever deception, they got there through sincere beliefs that aligned with the party line.

They don't believe that anything can actually be done about this progression towards doom, just as much as they don't properly believe in the doom.

[–] diz@awful.systems 1 points 1 month ago

So it got them so upset presumably because they thought it mocked the basilisk incident, I guess with Roko as Laurentius and Yudkowsky as the other guy?

[–] diz@awful.systems 8 points 1 month ago* (last edited 1 month ago)

I’d say its a combo of them feeling entitled to plagiarise people’s work and fundamentally not respecting the work of others (a point OpenAI’s Studio Ghibli abomination machine demonstrated at humanity’s expense.

Its fucking disgusting how they denigrate the very work on which they built their fucking business on. I think its a mixture of the two though, they want it plagiarized so that it looks like their bot is doing more coding than it is actually capable of.

On a wider front, I expect this AI bubble’s gonna cripple the popularity of FOSS licenses - the expectation of properly credited work was a major aspect of the current FOSS ecosystem, and that expectation has been kneecapped by the automated plagiarism machines, and programmers are likely gonna be much stingier with sharing their work because of it.

Oh absolutely. My current project is sitting in a private git repo, hosted on a VPS. And no fucking way will I share it under anything less than GPL3 .

We need a license with specific AI verbiage. Forbidding training outright won't work (they just claim fair use).

I was thinking adding a requirement that the license header should not be removed unless a specific string ("This code was adapted from libsomeshit_6.23") is included in the comments by the tool, for the purpose of propagation of security fixes and supporting a consulting market for the authors. In the US they do own the judges, but in the rest of the world the minuscule alleged benefit of not attributing would be weighted against harm to their customers (security fixes not propagated) and harm to the authors (missing out on consulting gigs).

edit: perhaps even an explainer that authors see non attribution as fundamentally fraudulent against the user of the coding tool: the authors of libsomeshit routinely publish security fixes and the user of the coding tool, who has been defrauded to believe that the code was created de-novo by the coding tool, is likely to suffer harm from misuse of published security fixes by hackers (which wouldn't be possible if the code was in fact created de-novo).

[–] diz@awful.systems 11 points 1 month ago

I think provenance has value outside copyright... here's a hypothetical scenario:

libsomeshit is licensed under MIT-0 . It does not even need attribution. Version 3.0 has introduced a security exploit. It has been fixed in version 6.23 and widely reported.

A plagiaristic LLM with training date cutoff before 6.23 can just shit out the exploit in question, even though it already has been fixed.

A less plagiaristic LLM could RAG in the current version of libsomeshit and perhaps avoid introducing the exploit and update the BOM with a reference to "libsomeshit 6.23" so that when version 6.934 fixes some other big bad exploit an automated tool could raise an alarm.

Better yet, it could actually add a proper dependency instead of cut and pasting things.

And it would not need to store libsomeshit inside its weights (which is extremely expensive) at the same fidelity. It just needs to be able to shit out a vector database's key.

I think the market right now is far too distorted by idiots with money trying to build the robot god. Code plagiarism is an integral part of it, because it makes the LLM appear closer to singularity (it can write code for itself! it is gonna recursively self-improve!).

[–] diz@awful.systems 13 points 1 month ago* (last edited 1 month ago) (4 children)

In case of code, what I find the most infuriating is that they didn't even need to plagiarize. Much of open source code is permissively enough licensed, requiring only attribution.

Anthropic plagiarizes it when they prompt their tool to claim that it wrote the code from some sort of general knowledge, it just learned from all the implementations blah blah blah to make their tool look more impressive.

I don't need that, in fact it would be vastly superior to just "steal" from one particularly good implementation that has a compatible license you can just comply with. (And better yet to try to avoid copying the code and to find a library if at all possible). Why in the fuck even do the copyright laundering on code that is under MIT or similar license? The authors literally tell you that you can just use it.

view more: ‹ prev next ›