istewart

joined 1 year ago
[–] istewart@awful.systems 13 points 1 month ago

I would say that the in-group jargon is more of a retention tactic than an attraction tactic, although it can become that for people who are desperately looking for an ordered view of the world. Certainly I've seen it a lot in recovering Scientologists, expressing how that edifice of jargon, colloquialisms, and redefined words shaped their worldview and how they related to other people. In this case here, if you've been nodding along for a while and want to continue to be one of the cool guys, how could you not glomarize? Peek coolly out from beneath your fedora and neither confirm nor deny?

I will agree that the ratsphere has softer boundaries and is not particularly competently managed as a cult. As you allude to, too, there isn't a clear induction ritual or psychological turning point, just a mass of material that you're supposed to absorb and internalize over a necessarily lengthy stretch of time. Hence the most clearly identifiable cults are splinter groups.

[–] istewart@awful.systems 14 points 1 month ago (8 children)

I understand where he probably got the neologism "glomarize" from (https://en.wikipedia.org/wiki/Glomar_Explorer) but his willingness to beat you in the face with it until you accept it is a big part of what makes his writing style so offputting. And, uh, this level of enthusiasm for specialized jargon continues to fail to overcome the cult allegations.

[–] istewart@awful.systems 10 points 1 month ago (1 children)

Two cases that are a bit less clear, but still show some structural similarities: Peter Thiel, Sam Altman.

A bit less clear because these two have a bit more money to throw around, and damn would it be good if they started throwing some in our direction again

[–] istewart@awful.systems 3 points 1 month ago

Probably ought to apply real bleach should you discover one languishing nonfunctionally in the back of a Goodwill a couple years from now - the form factor invites some unsanitary possibilities (as the below comment has already pointed out)

[–] istewart@awful.systems 10 points 1 month ago (1 children)

"Agentic" is meant to seem sci-fi, but I can't help but think it's terminal business-speak. It's the clearest statement yet of the attempted redesign of the computer from a personal device to a distinct entity separate from oneself. One is no longer a user or administrator, one is instead passively waiting for "agents" to complete a task on one's behalf. This model is imposed from the top down, to be the strongest reinforcement yet of the all-important moat around the big vendors' cloud businesses. Once you're in deep with "agents," your workflows will probably be so hopelessly tangled, vendor-specific, and non-debuggable/non-reimplementable that migrating them to another vendor would be a nightmare task orders of magnitude beyond any database or CRM migration. If your workflows even get any work done anymore at all.

[–] istewart@awful.systems 3 points 1 month ago (1 children)

Oh joy, I can perform a threat display by twirling it around my head like a bolo. I think I will get the pink or bright yellow one

[–] istewart@awful.systems 6 points 1 month ago

ok, cool. when does he start selling off all the super limited-edition anime waifu merch? asking for a friend

[–] istewart@awful.systems 7 points 1 month ago

‘Genetic engineering to merge with machines’ is both a stream of words with negative meaning

Iron-compatible osteoblasts that build bio-steel! Synapses with silicon, no, make that graphene neurotransmitter filters in the gap! C'mon, Sam, hire me and we can technobabble so much harder than this!

I must insist on cash payment, though. No stock options. And I prefer to be paid weekly.

[–] istewart@awful.systems 6 points 1 month ago

Occasionally I feel that Altman may be plugged into something that’s even dumber and more under the radar than vanilla rationalism.

I think he exists in the tension between rationalism/transhumanism and what he can get away with selling to the public, and that necessarily means his schtick appears dumber and more incoherent. He's essentially got two major groups he's trying to manipulate simultaneously; true believers and those who have yet to be persuaded. As he runs out of hype on the public-facing side, it's suddenly a desperate scramble to keep the true believers that make up the bulk of his workforce on board. Hence his pivot to marketing his latest and by far most important product: publicly traded shares in OpenAI.

Apropos of nothing, L. Ron Hubbard died in a dingy trailer in Creston. Ever been to Creston? It's a long ways from Hollywood.

[–] istewart@awful.systems 4 points 1 month ago

you know, if those ASML folks in dutchland weren't quite so busy what with their EUV lasers and all that, we might not be in quite this same pickle right now,

[–] istewart@awful.systems 6 points 1 month ago* (last edited 1 month ago)

Broke: Wyoming gold mine

Woke: Wyoming bison ranch

Bespoke: Wyoming AI lab

[–] istewart@awful.systems 2 points 1 month ago (1 children)

All 3 of the major Japanese manufacturers (Casio, Seiko, Citizen) have solar-powered radio sync models, but so far Casio is the best in my experience, and has the widest range of models. The Casios tend to have an auto-DST setting that relies on an internal calendar as well as the time signal. I have a chonky Seiko solar-atomic pilot's watch (with rotary slide-rule bezel!), but it doesn't have auto-DST so I have to bounce it back and forth between time zones. And it also doesn't seem to be as adept at receiving the WWVB signal as my Casios; it needs to be next to a window, while the Casios don't seem to care as long as there's not too much building mass to the east. I haven't had a chance to try a Citizen yet, but they now have solar-atomic moon-phase watches, which is tempting.

view more: ‹ prev next ›