jray4559

joined 1 year ago
[–] jray4559@lemmy.fmhy.ml 14 points 1 year ago

You can use the Web Archive, if those posts are saved, you'll see it without ever touching a reddit server. You can't comment or give awards or anything, but you're not locked out completely.

Or you can use a 3rd party app. The team behind Youtube Revanced have patched most of the major (Android) ones to still function afterwards. https://web.archive.org/web/20230705025421/https://www.reddit.com/r/AfterVanced/comments/14m3wgo/reddit_apps_for_which_patches_andor_patched/

[–] jray4559@lemmy.fmhy.ml 1 points 1 year ago

Centralization. Meta will develop the fediverse protocol up with their own specific enhancements, locking most of us onto their version of the platform due to convenience given enough time. (Think RCS as an example, it's technically an open standard but Google has effective control of it thanks to special features)

You might say "Well, if they are a better platform, so what, why not go there", but the problem is that they can cut off the rest of the fediverse as "outdated" essentially privatizing everybody.

[–] jray4559@lemmy.fmhy.ml 0 points 1 year ago (2 children)
[–] jray4559@lemmy.fmhy.ml 11 points 1 year ago (1 children)

Ah, yes, I was waiting for the Taliban to tell me which one of these two online clusterfucks to start using.

I'm surprised they even care about Twitter, wouldn't they want to be underground?

[–] jray4559@lemmy.fmhy.ml 6 points 1 year ago* (last edited 1 year ago)

Yep, especially for young people in the United States.

People have become so attached to "one place only for everything" that something as simple as blue/green bubbles (and associated service differences) will be enough to isolate.

The messiah of maximal convenience makes differentiation harder and harder for everyone.

(And for everyone who says "Everyone here just uses Telegram/Whatsapp/QQ/Line, not a problem over here!", guess what, you're still probably gonna be exclusionary to other services just like the blue/green folks. It just doesn't happen to affect you.)

[–] jray4559@lemmy.fmhy.ml 1 points 1 year ago (1 children)

I hope to god you are right. What will truly be a revolution is if somehow these models can be transitioned to CPU-bound rather than GPU without completely tanking performance. Then we can start talking about running it on phones and laptops.

But I don't know how much more you can squeeze out of the LLM stone. I'm surprised that we got what was essentially a brute-forcing of concepts, with massive catalogs of data, rather than one more hand-crafted/built from scratch. Maybe there is another way to go about? God I hope so, so OSS can use it before the big guys convince governments to drop the hammer.

[–] jray4559@lemmy.fmhy.ml 2 points 1 year ago (2 children)

Good. Hell, once the user-instance blocking thing comes out, give people a list and instructions to block those political instances, porn ones, etc. etc. I always like it to be on the user level, rather than some over-arching force telling me what I can and can't view. Freedom to make the decision for yourself should be a large priority.

[–] jray4559@lemmy.fmhy.ml 1 points 1 year ago (8 children)

I would have preferred to have a no-defederation policy (except for maybe threads due to EEE fears), but as long as it is not swung around constantly, I am okay with it.

The two that are blocked are political circlejerks that I personally dislike anyways. I am hopeful there is not more.

[–] jray4559@lemmy.fmhy.ml 8 points 1 year ago* (last edited 1 year ago) (6 children)

And I disagree with it too. And it's not because of how good the models are in technical terms, the corporate juggernauts are only just ahead of OSS on that front... it's server space and the money to acquire it that is the moat.

An average internet user will not install the Vicunas and the Pygmalions and the LLaMAs of the LLM space. Why?

For one, the field is too complicated to get into, but, more importantly, a lot of people can't.

Even the lowest complexity models require a PC + graphics card with a fairly beefy amount of VRAM (6GB at bare minimum), and the ones that can go toe-to-toe with ChatGPT are barely runnable on even the most monstrous of cards. No one is gonna shell out 1500 bucks for the 4090 just so they can run Vicuna-30B.

They are gonna use online, free-to-use, no BS, no technical jargon LLM services. All the major companies know that.

ChatGPT and services like it have made the expectation: "just type it in, get amazing response in seconds, no matter where".

OSS can't beat that, at least not right now. And until it can, the 99% will be in Silicon Valley's jaws.

[–] jray4559@lemmy.fmhy.ml 1 points 1 year ago

Because an ad or a subscription is more obvious.

Algorithms are harder to prove and don't interrupt the flow of content, thus less people get pissed, which means less people leave, and they can charge higher rates to advertisers.

[–] jray4559@lemmy.fmhy.ml 14 points 1 year ago (1 children)

Most lurkers view the Twitter/TikTok reposts Reddit is full of, which have not slowed down, or the ""advice"" subreddits, which have not slowed down either.

The content that people like us like, some of it has moved away, but the people who are willing to chase that content are a very small minority.

[–] jray4559@lemmy.fmhy.ml 64 points 1 year ago (20 children)

This gets made back by September.

95% of people who use reddit use the official app or website, and don't notice a single thing except the occasional stray John Oliver meme.

Not enough hobby communities left.

view more: next ›