[-] Hackworth@lemmy.world 1 points 16 hours ago* (last edited 16 hours ago)

What if we sell kids their comfort zones? Oh! Or better yet, rent them.

[-] Hackworth@lemmy.world 3 points 19 hours ago* (last edited 19 hours ago)

Honestly, I'd get on-board with just about anytime 2000 to 2010. The enshittification of the internet and social-media-driven comment culture didn't start in earnest until smart phones took off.

[-] Hackworth@lemmy.world 6 points 19 hours ago

An armageddon half full kinda guy, I see. I like that.

[-] Hackworth@lemmy.world 2 points 20 hours ago

The year 2000 was peak human technology. It's been downhill in every way since, until generative AI - which is f'in amazing. But let's be real, the future belongs to the bots.

[-] Hackworth@lemmy.world 1 points 20 hours ago* (last edited 20 hours ago)

I appreciate one of the most concise explanations of that perspective I've ever read! This is actually the one I'd like to believe, but not the one I do. I disagree with the idea that "both sides are the same," but I won't go so far as to imagine Democrats are truly concerned with integrity to the degree that they'd sacrifices strategy. I'm afraid they're just people, and people are all fucking stupid in their own way. It's just some are fucking stupid and malicious.

[-] Hackworth@lemmy.world 6 points 20 hours ago

You believe it's the Dem's that do the sabotaging, and that they are compromising to...themselves? Interesting.

[-] Hackworth@lemmy.world 8 points 21 hours ago

Do you not believe they're preparing an amendment, or do you not believe it will pass?

[-] Hackworth@lemmy.world 25 points 22 hours ago

The tax situation for U.S. territories is complex and varies between them. Here's a brief overview:

Puerto Rico and U.S. Virgin Islands:

Residents generally do not pay federal income tax on local income. They do pay federal payroll taxes (Social Security and Medicare). They have their own tax systems.

Guam and Northern Mariana Islands:

Have a "mirror" tax system that parallels the U.S. federal tax system. Residents pay taxes to the territorial government instead of to the U.S. federal government.

American Samoa:

Has its own tax system. Residents generally do not pay federal income taxes.

However, there are exceptions:

Federal employees in these territories generally pay federal income tax. Residents with income sources from the U.S. mainland may need to pay federal taxes on that income. Some residents may need to file U.S. tax returns under certain circumstances.

It's important to note that while residents of these territories may not pay federal income taxes in many cases, they also don't have full representation in Congress and can't vote in U.S. presidential elections (though they may participate in primary elections).

[-] Hackworth@lemmy.world 138 points 1 day ago

On the way out of the office, I told my new coworker "Have a good Independence Day," and he responded without missing a beat, "You too, it's the last one!" lol, fuk

[-] Hackworth@lemmy.world 9 points 1 day ago

Mildreds out here, tryin ta take 'Mildred' back, and y'all throwin shade like people ain't have to live with dumbass fuckin names like Mildred. Leave Mildreds alone! They've got enough to deal with. Wait, they chose Mildred, like as an adult? In the present? Pfft.

[-] Hackworth@lemmy.world 3 points 1 day ago

Claude, re-write my post.

Sure thing boss!

Generative AI fascinates me, and I use it frequently for both work and leisure. This photo, however, is the first to make me feel a twinge of sadness, as it reminds me that I now have to question the authenticity of everything I see.

Then I recall that even traditional photographs often undergo extensive post-processing, which somewhat alleviates my concern. To distract myself, I create an AI-generated video of the river flowing, momentarily forgetting my worries.

However, I can't help but notice that many of these AI generations are of poor quality. Generative AI is simultaneously wonderful and troubling - a double-edged sword of technology.

Despite these mixed feelings, I must admit this is a beautiful photo.

[-] Hackworth@lemmy.world 4 points 1 day ago* (last edited 1 day ago)

I like to think about the spacefaring AI (or cyborgs, if we're lucky) that will inevitably do this stuff in our stead, assuming we don't strangle them in the cradle.

76
submitted 1 month ago* (last edited 1 month ago) by Hackworth@lemmy.world to c/technology@lemmy.world

Also See: Worldwide Federated Training Of Language Models

Claude's Summary:

The two papers, "Worldwide Federated Training of Language Models" by Iacob et al. and "The Future of Large Language Model Pre-training is Federated" by Sani et al., both propose using federated learning (FL) as a new paradigm for pre-training large language models (LLMs). The main ideas are:

  1. FL allows leveraging more data and compute resources from multiple organizations around the world, while keeping the data decentralized and private. This can enable training larger LLMs on more diverse data compared to centralized training.

  2. FL relaxes synchronization requirements and reduces communication overheads compared to data-parallel distributed training, making it feasible for geographically distributed participants with varying hardware and connectivity.

  3. The papers present systems and algorithms for enabling efficient federated pre-training of LLMs at billion-parameter scales. Key techniques include allowing participants to modulate their amount of local training based on resource constraints, and partially personalizing models to clusters of participants with related data.

  4. Experimental results show federated LLM pre-training can match or exceed centralized training performance, with the performance gap narrowing as model size increases to billions of parameters. Larger federated models also converge faster and are more robust.

  5. Challenges include data and hardware heterogeneity across participants. The papers propose techniques like adaptive aggregation and load balancing to mitigate these issues.

In summary, the papers argue federated learning is a promising new direction for democratizing LLM pre-training by allowing many more organizations to collaboratively train large models on their combined data and compute resources. Let me know if you would like me to expand on any part of the summary or papers in more detail.

297
Good job! (lemmy.world)
view more: next ›

Hackworth

joined 1 month ago