BranBucket

joined 2 years ago
[–] BranBucket@lemmy.world 1 points 7 hours ago* (last edited 7 hours ago)

Bailouts happen when the overall economic effect of letting a company fail would be more painful than using tax dollars to save it.

The number of employees at the institution in trouble is a factor, as it was with some of the banks in 2008, but think of all the wild deals AI companies are making for memory, hard drives, and data center construction. Each of those adds to the bottom line of other companies and their suppliers and subcontractors. If those contracts aren't honored, many of those smaller companies lower down on the food chain are going to suffer, possibly go out of business, putting their employees out of work. And that's not even considering the effect it'll have on hedge funds and other financial institutions that hold stock in AI.

If it seems insane for AI companies to commit to huge purchases and massive projects before they have a clear revenue stream, it's probably because it is, but if they can inflate the bubble enough they effectively hold both the US and global economy hostage. Since the US government clearly favors business over people, and so many tech companies have swung right to appease the orange lump currently in office, they stand a pretty good chance of getting bailed out and later subsidized in the event of a crash.

[–] BranBucket@lemmy.world 13 points 21 hours ago

Algorithm-based, ad supported social media is a public health crisis and damages people of all ages. It should be destroyed. At that point we don't have to worry about it's effect on kids or them using VPNs to circumvent age restrictions.

Seems like a more effective solution to me.

[–] BranBucket@lemmy.world 7 points 1 day ago* (last edited 1 day ago)

Gamergate was mostly fabricated and an intentional trial run for recruiting younger generations into movements that would transition to outright fascism. Pizzagate and Q-anon conspiracies were intentionally started to reduce the fallout from the Epstein files... both of these operations worked far, far better than the people behind them expected.

EDIT: I almost forgot, one reason why many Democrats keep moving right and away from their base despite all the evidence that this is costing them votes is because their post-mortem on Gamergate and Q-anon revealed how wildly effective these operations were and they think they can win with the same play at some point in the future.

There is an ongoing conspiracy to erode the public education system in the US to perpetuate and increase the advantages the children of the wealthy already have. Contrary to what most people believe, the introduction of more technology in the classroom is just one part of this effort, and only leads to "better outcomes" because those outcomes are defined as being a more efficient drone from sector 7-C.

The continued effort to represent LLMs as something closer to General AI is so that at some point, an improved version can be present as having "reasoned" that humanity will prosper if we perpetuate the current system and give tech billionaires more money. The main reason why it's failing at present is that many people have noticed a larger than normal number of folks are not, in fact, prospering and feel threatened by what is basically three high power grammar checkers in a trench coat. If the economy picks up before the AI bubble bursts, this may succeed.

[–] BranBucket@lemmy.world 8 points 2 days ago (2 children)

My assumption is that the goal of most modern tech giants is to become "too big to fail" and be bailed out like many banks were post 2008.

[–] BranBucket@lemmy.world 1 points 2 days ago

I feel like the relevant text for this moment would be The Subprime Attention Crisis by Tim Hwang.

AI was just a blip on the radar when it was written, but I felt he did a good job illustrating how tech bubbles form and are propped up using examples from the 2008 housing crisis.

[–] BranBucket@lemmy.world 3 points 4 days ago* (last edited 4 days ago) (1 children)

In light of everything else that's going on it's a distinct possibility.

But I commented because the tweet and title of this post seemed to imply that good food for troops can only mean war. That's simply not the case. There are a number of reasons you'd want to improve morale with a (comparatively) nice meal. War is only one of them.

[–] BranBucket@lemmy.world 3 points 4 days ago

When smart home thermostats and light switches were still a new thing, I used to talk about "Jurassic Park Tech" as in too worried about whether or not they could... and that's even more the case with AI.

At some point I think this gets to be like S. M. Stirling's Emberverse, where modern tech stops working and people who know how to make traditional wooden bows become an extremely valuable resource. Except it'll be having some old-timer on hand who's able to handle logistics with just spreadsheet, a Rolodex, and a calendar that's going make or break companies.

[–] BranBucket@lemmy.world 35 points 5 days ago* (last edited 5 days ago) (11 children)

I'm not saying this isn't a sign of something, but Surf and Turf nights are fairly common at overseas bases. Sometimes once a week.

EDIT: Pie is pretty much nightly anywhere with real kitchen.

[–] BranBucket@lemmy.world 10 points 6 days ago* (last edited 6 days ago) (1 children)

Helps if your world view is shaped by people who are continually telling you that he's a figure of manliness, while constantly playing off your fear and insecurity for profit and political gain.

EDIT: It's also a very unrealistic standard of what it means to be a man, almost a caricature. It's a masculine ideal rooted in myths of the "common-sense everyman hero" who's wisdom is more valuable than other's knowledge, instincts more accurate than other's intellect, and cunning able to overcome other's skill. He's a man who's anger and will can overcome insurmountable odds, and who's "rough-around-the edges" personality is more attractive than practiced social graces. There's no need for growth or change, because our hero needs nothing more than the innate qualities he already possesses to thrive, and because of that failure is never his fault.

It's the aesthetic of romanticized cowboys, gangsters, renegade cops, and retired spec-ops, a domesticated version of 1980's gritty action-hero masculinity adapted for group membership. It's always framed as bucking authority and going against the grain, even though conformity is required to be one of the "good guys". Being low on the totem pole allows you to gain the virtue of being a simple man with a simple life, or being the trusted sergeant "who really makes things happen around here", but doesn't mean that you're incapable of rising to any occasion just because you have "guts". Washboard abs or hard work are only important when they can be used to show how weak and ineffective your opponents are, but what's really important is that you're able to dismiss, demean, deny, and destroy anything that doesn't conform to the right way of doing things to gain the accolades you're due and save yourself the embarrassment of having to admit your hero fantasies aren't true.

Again, it's ridiculous to apply this ideal to the bloated orange, but it's an image he cultivates. His personal mythology is embodying that ideal and gaining massive success in every endeavor because of it. It allows his failings to be used as evidence that he's "just like us", and not as examples of his overall lack of redeeming qualities.

But I wouldn't expect it every really make sense, because at it's core it's just a bullshit justification for getting whatever they want with as little effort as possible.

[–] BranBucket@lemmy.world 4 points 6 days ago

As a philosophical stance, I feel humans should use tools, not the other way around. AI is a tool that uses those who attempt to use it.

AI "art" as most people understand it perverts the natural relationship between artist and medium. It inverts it, using the human to give it the one thing it cannot generate, an idea, then produces an approximation of "art". A satisfying result with an AI generated image demonstrates a lack of vision on the part the of the user, they were likely never really clear on what they wanted, not the power of the generative model.

Asking AI for answers or to give an overview of a subject seems harmless, but it can't be trusted to understand the unique context and needs of each user or to highlight what details are truly pertinent in that place or time. Again, it inverts the relationship between human and information, even if what has been generated is factually correct. It over-simplifies relationships and concepts in ways that are dangerous when nuance has been systematically stripped from public discourse for the last few decades. We need information to decide how to act in a given context, AI seems to attempt to change our understanding of that context to match the information it provides.

It's necessary to accept that you don't have complete control over the world around around you, but that doesn't mean we should accept a lack of control over our own understanding of that world.

[–] BranBucket@lemmy.world 34 points 6 days ago (8 children)

Fear.

They rally around figures who look strong because they're afraid of losing social status, or feeling xenophobia due to racism, or insecure over economic concerns. As always, everything is projection with these dipshits. They let fear drive them into the sort of groupthink mob they accuse liberals of being.

 
 

SMP Selle TRK medium. Super comfy. Best decision I've made since buying the bike.

view more: next ›