OmnipotentEntity

joined 2 years ago
[–] OmnipotentEntity@beehaw.org 1 points 1 week ago

My understanding is that the Gobi Desert has historically been expanding due to desertification of the surrounding grasslands, and the project to plant trees in this area was to halt or reverse this process. In other words, the ecological destruction was already occurring.

But double check me.

[–] OmnipotentEntity@beehaw.org 0 points 1 week ago (1 children)

AI is something to be genuinely worried about. LLMs are a bubble, yes. They're not great at what they've been shoehorned into, yes. They're a disaster from a security standpoint because of this, yes.

But LLMs are not the totality of deep learning or computer AI, and may not even be the future. We have our artificial inanity bots that do the spicy text auto complete right now, but that's not going to be forever, no matter what Microsoft, Google, and NVidia want you to think for right now.

Artificial intelligence is a problem that has a solution. I cannot tell you how long it will take, but I would be very surprised if I wasn't personally around to see it, which means we need to begin preparing for it now. We need to build systems and structures now that don't carry the inevitable logic that if the riff raff are unnecessary for those in power, then they should be kept docile, ignorant, powerless, and ignored at best, and purged at worst.

[–] OmnipotentEntity@beehaw.org 1 points 2 weeks ago (1 children)

I think the point is the Democratic party is excluding themselves, and trying to go through the Democratic party apparatus to counter the Republicans is doomed. The only way to stop them is to sidestep the Dems.

[–] OmnipotentEntity@beehaw.org 2 points 3 months ago (1 children)

Oh man, I completely didn't think about maintenance. Yeah, a data center will typically have several hard drives swapped per day. You'd have to have life support and a staff up there, as well as frequent resupply trips.

[–] OmnipotentEntity@beehaw.org 5 points 3 months ago

It turns out, that for the values we are talking about here, it actually more or less does! A lemon has a pH of around 2.5, while "Flow" has an advertised pH of 8.1. This means roughly that to neutralize 1L of this water you need approximately 0.4mL of lemon juice or about 8 drops/half a gram. It's hard to tell how much a "spritz" is intended to be, but a single lemon contains about 60mL of juice, so this represents about 0.67% of the total juice inside.

It's a surprising consequence of using a logarithmic scale for pH.

[–] OmnipotentEntity@beehaw.org 3 points 3 months ago (3 children)

I am a bit late to this party, but I thought I'd piggy back on your comment to halfway address it using math.

We want to run data centers cool. This means keeping the center itself as close to 20°C as possible.

If we lose our convection and conduction then our satellite can only radiate away heat. The formula governing a black body radiator is P = σAT^4. We will neglect radiation received, though this is not actually a negligible amount.

If we set T = 20°C = 294K. Then we have the relationship of P/A = 423.6 W/m^2

According to an article I found on the Register from this April:

According to Google, the larger of the two offered pods will consume roughly 10 megawatts under full load.

This would imply a surface area of at minimum 23600 m^2 or 5.8 acres of radiator.

I don't know how large, physically, such a pod would be. But looking at the satellite view of a google data center in Ohio that I could find, the total footprint area of one of the large building of their data centers is ballpark in that range. I don't know how many "pods" that building contains.

So it's not completely outside of the realm of possibility. It's probably something that can be engineered with some care, despite my earlier misgivings. But putting things in orbit is very expensive, and latency is also a big factor. I can't think of any particular practical advantages to putting this stuff into orbit other than getting them out of the jurisdiction of governments. (Not counting the hype and stock song and dance from simply announcing you're going to set a few billion dollars on fire to put AI into space.)

[–] OmnipotentEntity@beehaw.org 3 points 4 months ago (3 children)

Is... that a reference to Frankie and Johnny's???

Um... Hello, fellow person who grew up in or around New Orleans in the 1990s. How are things? Where did you wind up after Katrina? I have no idea how to handle someone just randomly referencing commercials I had completely forgotten about.

[–] OmnipotentEntity@beehaw.org 22 points 4 months ago

Am I going crazy? One toothbrush for every person. I'm a monogamuggle, and I don't even share a toothbrush regularly with my wife. If you insist on the counter being tidy, put them in a drawer. Also prevents aerosolized fecal matter from getting on the toothbrush head.

[–] OmnipotentEntity@beehaw.org 2 points 5 months ago (3 children)

That's a deep cut reference. 👏

 

Abstract:

Hallucination has been widely recognized to be a significant drawback for large language models (LLMs). There have been many works that attempt to reduce the extent of hallucination. These efforts have mostly been empirical so far, which cannot answer the fundamental question whether it can be completely eliminated. In this paper, we formalize the problem and show that it is impossible to eliminate hallucination in LLMs. Specifically, we define a formal world where hallucina- tion is defined as inconsistencies between a computable LLM and a computable ground truth function. By employing results from learning theory, we show that LLMs cannot learn all of the computable functions and will therefore always hal- lucinate. Since the formal world is a part of the real world which is much more complicated, hallucinations are also inevitable for real world LLMs. Furthermore, for real world LLMs constrained by provable time complexity, we describe the hallucination-prone tasks and empirically validate our claims. Finally, using the formal world framework, we discuss the possible mechanisms and efficacies of existing hallucination mitigators as well as the practical implications on the safe deployment of LLMs.

 

You might know the game under the name Star Control 2. It's a wonderful game that involves wandering around deep space, meeting aliens, and navigating a sprawling galaxy while trying to save the people of Earth, who are being kept under a planetary shield.

 

Subverting Betteridge's law of headlines. Yes.

 

Sometimes, because I am ancient, I automatically type in www. before I type in beehaw.org into my address bar. It would be nice and comfy to have that give a CNAME redirect instead of just completely failing to DNS resolve.

1
submitted 2 years ago* (last edited 2 years ago) by OmnipotentEntity@beehaw.org to c/gaming@beehaw.org
 

the Logitech F710 is a solid controller to get if you’re on a tight budget, but perhaps not exactly the type of equipment you want to stake your life on. [...] Reviewers on sites like Amazon frequently mention issues with the wireless device's connection.

The reporter, who followed an expedition of the Titan from the launch ship, wrote that “it seems like this submersible has elements of MacGyver jerry-riggedness.”

view more: next ›