They started building the machine in 2055 and didn't have time to invent anything else. Turns out focus is pretty important for fast R&D...
Sekoia
This looks like Wafrn, which has a feature to replace the word "AI" with whatever you want. Probably what was done here!
If you're lucky estrogen gives you that too! I got a little bit + a bit of boobs (not nearly done yet, luckily tho)
I wasn't jacked (at all), but estrogen completely melted my muscle mass, it's great
As someone who wore men's jeans and now wears women's... even women's jeans with pockets have ridiculously small pockets compared to men's. It's really not just a matter of choice, because a majority of the choices just don't exist.
I can't tell what this is trying to say. Cus they had a fuckton of those videos already?
I would agree, but it's been less than a year since he was shot, so the jury's still up on which way his legacy will go.
They automatically get your approximate location from your IP, and some websites do need your precise location.
They don't need it, but google chrome sure gets it!
What the others said is true, but:
- Exercise is just good in general, but also burning fat will burn fat from the "non-fem" areas, which won't get replenished
- I heard salmon is good? Because of the fat? Not 100% sure
- I've got 70A/B (EU sizing) in 10 months, so you've got more than me there :P but anyway breast growth is just genetics + nutrition + time, don't focus too much on it
- I found painted nails help and are fun? I lile black because it's the default "alt" color so ppl can see me as a guy and still not be weirded out too much
To Cheney? That's wild, thanks
I understand the waterboarding, but what's with the shotgun?
Yes! The way those physics models are created is so cool. The article somewhat explains it, but it's mostly a fluff-piece for things unrelated to genAI. More in-depth:
The physically accurate simulation is great but slow. So we can create a neural network (there's a huge variety in shapes), and give it an example of physics, and tell it to make a guess as to what it'll look like in, say, 1ms. We make it improve at this billions of times, and eventually it becomes "good enough" in most cases. By doing those 1ms steps in a loop, we get a full simulation. Because we chose the shape of it, we can pick a shape that's quite fast to compute, and now we have a less-accurate but faster simulation.
The really cool thing is that sometimes, these models are better than the more expensive physics simulation, probably because real physics is logical and logical things are easier to learn.
We've done things like this for ages. One way we can improve them is by giving them multiple time steps. Unfortunately they kinda suck at seeing connections over time, so this is expensive. Luckily, transformers were invented! This is a neural network shape that is really good at seeing connections over one dimension, like time, while still being pretty cheap and really easy to do run in parallel (which is how you can go fast nowadays).
With a bunch of extra wiring, transformers also become GPT, i.e. text-based AIs. That's why they suddenly got way better; they went from being able to see connections with words maybe 3-4 steps back, to recently a literal million. This is basically the only relationship with "AI" this has.