look at the depth of this grifting
a whole One (1!) H100! in space!
note how it mentions nearly absolute fucking nothing about the supporting cast. about storage and networking, about interface capabilities, what kind of programmatic runtimes you could have! none of it. just gonna yeet a sat into space, problem solved! space DCs!
compute! in space! "what do you mean 'compute what'? compute!" I hear, as the jackass rapidly packs up their briefcase and starts edging towards the door. who needs to care about getting data to and from such a device? it'll run Gemma![0] magic!
SAR, in particular, generates lots of data — about 10 gigabytes per second, according to Johnston — so in-space inference would be especially beneficial when creating these maps.
scan-time "inference", like you'd definitely know every parameter you'd want to query and every result you'd want to have, first-time, at scan! there's a fucking reason this shit gets turned into datasets, and that the tooling around processing it is as extensive as it is.
and, again, this leaves aside all the other practical problems. of which there are many. even just the following ones should make you wince: launch, maintenance, power, heat dissipation (vacuum is an insulator!), repair, (usable) lifetime, radiation. and that's before even touching on the nuances in those, or going further on the list
good god.
I guess the one good bit here is that it isn't the "we're gonna micromachine them in orbit!" bullshit fantasy, but I bet that's not far behind
[0] - "multimodal and wide language support" so literally a Local LLM, but that means it needs... input... and... response... which again goes back to all those pesky "interaction" and "network" and "storage" questions.
you type “DHH said” twice in one post like he’s anyone to listen to, amazing
(pro-tip: he ain’t)