this post was submitted on 19 Sep 2025
477 points (98.8% liked)

Fuck AI

4089 readers
1089 users here now

"We did it, Patrick! We made a technological breakthrough!"

A place for all those who loathe AI to discuss things, post articles, and ridicule the AI hype. Proud supporter of working people. And proud booer of SXSW 2024.

founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] Denjin@feddit.uk 123 points 3 days ago (3 children)

Even if this worked perfectly, ignoring the fact that it's clearly setup for the camera to recognise certain things and is in no way a genuine demonstration, what is the point of this? By the time it's even responded to his first "Hey meta" he could have typed "korean steak sauce recipe" into his search engine of choice and got back several dozen decent results in seconds.

What is the problem that these LLMs and chatbots are the solution for? It's like they're all desperately trying to market some fancy new type of barely functioning legs to everyone when we already have legs, and arms, and cars, and bicycles.

[–] voodooattack@lemmy.world 51 points 3 days ago (2 children)

That’s not the point. He’s planning to harvest data about the environment in your home (what products you have around you, which brands do you prefer, etc) for better ad targeting and whatnot.

I guarantee that a lot of people will use it; and not because it does a great job, but out of curiosity, peer pressure, or abject laziness.

[–] Coyote_sly@lemmy.world 7 points 2 days ago (2 children)

Well sure, but you can't harvest data using your one legged mech suit of everyone keeps riding around on their perfectly functional bike.

LLMs just don't have a real use case for most people. That's the core issue here, and it isn't one that's going to get solved anytime soon.

[–] fishy@lemmy.today 3 points 2 days ago

Yup, their only real use in my daily life would be as a search result summary tool. Unfortunately in my experience they're they've been a net negative in that area and are hidden/ignored because they give the most common answer, which is rarely the specific answer I'm searching for.

[–] voodooattack@lemmy.world 3 points 2 days ago* (last edited 2 days ago) (2 children)

I’m pretty sure lots of people use Google Home Assistant or Amazon’s Alexa. Now slap a camera on it and hook it to an LLM and you’re all set. That’s what meta is about to do.

[–] foenkyfjutschah@programming.dev 4 points 2 days ago (1 children)

object recognition and classification is not LLM

[–] elfin8er@lemmy.world 3 points 2 days ago

No, but it can certainly augment LLMs.

[–] JcbAzPx@lemmy.world 3 points 2 days ago (1 children)

Meta isn't really a widget seller, though. Alexa and the Google thing only have the user base they do because they jam it into every device they sell.

[–] voodooattack@lemmy.world 3 points 2 days ago

It’s not a huge leap for them though. They could do it. They already sell VR gear that’s relatively popular, and they have the resources and reach to market anything. If it allows them to spy on people to this level, watch them subsidise any device they sell to hell and back.

Even acquiring a company that already sells consumer electronics/appliances and subsidising their products isn’t impossible for them. Like I said, they have the resources.

[–] ideonek@piefed.social 3 points 2 days ago

Which will still not be enught to make ads work. It's so much effort and abuse for nothing.

[–] someacnt@sh.itjust.works 5 points 2 days ago

To be fair, it is helpful to do grunt repetitive work which is only slightly different from the established formula. For me, it solved a quick math olympiad-esque problem which appeared in research.

But it is never conscious or creative.

[–] Alcoholicorn@mander.xyz 0 points 3 days ago (4 children)

The idea is the AI would automatically look at what you have and come up with something, substituting ingredients for what you have as necessary.

“korean steak sauce recipe” into his search engine of choice and got back several dozen decent results in seconds.

Have you tried looking for a recipe in the last 10 years or so? 10 pages of fluff with a recipe the author cobbled together from other recipes and guess work and made exactly 1 time if that at the bottom.

[–] zbyte64@awful.systems 2 points 1 day ago

The same pressures that made internet search suck applies to AI doing search as well.

[–] greybeard@feddit.online 11 points 3 days ago (1 children)

You know what would be more useful than this AI and wouldn't cost billions of dollars? If Facebook made a simple recipe that didn't have all that fluff. Instead of having AI try to come up with it in the fly, you could just have premade recipes. Wouldn't that be grand. Oh wait, that wouldn't give the opportunity for the AI to recommend a specific brand of soy sauce or that you buy your spices from Doordash with a 10% discount coupon!

[–] Denjin@feddit.uk 13 points 3 days ago (2 children)

The idea is the AI would automatically look at what you have and come up with something, substituting ingredients for what you have as necessary.

But as in the staged example they've put together here, that required you to find and lay out all your ingredients already so you've already done 90% of the work. Are my AI glasses going to be able to scan all my cupboards and fridge and pantry for things first and then go from there? It's a bad solution to a problem that doesn't really exist.

Have you tried looking for a recipe in the last 10 years or so? 10 pages of fluff with a recipe the author cobbled together from other recipes and guess work and made exactly 1 time if that at the bottom.

Yes, all the time, I even looked at recipes for "Korean style steak sauce" to prove my point and got probably a dozen decent ones straight away, including a couple from very well recognised sites like BBC Good Food.

And where exactly do you suppose the LLM has scraped together all it's information from for what you can substitute canola oil or sesame seeds for? Those exact recipes you could just have searched for.

[–] 4am@lemmy.zip 4 points 2 days ago

Are my AI glasses going to be able to scan all my cupboards and fridge and pantry for things first and then go from there?

No, the idea is that you wear the glasses all the time and it scans and knows the contents of your house constantly. What brands you like, what products you own, what activities you do and when, what conversations you have, whether or not you like the government, whether or not you own a computer with an ad blocker, what the best entry points are for a SWAT team from ICE, etc.

So it would already know what ingredients you have for making a Korean inspired BBQ sauce, and as long as you were well-mannered and compliant in your home and all the public spaces you visit at all times, it wouldn’t direct you to the re-education camp for further analysis on what radicalized you out of their control.

I don’t understand why you people are so negative about AI and the motivations of these large tech companies creating it on here! Think of how much easier life would be.

[–] MiddleAgesModem@lemmy.world -5 points 2 days ago (1 children)

And where exactly do you suppose the LLM has scraped together all it’s information from for what you can substitute canola oil or sesame seeds for? Those exact recipes you could just have searched for.

Who cares? It's still more convenient.

[–] Denjin@feddit.uk 6 points 2 days ago

LOL it demonstrably is not.