532
you are viewing a single comment's thread
view the rest of the comments
[-] dojan@lemmy.world 199 points 4 months ago

I mean these kinds of "AI companions" are grifts anyway. They won't take off because they are a solution looking for a problem. They aren't as affordable as the entry level HomePod/Amazon Pod/Google Home units, so they can't be bought as a "why not, and it's a speaker anyway" type thing. They don't have any secondary functionality you don't already have in your phone.

And if that's not enough, you can bet your cute arse on that Apple and Google are both working on bringing LLM functions into their assistants, basically making these units obsolete.

The moment that these companies decide that they can't afford to pay for servers and API subscriptions anymore, the service will die and you'll end up with a colourful brick. Don't buy these things, they're unfinished and will die within a year or two.

[-] PseudorandomNoise@lemmy.world 62 points 4 months ago

The ultimate issue is exactly what you said; phones exist. I’m not carrying another voice assistant around when both Siri and Google Assistant can be installed on my phone.

Based on MKBHD’s review this whole product category definitely screams “solution in search of a problem”

[-] GamingChairModel@lemmy.world 21 points 4 months ago

Like, I can imagine a world where a smart watch replaces my phone for day to day stuff, but that's because I'm in that weird space where I prefer a laptop for almost anything serious, but still appreciate the convenience and functionality of remaining connected wherever I am, even if I'm on the move.

But another device I need to keep in my pocket? What's the point?

[-] AIhasUse@lemmy.world 11 points 4 months ago

Rabbit has a SIM slot. I think the idea is that once its software gets better, it will be able to be a replacement for a phone for people who just want to quickly do simple things. Its battery seems to be pretty rubbish, though, and for now, the software is not nearly good enough.

[-] Thatuserguy@lemmy.world 23 points 4 months ago

But you can literally buy a cheap android phone for less than this device that does everything it does (and might do some day), maybe even better. Why buy a strange and unfamiliar form factor, when most people are comfortable with a smartphone already? They can just choose not to interact with anything other than the assistant if they really want to, and still be better off.

[-] AIhasUse@lemmy.world 0 points 4 months ago

I agree, fairly gimmicky, but I do like the idea of being able to press a single button to ask a quick question. I like my meta glasses for the same reason, but they need some improvement, and quite frankly, I'd like them a whole lot more if they were from someone other than meta. Also, I like the smallest of it. If I could get away with carrying just a tiny box, sometimes I'd do that. The software on it needs to get much better, so hopefully, they stick with it.

[-] Rinox@feddit.it 1 points 4 months ago

On Pixel (but probably also other phones) you can press and hold the power button to summon the assistant. Put chatgpt or whatever as your assistant and you have a rabbit equivalent with one button summon.

[-] AIhasUse@lemmy.world 1 points 4 months ago

Great point! Here are samsung instructions for this.

Download chatgpt from play store (ensure its by open ai and not a scam app). Set it up and make sure you have access to the voice feature

Download good lock from galaxy store (NOT play store)

In the good lock app, In the "life up" section, download the "RegiStar" module.

Open the RegiStar module and click the "side key press and hold action" setting. Turn it on

In the options underneath, choose "open app". Then scroll to the chat gpt app in the list, and click the setting icon next to the name. Then click "voice".

Now you should be able to long press the side button to directly access the chatgpt voice assistant.

[-] ColeSloth@discuss.tchncs.de 11 points 4 months ago

The rabbit is also just an android apk. You could literally install the rabbit on a cheap phone if you'd like. It's beyond useless.

What someone needs to do is put something similar into something all cutesy like a Furby, and sell it for kids. Just a $100 wifi only PG rated thing that can do some fun stuff. It wouldn't change the world, but it could run a few years of actual profiting and not feel like a rip-off.

[-] Rinox@feddit.it 1 points 4 months ago

Good luck making an AI you are 100% sure is PG rated.

Btw someone already put chatgpt+whisper in a kid's plushie/toy, saw it on an old WAN show. The lag is tremendous though

[-] Cheradenine@sh.itjust.works 8 points 4 months ago

But it's just an Android app in a dedicated device that reviews say has a shit interface and battery.

Run it on a cheap phone that does more for less.

[-] Petter1@lemm.ee 5 points 4 months ago

The battery part is fixed now 😂 they were able to give that thing 5x battery lifetime trough a software update

Makes me wonder what they where doing in the background prior this update

[-] yggstyle@lemmy.world 3 points 4 months ago

Former crypto company... Power drain... I feel like there's an answer here...

[-] Petter1@lemm.ee 1 points 4 months ago

😂that would be hell of a scam

[-] AIhasUse@lemmy.world -1 points 4 months ago

That's amazing, I hadn't heard about the battery fix!

[-] Imgonnatrythis@sh.itjust.works 5 points 4 months ago

Yeah, build this into a watch or Earbud that I already have on person for other reasons but gives me hands free access to a decent AI when I don't have my phone on me, and I might have some interest.

[-] JackGreenEarth@lemm.ee 1 points 4 months ago

What phone is that that supports both Siri and Google Assistant on the same device?

[-] PseudorandomNoise@lemmy.world 3 points 4 months ago

iPhones only, basically. Google Assistant is available through an app, but that's still more convenient than buying a $200 device

[-] Potatos_are_not_friends@lemmy.world 34 points 4 months ago

Absolutely a grift.

The CEO is a fucking joke. This is their bio on linkedin.

Serial Entrepreneur, semi - Pro Lamborghini Super Trofeo racer, music producer, car and vintage synth collector.

[-] RobotZap10000@feddit.nl 38 points 4 months ago

The resume of someome who had never done an honest day's work in their life.

[-] Knock_Knock_Lemmy_In@lemmy.world 4 points 4 months ago

Only the first item is related to business, and even that implies repeated failure.

[-] Reddfugee42@lemmy.world 11 points 4 months ago

Solutions looking for problems is a mainstay in multiple industries from material science to chemistry. It's not necessarily a bad idea.

[-] Hamartiogonic@sopuli.xyz 2 points 4 months ago

In the early days of laser development, it was seen as a solution seeking a problem. A few decades later, it actually turned out to be really handy, but it would have been tough to sell this idea to anyone before that. Imagine how hard it is to find funding for research that solves a problem that doesn’t exist.

[-] Aqarius@lemmy.world 1 points 4 months ago

In development and science, sure. But this is a finished product on the market.

[-] Reddfugee42@lemmy.world 1 points 4 months ago

The principle is the same. "Let's hope someone finds this useful." It's always a crapshoot.

[-] helpImTrappedOnline@lemmy.world 7 points 4 months ago

They're a solution looking to solve a problem that already has a well established better solution. The modern smart phone and voice assistats have been around for 14+ years....

For all these Ai devices can currently accomplish, our budget $200 phones can do an unmeasurable amount more.

If anyrhing, they should be focusing on the voice assistant aspect - "Hey google, add nearest gas station to my trip" "Here's a list of gas stations (I know you're driving but please review this list and select one using the tiny select button)" {presses button} "Please enable location data analytics to continue"

[-] TimeSquirrel@kbin.social 5 points 4 months ago* (last edited 4 months ago)

I think there's already a way to forward Google Home requests directly to ChatGPT, I might be wrong though.

[-] dojan@lemmy.world 3 points 4 months ago

That wouldn't surprise me. I think there's a Siri shortcut for integrating with ChatGPT. It's not the most elegant of solutions but it works well enough. I'm quite sure that this year we'll see whatever Google and Apple has cooked up in terms of machine learning integration into the operating systems. Likely a flagship feature of the new Pixel phones, and definitely a significant Siri update on iPhone, probably along with some gimmicky feature to sell the new 16 Pros.

At that point, who is going to care about these devices?

[-] cley_faye@lemmy.world 5 points 4 months ago

In addition to being able to run the exact same thing on that phone you already have, too.

Their device does not have any specific hardware for their usage. Even if Google and Apple don't bring any improvement to their own solution, soon enough someone is bound to just provide an "assistant AI app" with a subscription, proxying openai requests and using the touchscreen, camera, micro and speaker that are already there instead of making you buy a new set of those.

[-] FlyingSquid@lemmy.world 6 points 4 months ago

The "AI" in the R1 is utter shit. Wired eviscerated it in a review.

https://www.wired.com/review/rabbit-r1/

[-] ChaoticNeutralCzech@feddit.de 2 points 4 months ago

It is somewhat OK considering it's a free app.

[-] FlyingSquid@lemmy.world 2 points 4 months ago

You could say the same about Siri, which is also utter shit.

[-] ChaoticNeutralCzech@feddit.de 2 points 4 months ago

And yet, for both you are supposed to pay for an overpriced device. You can at least pirate the R1 app.

[-] Knock_Knock_Lemmy_In@lemmy.world 2 points 4 months ago

I think there may be a market for an LMM that is executed locally and privately incorporates personal data.

[-] cley_faye@lemmy.world 2 points 4 months ago

Yes, there is. And yes, it would be huge. I know a lot of people that are staying away from all this as long as the privacy issues are not resolved (there are other issues, but at this point, the cat is out of the bag).

But running large models locally requires a ton of resource. It may become a reality in the future, but in the meantime allowing more, smaller provider to provide a service (and a self-hosted option, for corporation/enthusiasts) is way better in term of resources usage. And it's already a thing; what needs work now is improving UI and integrations.

In fact, very far from the "impressive" world of generated text and pictures, using LLM and integrations (or whatever it is called) to create a sort of documentation index that you can query with natural language is a very interesting tool that can be useful for a lot of people, both individual and in corporate environment. And some projects are already looking that way.

I'm not holding my breath for portable, good, customized large models (if only for the economics of energy consumption) but moving away from "everything goes to a third party service provider" is a great goal.

this post was submitted on 04 May 2024
532 points (95.1% liked)

Technology

58287 readers
4294 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS