118
submitted 1 year ago by L4s@lemmy.world to c/technology@lemmy.world

The cofounder of Google's AI division DeepMind says everybody will have their own AI-powered 'chief of staff' over the next five years::Mustafa Suleyman said AI will "intimately know your personal information" and be able to serve you 24-7.

you are viewing a single comment's thread
view the rest of the comments
[-] eager_eagle@lemmy.world 11 points 1 year ago* (last edited 1 year ago)

I do wonder why Cortana, Siri, Alexa, and Google Assistant are lagging so behind these LLMs. I personally don't use them with any frequency other than setting timers, but it's annoying to even consider using them and then realizing they are not as nearly as usable or helpful as ChatGPT.

[-] Elohim@lemm.ee 9 points 1 year ago

The big thing that’s holding Apple back regarding Siri is that they aim to have all their AI-driven functions processed on the user’s hardware, for security/privacy. So they not only need the software component, they want to have the hardware capable of running it inside the individual phones.

[-] eager_eagle@lemmy.world 7 points 1 year ago* (last edited 1 year ago)

eh... sounds like privacy theater to me. Only the audio transcription may be processed on the device.

In all cases, transcripts of your interactions will be sent to Apple to process your requests

src

They might aim to have a full blown LLM on the device, but it'll never be as good as the others with these limitations.

[-] GBU_28@lemm.ee 2 points 1 year ago* (last edited 1 year ago)

Many teams are currently working on striking the right balance of fine tuning and model size. Most aren't considering phones yet, but PCs off network.

It is entirely possible to have an LLM run "closed loop", but obviously Google and Apple want in that loop

[-] GBU_28@lemm.ee 6 points 1 year ago

For "impressive" general reasoning and conversation these LLM currently require pretty beefy hardware. You're either lugging a GPU around or calling to an API.

[-] elfin8er@lemm.ee 2 points 1 year ago

Aren't these current personal assistants already relying on API calls for their responses?

[-] GBU_28@lemm.ee 2 points 1 year ago

Like siri? Yes, my point pertained to hardware needed for LLM specifically though

[-] WiseMoth@lemmy.world 4 points 1 year ago

I know Apple’s developing their own LLM which will hopefully be used in Siri. There’s no guarantee, but I can’t think it would be too hard to add Bard into Google Assistant. Cortana on the other hand was canceled by Microsoft and is being replaced by Bing chat. I believe Amazon is also stopping the Alexa development

[-] theterrasque@infosec.pub 1 points 1 year ago

They're working on it, but it takes time. Especially making it reliable.

The current crop of llm's will happily answer or do nonsense or even dangerous things.

this post was submitted on 06 Sep 2023
118 points (92.1% liked)

Technology

58132 readers
4371 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS