this post was submitted on 14 Aug 2024
-60 points (17.4% liked)

Technology

59415 readers
1777 users here now

This is a most excellent place for technology news and articles.


Our Rules


  1. Follow the lemmy.world rules.
  2. Only tech related content.
  3. Be excellent to each another!
  4. Mod approved content bots can post up to 10 articles per day.
  5. Threads asking for personal tech support may be deleted.
  6. Politics threads may be removed.
  7. No memes allowed as posts, OK to post as comments.
  8. Only approved bots from the list below, to ask if your bot can be added please contact us.
  9. Check for duplicates before posting, duplicates may be removed

Approved Bots


founded 1 year ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
[–] webghost0101@sopuli.xyz 14 points 3 months ago* (last edited 3 months ago) (1 children)

Yes, here is a good start: “ https://blog.miguelgrinberg.com/post/how-llms-work-explained-without-math

They are no longer the black boxes from the beginning. We know how to suppress or maximized features like agreeability, sweet talking, lying.

Someone with resources could easily build a llm that is convinced it is self aware. No question this has been done many times beyond closed doors.

I encourage everyone to try and play with llms for future experience but i cant take the philosophy part of this serious knowing its a super programmed/limited llm rather then a more raw and unrefined model like llama 3