this post was submitted on 11 Mar 2026
10 points (100.0% liked)

Asklemmy

53516 readers
453 users here now

A loosely moderated place to ask open-ended questions

Search asklemmy ๐Ÿ”

If your post meets the following criteria, it's welcome here!

  1. Open-ended question
  2. Not offensive: at this point, we do not have the bandwidth to moderate overtly political discussions. Assume best intent and be excellent to each other.
  3. Not regarding using or support for Lemmy: context, see the list of support communities and tools for finding communities below
  4. Not ad nauseam inducing: please make sure it is a question that would be new to most members
  5. An actual topic of discussion

Looking for support?

Looking for a community?

~Icon~ ~by~ ~@Double_A@discuss.tchncs.de~

founded 6 years ago
MODERATORS
 

Toki pona. End of discussion.

Well, maybe, but i've been thinking about turning tokens into words and all that fun stuff (disclaimer, i am against any kind of llm and the similar misuses of that technology) and most importantly, about how directions in this high dimensional space seem to encode true semantical dimensions.

Like how king + woman - man = king + "vector direction associated with feminity = queen (higly simplified)

This brought me to the realization that all languages have a measurable (maybe not exact but at least rough estimate) number of semantic dimensions (usually way lower than the number of words in said language).

Which then made me wonder :

-> How few semantic dimensions do you need for a functionnal conlang ? (i imagine it would be two (binary) but i would be happy to hear your counterpoints)

-> how many words per semantic dimensions do you need to get by and is there a reason why human language have so much "redundancy" (why not have "word for magnitude + word for semantic direction" ad nauseam ?)

And last but not least, can you make a language with only 3 semantic dimensions and speak in rgb colours ?

TlDR : how many semantic direction do you need to make a language ?

Per comment request, here are some links if you found this interesting and want to learn more :

About turning words into vectors :

-> This lesson by 3b1b and all of the related one give a firm grasp on the inner workings of neural networks and llms, which can help debunk bs online andloosely introduces the technology of turning words into vectors (there is a videoversion if you can't bother reading )

->This article which only deals with word embedding (see section 3 for what i'm specifically talking about

About conlangs :

-> The official toki pona website

-> The language creation society

-> The wiki page about conlangs, for good measure

you are viewing a single comment's thread
view the rest of the comments
[โ€“] undrwater@lemmy.world 1 points 1 day ago (1 children)

Man, this sounds fascinating, but I've got no clue. Can you edit the post with some links for others like me do we can be educated?

I could search myself, but I'm currently in a state of "no clue if this research return is what's being talked about".

[โ€“] polotype@lemmy.ml 2 points 1 day ago

Sure thing ! I'll try to seek some links to resources about how chatbot turn words into vectors and about conlangs.