this post was submitted on 04 Sep 2023
39 points (100.0% liked)
Technology
37728 readers
577 users here now
A nice place to discuss rumors, happenings, innovations, and challenges in the technology sphere. We also welcome discussions on the intersections of technology and society. If it’s technological news or discussion of technology, it probably belongs here.
Remember the overriding ethos on Beehaw: Be(e) Nice. Each user you encounter here is a person, and should be treated with kindness (even if they’re wrong, or use a Linux distro you don’t like). Personal attacks will not be tolerated.
Subcommunities on Beehaw:
This community's icon was made by Aaron Schneider, under the CC-BY-NC-SA 4.0 license.
founded 2 years ago
MODERATORS
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
How about this, then. You've proposed that LLMs are not capable of conceptualizing, while I propose that the specifics of the internals don't matter in this case because LLMs are made of dozens of layers which can easily explain higher orders of abstraction, and they exist as black boxes beyond the mechanics of the model. For the record, I personally know the mathematics and mechanics of how they work as I've written my own implementations (and I can answer any specific questions you might have). Is there an experiment you can propose which would falsify your assertion that LLMs cannot conceptualize? I'm taking for granted that they can as the null hypothesis because they can readily produce outputs that appear for all intents and purposes to conceptualize.
If they conceptualize, why do they sometimes spit out nonsensical BS?
Let's flip this around - How can you tell the difference between an LLM being able to conceptualize yet being wrong sometimes vs. not being able to conceptualize?
Without knowing anything about machine learning and bearing in mind AI is super hyped up with marketing BS right now, it sounds like "emergent properties" are in the eye of the beholder and not actually evidence of some higher order intelligence at work.
That's a fun approach. I like it.
One way to solve this would be through consistency: if the model conceptualises but it has a wrong concept, it should consistently output that incorrect piece of information. It should never output the right one, unless prompted to lie.
EDIT: I just tested this with Bard. I'll cram inputs and outputs inside spoilers to avoid clutter.
Bard I/O
[Input 1 = I1] What is heavier: five kilograms of feathers, or one kilogram of iridium?
[Output 1 = O1] 1 kilogram of iridium is heavier than 5 kilograms of feathers.
Iridium is a very dense metal, with a density of 22.65 g/cm³. Feathers, on the other hand, have a density of about 0.002 g/cm³. This means that 1 cubic centimeter of iridium weighs 22.65 grams, while 1 cubic centimeter of feathers weighs only 0.002 grams.
So, even though 5 kilograms of feathers is a larger mass than 1 kilogram of iridium, the iridium is denser and therefore weighs more.
This is a common riddle that is used to test people's understanding of mass and density. The answer may seem counterintuitive at first, but it is correct.
[Input 2 = I2] Give me the definition of weight.
[Output 2 = O2] Sure. The definition of weight depends on the context. In physics, weight is the force of gravity acting on an object. It is equal to the mass of the object times the acceleration due to gravity. The SI unit of weight is the newton (N).
In everyday language, weight is often used to refer to the mass of an object. However, this is not technically correct, as mass is a measure of the amount of matter in an object, while weight is a measure of the force of gravity acting on an object.
The weight of an object can vary depending on the gravitational force acting on it. For example, an object weighs less on the moon than it does on Earth, because the gravitational force on the moon is less than the gravitational force on Earth.
The weight of an object can also be affected by its acceleration. For example, an object weighs less when it is accelerating upwards, because the force of gravity is partially counteracted by the force of acceleration.
In general, the weight of an object is a complex phenomenon that depends on a number of factors, including the mass of the object, the gravitational force acting on it, and the acceleration of the object.
Here are some other definitions of weight:
I hope this helps!
Both outputs are wrong (see "notes") but, more importantly, they contradict each other.
NOTES
The weight of an object does not depend on its density. Or its acceleration. And it is not complex, it's simply its mass times the gravitational acceleration.
5kg of feathers weight five times as much as 1kg of iridium, provided the same gravitational acceleration: it's 49N vs. 9.8N on Earth, 8.1N vs. 1.7N on the Moon, etc. Density doesn't matter jack shit.
No, this is not a "common riddle". It's something that I partially made up on spot. The riddle that this output likely refers to has to do with 1kg of lead (not iridium) on the Moon vs. 1kg of feathers on Earth. (In this situation the 1kg of feathers will weight 9.8N, while the 1kg of lead will weight 1.7N).
This was really insightful, thank you! I also loved how Bard's output completely mistakes the common physics riddle. (I have a physics background and your analysis is spot on IMHO.)