1577
you are viewing a single comment's thread
view the rest of the comments
view the rest of the comments
this post was submitted on 03 Sep 2024
1577 points (97.8% liked)
Technology
60115 readers
1981 users here now
This is a most excellent place for technology news and articles.
Our Rules
- Follow the lemmy.world rules.
- Only tech related content.
- Be excellent to each another!
- Mod approved content bots can post up to 10 articles per day.
- Threads asking for personal tech support may be deleted.
- Politics threads may be removed.
- No memes allowed as posts, OK to post as comments.
- Only approved bots from the list below, to ask if your bot can be added please contact us.
- Check for duplicates before posting, duplicates may be removed
Approved Bots
founded 2 years ago
MODERATORS
Hello fellow human. I also learn by having information shoveled to me without regard to my agency.
@zbyte64 with everything you see you are scraping data from your environment whether you want to or not
How does a child learn what pain is? How does a teenager learn what heartbreak is? It’s certainly not because they made the decision to find that out themselves
I bring up agency and I get an exemplary response what I mean.
Raising a child well requires someone who is able to engage in the child's own theory of mind. If you just treat a child as an information sponge they will need more therapy than usual. A good parent takes interest in their child's ability to exercise agency.
@zbyte64 you’re getting away from the original conversation
Then I guess my original point of agency being an essential element in human learning had nothing to do with your conversation about how AI learns like humans. Carry on.
@zbyte64 we’re saying the same thing
It’s a matter scale, not process
I'm literally saying (an aspect of) process matters, how are we saying the same thing?
@zbyte64 from what I understand, you’re referring to the process at scale—the amount of information the AI can take in is inhuman—which I’m not disagreeing with
None of which is relevant to my original point: the scale of their operations, which has already been used countless times in copyright law
The scale at which they operate and their intention to profit is the basis for their infringement, how they’re doing it would be largely irrelevant in a copyright case, is my point
I don't understand how when I say "agency" or "an aspect of the process" one would think I'm talking about the volume of information and not the quality.
@zbyte64 1) In no way is quality a part of that equation and 2) In what other contexts is quality ever a part of the equation? I mean I can go look at some Monets and paint some shitty water lillies, is that somehow problematic?
If we're using your paintings as training data for a Monet copy, then it could be.
Are we even talking about AI if we're saying data quality doesn't matter?
@zbyte64 data quality, again, was out of the scope of what I was talking about originally
Which, again, was that legal precedent would suggest that the *how* is largely irrelevant in copyright cases, they’re mostly focused on *why* and the *scale of the operation*
I’m not getting sued for copyright infringement by the NYT because I used inspect element to delete content to read behind their paywall, OpenAI is
I was narrowly taking issue with the comparison to how humans learn, I really don't care about copyrights.
@zbyte64 where am I wrong? The process is effectively the same: you get a set of training data (a textbook) and a set of validation data (a test) and voila, I’m trained
To learn how to draw an image of a thing, you look at the thing a lot (training data) and try sketching it out (validation data) until it’s right
How the data is acquired is irrelevant, I can pirate the textbook or trespass to find a particular flower, that doesn’t mean I’m learning differently than someone who paid for it
Do we assume everything read in a textbook is correct? When we get feedback on drawing, do we accept the feedback as always correct and applicable? We filter and groom data for the AI so it doesn't need to learn these things.