IncognitoErgoSum

joined 2 years ago
[–] IncognitoErgoSum@kbin.social 6 points 2 years ago (9 children)

For something to be a fact, it needs to actually be true. AI is currently accessible to everyone.

[–] IncognitoErgoSum@kbin.social 5 points 2 years ago

Wow, you have this all planned out, don't you?

If that's what Europe is like, they'll build their data centers somewhere else. Like the corrupt USA. Again, you'll be taking away your access to AI, not theirs.

[–] IncognitoErgoSum@kbin.social 10 points 2 years ago* (last edited 2 years ago) (11 children)

I'm not sure what you're getting at with this. It will only be a privilege for these groups of we choose to artificially make it that way. And why would you want to do that?

Do you want to give AI exclusively to the rich? If so, why?

[–] IncognitoErgoSum@kbin.social 3 points 2 years ago

Is kbin a place where we just call everyone we don't like "techbros"?

[–] IncognitoErgoSum@kbin.social 2 points 2 years ago

Also, working in open source means having a proper understanding of licensing and ownership. Open source doesn't mean "free this and free that" -- in fact, many AI based code assistance tools are actually hurting the open source initiative by not properly respecting the license of the code base it's studying from.

Don't be patronizing. I've been involved in open source for 20+ years, and I know plenty about licensing.

What you're talking about is changing copyright law so that you'll have to license content in order for an AI to learn concepts from that content (in other words, to be able to summarize it, learn facts from it, learn an art style, and so on). This isn't how copyright law currently works, and I hope to god it stays that way.

For example, if you don't own the right of the original copy of Star Wars, you obviously wouldn't own any rights over the output of an upscaled Star Wars. Same goes for writing or other "transformative" media and it has been this way for a long time (see: audio sampling)

That's not the same thing as training and AI on Star Wars. If you feed Star Wars into an upscaling AI, the AI is processing each frame and creating an output that's a derivative work on that frame, and result of that isn't something you would be allowed to release without a license. If you train it on Star Wars, the AI would learn general concepts from Star Wars, and not be able to produce an upscaled version of the movie verbatim (although depending on the AI, it may be able to produce images in the general style of Star Wars or summarize the movie).

An appropriate analogy for what's going on here would be reading a book and then talking about the facts I learned from that book, which is in no way a violation of copyright law. If I started quoting long sections of that book verbatim, I would need a license from the author, but that's not how AI works. It's not learning the sentences those people type verbatim, it's picking up concepts and facts from them. Even if I were to memorize the book from cover to cover, I would be in the clear as long as I didn't actually start reproducing the book in some way. Neural networks are learning machines, not databases. Their purpose isn't to reproduce information verbatim.

If you're still not clear on the difference between training on data and processing it, let me know and I'll try to clarify further.

[–] IncognitoErgoSum@kbin.social 2 points 2 years ago (2 children)

Wow, that's a great way to immediately drain all of the potential out of what could be a really amazing technology, and absolutely prevent any open source competitor from ever coming into existince, so in the best case we'll all be paying google and openAI monthly forever for access to knowledge that ought to be free. What we need are unions and laws that enforce better labor conditions across the board.

[–] IncognitoErgoSum@kbin.social 2 points 2 years ago

But since you seem to love the potential of AI would you be willing to send me an audio file of you pronouncing every possible phonetic sound the human mouth can make?

In theory, absolutely.

In practice, I'm not going to go through that much work just to make a point for a single fediverse comment. I'll be honest, though -- I'm not particularly worried about somebody using my voice to do a bad (or do a racism or whatever). It may happen, and I can live with it; I think the benefits far outweigh the cost, and in my experience, far more people use those sorts of things to do awesome stuff than to be shitty. Earlier today I was considering trying to put together an Open Voice project and collect volunteers to do exactly what you said.

I've already released open source code over the years; people could potentially use that to do things I don't agree with as well, but frankly, as someone who has had work out in the wild available for use by everyone, the panic is vastly overblown.

Your assumption that I felt otherwise is because you’re on the opposite end of the spectrum. So self assured of it’s value that you’re blind to real shortcomings and abusable points.

Just because I feel that the potential benefits far outweigh the costs (as well as the draconian technical restrictions that would be required in order to prevent people from expressing themselves in a bad way), it doesn't follow that I'm somehow blind to the real shortcomings and abusable points of AI. I would appreciate if you not make silly strawman assumptions about whether I've given something due consideration just because you don't like my conclusions.

If you have a solution that wouldn't absolutely kill it (or put a horribly filtered version in the hands of a few massive corporations who charge the rest of us for the privilege of using it while using it themselves however they want), I'm all ears.

[–] IncognitoErgoSum@kbin.social 1 points 2 years ago (1 children)

Just being "a bunch of numbers" doesn't stop it from being a work, it doesn't stop it from being a derivative work

I suggest reading my entire comment.

A trained AI is not a measurement of the natural world. It is a thing that has been created from the processing of other things -- in the common sense of it the word, it is derivative of those works. What remains, IMO, is the question of if it would be a work, or something else, and if that something else would be distinct enough from being a work to matter.

It's only a work if your brain is a work. We agree that in a digitized picture, those numbers represent the picture itself and thus constitute a work (which you would have known if you read beyond the first sentence of my comment). The weights that make up a neural network represent encodings into neurons, and as such should be treated the same way as neural encodings in a brain.

[–] IncognitoErgoSum@kbin.social 0 points 2 years ago (1 children)

Same. I also like how they don't push comments down the page.

People are going to use it as a disagree button, let them do it publicly. If you don't want other people to know you downvoted something, it's probably because they made a good point that you don't like.

[–] IncognitoErgoSum@kbin.social 3 points 2 years ago (2 children)

The end result is going to be basically the same regardless. Plenty of people (such as myself) who believe in the huge potential of AI to give creative power to regular people will volunteer our voices. Giving that creative power to everyone is worth far more, in my opinion, than gatekeeping the creation of art.

Unless they're planning on making it illegal for a computer to imitate any human voice, I don't see where making a law against using a voice without consent would make a big substantive difference. Just re-voice the existing lines in Skyrim with new voices to maintain consistency and you're good (there's a Serana mod that already does this, for instance).

[–] IncognitoErgoSum@kbin.social 5 points 2 years ago

Unfortunately, the courts and legislatures may craft their opinions and laws, respectively, without knowing how machine learning actually works.

view more: ‹ prev next ›