73
submitted 1 year ago* (last edited 1 year ago) by pexavc@lemmy.world to c/opensource@lemmy.ml

Other samples:

Android: https://github.com/nipunru/nsfw-detector-android

Flutter (BSD-3): https://github.com/ahsanalidev/flutter_nsfw

Keras MIT https://github.com/bhky/opennsfw2

I feel it's a good idea for those building native clients for Lemmy implement projects like these to run offline inferences on feed content for the time-being. To cover content that are not marked NSFW and should be.

What does everyone think, about enforcing further censorship, especially in open-source clients, on the client side as long as it pertains to this type of content?

Edit:

There's also this, but it takes a bit more effort to implement properly. And provides a hash that can be used for reporting needs. https://github.com/AsuharietYgvar/AppleNeuralHash2ONNX .

Python package MIT: https://pypi.org/project/opennsfw-standalone/

top 47 comments
sorted by: hot top controversial new old
[-] library_napper@monyet.cc 13 points 1 year ago
[-] WhoRoger@lemmy.world 5 points 1 year ago

I wish there were such detectors for other triggering stuff, like gore, or creepy insects, or any visual based phobia. Everyone just freaks out about porn.

[-] pexavc@lemmy.world 2 points 1 year ago* (last edited 1 year ago)

Actually am looking at this exact thing. Compiling them into an open source package to use on Swift. Just finished nsfw. But everything you mentioned should be in a “ModerationKit” as well. Allowing users to toggle based on their needs.

[-] WhoRoger@lemmy.world 2 points 1 year ago

Sounds good.

[-] JuxtaposedJaguar@lemmy.ml 3 points 1 year ago

To be fair, most non-porn "NSFW" is probably "NSFL". So NSFW in its exclusive usage is almost entirely porn.

[-] Sethayy@sh.itjust.works 3 points 1 year ago

Tho some people consider nudity art, without including anything sexual about it

[-] janAkali@lemmy.one 2 points 1 year ago* (last edited 1 year ago)

In many cultures around the world nudity in itself isn't considered inappropriate or sexual.

[-] BrikoX@lemmy.zip 3 points 1 year ago* (last edited 1 year ago)

2 of them are lincensed under BSD-3, so not open source. The the 3rd one uses Firebase, so no thanks.

Edit: BSD-3 is open source. I confused it with BSD-4. My bad.

[-] wildbus8979@sh.itjust.works 9 points 1 year ago* (last edited 1 year ago)

How is BSD-3 not open source? I think you are confusing "Free/Libre" and Open Source. BSD-3/MIT licenses are absolutely open source. GPL is Free/Libre and Open Source (FLOSS)

[-] danielquinn@lemmy.ca 3 points 1 year ago

It's an interesting idea, and given the direction some economies are moving (looking at you EU & UK) something like this is likely going to feature whether we like it or not. The question for me however is what is the nature of the training data? What some places consider "porn" (Saudi Arabia, the Vatican, the US) is just people's bodies in more civilised places. Facebook's classic "free the nipple" campaign is an excellent example here: why should anyone trust that this software's opinion aligns with their own?

[-] pexavc@lemmy.world 1 points 1 year ago

Yeah. Have been thinking of this exact scenario. How to create solutions around anything that might “filter” while respecting the worldviews of all. I feel the best approach so far, is if filters are to be implemented. It should never be a 1 all be all and should always be “a toggle”. Ultimately respecting the user’s freedom of choice while providing the best quality equipment to utilize effectively when needed

this post was submitted on 28 Aug 2023
73 points (91.0% liked)

Open Source

31217 readers
235 users here now

All about open source! Feel free to ask questions, and share news, and interesting stuff!

Useful Links

Rules

Related Communities

Community icon from opensource.org, but we are not affiliated with them.

founded 5 years ago
MODERATORS