technology

24127 readers
377 users here now

On the road to fully automated luxury gay space communism.

Spreading Linux propaganda since 2020

Rules:

founded 5 years ago
MODERATORS
1
14
Hexbear Code-Op (hexbear.net)
submitted 9 months ago* (last edited 9 months ago) by RedWizard@hexbear.net to c/technology@hexbear.net
 
 

Where to find the Code-Op

Wow, thanks for the stickies! Love all the activity in this thread. I love our coding comrades!


Hey fellow Hexbearions! I have no idea what I'm doing! However, born out of the conversations in the comments of this little thing I posted the other day, I have created an org on GitHub that I think we can use to share, highlight, and collaborate on code and projects from comrades here and abroad.

  • I know we have several bots that float around this instance, and I've always wondered who maintains them and where their code is hosted. It would be cool to keep a fork of those bots in this org, for example.
  • I've already added a fork of @WhyEssEff@hexbear.net's Emoji repo as another example.
  • The projects don't need to be Hexbear or Lemmy related, either. I've moved my aPC-Json repo into the org just as an example, and intend to use the code written by @invalidusernamelol@hexbear.net to play around with adding ICS files to the repo.
  • We have numerous comrades looking at mainlining some flavor of Linux and bailing on windows, maybe we could create some collaborative documentation that helps onboard the Linux-curious.
  • I've been thinking a lot recently about leftist communication online and building community spaces, which will ultimately intersect with self-hosting. Documenting various tools and providing Docker Compose files to easily get people off and running could be useful.

I don't know a lot about GitHub Orgs, so I should get on that, I guess. That said, I'm open to all suggestions and input on how best to use this space I've created.

Also, I made (what I think is) a neat emblem for the whole thing:

Todos

  • Mirror repos to both GitHub and Codeberg
  • Create process for adding new repos to the mirror process
  • Create a more detailed profile README on GitHub.

Done

spoiler

  • ~~Recover from whatever this sickness is the dang kids gave me from daycare.~~
2
3
4
5
6
7
8
9
 
 

Im currently running a dual boot with Linux Mint and Windows 11 (recently switched from Windows 10).

Long term, I want to move fully to Linux, but heres the catch Im considering running a Linux–Linux dual boot instead basically using Mint alongside something like Fedora. The main purpose of this machine is gaming.

So far, Ive tested around 40 games on Mint. About 37 worked basically out of the box. For two of them, a friend helped me get things running, and the last one only worked after I swtiched to older NVIDIA drivers. Overall pretty happy with the results

Im also planning to move to an AMD GPU in the future, since Ive heard they tend to be less hassle on Linux than NVIDIA cards.

My plan is to give each Linux distro its own 1-terabyte SSD. So the question is: is this overall a bad idea? I like Mint,but I also want to try out other distros for a longer time period, and I really like the flexibility that dual booting gives me.

Would Mint and Fedora be a good pairing for mostly gaming and a bit of browsing, or would you recommend something other than Fedora? Its going good so far on Mint. One of the reasons why Im considering a Linux dualboot is cause I could run Mint with older drivers and Fedora with cutting edge drivers and that way hopefully max performance in my gaming. (That was at least my idea as a novice)

Lets have a bit of a discussion. All and any input is welcome. Yes I was the person that asked about dualbooting windows and linux in the past

10
 
 

A large department store sold someone an apple gift card that was likely stolen/already redeemed, apple responded by permanently closing his apple account, bricking his devices and causing him to lose access to 20 years of saved media in iCloud

Edit: in the FAQ section he says that he has backups so that's good. Main damage is that all of his apple devices are tied to the deleted account and won't work anymore

11
 
 

Microsoft has cut its sales targets for its agentic AI software after struggling to find buyers interested in using it. In some cases, targets have been slashed by up to 50%, suggesting Microsoft overestimated the potential of its new AI tools. Indeed, compared with ChatGPT and Google's Gemini, Copilot is falling behind, raising concerns about Microsoft's substantial AI investment.

Petulance aside, tests from earlier this year found that AI agents failed to complete tasks up to 70% of the time, making them almost entirely redundant as a workforce replacement tool. At best, they're a way for skilled employees to be more productive and save time on low-level tasks, but those tasks were already being handed off to lower-level employees. Having an AI do it and fail half the time isn't exactly a winning alternative.

Other AI companies are just doing better, too. Windows Central reports that OpenAI's ChatGPT commands over 61% of the market, and Google's Gemini is now less than 1% behind Microsoft's 14% with Copilot. That's after a 12% growth over the last quarter, too, suggesting Gemini is well on its way to becoming the real second-place alternative to ChatGPT.

12
 
 

13
14
15
16
17
18
19
20
21
22
23
24
25
 
 

A Developer Accidentally Found CSAM in AI Data. Google Banned Him For It

Google suspended a mobile app developer’s accounts after he uploaded AI training data to his Google Drive. Unbeknownst to him, the widely used dataset, which is cited in a number of academic papers and distributed via an academic file sharing site, contained child sexual abuse material. The developer reported the dataset to a child safety organization, which eventually resulted in the dataset’s removal, but he claims Google’s has been "devastating.”

A message from Google said his account “has content that involves a child being sexually abused or exploited. This is a severe violation of Google's policies and might be illegal.”

The incident shows how AI training data, which is collected by indiscriminately scraping the internet, can impact people who use it without realizing it contains illegal images. The incident also shows how hard it is to identify harmful images in training data composed of millions of images, which in this case were only discovered accidentally by a lone developer who tripped Google’s automated moderation tools.

💡Have you discovered harmful materials in AI training data ? I would love to hear from you. Using a non-work device, you can message me securely on Signal at @emanuel.404‬. Otherwise, send me an email at emanuel@404media.co.

In October, I wrote about the NudeNet dataset, which contains more than 700,000 images scraped from the internet, and which is used to train AI image classifiers to automatically detect nudity. The Canadian Centre for Child Protection (C3P) said it found more than 120 images of identified or known victims of CSAM in the dataset, including nearly 70 images focused on the genital or anal area of children who are confirmed or appear to be pre-pubescent. “In some cases, images depicting sexual or abusive acts involving children and teenagers such as fellatio or penile-vaginal penetration,” C3P said.

In October, Lloyd Richardson, C3P's director of technology, told me that the organization decided to investigate the NudeNet training data after getting a tip from an individual via its cyber tipline that it might contain CSAM. After I published that story, a developer named Mark Russo contacted me to say that he’s the individual who tipped C3P, but that he’s still suffering the consequences of his discovery.

Russo, an independent developer, told me he was working on an on-device NSFW image detector. The app runs locally and can detect images locally so the content stays private. To benchmark his tool, Russo used NudeNet, a publicly available dataset that’s cited in a number of academic papers about content moderation. Russo unzipped the dataset into his Google Drive. Shortly after, his Google account was suspended for “inappropriate material.”

On July 31, Russo lost access to all the services associated with his Google account, including his Gmail of 14 years, Firebase, the platform that serves as the backend for his apps, AdMob, the mobile app monetization platform, and Google Cloud.

“This wasn’t just disruptive — it was devastating. I rely on these tools to develop, monitor, and maintain my apps,” Russo wrote on his personal blog. “With no access, I’m flying blind.”

Russo filed an appeal of Google’s decision the same day, explaining that the images came from NudeNet, which he believed was a reputable research dataset with only adult content. Google acknowledged the appeal, but upheld its suspension, and rejected a second appeal as well. He is still locked out of his Google account and the Google services associated with it.

Russo also contacted the National Center for Missing & Exploited Children (NCMEC) and C3P. C3P investigated the dataset, found CSAM, and notified Academic Torrents, where the NudeNet dataset was hosted, which removed it.

As C3P noted at the time, NudeNet was cited or used by more than 250 academic works. A non-exhaustive review of 50 of those academic projects found 134 made use of the NudeNet dataset, and 29 relied on the NudeNet classifier or model. But Russo is the only developer we know about who was banned for using it, and the only one who reported it to an organization that investigated that dataset and led to its removal.

After I reached out for comment, Google investigated Russo’s account again and reinstated it.

“Google is committed to fighting the spread of CSAM and we have robust protections against the dissemination of this type of content,” a Google spokesperson told me in an email. “In this case, while CSAM was detected in the user account, the review should have determined that the user's upload was non-malicious. The account in question has been reinstated, and we are committed to continuously improving our processes.”

“I understand I’m just an independent developer—the kind of person Google doesn’t care about,” Russo told me. “But that’s exactly why this story matters. It’s not just about me losing access; it’s about how the same systems that claim to fight abuse are silencing legitimate research and innovation through opaque automation [...]I tried to do the right thing — and I was punished.”

view more: next ›