this post was submitted on 12 Nov 2025
415 points (98.4% liked)

iiiiiiitttttttttttt

1320 readers
171 users here now

you know the computer thing is it plugged in?

A community for memes and posts about tech and IT related rage.

founded 6 months ago
MODERATORS
 
top 28 comments
sorted by: hot top controversial new old
[–] Randelung@lemmy.world 5 points 1 day ago

compress ze file

[–] Evotech@lemmy.world 4 points 1 day ago

Me compressing binary data

[–] menas@lemmy.wtf 11 points 1 day ago (2 children)

haha. I spent so much time looking for an efficient compression algorithm for audio and movies ... until I finally understood there are already compressed X) However it allow me to discover zst, wich is uncredible with text :

 35M test.log
801K test.7z
  32 test.log.7z
  46 test.log.tar.bz2
  45 test.log.tar.gz
 108 test.log.tar.xz
  22 test.log.tar.zst
2,1M test.tar.bz2
8,1M test.tar.gz
724K test.tar.xz
1,1M test.tar.zst
8,1M test.zip
[–] sobchak@programming.dev 6 points 1 day ago

Yeah, I found zst recently as well. Quite a bit faster than xz/7z which is what I previously used (or gz when I just needed something fast).

[–] UnrepentantAlgebra@lemmy.world 10 points 1 day ago (2 children)

Am I reading that correctly that test.log.7z compressed 35 mB of text into 32 bytes?

[–] menas@lemmy.wtf 2 points 16 hours ago

I shall have check. No for some reason some archives failed. Not 7z, but it size is 800kB :

35M test.log
804K test.7z    
8,1M test.tar.gz
8,1M test.zip
724K test.tar.xz
2,1M test.tar.bz2
1,1M test.tar.zst
[–] rbn@sopuli.xyz 7 points 1 day ago (1 children)

And .tar.zst into 22!?

What was that log file? Millions of identical characters?

[–] Tangent5280@lemmy.world 1 points 1 hour ago

Email written only with the letter 'P'

[–] Scubus@sh.itjust.works 9 points 1 day ago (2 children)

I dont remember him crushing a tank, is this an edit or did they release another watchmen movie?

[–] SacralPlexus@lemmy.world 16 points 1 day ago (1 children)

It’s from the 2009 movie, during the montage where he is on Mars and is recounting to himself how he got his powers and his relationship with Janey. Timestamp is 1:10:08.

[–] Scubus@sh.itjust.works 6 points 1 day ago

Ah, thanks. Figured it was just an edit of the climax of the movie. You know the one

[–] golden_zealot@lemmy.ml 4 points 1 day ago

I believe it was a short clip shown when they went into Dr. Manhattans back story.

[–] addie@feddit.uk 16 points 1 day ago (2 children)

Well, yeah. The real advantage is only having a single file to transfer, makes eg. SFTP a lot less annoying at the command line.

Lossless compression works by storing redundant information more efficiently. If you've got 50 GB in a directory, it's going to be mostly pictures and videos, because that would be an incredible amount of text or source code. Those are already stored with lossy compression, so there's just not much more you can squeeze out.

I suppose you might have 50 GB of logs, especially if you've a logserver for your network? But most modern logging stores in a binary format, since it's quicker to search and manipulate, and doesn't use up such a crazy amount of disk space.

[–] webhead@sh.itjust.works 7 points 1 day ago

I actually just switched a backup script of mine from using a tar.gz to just a regular tar file. It's a little bigger but overall, the process is so much faster I don't even care about the tiny extra bit of compression (100gb vs 120gb transferred over a 1gbit connection). The entire reason I do this is, like you said, transferring files over the Internet is a billion times faster as one file, BUT you don't need the gzip step just for that

[–] aev_software@programming.dev 4 points 1 day ago (1 children)

Meh, that barely fits an empty MS Word doc...

/s

[–] Honytawk@feddit.nl 1 points 15 hours ago

Can we transcribe it to a QR code?

[–] xoggy@programming.dev 69 points 2 days ago (1 children)

Turns out we're already doing a lot of compression at the file and filesystem level.

[–] Opisek@piefed.blahaj.zone 30 points 2 days ago (2 children)

Not necessarily. For example, you can't really compress encrypted files. You can certainly try but the result will likely be what the meme portrays.

[–] Randelung@lemmy.world 28 points 2 days ago (1 children)

Turns out pseudo random byte streams don't really repeat that often.

We just need to develop an algorithm to compress random byte streams, easy

[–] WolfLink@sh.itjust.works 6 points 1 day ago (2 children)

Media files are always thoroughly compressed (except in certain settings like professional video and audio work).

[–] rainwall@piefed.social 4 points 1 day ago (1 children)

Media files can can benefit from a codec change. Going from h264 to h265/hevc can net a 30-50% reduction in size for almost no quality loss.

The only trade off is increased cpu usage if the client doesnt have hardware h265 support and the time to do the transcoding.

[–] ulterno@programming.dev 2 points 1 day ago

And then comes AV1, with crazily varying quality/compression for different source materials.

[–] Opisek@piefed.blahaj.zone 1 points 1 day ago (1 children)

I don't see how that relates to encryption.

[–] WolfLink@sh.itjust.works 1 points 1 day ago

It’s more relevant to the previous comment as an example of how we are doing a lot of compression at the filesystem level.

The files that are typically largest are already quite thoroughly compressed.

[–] MrLLM@ani.social 21 points 2 days ago* (last edited 2 days ago)

Plot twist: Your file manager reports directories in GB and files in GB converted to GiB with GB as units

~/j~

[–] dontsayaword@piefed.social 16 points 2 days ago
[–] your_good_buddy@lemmy.world 13 points 2 days ago

"Look what they need to mimic a fraction of our compression ratio." - tar cJf said to tar cjf