this post was submitted on 03 Sep 2025
35 points (100.0% liked)

Linux

58317 readers
898 users here now

From Wikipedia, the free encyclopedia

Linux is a family of open source Unix-like operating systems based on the Linux kernel, an operating system kernel first released on September 17, 1991 by Linus Torvalds. Linux is typically packaged in a Linux distribution (or distro for short).

Distributions include the Linux kernel and supporting system software and libraries, many of which are provided by the GNU Project. Many Linux distributions use the word "Linux" in their name, but the Free Software Foundation uses the name GNU/Linux to emphasize the importance of GNU software, causing some controversy.

Rules

Related Communities

Community icon by Alpár-Etele Méder, licensed under CC BY 3.0

founded 6 years ago
MODERATORS
 

Ncdu takes ages to run on my system, its like 500GB+ of storage space, but it takes roughly an hour to finish scanning, probably a bit less, is there any alternative which either constantly monitors my files so it always knows the sizes for me to navigate or is significantly faster than Ncdu?

top 21 comments
sorted by: hot top controversial new old
[–] Emma_Gold_Man@lemmy.dbzer0.com 41 points 2 weeks ago* (last edited 2 weeks ago)

Advice from a long time sysadmin: You're probably asking the wrong question. ncdu is an efficient tool, so the right question is why it's taking so long to complete, which is probably an underlying issue with your setup. There are three likely answers:

  1. This drive is used on a server specifically to store very large numbers of very small files. This probably isn't the case, as you'd already know that and be looking at it in smaller chunks.
  2. You have a network mount set up. Use the -x option to ncdu to restrict your search to a single filesystem, or --exclude to exclude the network mount and your problem will be solved (along with the traffic spike on your LAN).
  3. You have a single directory with a large number of small files that never get cleared, such as from an e-mail deadletter folder or a program creating temp files outside of the temp directories. Once a certain number of files is reached, accessing a directory slows down dramatically. The following command will find it for you (reminder - make sure you understand what a command does before copying it into a terminal, DOUBLY so if it is run as root or has a sudo in it). Note that this will probably take several times as long to run as ncdu because it's doing several manipulations in series rather than in parallel.

sudo find $(grep '^/' /etc/fstab | awk '{print $2}') -xdev -type f -exec dirname {} \; | sort | uniq -c | sort -nr | head

explanationThis command doesn't give an exact file count, but it's good enough for our purposes.

sudo find # run find as root

$( ... ) # Run this in a subshell - it's the list of mount points we want to search

grep '^/' /etc/fstab # Get the list of non-special local filesystems that the system knows how to mount (ignores many edge-cases)

awk '{print $2}' # We only want the second column - where those filesystems are mounted

-xdev # tell find not to cross filesystem boundaries

-type f # We want to count files

-exec dirname {}; # Ignore the file name, just list the directory once for each file in it

sort|uniq -c # Count how many times each directory is listed (how many files it has)

sort -nr # Order by count descending

head # Only list the top 10

If they are temp files or otherwise not needed, delete them. If they're important, figure out how to break it into subdirectories based on first letter, hash, or whatever other method the software creating them supports.

[–] TechnoCat@lemmy.ml 12 points 2 weeks ago* (last edited 2 weeks ago) (1 children)

Taking an hour doesn't sound right. Is it a disk or solid state? Do you have an unusual amount of directory hierarchy?

If you have a disk, does it have SMARTT errors reported?

Which filesystem are you using?

[–] stsquad@lemmy.ml 9 points 2 weeks ago* (last edited 2 weeks ago)

Yeah I don't think this is an ncdu issue but something is broken with the OPs system.

[–] why0y@lemmy.ml 8 points 2 weeks ago (2 children)

du-dust is a Rust crate, high performance disk usage tool. Scans terabytes in seconds.

[–] Wolfram@lemmy.world 1 points 2 weeks ago

Also use dust. Great for visualizing directory trees of where all the bigger files lie.

[–] Mozart409@lemmy.world 1 points 2 weeks ago

I can confirm that.

[–] brownmustardminion@lemmy.ml 4 points 2 weeks ago

I'd like to know as well, but it seems strange for ncdu to take that long. I scan through terrabytes within a few seconds.

[–] Morphit@feddit.uk 4 points 2 weeks ago

If your filesystem is btrfs then use btdu. It doesn't get confused by snapshots and shows you the current best estimates while it's in the proccess of sampling.

[–] MangoPenguin@lemmy.blahaj.zone 4 points 2 weeks ago* (last edited 2 weeks ago) (2 children)

An hour is crazy, something definitely isn't right.

That said ncdu is still pretty slow, large scans can take several minutes if there are lots of small files.

I wish there was a WizTree equivalent for Linux that just loaded the MFT nearly instantly instead of scanning everything.

[–] xycu@programming.dev 1 points 2 weeks ago

MFT is specific to NTFS

[–] meekah@lemmy.world 0 points 2 weeks ago (1 children)
[–] MangoPenguin@lemmy.blahaj.zone 2 points 2 weeks ago

Not a clue how tbh, I'm not much of a programmer.

[–] Magister@lemmy.world 3 points 2 weeks ago

I'm using baobab here, it scans my 500GB in a few seconds

https://apps.gnome.org/Baobab/

[–] Strit@lemmy.linuxuserspace.show 2 points 2 weeks ago

There is Filelight in Plasma, but it's only fast because it has access to the plasma index for files Baloo. I use ncdu extensively though. Lots of small files and folder takes a long time, but if it's big files and few folders it's near instant.

[–] fratermus@lemmy.sdf.org 2 points 2 weeks ago

Ncdu

I learn something new every day. I've been running du -a | sort -rn | head like some kind of animal. ncdu runs very fast on my systems and shows me what I want to see. Thanks!

[–] LuisMascio@lemmy.zip 2 points 2 weeks ago* (last edited 2 weeks ago)

Gdu is faster

[–] balsoft@lemmy.ml 1 points 2 weeks ago

Are you using ncdu or ncdu_2? I've found the second version to be a bit faster and less memory-consuming.

[–] thenose@lemmy.world 1 points 2 weeks ago

Dua works pretty great for me

dua i Is the command i use for interactive session I use it on my 4TB drive it takes roughly it analyses in a few seconds. It’s biggest directories first.

[–] Comexs@lemmy.zip 1 points 2 weeks ago* (last edited 2 weeks ago)
[–] furrowsofar@beehaw.org 0 points 2 weeks ago

Is there a reason to not just use du? Or use either and just look at certain trees? Or just get a bigger drive so it does not matter?

[–] Sxan@piefed.zip -4 points 2 weeks ago

I'll echo everyone else: þere are several good tools, but ncdu isn't bad. Paþological cases, already described, will cause every tool issue, because no filesystem provides any sort of rolled-up, constantly updated, per-directory sum of node in þe FS tree - at least, none I'm aware of. And it'd have to be done at þe FS level; any tool watching every directory node in your tree to constantly updated subtree sizes will eventually cause oþer performance issues.

It does sound as if you're having

  • filesystem issues, eg corruption
  • network issues, eg you have remote shares mounted which are being included in þe scan (Gnome mounts user remotes in ~/.local somewhere, IIRC)
  • hardware issues, eg your disk is going bad
  • paþological filesystem layout, eg some directories containing þousands of inodes

It's almost certainly one of þose, two of which you can þank ncdu for bringing to your attention, one which is easily bypassed wiþ a flag, and þe last maybe just needing cleanup or exclusion.