On Windows you have a GUI where you can identify large folders taking up space. On most Linux servers, which are headless (i.e. no GUI desktop), these helpful graphical interfaces are not present because they take up precious resources which should be used for your web server software. You can still have large folders with files you are not using on Linux that can require cleaning up. This is especially true if you use a VPS or dedicated server with SSD drives which are very fast but also expensive per GB and therefore you normally get less space. It is critical to keep your server clean so you are not spending money on storing unnecessary files!
I have used this technique multiple times on Codeable after Dan Dulaney, a fellow Codeable expert, from WP Tech Guides shared this time-saving tip with me.
This tutorial shows you how to use ncdu to identify large folders taking up space on your system, there is an alternative technique using tree and du outlined as well.
Using ncdu to Identify Large Folders
Install ncdu on Debian or Ubuntu with this command
sudo apt-get install ncduOn CentOS 6.8 and greater
yum install epel-release
yum install ncduEnter the root directory (not the root user’s home directory /root)
cd /Execute ncdu
ncduAfter ncdu scans your entire file system you will be shown the interface below.
Using ncdu you get an awesome drill down menu. The largest folders are shown in ascending order from biggest to smallest.
The idea is you enter the largest folder first which then takes you to a similar menu showing the subfolders again in ascending order.
After entering the subfolder you can see which of those subfolders are taking up the most space.
--- / --------------------------------------------------------------------------
14.4 GiB [##########] /var
10.9 GiB [####### ] /root
1.3 GiB [ ] /usr
1.0 GiB [ ] /lib
1.0 GiB [ ] swapfile
197.7 MiB [ ] /boot
96.2 MiB [ ] /tmp
40.2 MiB [ ] /run
15.6 MiB [ ] /bin
12.9 MiB [ ] /sbin
7.3 MiB [ ] /etc
e 16.0 KiB [ ] /lost+found
12.0 KiB [ ] /media
4.0 KiB [ ] /lib64
e 4.0 KiB [ ] /srv
e 4.0 KiB [ ] /snap
e 4.0 KiB [ ] /opt
e 4.0 KiB [ ] /mnt
e 4.0 KiB [ ] /home
. 0.0 B [ ] /proc
0.0 B [ ] /sys
Total disk usage: 29.0 GiB Apparent size: 29.0 GiB Items: 449055I use the arrows to choose the folders and when I am done use q to quit and remove unnecessary files
Here are the commands for controlling ncdu, you should be able to get by with the arrow keys to navigate and q to quit
qncdu helpqqqqqqqqqqqqqqqqq1:Keysqqq2:Formatqqq3:Aboutqqk
1.1 Gix x
164.8 Mix up, k Move cursor up x
20.7 Mix down, j Move cursor down x
7.7 Mix right/enter Open selected directory x
812.0 Kix left, <, h Open parent directory x
56.0 Kix n Sort by name (ascending/descending) x
56.0 Kix s Sort by size (ascending/descending) x
12.0 Kix C Sort by items (ascending/descending) x
8.0 Kix d Delete selected file or directory x
8.0 Kix t Toggle dirs before files when sorting x
e 4.0 Kix g Show percentage and/or graph x
4.0 Kix -- more -- x
4.0 Kix Press q to close x
4.0 KimqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqqjThis is by far the fastest way I have learned to identify large folders on Linux systems.
If you know you have a super large folder that you need to keep, you can tell ncdu to exclude that folder from scanning to save time.
ncdu --exclude=foldernameTree and du
On Ubuntu and Debian systems
sudo apt-get install treeOn CentOS
yum install treeYou can run this command inside the folder you know or suspect to be large
tree --du -d -shaC | grep -Ev '( *[^ ]* ){2}\[' | moreYou press space to jump to the next page and use q to quit displaying additional pages.
Pure du solution
When you don’t have root access then just du will be all you can use
du --human-readable --max-depth=1 | sort --human-numeric-sort --reverseIts output gives a quick overview
20G ./wp-content
16M ./wp-includes
7.8M ./wp-adminHere is another that shows more depth
du --human-readable --max-depth=2 | sort --human-numeric-sort --reverseIts output has more detail
20G ./wp-content/uploads
20G ./wp-content
20G .
34M ./wp-content/themes
16M ./wp-includes
7.8M ./wp-admin
5.6M ./wp-includes/js
2.4M ./wp-admin/css
2.3M ./wp-admin/includes
1.6M ./wp-admin/js
1.0M ./wp-includes/ID3
688K ./wp-includes/css
508K ./wp-includes/SimplePie
480K ./wp-content/houzez-new
464K ./wp-admin/images
436K ./wp-includes/rest-api
344K ./wp-includes/Requests
344K ./wp-includes/images
280K ./wp-includes/certificates
256K ./wp-content/plugins
208K ./wp-includes/customize
204K ./wp-admin/network
196K ./wp-includes/fonts
144K ./wp-includes/widgets
88K ./wp-includes/Text
64K ./wp-includes/random_compat
60K ./wp-includes/IXR
56K ./wp-includes/pomo
40K ./wp-includes/theme-compat
36K ./wp-admin/user
16K ./wp-content/ai1wm-backups
12K ./wp-admin/maint
4.0K ./wp-content/upgradeHere are some useful commands that trims the . and includes a total
# Quick overview (1 level)
du -h --max-depth=1 . 2>/dev/null | sort -hr
# Detailed view (2 levels)
du -h --max-depth=2 . 2>/dev/null | sort -hr | head -20
# With totals
du -hc --max-depth=2 . 2>/dev/null | sort -hrHere is my one-liner for listing directories and their subdirectories 1 level deep in descending order on systems without ncdu or tree using du only
for FOLDER in $(find . -maxdepth 1 -type d ! -name '.' 2>/dev/null | sort); do SIZE=$(du -sh "$FOLDER" 2>/dev/null | cut -f1); echo ""; echo "$(tput setaf 2)$(tput bold)$FOLDER$(tput sgr0) $(tput setaf 3)[$SIZE]$(tput sgr0)"; du -h --max-depth=1 "$FOLDER" 2>/dev/null | sort -hr | tail -n +2 | sed "s#$FOLDER/##g" | awk '{printf " ├─ %-50s %s\n", $2, $1}'; done
Will get you this output
./.tmb [1.1M]
./.well-known [28K]
├─ acme-challenge 24K
./wp-admin [11M]
├─ includes 3.2M
├─ css 2.9M
├─ js 2.2M
├─ images 640K
├─ network 216K
├─ user 44K
├─ maint 12K
./wp-content [855M]
├─ updraft 509M
├─ plugins 176M
├─ uploads 102M
├─ cache 53M
├─ themes 12M
├─ ewww 4.6M
├─ mu-plugins 32K
├─ upgrade-temp-backup 8.0K
├─ aocritcss 8.0K
├─ upgrade 4.0K
./wp-includes [56M]
├─ js 32M
├─ blocks 4.1M
├─ css 3.8M
├─ sodium_compat 1.7M
├─ ID3 1.2M
├─ SimplePie 1.1M
├─ rest-api 1.1M
├─ html-api 552K
├─ Requests 456K
├─ images 364K
├─ fonts 344K
├─ customize 252K
├─ PHPMailer 236K
├─ certificates 228K
├─ widgets 192K
├─ block-supports 180K
├─ Text 92K
├─ sitemaps 76K
├─ pomo 68K
├─ interactivity-api 64K
├─ style-engine 60K
├─ IXR 60K
├─ assets 60K
├─ theme-compat 44K
├─ l10n 44K
├─ block-patterns 32K
├─ block-bindings 12K
├─ php-compat 8.0K
This is the last one
du -k -c --max-depth=1 | grep -v \\.$ | sort -rn| while read p; do echo ""; k=`echo $p|awk '{print $2}'` ; if [ $k == "total" ] ; then echo ""; else echo $k ; du -k --max-depth=1 $k |sort -rn | while read p1; do echo $p1| awk ' BEGIN { split("KB,MB,GB,TB", Units, ","); } { u = 1; while ($1 >= 1024) { $1 = $1 / 1024; u += 1 ; } $2 = sprintf("\t\t%60.60s\t\t\t%9.1f%s", $2,$1, Units[u]); print $2}' ; done; fi ; done; du -k -c --max-depth=1| grep -i "total$"|awk '{$1 = sprintf("\t\t%60.60s\t\t\t%9.1f MB",$2,$1/1024); print $1}';Enjoy 🙂
Sources
Combine the Best of tree and du
Unix Sort Command
du for subdirectories
Changing Color Output in bash

1 thought on “How to Find Large Folders Taking up Space on Linux”
Comments are closed.