Batch Optimize JPG Lossy Linux Command Line with jpeg-recompress

Optimizing your images can feel like black magic sometimes. The safest JPG compression is lossless meaning no quality loss (guide), lossy compression has far superior space savings. This guide will show you how to batch optimize JPG images using lossy compression with jpeg-recompress from jpeg-archive on Linux.

In this guide I am batch optimizing 37000 images on an Ubuntu 16.04 Linux VPS from Vultr.

Back up your images first before attempting any optimizations!

If you want to get a batch of test images you can find some here Test images here.

Potential Savings Demo

I chose this sunset image, the original image was 6.3 MB.

Optimized it is 1 MB, ~ 20% of its original size!

I performed this lossy compression with jpeg-compress

jpeg-recompress --quality medium --method smallfry --min 40 hires.jpg hires.jpg

jpeg-recompress does some test compressions and analyses to see if the quality difference is noticeable before choosing an appropriate size.

Overall 5073 kilobytes were saved 🙂

Metadata size is 20kb
smallfry at q=67 (40 - 95): 95.344597
smallfry at q=81 (68 - 95): 99.582062
smallfry at q=88 (82 - 95): 103.033356
smallfry at q=84 (82 - 87): 100.918098
smallfry at q=86 (85 - 87): 101.932304
Final optimized smallfry at q=87: 100.375671
New size is 21% of original (saved 5073 kb)

Let’s get started.

Checking Initial Size

Check the initial size of your image folder

du -sh foldername

For example, before optimization the folder was 18 GB

18G     uploads

Optimize JPG Lossy Linux with jpeg-recompress

Install mozjpeg dependencies first

sudo apt-get update
sudo apt-get install build-essential autoconf pkg-config nasm libtool git gettext libjpeg-dev -y

Build mozjpeg, the latest tar.gz can be found here which you can replace below in the wget line.

cd /tmp
wget -O mozjpeg.tar.gz
tar -xf mozjpeg.tar.gz
cd mozjpeg
autoreconf -fiv
./configure --with-jpeg8 --prefix=/usr
sudo make install

Install jpeg-recompress with these commands, make sure you have the bzip2 package.

sudo apt-get install bzip2
cd /tmp
wget -O jpeg-archive.tar.bz2
tar -xf jpeg-archive.tar.bz2
sudo cp jpeg-recompress /usr/bin/jpeg-recompress
sudo chmod 755 /usr/bin/jpeg-recompress

Now the required dependencies are installed we can start automating the lossy compression of tons of jpeg files.

Using jpeg-archive to Lossy Compress JPEGs

First we are going to run some tests on the images to verify the quality is OK, then batch optimize.

The basic command for lossy jpeg compression specifies a --quality, other qualities exist like high and veryhigh

jpeg-recompress --quality medium --method ssim image.jpg output.jpg


Metadata size is 0kb
 smallfry at q=77 (60 - 95): 98.259766
 smallfry at q=86 (78 - 95): 103.205048
 smallfry at q=81 (78 - 85): 99.176430
 smallfry at q=83 (82 - 85): 100.903015
 smallfry at q=84 (84 - 85): 101.598312
 Final optimized smallfry at q=85: 99.523018
 New size is 71% of original (saved 78 kb)

You can also specify the minimum quality allowed with --min and jpeg-compress will do comparisons to ensure the best compression ratio without losing quality.

This means if you set --min to 40 jpeg-recompress will only save the image at that quality level after testing other quality levels and analyzing the results.

I had good results with the smallfry method so I am using that here.

jpeg-recompress --quality high --method smallfry --min 60 image.jpg image.jpg

Compressing an image to 12% of its original size without noticeable quality loss is pretty awesome.

Metadata size is 56kb
smallfry at q=77 (60 - 95): 102.386467
smallfry at q=68 (60 - 76): 101.549133
smallfry at q=72 (69 - 76): 101.127449
smallfry at q=74 (73 - 76): 101.716339
smallfry at q=75 (75 - 76): 101.718018
Final optimized smallfry at q=76: 100.795944
New size is 12% of original (saved 2228 kb)

If you want to be extra accurate but slower you can add the --accurate flag

jpeg-recompress --quality high --accurate --method smallfry --min 60 image.jpg image.jpg

If you having thousands of images, it’s a good idea to use screen to run the optimizations.

Screen will ensure the batch command keeps on running even if your SSH session is terminated.

sudo apt-get install screen

Create a new screen session


To batch optimize all of your JPGs to lossy versions in a specific folder use the command below.

The find command does recursive searching so all subfolders with images will be processed.

Make sure to adjust the quality and min flags based on your previous compression trials

find /var/www/ -type f -iname '*.jpg' -exec jpeg-recompress --quality medium --min 60 --method smallfry \{} \{} \;

Detach the screen with Ctrl+A and pressing D (detach).

You can use the top command and look for jpegoptim processes once in a while.

Reattach the screen like so

screen -r

Now you can strip EXIF data as well for additional savings.

Stripping EXIF Data with exiftool

Exiftool is written in perl and will strip metadata out of your images

sudo apt install exiftool

Removing exif data from a single image is done with the command below

By default a backup is created with the extension jpg_original

If you just want to overwrite the images, use the -overwrite_image flag

exiftool -overwrite_original -all= image.jpg

This command will remove all of the EXIF data from all JPGs and replace the originals

find /var/www/ -type f -iname "*.jpg" -exec exiftool -overwrite_original -all= \{} \;

Now it is time to check how much space was saved.

Check Space Savings

Recheck your image folder size

du -sh foldername

Hope the savings were significant

11G     uploads

That is almost a 50% savings without a noticeable quality loss of the JPEG images.


Remove exif Metadata
Remove exif without Recompressing JPEG
Reuse found file name in bash