Saving a bit of bandwidth
I usually try to shoot my photos with my Sony system camera. But even if you’re using your phone, file sizes of photos might be in the couple of MB range. This is not practical for a website, or to be more specific, for this blog. I also do not want to manually convert all images I use down to an okay size.
So, why not do that automatically? I wanted to have the following process:
- Create folder for the post
- Select photos and dump them into a folder called raw_images
- Rename and order the photos
- Call the script
The script should therefore create an folder called images, which should contain all images with the same name but compressed. Further, the blog is stored in my private git forge and thus, I do not want to store the bigger images there. A simple entry to the gitignore file containing raw_images prevents the directory for each post from being added to the repository.
The Script
The script itself is very basic and uses the ImageMagick package. Basically, we iterate through all directories in our posts directory. For each post, we then check if there is a directory called raw_images. If not, nothing to do. If there is one, create a corresponding images directory if there is none. Now, for all files in raw_images, convert them using magick to 1600x1200 with a quality of 80.
ImageMagick does not adhere to the aspect ratio, it keeps the one of the input file. You can force the size with -resize=!1600x1200, but I do not want stretched files. ;D The quality setting depends on the given input file type. I’m using jpg files so it is hardcoded for this usecase. After some quick tests, 80 seemed to be fine with image quality and file size. See the ImageMagick documentation for more information on that.
#! /bin/env zsh
CONTENT_PATH=$PWD/src/content/posts
for PROJECT_PATH in ${CONTENT_PATH}/*; do
if [[ ! -d $PROJECT_PATH ]]; then
continue
fi
POST_NAME=$(basename $PROJECT_PATH)
echo "Got post path: $PROJECT_PATH"
IMAGE_PATH=$PROJECT_PATH/images
RAW_IMAGE_PATH=$PROJECT_PATH/raw_images
if [[ ! -d $RAW_IMAGE_PATH ]]; then
# if post has no raw_images folder, continue
echo "Post has no raw_images path, skipping"
continue
fi
if [[ ! -d $IMAGE_PATH ]]; then
# if post has raw_images but not images as path, create it
mkdir -p $IMAGE_PATH
fi
# create all images that have no converted copy already
for RAW_IMAGE in $RAW_IMAGE_PATH/*; do
IMAGE_NAME=$(basename $RAW_IMAGE)
OUTPUT_FILE=$IMAGE_PATH/$IMAGE_NAME
if [[ -f $OUTPUT_FILE ]]; then
echo "$IMAGE_NAME: converted output file already exists, skipping"
continue
fi
magick $RAW_IMAGE -resize 1600x1200 -quality 80 $OUTPUT_FILE
FILE_SIZE=$(du -h --apparent-size $RAW_IMAGE | cut -f1)
NEW_FILE_SIZE=$(du -h --apparent-size $OUTPUT_FILE | cut -f1)
echo "$IMAGE_NAME: $FILE_SIZE -> $NEW_FILE_SIZE"
done
done
Bloggyness
That should reduce my time spent on converting all those photos manually. Let’s see if that increases my output here ;D Writing blog posts is harder than I anticipated. You have to find time to do something cool and then also write about it. This post here is a test to see, if “lower effort” posts also are fun. Feels good to write these lines and publish it! Thanks for reading and have a lovely day!
Last modified on 2025-03-27