does anyone have recommendations for how they downsize or optimize images on their wiki/site? i'd like to run all images through a downscaling and maybe dithering script before linking? not sure what other people do / use, and while i'm tempted to write my own approach, i wanna learn more first

· · Web · 9 · 10 · 7

@metasyn imagemagick will do what you want given sufficient patience

@metasyn I’m using imagemagick as it’s the most widely available app on servers. Here’s the bash script:
Note that this script creates multiple sizes and images formats, that’s because I use the html source element to always serve the more compressed image at the correct size:
If you want an exemple of dithering with imagemagick:

@metasyn imagemagick! Here’s an example:

convert -strip car.jpg -resize 350x350 -define jpeg:extent=32kb car-compressed.jpg

@alexw @metasyn I use this exact same thing except with an ordered dither as well. I think I got the idea from you actually on flounder. Gives an example like this:

@ols @metasyn very cool! I think dithering often compresses images in a cooler looking way than jpg, but less efficiently

@metasyn Do you know Low Tech Magazine? Their solar version uses dithering and several smart tricks to serve maximum information with minimum bandwidth.

@metasyn yeah, just play with ImageMagick:

convert -auto-orient a.jpg \
-resize 400 \
-colorspace gray \
-posterize 24 +dither \
-quality 70 \
-strip out.jpg

-strip removes all metadata which has a big effect with tiny kB images

@metasyn I made a dithering tool using a JS library of which I've now forgotten the name. I run images through it first, and then I run the dithered images through Trimage for compression. I make no claims that this is the best approach; but it ain't been broke, so I ain't never fixed it. :)

@metasyn long time ago I wrote a very basic gallery, where I'd simply upload a folder of pictures and when that gallery-folder was first visited php-im (ImageMagick) would generate thumbnails into a special folder. So, yeah php ImageMagick module, i think.

@metasyn there’s also ‘lid’, a python script that does similar things to the other solutions proposed:

@chotrin interesting! I'll have to read through their source and see what they're up to

@chotrin spent some time poking around goher:// and came across this post:


which lead me to:

which seems pretty neat. I wonder if there is a way I can make the (nearly) exact same content available on gemini, gopher, _and_ http 🤔

@metasyn oh man, that’s seriously goals if you do get it going. I’m thinking I’d like to do something similar, but with a git or fossil repo as the content source...

@chotrin yeah just an idea still... But right now my core wiki content is all plain text and replacing images with a braile img or just removing them and dumping all that in a different directory would be simple... I mostly want to learn how (if ?) I can write middleware accept everything at the same root host and then route based on protocol 🤔

@metasyn oooo fun! Something SCGI-esque, but to multiple daemons! That’s awesome

@metasyn When I downsample images, I usually use a couple of techniques. Not really automated, but that's not the point.

First I crop manually, then resize to the desired resolution, then -- if I'm really looking to optimize, I tend to do one of the following:

- Line art? convert to PNG (if not already there)
- Then run optipng -o7 on it
- Pictures for the web? convert to webp and or:

Final steps would be to strip metadata and do spot checks.

Sign in to participate in the conversation

Merveilles is a community project aimed at the establishment of new ways of speaking, seeing and organizing information — A culture that seeks augmentation through the arts of engineering and design. A warm welcome to any like-minded people who feel these ideals resonate with them.