The 'jpeg-archive' tool, if anyone was wondering, appears from the source code to not produce an archive file of any kind but just a directory of lossily compressed files, does no deduplication on the file or image level, stores no hash sums, creates no forward error correction information, does not shard or store across multiple storage services under the LOCKSS principle, and otherwise doesn't do anything relevant for long-term archival storage. The title of the submission and tool are more aspirational than descriptive.
Yup, it just re-compresses the photos to save some space potentially sacrificing the quality of originals. Nothing to do with the backing up the files securely. And also not something any photographer would ever do to photos, but I guess it can be useful when you have tones of holiday snapshots where maximum quality or ability to edit raw photos in future is not a concern.
I imagine this tool served a need to store lots of images online and that there are wider use cases for image libraries and ecommerce stores where lots of porduct images are needed to be served efficiently.
Sadly it is not 'jpeg-and-png' and you do need to keep some images in PNG becuase you cannot have transparency in JPEG.
Much can be done with 4:2:0 colour and the optimiser tools Google recommend. In PNG world I like pngquant for some images but not all. It is also possible to prepare images in Imagemagick to suit an optimiser.
However, storage is cheap and there are better ways. Google's Pagespeed for NGINX and Apache works wonders at serving optimised images. This abstracts the problem away from code so there is no need to be worrying about image sizes over the web, size in pixels or bytes.
I don't think this tool is about providing a professional archiving solution, just efficiently processing an online image library so that you have an extremely practical archive regardless of what compression ratio images were originally uploaded at.
The use of all cores to quickly process all images was instructive. However, in real life it is more useful to have a cron job that just updates all images to correct standards slowly in the background, not using all cores.
I would like to see some comparisons against more commonly used jpeg optimisers.
What layer of the filesystem it should be handled at can be debated (a tar with separate PAR2s? a single archive split into multiple ZFEC shares?) but that sort of thing is precisely what a long-term archiving tool should be handling for you. I don't want to mess with par2create by hand anymore than I want to try Imagemagick on multiple settings to get an acceptable tradeoff.