I suspect you could use JPEG as a lossless format, you'd just have to write your own custom encoder.
In some sense, JPEG is just a really weird programming language. So a JPEG file is just a really weird program that gets interpreted by a JPEG decoder. Most normal JPEG encoders try to find a short 'program' in the 'JPEG programming language' that gets interpreted by the decoder to recreate a particular image reasonably well.
But there's no reason you couldn't ask for a perfect reproduction.
As far as I know, JPEG decoders are lossless and deterministic.
Of course, you point still stands that JPEG is not a good file format for storing UI elements. Even if in theory you could torture the JPEG standard enough to make this barely work.
In some sense, JPEG is just a really weird programming language. So a JPEG file is just a really weird program that gets interpreted by a JPEG decoder. Most normal JPEG encoders try to find a short 'program' in the 'JPEG programming language' that gets interpreted by the decoder to recreate a particular image reasonably well.
But there's no reason you couldn't ask for a perfect reproduction.
As far as I know, JPEG decoders are lossless and deterministic.
I asked this question on https://stackoverflow.com/questions/71201691/does-jpeg-allow... to get something definite.
Of course, you point still stands that JPEG is not a good file format for storing UI elements. Even if in theory you could torture the JPEG standard enough to make this barely work.