> Unused memory is wasted memory. 77% is basically caches + private process memory + shared memory.
In simplified overviews, Windows counts file system caches (standby memory) as free (respectively available) memory, so if 77% of 32 GB is to be taken literally, it still sounds rather on the high side.
> The deep tube lines will all get variations on the 2024 stock
I think that's a bit optimistic. Right now, the only thing that has confirmed orders is the replacement for the 1973 stock (Piccadilly line). The same order also has options for further trains which would cover the Bakerloo, Central and Waterloo & City lines, but somebody still needs to come up with the money for it.
For the Bakerloo line trains (1972 stock) that's probably going to happen, since those trains really are getting long in the tooth now, but for the Central/W&C line stock (1992 stock) there's currently a refurbishment programme underway, so depending on how that goes, those trains will probably continue running for a while further.
That still leaves the Northern and Jubilee lines (1995 and 1996 stock respectively), whose replacement trains, whenever they might happen, will probably need a new tender – it could be that whatever train gets selected then will be a close relative to the 2024 stock, but I don't think it's automatically a given.
And the Victoria line – that one only got new trains in 2009, so those will continue for quite a while further and will be the last ones due for replacement on the deep tubes.
That's a good point especially about Victoria, by the time a 2009 stock train is at its replacement age the "new tube for London" design is probably going to look pretty archaic and budgets are always too tight to make a replacement early. Who knows, by then TfL might actually have a "driverless train" plan for these lines which makes sense.
> I've never actually seen 230 V, the supposed standard, in real life.
I've just measured the voltage in a socket my home (Germany) and the multimeter says 231 V. (And it's nighttime, so no solar generation from houses in the neighbourhood potentially distorting the local grid.)
With Android that's definitely not the case. Supporting older phones might get harder over time because you can't use any new APIs introduced in more recent OS releases, respectively always have to provide some fallback code path, and occasionally (at least if you want to publish on the Play Store) you're forced to use the new APIs, so you can't avoid the complexity of supporting both old and new APIs.
Plus if you're using any dependencies, you're also bound by whatever minimum API version all your dependencies are using. (Even Google's support library – on the one hand that one does try to somewhat smooth over the API differences between various OS releases and make your life easier, but eventually it'll also drop old Android API versions – on the conservative side, but eventually it'll do.)
But – if you're prepared to somehow work around all that, there's no hard cut-off, and a modern Android toolchain will still happily produce APKs that would run on by now very old phones, too.
Like e.g. I've taken an old app that had initially been developed during the Android 2.x era (around 2010/11) in order to fix a few annoyances and add some features. Since I didn't do any kind of radical overhaul of the original code so far, the resulting app happily runs both on modern phones (albeit with a somewhat older look-and-feel), but also on the oldest emulator image I could still get to work on my computer (Android 2.3.3 / API10 from February 2011).
The other factor would be driving away potential users – even when giving away an app for free, some people might derive satisfaction from knowing that other people find it useful and are actually using it, too.
> And every time a car makes a turn, pedestrians automatically have priority. Which creates an implicit zebra crossing.
Only for turning traffic, though, i.e. as a pedestrian you still need to yield to traffic coming from the side street. There was some talk of having pedestrians participate more fully in right-of-way-rules, too, i.e. if the side street has a yield/stop sign, traffic would have to yield to crossing pedestrians, too, but so far that idea didn't get anywhere.
Also if your software for whatever reasons is using the original libjpeg in its modern (post classic version 6b) incarnation [1], right from version 7 onwards the new (and still current) maintainer switched the algorithm for chroma up-/downsampling from classic pixel interpolation to DCT-based scaling, claiming it's mathematically more beautiful and (apart from the unavoidable information loss on the first downscaling) perfectly reversible [2].
The problem with that approach however is that DCT-scaling is block-based, so for classic 4:2:0 subsampling, each 16x16 chroma block in the original image is now individually being downscaled to 8x8, and perhaps more importantly, later-on individually being upscaled back to 16x16 on decompression.
Compared to classic image resizing algorithms (bilinear scaling or whatever), this block-based upscaling can and does introduce additional visual artefacts at the block boundaries, which, while somewhat subtle, are still large enough to be actually borderline visible even when not quite pixel-peeping. ([3] notes that the visual differences between libjpeg 6b/turbo and libjpeg 7-9 on image decompression are indeed of a borderline visible magnitude.)
I stumbled across this detail after having finally upgraded my image editing software [4] from the old freebie version I'd been using for years (it was included with a computer magazine at some point) to its current incarnation, which came with a libjpeg version upgrade under the hood. Not long afterwards I noticed that for quite a few images, the new version introduced some additional blockiness when decoding JPEG images (also subsequently exacerbated by some particular post-processing steps I was doing on those images), and then I somehow stumbled across this article [3] which noted the change in chroma subsampling and provided the crucial clue to this riddle.
Thankfully, the developers of that image editor were (still are) very friendly and responsive and actually agreed to switch out the jpeg library to libjpeg-turbo, thereby resolving that issue. Likewise, luckily few other programs and operating systems seem to actually use modern libjpeg, usually preferring libjpeg-turbo or something else that continues using regular image scaling algorithms for chroma subsampling.
[1] Instead of libjpeg-turbo or whatever else is around these days.
[2] Which might be true in theory, but I tried de- and recompressing images in a loop with both libjpeg 6b and 9e, and didn't find a significant difference in the number of iterations required until the image converged to a stable compression result.
I poked at the app, which surprisingly enough isn't even obfuscated, and as far as I can tell, it's mainly relying on Play Integrity's verdict. I didn't investigate it in detail though, so I don't know absolutely sure if that's really all or whether they're also running some additional custom checks, and I also don't know which integrity level they're requiring.
Sarcasm aside, it depends on whether your employer has configured Entra to allow classic TOTP (in which case Microsoft will try to push its own app as the default option, but you can in fact use anything that supports TOTP if you insist), respectively has set the option to only allow Microsoft's proprietary 2FA, which only works with the Microsoft app.
> That is an issue with the capabilities the os exposes to you. The answer to every security issue not "add a backdoor".
Problem is, I strongly suspect we'd still be having the same discussion even if we were talking about "allow the user direct access to all files*" instead of "allow the user full root rights".
Because while some of those missing capabilities are "simply" a matter of it being too much effort to provide a dedicated capability for each and every niche use case (though that once again raises the question as to whether you prefer failing open, i.e. provide root as an ultimate fallback solution, or fail closed), with file access I guess that this was very much an intentional design decision.
In simplified overviews, Windows counts file system caches (standby memory) as free (respectively available) memory, so if 77% of 32 GB is to be taken literally, it still sounds rather on the high side.
reply