That is pretty awesome! When I joined the Java effort in '92 (called Oak at the time) the group I was with was looking at writing a full OS in Java. The idea being that you could get to just the minimal set of things needed as "machine code" (aka native methods) you could reduce the attack surface of an embedded OS. (originally Java was targeted to run in things like TV's and other appliances). We were, of course, working in C rather than Rust for the native methods. The JVM in Rust though adds a solid level of memory safety to the entire process.
IMAO, Android kind of achieve that...kind of. They write lots of OS logics in Java (or Kotlin) but mixing lots of system services written in native code at the same time, interconnected by the famous (or infamous?) Bind IPC.
Compiling to native isn't exactly black magic and works just as well.
JVM wastes cycles on things like classes, which is not necessary at all. Going forward, Rust has already proven that you can do things at compile time to guarantee things like memory safety.
Compiling to native on whatever random chip is always a recipe for hassles and random incompatibilities which need work to unravel/fix. Having a spec that chip manufacturers can implement (and that can be trivially tested) and/or a JVM that can be exercised and ported to a chip once and validated removes the vast majority of the integration work.
Does it always make sense? No. But clearly a bunch of folks have found it valuable in many niches.
You can justify every project that has succeeded against any odds by saying "but it did succeed". It's true, but imo not a very good way of judging when a technology was a good fit or not.
Is it also too much to ask that someone explain how they think a technology which has been chosen by market participants and is used widely and (apparently) successfully by them does not actually match the real constraints of the market or participants?
And perhaps proposes a concrete alternative that matches those constraints better?
Don't forget to include things like long term support, developer time, interoperability, etc.
> Is it also too much to ask that someone explain how they think a technology which has been chosen by market participants and is used widely and (apparently) successfully by them does not actually match the real constraints of the market or participants?
Kind of, yes. Since instead of evaluating based on an understanding of the technology you're saying "In this universe was chosen for a project, the project was successful, show me the universes where the alternative decision was made".
I'm saying that if you think something is crap and doesn't meet customer needs, at least propose a concrete alternative you believe is better so someone can respond meaningfully! Or concretely what concrete needs are not being met!
Currently, we have one example of something that all evidence leads us to believe fits the universe as it exists, at least in that specific niche.
If you think it doesn't, how doesn't it? Or if you're saying there is something better, are you saying that is hand written assembly? Or TurboPascal? Or ADA? Or some as yet not designed system?
I'm not asking for an alternative universe. I'm asking you to support your statement with enough details it can be assessed in the current universe.
Them: People are using Java for this because there aren't alternatives
You: If it is the convenient alternative than it is a good choice
Me: That is a bad justification for something being a good choice
You: What criteria then?
Me: An understanding of the engineering principals/ domain
Am I accurately summarizing this conversation so far? This isn't about alternatives, or what else they should have done. Something can be a bad fit and the right choice.
As an example, I could say "Java dominates that section due to historical artifacts of business, not technology. Java is a bad fit for this type of work otherwise because of the complexity involved in implementing a Java VM in hardware". I can then also say "Java is the only real choice because of those historical artifacts so I have to recommend that you use it unless you're willing to build your own hardware from scratch".
I actually don't have to propose any alternatives at all, hopefully you can see that - we can just evaluate Java as a language (complex VM, assumes a heavy runtime) against the constraints (custom hardware, low energy) and see that the fit is weak. Obviously people overcame that and made it work, and because of that Java is the obvious choice for this technology.
From an engineering perspective, I don't think it's fair to say something CAN be a bad fit and yet a good choice? At least not without acknowledging it was the best choice available, and therefore not likely actually a bad fit?
In that scenario, it's literally the best possible fit.
That you don't have any viable better alternative at hand may be further evidence of that? (and I don't mean from a standards basis 'well, it's locked into financial rules now, so gov't intervention'). I mean, what else was going to work considering all the factors involved? What else could work better, considering the factors involved?
JavaCard is in fact so widely used and implemented (SIM cards, bank cards, health cards, passports, etc.) that it probably has literally 10's of billions of devices manufactured using it (3.5bln claimed as of 2010 - https://www.oracle.com/technical-resources/articles/javase/j...), in essentially every high value target rich environment niche you can think of, and at extremely low costs. Literally sub-cent per-item.
And with very high environmental stresses (like debit cards getting sat on, left in hot cars, run over, dropped in puddles, jammed into random dirty readers over and over again, etc.), those devices keep working.
And everyone from random countries gov'ts to random financial firms to telcos have managed to implement what they need in it without too much difficulty, and a minimum number of security issues. Which is frankly astonishing if you've ever dealt with folks like that.
So love or hate Java, or JavaCard from a stylistic perspective - any perceived complexity for implementing a Java VM in hardware has had no practical economic effect, or slowed down implementation meaningfully.
It's fit for purpose.
Probably also ugly and feels gross using them sometimes, but a lot of fit for purpose stuff is until you've experienced the alternatives. Hopefully you never have to fix a sewage lift station pump, or clear a clogged sewer line, or clean out a transmission after it's burned out.
Each of these has literally hundreds of years of specialized knowledge and expertise behind their often boring looking facades. They're all amazingly complex if you learn about them. And they're all better than throwing sewage in the street, or carrying everything on horseback. And they're beautiful in their own way when you appreciate why they are how they are.
Even if they're not shiny and flashy, there is beauty in them, because they work well.
And they're still amazing engineering marvels, necessary for our lives as we know them and based on the actual engineering principals involved and the problem domain.
> I don't think it's fair to say something CAN be a bad fit and yet a good choice?
So we fundamentally disagree.
> has had no practical economic effect, or slowed down implementation meaningfully.
This goes back to my "to prove me wrong you have to show me alternate universes where other options had that investment made under the same circumstances".
Hardly. That would require an assertion like 'any perceived complexity for implementing a Java VM in hardware did not exist because it was rolled out in the most economic way possible given non-technical constraints, and it also did not slow down the implementation beyond the minimum absolutely necessary for any technical solution possible.'.
Notice the difference?
To propose alternative solutions based on practical economic effect just requires a reasonable degree of comparison to projects of similar scope at other times, reports of difficulty from various vendors, and comparisons of end price for this solution, end price for other solutions of similar scope, to the overall scope of the solution and value it brings. None of which requires perfect alternative universe A/B testing to come to some reasonable analysis.
If another solution could be done for half the price (say $0.005 per unit, instead of $0.01 per unit) but the perceived value for vendors is $1/unit - then it's hard to say there is any practical economic effect going either way. Neither solution would block profitability or value. That said, they could easily be compared and better/worse solutions could also be determined or tradeoffs analyzed based on that data, also without perfect alternate universe A/B testing to come to some reasonable analysis. Industry does this all the time at scale, including projected costs of implementation of various solutions.
If the solution was rolled out within a timeframe considered useful/expected for this kind of solution, then it also didn't slow down implementation meaningfully - as in it didn't block it, or add serious delay. If there is another solution which could have been done in half the time, that's cool. But it wasn't required. Identifying such an alternative, if one exists, could be done if you have any data, without having to do an alternative universe A/B test. Though since the proof is in the pudding, to REALLY be sure maybe it would. But that's hardly what I've been referring to or asking for, clearly.
Doesn't mean they wouldn't have been better solutions, and proposed them as alternatives can easily be done without parallel universes! In fact, chances are they have already been implemented somewhere in another niche, so there is adequate data to do so.
> If the solution was rolled out within a timeframe considered useful/expected for this kind of solution, then it also didn't slow down implementation meaningfully - as in it didn't block it, or add serious delay.
It's actually very well suited to low level extremely low power embedded systems. The toolkit and dev experience targeting these platforms is actually pretty good DX.
32 bit 4mhz processor with ~64kb of nvram all running off of an induction charge!
also every SIM card. which is potentially quite a bit less benign - the baseband chip is this completely separate processor which the SOC can't see into, and the SIM card can snoop messages, send commands, and generally act like a secure enclave. It is/was? used for a couple banking systems for feature-phones like MPESA, where the app can run as a featurephone app with the menu item toolkits the phone provides.
I think the e-sim idea is probably a net security benefit imo, apple has leveraged the carriers out of a fairly dangerous tool.
Recently I was thinking about what a program written to take advantage of Optane's persistent-memory model would have looked like, if you use it like RAM it's forever gonna be slow shitty RAM. Javacard seems to be the closest hit for that, in some ways. Maybe some of the higher-tier javacards have GC.
I guess at that point it's basically a JVM application state snapshot, which is the same thing, so maybe not any better.
Java Card is a very different language that's only superficially similar to Java. It prohibits OOP, only allows static methods, and makes you store everything in byte arrays and use System.arrayCopy. It's basically a less convenient C, with its complete lack of structs.
After reading your comment I was surprised to find out that credit card chips have any processing capabilities whatsoever, which they apparently do, though at least according to gpt4, they are far too basic to run java/jvm. ?
Google appears to be significantly more useful than GPT-4 here. [1] is the third result for me for the query "credit card jvm". [2] is the second result and gives a direct (and more importantly, actually correct) answer. That post links to the Oracle documentation for Java Cards [3] which is the fourth result.
All of this is just as easy as, if not easier than, using ChatGPT. It's unclear that such a tool even serves this purpose (retrieval of basic facts) adequately, so it should probably be avoided in the future.
Fair enough, there is a "Java Card". I'm not convinved that Java is running on any of my or your credit cards in your wallet today, though I'm not willing to bet on it.
It's running on many e-Passports and e-ID cards, i can't find the documentation from my e-ID card which runs on Java, but the chips are quite common
like in
Visa became the first large payment company to license JavaCard. Visa mandated JavaCard for all of Visa’s smartcard payment cards. Later, MasterCard acquired Mondex, and Peter Hill joined as their CTO, licensed JavaCard, and ported the Mondex payment platform to JavaCard.
That's pretty neat. It sounds like Java Card (or at least some other cards) actually "boot up" by way of inductive coupling, ie, via the "contactless" card readers where you just hold your card in proximity to the reader thing. Did not know that, I assumed it was just reading a key via NFC or something.
The whole reason there’s a thing called a chip in the card is that it actually does computing (indeed they’re called smart cards) and that it does the sort of computing (cryptographic challenge-response) that makes these cards much more secure than oldschool magnetic stripe cards.
Even fully passive NFC tags contain logic that needs power to talk NFC back to the reader, there’s no such thing as just reading data via NFC.
This is a classic defcon talk about it. They developed their own cell network with their own sim cards for an event and even built custom JavaCard applications their users could use. They have released all their information and tools used to build and compile Java Card software.
A lot of people assume this, but with contactless EMV there is a whole transaction flow between the card and the reader going on.
You know those 4 lights on a contactless card reader? They indicate different transaction stages between the card and the terminal. I don’t find them that useful because it’s so fast they all appear to light at the same time, but that’s what they are!
If they were just passive tags, they wouldn’t be very secure, they have cryptographic processors onboard with private keys that can sign stuff for the terminal and your bank. The specs for the interaction are all public if you’re interested (I wouldn’t be!) and lookup contactless EMV.
How about respect other people instead of rushing to condescending judgement? the stakes are incredibly low here, I asked ChatGPT for fun, like tens of millions of others are doing every day.
For some reason you announced that you were subjecting us to a low quality information retrieval method, and after 6 months of this, people are irritable. The social norms is to do that sort of thing in private, doing it in public and seemingly proudly came across as coarse and impolite
It didn't help any that it was clear from the initial post you were questioning someone with domain knowledge, which was later gently indicated to you
It's not asking ChatGPT that is anti social but posting it, diluting the entropy of the conversation. Thousands of brains read every posted word before they can discard redundant information. Together we can keep a high quality shared medium, which benefits everyone!
oh, yeah, they do. They often have multiple 'applications' on them, for different purposes (for example, withdrawing from an ATM with a card is a different application to making a purchase, which is different from the ill-conceved idea of using your card and PIN as a two-factor authentication token)
Android isn't conventional Java. For starters, its runtime uses its own bytecode (dex) that's based on registers instead of a stack. But then, also, many things that aren't related to GUI are C++ with a thin Java wrapper on top.
When I think about a "Java OS", I imagine a JVM running in kernel mode, providing minimal OS functionality (scheduler, access to hardware I/O ports) and there not being any kind of userspace.