Really? How do they know what the representation of the object code was before compilation? Seems like if they had a whitelist it would be difficult to enforce.
The biggest problem I faced with Rust and macOS/iOS is related to Apple's requirement for bitcode on all submitted code to the App Store for Watch and appleTV applications.
This gist is that Apple is requiring bitcode, but isn't giving easy access to the LLVM version they use for their own tools. This means there's not a path forward to support bitcode generation from Rust that would align with Apple's requirements. This currently only effects iOS on Watch and TV, meaning you can easily target macOS and iPhone without issue, but I fear that the writing is on the wall.
I continue to be amazed that apple is not considered as having a monopoly on "stores for apple devices".
These are exactly the kind of archaic requirements that would be a nonstarter or otherwise kill market support for a store given any actual competition.
Instead, they are able to leverage it to try and push their 'approved' languages and developer environments - furthering anticompetitive lock-in.
Apple making money from selling devices (or even distributing "at a loss" devices which they benefit from having exist so they can better act as software vendors) and them making money from their store are two separate revenue streams, after all.
"products made by one brand" isn't a category of products. If they're the only brand that makes products for a particular category, then they're a monopoly, but you cannot simply define a category as "stuff made by that company", because by that logic, every company is a monopoly on stuff made by them.
Actually, a makers own products are a distinct market for antitrust purposes if people empirically don't substitute out of it, as shown by the producer having market (pricing) power.
I wouldn't be surprised if that's true for Apple for some of its offerings.
It's not. The only product you could even try to make the argument for is the iPhone, but the generally-accepted categorization here is that iPhone and Android phones (and Windows phones) are part of the same category, which makes sense because people absolutely do switch between them.
Actually I have a better argument against this than my other.
No, "products from brand A" is never a category. However, a company may create a brand new market with a product, and they may be the only company with a product in that market for a while. But that still doesn't mean the category is "products by that company", it just means it's whatever new category was created from the product.
For example, the iPhone arguably created a new category of smartphones. But competitors quickly introduced their own products in this same category (e.g. Android).
I imagine it would be easier to accept your assertion of anti-competitive behavior if it weren't trivial to completely ignore Apple for your phone needs, or if there were some sort of Constitutional right guaranteeing you the Apple phone of your dreams
It's not as bad as it sounds. The LLVM they ship in Xcode is basically stock, or at least it was in past releases where they posted the source - after all, Apple is the upstream for LLVM. And unlike the LLVM API, the LLVM bitcode format is stable and preserves backwards compatibility whenever possible, in the sense that newer versions of LLVM can read old bitcode. So there's a good chance that passing bitcode from rustc to Apple's tools will Just Work, and if it doesn't (like if rustc is using a newer LLVM than Xcode) then it should be fixable.
I agree that ostensibly it is possible. But what I would love is to see either: Apple show how to do it in a supported manner; or someone at least prove that it's possible. So far I haven't seen either.
To your comment about bitcode being stable, I don't think that bitcode stability and forward compatibility was something guarateed until 4.0.
That's using Apple's compiler to compile bitcode from rustc. The .bc file has significant chunks of Rust's libstd embedded, so the test isn't as trivial as it seems. To embed bitcode as they want for the App Store, you can use -fembed-bitcode, except it doesn't work because liballoc_jemalloc wasn't compiled with that option (but that's trivial to fix).
I also tried compiling for iOS, which seems to work, but I didn't bother to test the resulting binary.
(As for stability, according to the announcement[1], the previous policy was that bitcode would be readable "up to and including the next major release", which was already reasonable from the perspective of keeping a third-party compiler's output compatible.)
You could get rid of jemalloc, but on the other hand you could also just compile it with -fembed-bitcode. (It's part of the Rust standard library, so that's a bit more work. Rust code doesn't have this issue because .rlibs contain serialized ASTs, and passing -C lto makes rustc only use that rather than use the precompiled bits. In any case, it's just a matter of rebuilding.)
Incidentally, I just tried updating my nightly Rust, and it stopped working - clang started failing to read the bitcode, bailing out with a vague "error: Invalid record". This is not too surprising, because Rust just landed a big LLVM upgrade two weeks ago. Newer LLVM can read older bitcode files but not the other way around, and even though LLVM 4.0 was released months ago, Apple seems to only sync with trunk yearly, along with major Xcode releases. (You can tell based on the --version output.)
However, Rust can be built against an external LLVM (rather than the fork it uses by default), and AFAIK it tries to preserve compatibility with older versions. So it should still be possible to use the latest rustc, you just have to compile it yourself against a slightly older LLVM.
Note that I've obviously never written an iOS app, but I figured that Apple required you to submit the source of your app for their auditing process. Which sounds extreme, but this is Apple we're talking about. :)
There was indeed a rule at one point that apps had to be written in C, C++, or Objective-C. This was allegedly implemented in order to ban cross-compilers such as Flash compiler[0]. Presumably they would either require the source code to verify this, or inspect the compiled binary for some identifying feature. However, these rules don't exist in the current version of the guidelines[1].
This rule got so much press 7 years ago that it will be one of those "truths" about a platform that never dies on internet forums, that and the Playstation platforms using OpenGL as their primary API...