It was always a stupid rumor. Can you imagine if Apple required a different USB-C cable for your new iPhone rather than what already works with iPads and MacBooks? They would never do that.
> Can you imagine if Apple required a different [... snip ...] cable for your new iPhone rather than what already works with iPads and MacBooks?
My alteration of your comment brings it in line with the status quo prior to the USB-C which, keep in mind, Apple fought tooth and nail against. It is a stupid idea, but Apple did it anyway. The world runs on stupid ideas daily; either because they are more profitable, or are too entrenched, or both.
let's recall that lightning came out before usb-c, because vendors within the USB-IF didn't want to upgrade their devices to replace micro-b with something more expensive and were stalling the acceptance of the standard.
for everything you can say about apple, they have at least been consistent on connectors, some would say to a fault. when they deprecated the 30-pin they promised that iphone (not ipad, not laptop) would remain on lightning for 10 years.
I'm not buying it. It's just a cable, and regular wear and tear (user error in Apple parlance) pretty-much dictates a lifetime far less than 10 years. The reason is profit, no more, no less.
Most of the software I’ve written has gone into hardware products. Currently working on an embedded Linux system and writing a UI in ReactJS. We are late but only by a couple of months and I have every expectation that the software will be completed. The product has pre-sold about $1M already.
If you want to be working on shipping products, work on embedded stuff.
I’ve done customer facing business software too. Still mostly successful but that is where I’ve seen a few failures. One project I did was writing automated functional/integration tests didn’t see release but I take no blame for that one. The tests worked. The product didn’t.
I haven't used one myself but the N200 looks pretty ok for a tablet that's supposed to run a long time on battery. Quad core Skylake-ish cores that turbo to 3.7GHz?
Yes, and in this case 6W TDP and cheap enough for a $500 tablet seem like key drivers of the N200's design. Of course the Apple M1 is probably twice as fast at similar cost and power but compared to everything outside Apple the N200 looks pretty decent.
7040 in its lowest TDP configuration is 15W. Intel N200 is 6W. Even accounting for some differences in how both companies measure TDP only one of these can be passively cooled in that sort of form factor.
The lack of low-tdp products on AMD side was also one of the reasons given by pcengines to discontinue their embedded line. AMD's last 6W APU was the 2-core R1102G which is now a couple generations old.
I guess that's only for Safari, but not Chrome if you have that installed as well? Also, what if you never signed into your iCloud account? Is it impossible to disable?
> Also, what if you never signed into your iCloud account? Is it impossible to disable?
If you're not signed in, then it's not enabled, because the iCloud account is essential to the process: "An Apple server validates your device and Apple ID."
1) Apple Silicon won't support PCIe.
2) Apple doesn't want it to.
#2 means Apple is taking nVidia and AMD head on in the GPU space. Apple wants to control everything, and allowing these competitors on their platform is giving away too much. Because Apple Silicon scales better than competitors' hw, the desire for third party GPU is probably going to evaporate within a few generations of Apple Silicon. I mean, we'll see, but that is my best guess, because it seems like that was an intentional decision rather than oversight.
To suggest it is merely lack of drivers is an oversimplification. There is a chip errata that prevents PCIe GPUs from working properly on Apple Silicon: the architectures are not compatible. Apple Silicon GPU drivers are deeply integrated into the system. Due to this integration, only graphics cards that use the same GPU architecture as Apple Silicon could be supported, and there just aren't any, and I don't see how there could be unless Apple developed one and released it.
> To suggest it is merely lack of drivers is an oversimplification.
Not really. When you hook up an external PCI chassis with a graphics card inserted, it sees the PCI expansion slot and the GPU just fine, it just doesn't have a driver for the GPU.
Not really. Software drivers alone will not get it done, at least not adequately (Asahi-related developers may come up with a software solution for Asahi, but it will necessarily degrade performance, so the effort is likely to be abandoned).
Thunderbolt supports PCIe, and for most devices Apple Silicon does also, but GPU is different enough from audio interfaces and NVMe that it isn't just a "load the driver and plug it in" situation. GPU is vastly more complex than other PCIe devices. Apple Silicon and x86 architectures are not compatible, so GPU for x86 is not going to work with Apple Silicon with merely a software driver.
It's going to take hw translation and other technology that is not yet available for Apple Silicon, thus Apple's recent patent applications,[1] showing that Apple is either exploring supporting outboard GPU or locking anyone else out from their method of doing so, but either way is no guarantee they'll complete development or release, because it seems just as if not more likely the roadmap for Apple Silicon GPU performance will outpace nVidia and AMD GPUs.
But, again, claiming, "it just doesn't have a driver for the GPU" is a staggering oversimplification.
> It could be that Apple does away with it entirely and the Mac Studio is the new Pro.
Or the Mac Pro will be released without PCIe GPU support, and Apple will be able to leverage increases in Apple Silicon GPU performance to eliminate any need or desire for PCIe GPU, drawing away high end GPU customers from nVidia and AMD and locking them into Apple Silicon and the Apple ecosystem.
> Or the Mac Pro will be released without PCIe GPU support, and Apple will be able to leverage increases in Apple Silicon GPU performance to eliminate any need or desire for PCIe GPU, drawing away high end GPU customers from nVidia and AMD and locking them into Apple Silicon and the Apple ecosystem.
Maybe. But graphics cards aren't the only thing people put into PCIe slots.
> Maybe. But graphics cards aren't the only thing people put into PCIe slots.
Right, so there will be PCIe slots for expansion, they just won't support PCIe GPU, just like Thunderbolt PCIe expansion chassis for Thunderbolt now. It isn't the PCIe slots that break compatibility, its the difference in architecture between x86 and Apple Silicon that makes them incompatible.
I’ve been doing ReactJS development with both the M1 and now the M2 MacBook Air for a year now. My app’s transpiler production build takes about 5 seconds and any development incremental is essentially instantaneous.
I can’t imagine most developers work even stressing an M2 unless you are compiling chrome or something.
Best thing as a contractor I don’t even bother to bring a power adapter. I work for 8 hours and leave with 20%-40% remaining every day.
And the product they were selling was a bare (well, populated, but not enclosed) PCB which the user was expected to wire up to power, a keyboard, and a monitor themselves. Long-term reliability was more the responsibility of the owner than the manufacturer.
Too bad Apple's management was so timid. That could have changed the course of computing back in 1985. I had a Macintosh IIx and it was a great computer but I would have bought the Jonathon in a heartbeat back then. Thanks for the history, first I've heard of it.