Hacker Newsnew | past | comments | ask | show | jobs | submit | more thoughtsimple's commentslogin

It was always a stupid rumor. Can you imagine if Apple required a different USB-C cable for your new iPhone rather than what already works with iPads and MacBooks? They would never do that.


> Can you imagine if Apple required a different [... snip ...] cable for your new iPhone rather than what already works with iPads and MacBooks?

My alteration of your comment brings it in line with the status quo prior to the USB-C which, keep in mind, Apple fought tooth and nail against. It is a stupid idea, but Apple did it anyway. The world runs on stupid ideas daily; either because they are more profitable, or are too entrenched, or both.


let's recall that lightning came out before usb-c, because vendors within the USB-IF didn't want to upgrade their devices to replace micro-b with something more expensive and were stalling the acceptance of the standard.

for everything you can say about apple, they have at least been consistent on connectors, some would say to a fault. when they deprecated the 30-pin they promised that iphone (not ipad, not laptop) would remain on lightning for 10 years.


I'm not buying it. It's just a cable, and regular wear and tear (user error in Apple parlance) pretty-much dictates a lifetime far less than 10 years. The reason is profit, no more, no less.


Them: “Apple promised to ship their iPhones with lighting for at least 10 years”

You: “I don’t buy it, cables wouldn’t last 10 years”

It’s a non-sequitur.

As for profit motives, if that were the case then they wouldn’t voluntarily switch their iPad line over to USB-C.


> It’s a non-sequitur.

This isn't a formal debate, I am allowed to have different reasoning from the rest of the group.


You’re allowed to say whatever you want and I’m allowed to point out that it makes no sense within the context of the discussion at hand.


Most of the software I’ve written has gone into hardware products. Currently working on an embedded Linux system and writing a UI in ReactJS. We are late but only by a couple of months and I have every expectation that the software will be completed. The product has pre-sold about $1M already.

If you want to be working on shipping products, work on embedded stuff.

I’ve done customer facing business software too. Still mostly successful but that is where I’ve seen a few failures. One project I did was writing automated functional/integration tests didn’t see release but I take no blame for that one. The tests worked. The product didn’t.


Yeah but a 1 GHz N200 yeesh. Everything else looks great.


It's a tablet, not a workstation PC. I have a Surface Go w/ 4GB of memory running Fedora 37 that's plenty fast for what I use it for.

Apparently this also supports hardware accelerated AV1 decoding and h265 encode/decode.


I haven't used one myself but the N200 looks pretty ok for a tablet that's supposed to run a long time on battery. Quad core Skylake-ish cores that turbo to 3.7GHz?


Notebook check says equivalent to a Core i5-8250U. That is not good in 2023.

https://www.notebookcheck.net/Intel-Processor-N200-CPU-Bench...


It depends on what you're doing. My 4th low power i5 in my laptop and my i3 7100 in my main desktop are just fine for web browsing and development.


Yes, and in this case 6W TDP and cheap enough for a $500 tablet seem like key drivers of the N200's design. Of course the Apple M1 is probably twice as fast at similar cost and power but compared to everything outside Apple the N200 looks pretty decent.


If they make a Ryzen 7040 APU version, I'll take it as my daily driver.


7040 in its lowest TDP configuration is 15W. Intel N200 is 6W. Even accounting for some differences in how both companies measure TDP only one of these can be passively cooled in that sort of form factor.

The lack of low-tdp products on AMD side was also one of the reasons given by pcengines to discontinue their embedded line. AMD's last 6W APU was the 2-core R1102G which is now a couple generations old.


Where did you get 1Ghz from? The N200 has a max clock of 3.7Ghz.


From the Specification section of the website:

> 1.00GHz quad-core Intel Alder Lake N200

> Turbo Boost up to 3.70GHz, with 6MB Smart Cache


You can turn it off in macOS 13 as well.

https://support.apple.com/en-us/HT213449

System Settings->iCloud Settings (your name)->Password & Security->Automatic Verification.


I guess that's only for Safari, but not Chrome if you have that installed as well? Also, what if you never signed into your iCloud account? Is it impossible to disable?


It's specifically a Safari technology. Chrome does not implement this.


And can't without Apple's blessing as it isn't a "verified" browser.


> Also, what if you never signed into your iCloud account? Is it impossible to disable?

If you're not signed in, then it's not enabled, because the iCloud account is essential to the process: "An Apple server validates your device and Apple ID."


"These apps are also not immune to the same type of panic-based bank run that closed down Silicon Valley Bank and others recently, the agency added."

I'm thinking that worrying about Apple going the way of SVB isn't much of a risk.


Only on Intel Macs which are going away soon (only 2019 Mac Pro is remaining outside of refurbished.)

Apple Silicon doesn't currently have any provision for external graphics cards.


There are two issues.

     1) Apple Silicon won't support PCIe. 

     2) Apple doesn't want it to.
#2 means Apple is taking nVidia and AMD head on in the GPU space. Apple wants to control everything, and allowing these competitors on their platform is giving away too much. Because Apple Silicon scales better than competitors' hw, the desire for third party GPU is probably going to evaporate within a few generations of Apple Silicon. I mean, we'll see, but that is my best guess, because it seems like that was an intentional decision rather than oversight.


AS supports Thunderbolt just fine. Isn't the lack of eGPU support due to lack of ARM drivers for them?


To suggest it is merely lack of drivers is an oversimplification. There is a chip errata that prevents PCIe GPUs from working properly on Apple Silicon: the architectures are not compatible. Apple Silicon GPU drivers are deeply integrated into the system. Due to this integration, only graphics cards that use the same GPU architecture as Apple Silicon could be supported, and there just aren't any, and I don't see how there could be unless Apple developed one and released it.


> To suggest it is merely lack of drivers is an oversimplification.

Not really. When you hook up an external PCI chassis with a graphics card inserted, it sees the PCI expansion slot and the GPU just fine, it just doesn't have a driver for the GPU.


Not really. Software drivers alone will not get it done, at least not adequately (Asahi-related developers may come up with a software solution for Asahi, but it will necessarily degrade performance, so the effort is likely to be abandoned).

Thunderbolt supports PCIe, and for most devices Apple Silicon does also, but GPU is different enough from audio interfaces and NVMe that it isn't just a "load the driver and plug it in" situation. GPU is vastly more complex than other PCIe devices. Apple Silicon and x86 architectures are not compatible, so GPU for x86 is not going to work with Apple Silicon with merely a software driver.

It's going to take hw translation and other technology that is not yet available for Apple Silicon, thus Apple's recent patent applications,[1] showing that Apple is either exploring supporting outboard GPU or locking anyone else out from their method of doing so, but either way is no guarantee they'll complete development or release, because it seems just as if not more likely the roadmap for Apple Silicon GPU performance will outpace nVidia and AMD GPUs.

But, again, claiming, "it just doesn't have a driver for the GPU" is a staggering oversimplification.

[1] https://image-ppubs.uspto.gov/dirsearch-public/print/downloa...


It'll be interesting to see what happens with the Mac Pro.

It could be that Apple does away with it entirely and the Mac Studio is the new Pro.

Or they might make a machine with PCIe support, but make it so expensive that only people with a serious need get access to it.

Or something else.


> It could be that Apple does away with it entirely and the Mac Studio is the new Pro.

Or the Mac Pro will be released without PCIe GPU support, and Apple will be able to leverage increases in Apple Silicon GPU performance to eliminate any need or desire for PCIe GPU, drawing away high end GPU customers from nVidia and AMD and locking them into Apple Silicon and the Apple ecosystem.


> Or the Mac Pro will be released without PCIe GPU support, and Apple will be able to leverage increases in Apple Silicon GPU performance to eliminate any need or desire for PCIe GPU, drawing away high end GPU customers from nVidia and AMD and locking them into Apple Silicon and the Apple ecosystem.

Maybe. But graphics cards aren't the only thing people put into PCIe slots.


> Maybe. But graphics cards aren't the only thing people put into PCIe slots.

Right, so there will be PCIe slots for expansion, they just won't support PCIe GPU, just like Thunderbolt PCIe expansion chassis for Thunderbolt now. It isn't the PCIe slots that break compatibility, its the difference in architecture between x86 and Apple Silicon that makes them incompatible.


I’ve been doing ReactJS development with both the M1 and now the M2 MacBook Air for a year now. My app’s transpiler production build takes about 5 seconds and any development incremental is essentially instantaneous.

I can’t imagine most developers work even stressing an M2 unless you are compiling chrome or something.

Best thing as a contractor I don’t even bother to bring a power adapter. I work for 8 hours and leave with 20%-40% remaining every day.


As a Dell XPS user I'm so jealous of laptops with competent power management.


The company was two guys soldering in a garage.


Woz later claimed that story is a myth and that most of the soldering happened elsewhere, though he never gave any specifics as to where it was done.


And the product they were selling was a bare (well, populated, but not enclosed) PCB which the user was expected to wire up to power, a keyboard, and a monitor themselves. Long-term reliability was more the responsibility of the owner than the manufacturer.


I pretty doubt Steve was doing any soldering. He was the marketing guy. And of course his parents were putting the garage.


He was a hardware tech at Atari for his day job at this point.


And he wanted to leave that job. I wonder if Woz' books say anything about Jobs regarding their time in the garage.


Too bad Apple's management was so timid. That could have changed the course of computing back in 1985. I had a Macintosh IIx and it was a great computer but I would have bought the Jonathon in a heartbeat back then. Thanks for the history, first I've heard of it.


Sure! A bit more detail in this thread: https://twitter.com/bensyverson/status/1494532671623544840


You can always go with a VPX system if you want something like that, but obviously that's not cheap at all.


They should use AmorphousDiskMark. But I don't doubt the results in general. It looks like Apple has a hard time sourcing new 128 GB Nand chips.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: